如何在ApacheFlink中实现窗口函数?


0

所有人,

    总和(金额)除以(除以

我可以为这些操作使用数据流函数吗?或

如何操作kafka数据以将其 transformation为DataTable并使用sqlQuery?

目的地是 Kafka 的另一个话题。

    val stream = senv
      .addSource(new FlinkKafkaConsumer[String]("flink", new SimpleStringSchema(), properties))

我试过这么做

val tableA = tableEnv.fromDataStream(stream, 'user, 'product, 'amount)

但是我得到了下面的错误

Exception in thread "main" org.apache.flink.table.api.ValidationException: Too many fields referenced from an atomic type.

试验数据

1,"beer",3
1,"beer",1
2,"beer",3
3,"diaper",4
4,"diaper",1
5,"diaper",5
6,"rubber",2

query示例

    SELECT
     user, product, amount,
     COUNT(user) OVER(PARTITION BY product) AS count_product
   FROM table;

预期性能

1,"beer",3,3
1,"beer",1,3
2,"beer",3,3
3,"diaper",4,3
4,"diaper",1,3
5,"diaper",5,3
6,"rubber",2,1

1 答案


0

您需要将字符串解析为字段,然后再重命名它们。

val env = StreamExecutionEnvironment.getExecutionEnvironment
val tEnv = StreamTableEnvironment.create(env)

val stream = env.fromElements("1,beer,3",
"1,beer,1","2,beer,3","3,diaper,4","4,diaper,1","5,diaper,5","6,rubber,2");

val parsed = stream.map(x=> {
val arr = x.split(",")
(arr(0).toInt, arr(1), arr(2).toInt)
})

val tableA = tEnv.fromDataStream(parsed, $"_1" as "user", $"_2" as "product", $"_3" as "amount")

// example query
val result = tEnv.sqlQuery(s"SELECT user, product, amount from $tableA")

val rs = result.toAppendStream[(Int, String, Int)]

rs.print()

我不确定如何在flinksql中实现所需的窗口函数。或者,可以在Flink中实现:

parsed.keyBy(x => x._2) // key by product id.
      .window(TumblingEventTimeWindows.of(Time.milliseconds(2)))
      .process(new ProcessWindowFunction[
        (Int, String, Int), (Int, String, Int, Int), String, TimeWindow
      ]() {
        override def process(key: String, context: Context,
                             elements: Iterable[(Int, String, Int)],
                             out: Collector[(Int, String, Int, Int)]): Unit = {
          val lst = elements.toList
          lst.foreach(x => out.collect((x._1, x._2, x._3, lst.size)))
        }
      })
      .print()

我来回答

写文章

提问题

面试题