Structured streaming, Writing Kafka topic to BigQuery table, throws error

2021-02-23 Thread Mich Talebzadeh
With the ols spark streaming (example in Scala), this would have been easier through RDD. You could read data val dstream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](streamingContext, kafkaParams, topicsValue) dstream.foreachRDD { pricesRDD => if

Re: Structured streaming, Writing Kafka topic to BigQuery table, throws error

2021-02-23 Thread Jungtaek Lim
If your code doesn't require "end to end exactly-once" then you could leverage foreachBatch which enables you to use batch sink. If your code requires "end to end exactly-once", then well, that's the different story. I'm not familiar with BigQuery and even have no idea how sink is implemented, but