etter than me/I am
not good enough/I don't have time/This is the way I am
--
发件人:Amit Joshi
发送时间:2020年8月10日(星期一) 02:37
收件人:user
主 题:[Spark-Kafka-Streaming] Verifying the approach for multiple queries
Hi,
I have a scenar
Hi,
I have a scenario where a kafka topic is being written with different types
of json records.
I have to regroup the records based on the type and then fetch the schema
and parse and write as parquet.
I have tried structured programming. But dynamic schema is a constraint.
So I have used DStream