Re: Writing streaming data to cassandra creates duplicates

2015-07-30 Thread Priya Ch
Hi All, Can someone throw insights on this ? On Wed, Jul 29, 2015 at 8:29 AM, Priya Ch wrote: > > > Hi TD, > > Thanks for the info. I have the scenario like this. > > I am reading the data from kafka topic. Let's say kafka has 3 partitions > for the topic. In my streaming application, I woul

Re: Writing streaming data to cassandra creates duplicates

2015-07-27 Thread Tathagata Das
You have to partition that data on the Spark Streaming by the primary key, and then make sure insert data into Cassandra atomically per key, or per set of keys in the partition. You can use the combination of the (batch time, and partition Id) of the RDD inside foreachRDD as the unique id for the d