After watching Ryan Weald's talk about integrating Spark Streaming with 
Algebird and Storehaus, I decided that I had to give this a try! :-)  But I'm 
having some rather basic problems.

My code looks a lot like the example Ryan gives.  Here's the basic structure:

        dstream.foreach(rdd => if(rdd.count != 0){ val data = 
rdd.collect.toMap; store.multiMerge(data)})

where dstream is my Spark DStream and store is a Storehaus MySqlLongStore.

The problem is that the Spark Streaming CheckpointWriter wants to serialize the 
store object and it can't (I end up getting a NotSerializableException).

Anybody have an example of working code?  Is this problem specific to 
MySqlLongStores, or is this generic?


Thanks,

Jim Donahue
Adobe

Reply via email to