Hi Harold,

Yes, that is the problem :) Sorry for the confusion, I will make this clear in 
the docs ;) since master is work for the next version.

All you need to do is use 
spark 1.1.0 as you have it already
"com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0-beta1”
and assembly - not from master, checkout branch b1.1, and sbt ;clean ;reload 
;assembly

Cheers,
- Helena
@helenaedelson


On Oct 31, 2014, at 1:35 PM, Harold Nguyen <har...@nexgate.com> wrote:

> Hi Helena,
> 
> Thanks very much ! I'm using Spark 1.1.0, and 
> spark-cassandra-connector-assembly-1.2.0-SNAPSHOT
> 
> Best wishes,
> 
> Harold
> 
> On Fri, Oct 31, 2014 at 10:31 AM, Helena Edelson 
> <helena.edel...@datastax.com> wrote:
> Hi Harold,
> Can you include the versions of spark and spark-cassandra-connector you are 
> using?
> 
> Thanks!
> 
> Helena
> @helenaedelson
> 
> On Oct 30, 2014, at 12:58 PM, Harold Nguyen <har...@nexgate.com> wrote:
> 
> > Hi all,
> >
> > I'd like to be able to modify values in a DStream, and then send it off to 
> > an external source like Cassandra, but I keep getting Serialization errors 
> > and am not sure how to use the correct design pattern. I was wondering if 
> > you could help me.
> >
> > I'd like to be able to do the following:
> >
> >  wordCounts.foreachRDD( rdd => {
> >
> >        val arr = record.toArray
> >        ...
> >
> > })
> >
> > I would like to use the "arr" to send back to cassandra, for instance:
> >
> > Use it like this:
> >
> > val collection = sc.parallelize(Seq(a.head._1, a.head_.2))
> > collection.saveToCassandra(....)
> >
> > Or something like that, but as you know, I can't do this within the 
> > "foreacRDD" but only at the driver level. How do I use the "arr" variable 
> > to do something like that ?
> >
> > Thanks for any help,
> >
> > Harold
> >
> 
> 

Reply via email to