I'm running out of options trying to integrate cassandra, spark, and the
spark-cassandra-connector.

I quickly found out just grabbing the latest versions of everything
(drivers, etc.) doesn't work--binary incompatibilities it would seem.

So last I tried using versions of drivers from the
spark-cassandra-connector's build.  Better, but still no dice.
Any successes out there?  I'd really love to use the stack.

If curious my ridiculously trivial example is here: 
https://github.com/gzoller/doesntwork
<https://github.com/gzoller/doesntwork>  

If you run 'sbt test' you'll get a NoHostAvailableException exception
complaining it tried /10.0.0.194:9042.  I have no idea where that addr came
from.  I was trying to connect to local.

Any ideas appreciated!
Greg



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Anyone-have-successful-recipe-for-spark-cassandra-connector-tp14681.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to