For anyone revisiting this at a later point, the issue was that Spark 2.1.0
upgrades netty to version 4.0.42 which is not binary compatible with version
4.0.37 used by version 3.1.0 of the Cassandra Java Driver. The newer version
can work with Cassandra, but because of differences in the maven artifacts
(Spark depends on netty-all while Cassandra depends on netty-transport) this
was not automatically resolved by SBT. Adding an explicit dependency to
netty-transport version 4.0.42 solved the problem.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/New-runtime-exception-after-switch-to-Spark-2-1-0-tp28263p28319.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to