Hi

We just tested a switch from Spark 2.0.2 to Spark 2.1.0 on our codebase. It
compiles fine, but introduces the following runtime exception upon
initialization of our Cassandra database. I can't find any clues in the
release notes. Has anyone experienced this?

Morten

sbt.ForkMain$ForkError: java.lang.reflect.InvocationTargetException
        at com.google.common.base.Throwables.propagate(Throwables.java:160)
        at
com.datastax.driver.core.NettyUtil.newEventLoopGroupInstance(NettyUtil.java:136)
        at
com.datastax.driver.core.NettyOptions.eventLoopGroup(NettyOptions.java:96)
        at 
com.datastax.driver.core.Connection$Factory.<init>(Connection.java:713)
        at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1375)
        at com.datastax.driver.core.Cluster.init(Cluster.java:163)
        at com.datastax.driver.core.Cluster.connectAsync(Cluster.java:334)
        at com.datastax.driver.core.Cluster.connectAsync(Cluster.java:309)
        at com.datastax.driver.core.Cluster.connect(Cluster.java:251)
        at
com.websudos.phantom.connectors.DefaultSessionProvider$$anonfun$3$$anonfun$4.apply(DefaultSessionProvider.scala:66)
        at
com.websudos.phantom.connectors.DefaultSessionProvider$$anonfun$3$$anonfun$4.apply(DefaultSessionProvider.scala:66)
        at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.package$.blocking(package.scala:123)
        at
com.websudos.phantom.connectors.DefaultSessionProvider$$anonfun$3.apply(DefaultSessionProvider.scala:65)
        at
com.websudos.phantom.connectors.DefaultSessionProvider$$anonfun$3.apply(DefaultSessionProvider.scala:64)
        at scala.util.Try$.apply(Try.scala:192)
        at
com.websudos.phantom.connectors.DefaultSessionProvider.createSession(DefaultSessionProvider.scala:64)
        at
com.websudos.phantom.connectors.DefaultSessionProvider.<init>(DefaultSessionProvider.scala:81)
        at com.websudos.phantom.connectors.KeySpaceDef.<init>(Keyspace.scala:92)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/New-runtime-exception-after-switch-to-Spark-2-1-0-tp28263.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to