Oops I think that should fix it. I am going to try it now..

Great catch! I feel like an idiot.

On Fri, Feb 17, 2017 at 10:02 AM, Russell Spitzer <russell.spit...@gmail.com
> wrote:

> Great catch Anastasios!
>
> On Fri, Feb 17, 2017 at 9:59 AM Anastasios Zouzias <zouz...@gmail.com>
> wrote:
>
>> Hey,
>>
>> Can you try with the 2.11 spark-cassandra-connector? You just reported
>> that you use spark-cassandra-connector*_2.10*-2.0.0-RC1.jar
>>
>> Best,
>> Anastasios
>>
>> On Fri, Feb 17, 2017 at 6:40 PM, kant kodali <kanth...@gmail.com> wrote:
>>
>> Hi,
>>
>>
>> val df = spark.read.format("org.apache.spark.sql.cassandra").options(Map(
>> "table" -> "hello", "keyspace" -> "test" )).load()
>>
>> This line works fine. I can see it actually pulled the table schema from
>> cassandra. however when I do
>>
>> df.count I get the error below.
>>
>>
>> I am using the following jars.
>>
>> spark version 2.0.2
>>
>> spark-sql_2.11-2.0.2.jar
>>
>> spark-cassandra-connector_2.10-2.0.0-RC1.jar
>>
>> Java version 8
>>
>> scala version 2.11.8
>>
>>
>>
>> java.lang.NoClassDefFoundError: scala/runtime/
>> AbstractPartialFunction$mcJL$sp
>>
>> at java.lang.ClassLoader.defineClass1(Native Method)
>>
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>
>> at java.security.SecureClassLoader.defineClass(
>> SecureClassLoader.java:142)
>>
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>
>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> at com.datastax.spark.connector.rdd.CassandraLimit$.limitForIterator(
>> CassandraLimit.scala:21)
>>
>> at com.datastax.spark.connector.rdd.CassandraTableScanRDD.
>> compute(CassandraTableScanRDD.scala:367)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:79)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:47)
>>
>> at org.apache.spark.scheduler.Task.run(Task.scala:86)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>>
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Caused by: java.lang.ClassNotFoundException: scala.runtime.
>> AbstractPartialFunction$mcJL$sp
>>
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> ... 35 more
>>
>> 17/02/17 17:35:33 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID
>> 0)
>>
>> java.lang.NoClassDefFoundError: com/datastax/spark/connector/
>> rdd/CassandraLimit$$anonfun$limitForIterator$1
>>
>> at com.datastax.spark.connector.rdd.CassandraLimit$.limitForIterator(
>> CassandraLimit.scala:21)
>>
>> at com.datastax.spark.connector.rdd.CassandraTableScanRDD.
>> compute(CassandraTableScanRDD.scala:367)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:79)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:47)
>>
>> at org.apache.spark.scheduler.Task.run(Task.scala:86)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>>
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>> 17/02/17 17:35:33 ERROR Executor: Exception in task 2.0 in stage 0.0 (TID
>> 2)
>>
>> java.lang.NoClassDefFoundError: com/datastax/spark/connector/
>> rdd/CassandraLimit$$anonfun$limitForIterator$1
>>
>> at com.datastax.spark.connector.rdd.CassandraLimit$.limitForIterator(
>> CassandraLimit.scala:21)
>>
>> at com.datastax.spark.connector.rdd.CassandraTableScanRDD.
>> compute(CassandraTableScanRDD.scala:367)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:79)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:47)
>>
>> at org.apache.spark.scheduler.Task.run(Task.scala:86)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>>
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>> 17/02/17 17:35:33 ERROR Executor: Exception in task 3.0 in stage 0.0 (TID
>> 3)
>>
>> java.lang.NoClassDefFoundError: com/datastax/spark/connector/
>> rdd/CassandraLimit$$anonfun$limitForIterator$1
>>
>> at com.datastax.spark.connector.rdd.CassandraLimit$.limitForIterator(
>> CassandraLimit.scala:21)
>>
>> at com.datastax.spark.connector.rdd.CassandraTableScanRDD.
>> compute(CassandraTableScanRDD.scala:367)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:79)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:47)
>>
>> at org.apache.spark.scheduler.Task.run(Task.scala:86)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>>
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>> 17/02/17 17:35:33 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1,
>> localhost): java.lang.NoClassDefFoundError: scala/runtime/
>> AbstractPartialFunction$mcJL$sp
>>
>> at java.lang.ClassLoader.defineClass1(Native Method)
>>
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>
>> at java.security.SecureClassLoader.defineClass(
>> SecureClassLoader.java:142)
>>
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>
>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> at com.datastax.spark.connector.rdd.CassandraLimit$.limitForIterator(
>> CassandraLimit.scala:21)
>>
>> at com.datastax.spark.connector.rdd.CassandraTableScanRDD.
>> compute(CassandraTableScanRDD.scala:367)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:79)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:47)
>>
>> at org.apache.spark.scheduler.Task.run(Task.scala:86)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>>
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Caused by: java.lang.ClassNotFoundException: scala.runtime.
>> AbstractPartialFunction$mcJL$sp
>>
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> ... 35 more
>>
>>
>> 17/02/17 17:35:33 ERROR TaskSetManager: Task 1 in stage 0.0 failed 1
>> times; aborting job
>>
>> 17/02/17 17:35:33 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
>> localhost): java.lang.NoClassDefFoundError: com/datastax/spark/connector/
>> rdd/CassandraLimit$$anonfun$limitForIterator$1
>>
>> at com.datastax.spark.connector.rdd.CassandraLimit$.limitForIterator(
>> CassandraLimit.scala:21)
>>
>> at com.datastax.spark.connector.rdd.CassandraTableScanRDD.
>> compute(CassandraTableScanRDD.scala:367)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.rdd.MapPartitionsRDD.compute(
>> MapPartitionsRDD.scala:38)
>>
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>>
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:79)
>>
>> at org.apache.spark.scheduler.ShuffleMapTask.runTask(
>> ShuffleMapTask.scala:47)
>>
>> at org.apache.spark.scheduler.Task.run(Task.scala:86)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>>
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>>
>> --
>> -- Anastasios Zouzias
>> <a...@zurich.ibm.com>
>>
>

Reply via email to