Hi Steve Since you are familiar with groovy it will go a bit deeper in details. My (simple) groovy scripts are working fine with Apache Spark - a closure (when dehydrated) will nicely serialize. My issue comes when I want to use GroovyShell to run my scripts (my ultimate goal is to integrate with Apache Zeppelin so I would need to use GroovyShell to run the scripts) and this is where I got the previous exception.
Sure you may question the use of Groovy while Scala/Python are nicely supported. For me it is more a way to support the wider Java community ... and after all scala/groovy/java are all working on the JVM! Beyond that point, I would be interested to know from the Spark community if there is any plan to integrate closer with java especially with a Java REPL landing in Java 9. Cheers Guillaume On 18 November 2015 at 21:51, Steve Loughran <ste...@hortonworks.com> wrote: > > Looks like groovy scripts dont' serialize over the wire properly. > > Back in 2011 I hooked up groovy to mapreduce, so that you could do mappers > and reducers there; "grumpy" > > https://github.com/steveloughran/grumpy > > slides: http://www.slideshare.net/steve_l/hadoop-gets-groovy > > What I ended up doing (see slide 13) was send the raw script around as > text and compile it in to a Script instance at the far end. Compilation > took some time, but the real barrier is that groovy is not at all fast. > > It used to be 10x slow, maybe now with static compiles and the java7 > invoke-dynamic JARs things are better. I'm still unsure I'd use it in > production, and, given spark's focus on Scala and Python, I'd pick one of > those two > > > On 18 Nov 2015, at 20:35, tog <guillaume.all...@gmail.com> wrote: > > Hi > > I start playing with both Apache projects and quickly got that exception. > Anyone being able to give some hint on the problem so that I can dig > further. > It seems to be a problem for Spark to load some of the groovy classes ... > > Any idea? > Thanks > Guillaume > > > tog GroovySpark $ $GROOVY_HOME/bin/groovy > GroovySparkThroughGroovyShell.groovy > > org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 > in stage 0.0 failed 1 times, most recent failure: Lost task 1.0 in stage > 0.0 (TID 1, localhost): java.lang.ClassNotFoundException: > Script1$_run_closure1 > > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > at java.lang.Class.forName0(Native Method) > > at java.lang.Class.forName(Class.java:348) > > at > org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67) > > at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613) > > at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) > > at > org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72) > > at > org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98) > > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) > > at org.apache.spark.scheduler.Task.run(Task.scala:88) > > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > Driver stacktrace: > > at org.apache.spark.scheduler.DAGScheduler.org > <http://org.apache.spark.scheduler.dagscheduler.org/> > $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) > > at > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) > > at > org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) > > at scala.Option.foreach(Option.scala:236) > > at > org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) > > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) > > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) > > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) > > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > > at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1824) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1837) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1850) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1921) > > at org.apache.spark.rdd.RDD.count(RDD.scala:1125) > > at org.apache.spark.api.java.JavaRDDLike$class.count(JavaRDDLike.scala:445) > > at > org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:47) > > at org.apache.spark.api.java.JavaRDDLike$count.call(Unknown Source) > > at > org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117) > > at Script1.run(Script1.groovy:24) > > at groovy.lang.GroovyShell.evaluate(GroovyShell.java:585) > > at groovy.lang.GroovyShell.evaluate(GroovyShell.java:623) > > at groovy.lang.GroovyShell.evaluate(GroovyShell.java:594) > > at groovy.lang.GroovyShell$evaluate.call(Unknown Source) > > at > org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) > > at > GroovySparkThroughGroovyShell.run(GroovySparkThroughGroovyShell.groovy:47) > > at > groovy.lang.GroovyShell.runScriptOrMainOrTestOrRunnable(GroovyShell.java:263) > > at groovy.lang.GroovyShell.run(GroovyShell.java:518) > > at groovy.lang.GroovyShell.run(GroovyShell.java:507) > > at groovy.ui.GroovyMain.processOnce(GroovyMain.java:594) > > at groovy.ui.GroovyMain.run(GroovyMain.java:326) > > at groovy.ui.GroovyMain.process(GroovyMain.java:312) > > at groovy.ui.GroovyMain.processArgs(GroovyMain.java:129) > > at groovy.ui.GroovyMain.main(GroovyMain.java:109) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:497) > > at > org.codehaus.groovy.tools.GroovyStarter.rootLoader(GroovyStarter.java:107) > > at org.codehaus.groovy.tools.GroovyStarter.main(GroovyStarter.java:129) > > Caused by: java.lang.ClassNotFoundException: Script1$_run_closure1 > > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > at java.lang.Class.forName0(Native Method) > > at java.lang.Class.forName(Class.java:348) > > at > org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67) > > at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613) > > at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) > > at > org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72) > > at > org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98) > > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) > > at org.apache.spark.scheduler.Task.run(Task.scala:88) > > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > org.apache.spark.SparkConf@2ca5f1ed > > > org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 > in stage 0.0 failed 1 times, most recent failure: Lost task 1.0 in stage > 0.0 (TID 1, localhost): java.lang.ClassNotFoundException: > Script1$_run_closure1 > > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > at java.lang.Class.forName0(Native Method) > > at java.lang.Class.forName(Class.java:348) > > at > org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67) > > at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613) > > at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) > > at > org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72) > > at > org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98) > > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) > > at org.apache.spark.scheduler.Task.run(Task.scala:88) > > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > Driver stacktrace: > > at org.apache.spark.scheduler.DAGScheduler.org > <http://org.apache.spark.scheduler.dagscheduler.org/> > $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) > > at > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) > > at > org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) > > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) > > at scala.Option.foreach(Option.scala:236) > > at > org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) > > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) > > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) > > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) > > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > > at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1824) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1837) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1850) > > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1921) > > at org.apache.spark.rdd.RDD.count(RDD.scala:1125) > > at org.apache.spark.api.java.JavaRDDLike$class.count(JavaRDDLike.scala:445) > > at > org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:47) > > at org.apache.spark.api.java.JavaRDDLike$count.call(Unknown Source) > > at > org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117) > > at Script1.run(Script1.groovy:24) > > at groovy.lang.GroovyShell.evaluate(GroovyShell.java:585) > > at groovy.lang.GroovyShell.evaluate(GroovyShell.java:623) > > at groovy.lang.GroovyShell.evaluate(GroovyShell.java:594) > > at groovy.lang.GroovyShell$evaluate.call(Unknown Source) > > at > org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) > > at > org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) > > at > GroovySparkThroughGroovyShell.run(GroovySparkThroughGroovyShell.groovy:47) > > at > groovy.lang.GroovyShell.runScriptOrMainOrTestOrRunnable(GroovyShell.java:263) > > at groovy.lang.GroovyShell.run(GroovyShell.java:518) > > at groovy.lang.GroovyShell.run(GroovyShell.java:507) > > at groovy.ui.GroovyMain.processOnce(GroovyMain.java:594) > > at groovy.ui.GroovyMain.run(GroovyMain.java:326) > > at groovy.ui.GroovyMain.process(GroovyMain.java:312) > > at groovy.ui.GroovyMain.processArgs(GroovyMain.java:129) > > at groovy.ui.GroovyMain.main(GroovyMain.java:109) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:497) > > at > org.codehaus.groovy.tools.GroovyStarter.rootLoader(GroovyStarter.java:107) > > at org.codehaus.groovy.tools.GroovyStarter.main(GroovyStarter.java:129) > > Caused by: java.lang.ClassNotFoundException: Script1$_run_closure1 > > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > at java.lang.Class.forName0(Native Method) > > at java.lang.Class.forName(Class.java:348) > > at > org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67) > > at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613) > > at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) > > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) > > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) > > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) > > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) > > at > org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72) > > at > org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98) > > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) > > at org.apache.spark.scheduler.Task.run(Task.scala:88) > > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > > -- > PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net > > > -- PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net