The issue and a workaround can be found here
https://github.com/apache/spark/pull/181


On Wed, Mar 26, 2014 at 10:12 PM, Aniket Mokashi <aniket...@gmail.com>wrote:

> context.objectFile[ReIdDataSetEntry]("data") -not sure how this is
> compiled in scala. But, if it uses some sort of ObjectInputStream, you need
> to be careful - ObjectInputStream uses root classloader to load classes and
> does not work with jars that are added to TCCC. Apache commons has
> ClassLoaderObjectInputStream to workaround this.
>
>
> On Wed, Mar 26, 2014 at 1:38 PM, Jaonary Rabarisoa <jaon...@gmail.com>wrote:
>
>> it seems to be an old problem :
>>
>>
>> http://mail-archives.apache.org/mod_mbox/spark-user/201311.mbox/%3c7f6aa9e820f55d4a96946a87e086ef4a4bcdf...@eagh-erfpmbx41.erf.thomson.com%3E
>>
>> https://groups.google.com/forum/#!topic/spark-users/Q66UOeA2u-I
>>
>> Does anyone got the solution ?
>>
>>
>> On Wed, Mar 26, 2014 at 5:50 PM, Yana Kadiyska 
>> <yana.kadiy...@gmail.com>wrote:
>>
>>> I might be way off here but are you looking at the logs on the worker
>>> machines? I am running an older version (0.8) and when I look at the
>>> error log for the executor process I see the exact location where the
>>> executor process tries to load the jar from...with a line like this:
>>>
>>> 14/03/26 13:57:11 INFO executor.Executor: Adding
>>> file:/dirs/dirs/spark/work/app-20140326135710-0029/0/./spark-test.jar
>>> to class loader
>>>
>>> You said "The jar file is present in each node", do you see any
>>> information on the executor indicating that it's trying to load the
>>> jar or where it's loading it from? I can't tell for sure by looking at
>>> your logs but they seem to be logs from the master and driver, not
>>> from the executor itself?
>>>
>>> On Wed, Mar 26, 2014 at 11:46 AM, Ognen Duzlevski
>>> <og...@plainvanillagames.com> wrote:
>>> > Have you looked through the logs fully? I have seen this (in my limited
>>> > experience) pop up as a result of previous exceptions/errors, also as a
>>> > result of being unable to serialize objects etc.
>>> > Ognen
>>> >
>>> >
>>> > On 3/26/14, 10:39 AM, Jaonary Rabarisoa wrote:
>>> >
>>> > I notice that I get this error when I'm trying to load an objectFile
>>> with
>>> > val viperReloaded = context.objectFile[ReIdDataSetEntry]("data")
>>> >
>>> >
>>> > On Wed, Mar 26, 2014 at 3:58 PM, Jaonary Rabarisoa <jaon...@gmail.com>
>>> > wrote:
>>> >>
>>> >> Here the output that I get :
>>> >>
>>> >> [error] (run-main-0) org.apache.spark.SparkException: Job aborted:
>>> Task
>>> >> 1.0:1 failed 4 times (most recent failure: Exception failure in TID 6
>>> on
>>> >> host 172.166.86.36: java.lang.ClassNotFoundException:
>>> >> value.models.ReIdDataSetEntry)
>>> >> org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 4
>>> times
>>> >> (most recent failure: Exception failure in TID 6 on host
>>> 172.166.86.36:
>>> >> java.lang.ClassNotFoundException: value.models.ReIdDataSetEntry)
>>> >> at
>>> >>
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1011)
>>> >> at
>>> >>
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1009)
>>> >> at
>>> >>
>>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>> >> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>> >> at
>>> >> org.apache.spark.scheduler.DAGScheduler.org
>>> $apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1009)
>>> >> at
>>> >>
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>>> >> at
>>> >>
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:596)
>>> >> at scala.Option.foreach(Option.scala:236)
>>> >> at
>>> >>
>>> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:596)
>>> >> at
>>> >>
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:146)
>>> >> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>>> >> at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>>> >> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>>> >> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>>> >> at
>>> >>
>>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>>> >> at
>>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>> >> at
>>> >>
>>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>> >> at
>>> >>
>>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>> >> at
>>> >>
>>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>> >>
>>> >> Spark says that the jar is added :
>>> >>
>>> >> 14/03/26 15:49:18 INFO SparkContext: Added JAR
>>> >> target/scala-2.10/value-spark_2.10-1.0.jar
>>> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> On Wed, Mar 26, 2014 at 3:34 PM, Ognen Duzlevski
>>> >> <og...@plainvanillagames.com> wrote:
>>> >>>
>>> >>> Have you looked at the individual nodes logs? Can you post a bit
>>> more of
>>> >>> the exception's output?
>>> >>>
>>> >>>
>>> >>> On 3/26/14, 8:42 AM, Jaonary Rabarisoa wrote:
>>> >>>>
>>> >>>> Hi all,
>>> >>>>
>>> >>>> I got java.lang.ClassNotFoundException even with "addJar" called.
>>> The
>>> >>>> jar file is present in each node.
>>> >>>>
>>> >>>> I use the version of spark from github master.
>>> >>>>
>>> >>>> Any ideas ?
>>> >>>>
>>> >>>>
>>> >>>> Jaonary
>>> >
>>> >
>>>
>>
>>
>
>
> --
> "...:::Aniket:::... Quetzalco@tl"
>

Reply via email to