Re: Spark can't find jars

2014-10-27 Thread twinkle sachdeva
Hi, Try running following in the spark folder: bin/*run-example *SparkPi 10 If this runs fine, just see the set of arguments being passed via this script, and try in similar way. Thanks, On Thu, Oct 16, 2014 at 2:59 PM, Christophe Préaud < christophe.pre...@kelkoo.com> wrote: > Hi, > > I ha

Re: Spark can't find jars

2014-10-16 Thread Christophe Préaud
Hi, I have created a JIRA (SPARK-3967), can you please confirm that you are hit by the same issue? Thanks, Christophe. On 15/10/2014 09:49, Christophe Préaud wrote: Hi Jimmy, Did you try my patch? The problem on my side was that the hadoop.tmp.

Re: Spark can't find jars

2014-10-15 Thread Christophe Préaud
Hi Jimmy, Did you try my patch? The problem on my side was that the hadoop.tmp.dir (in hadoop core-site.xml) was not handled properly by Spark when it is set on multiple partitions/disks, i.e.: hadoop.tmp.dir file:/d1/yarn/local,file:/d2/yarn/local,file:/d3/yarn/local,file:/d4/yarn/local,

Re: Spark can't find jars

2014-10-14 Thread Jimmy McErlain
So the only way that I could make this work was to build a fat jar file as suggested earlier. To me (and I am no expert) it seems like this is a bug. Everything was working for me prior to our upgrade to Spark 1.1 on Hadoop 2.2 but now it seems to not... ie packaging my jars locally then pushing

Re: Spark can't find jars

2014-10-14 Thread Christophe Préaud
Hello, I have already posted a message with the exact same problem, and proposed a patch (the subject is "Application failure in yarn-cluster mode"). Can you test it, and see if it works for you? I would be glad too if someone can confirm that it is a bug in Spark 1.1.0. Regards, Christophe. On

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
Or if it has something to do with the way you package your files - try another alternative method and see if it works On Monday, October 13, 2014, HARIPRIYA AYYALASOMAYAJULA < aharipriy...@gmail.com> wrote: > Well in the cluster, can you try copying the entire folder and then run? > For example m

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
Well in the cluster, can you try copying the entire folder and then run? For example my home folder say helloWorld consists of the src, target etc. can you copy the entire folder in the cluster ? I doubt it is looking for some dependencies and is missing that when it runs your jar file. or if you

Re: Spark can't find jars

2014-10-13 Thread Jimmy McErlain
BTW this has always worked for me before until we upgraded the cluster to Spark 1.1.1... J ᐧ *JIMMY MCERLAIN* DATA SCIENTIST (NERD) *. . . . . . . . . . . . . . . . . .* *IF WE CAN’T DOUBLE YOUR SALES,* *ONE OF US IS IN THE WRONG BUSINESS.* *E*: ji...@sellpoints.com *M*: *510.303.7751*

Re: Spark can't find jars

2014-10-13 Thread Jimmy McErlain
That didnt seem to work... the jar files are in the target > scala2.10 folder when I package, then I move the jar to the cluster and launch the app... still the same error... Thoughts? J ᐧ *JIMMY MCERLAIN* DATA SCIENTIST (NERD) *. . . . . . . . . . . . . . . . . .* *IF WE CAN’T DOUBLE YOUR

Re: Spark can't find jars

2014-10-13 Thread Sean McNamara
I’ve run into this as well. I haven’t had a chance to troubleshoot what exactly was going on, but I got around it by building my app as a single uberjar. Sean On Oct 13, 2014, at 6:40 PM, HARIPRIYA AYYALASOMAYAJULA mailto:aharipriy...@gmail.com>> wrote: Helo, Can you check if the jar file

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
Helo, Can you check if the jar file is available in the target->scala-2.10 folder? When you use sbt package to make the jar file, that is where the jar file would be located. The following command works well for me: spark-submit --class “Classname" --master yarn-cluster jarfile(withcomplete

Re: Spark can't find jars

2014-10-13 Thread Jimmy
Having the exact same error with the exact same jar Do you work for Altiscale? :) J Sent from my iPhone > On Oct 13, 2014, at 5:33 PM, Andy Srine wrote: > > Hi Guys, > > Spark rookie here. I am getting a file not found exception on the --jars. > This is on the yarn cluster mode and I am

Spark can't find jars

2014-10-13 Thread Andy Srine
Hi Guys, Spark rookie here. I am getting a file not found exception on the --jars. This is on the yarn cluster mode and I am running the following command on our recently upgraded Spark 1.1.1 environment. ./bin/spark-submit --verbose --master yarn --deploy-mode cluster --class myEngine --driver