SPARK_HADOOP_VERSION=2.3.0 sbt/sbt assembly
and copy the generated jar to lib/ directory of my application,
it seems that sbt cannot find the dependencies in the jar?
but everything works with the pre-built jar files downloaded from the link
provided by Patrick
Best,
--
Nan Zhu
On Thurs
Hi,
I have written a code that works just about fine in the spark shell on EC2.
The ec2 script helped me configure my master and worker nodes. Now I want to
run the scala-spark code out side the interactive shell. How do I go about
doing it.
I was referring to the instructions mentioned here:
htt
Hi AJ,
You might find this helpful -
http://blog.cloudera.com/blog/2014/04/how-to-run-a-simple-apache-spark-app-in-cdh-5/
-Sandy
On Sat, May 3, 2014 at 8:42 AM, Ajay Nair wrote:
> Hi,
>
> I have written a code that works just about fine in the spark shell on EC2.
> The ec2 script helped me co
Hey AJ,
I created a little sample app using the spark's quick start.
Have a look here.
Assuming you used scala, using sbt is good for running your application in
standalone mode.
The configuration file which is "simple.sbt" in my repo, holds all the
dependencies needed to build your app.
Hope t
Thank you for the reply. Have you posted a link from where I follow the steps
?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6462.html
Sent from the Apache Spark Developers List mailing list archive
Sorry, the link went wrong. I meant here:
https://github.com/ngarneau/spark-standalone
Le 2014-05-03 à 13:23, Nicolas Garneau a écrit :
> Hey AJ,
>
> I created a little sample app using the spark's quick start.
> Have a look here.
> Assuming you used scala, using sbt is good for running your ap
Thank you. Let me try this quickly !
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6463.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
Quick question, where should I place your folder. Inside the spark directory.
My Spark directory is in /root/spark
So currently I tried pulling your github code in /root/spark/spark-examples
and modified my home spark directory in the scala code.
I copied the sbt folder within the spark-examples fo
Hey AJ,
As I can see your path when running sbt is:
> $root/spark/spark-examples: sbt/sbt package
You should be within the app's folder that contains the simple.sbt, which is
spark-standalone/;
> $root/spark/spark-examples/spark-standalone: sbt/sbt package
> $root/spark/spark-examples/spark-st
Hi Nicolas,
Good catches on these things.
> Your website seems a little bit incomplete. I have found this page [1] with
> list the two main mailing lists, users and dev. But I see a reference to a
> mailing list about "issues" which tracks the sparks issues when it was hosted
> at Atlassian. I
10 matches
Mail list logo