I am new and i don't know much either. But this is what helped me.

a) Check if the compiled jar is in
/spark-0.9.0-incubating/assembly/target/scala-2.10.1/
b) Try sbt package command
c) spark-shell will only run from the root of the spark-0.9.0-incubating
directory. I think the path of the shell script is sbin/spark-shell
d) check build.sbt for errors

HTH,
Shivani


On Mon, Mar 17, 2014 at 2:10 PM, Yexi Jiang <yexiji...@gmail.com> wrote:

> Hi,
>
> I am a beginner of Spark.
> Currently I am trying to install spark on my laptop.
>
> I followed the tutorial at
> http://spark.apache.org/screencasts/1-first-steps-with-spark.html (The
> only difference is that I installed scala-2.10.1 instead of 2.9.2).
>
> I packaged spark successfully with "sbt package" and config the
> spark-env.sh according to the tutorial.
>
> Now when I execute spark-shell, I got the following error:
>
> Failed to find Spark assembly in
> "PATH/spark-0.9.0-incubating/assembly/target/scala-2.10/"
> You need to build Spark with 'sbt/sbt assembly' before running this
> program.
>
> Could anyone tell me what is the problem?
>
> Thank you very much!
>
> Regards,
> Yexi
>
> --
>
>
>


-- 
Software Engineer
Analytics Engineering Team@ Box
Mountain View, CA

Reply via email to