Yana: Thanks.  Can you give me a transcript of the actual commands you are
running?

THanks!
Diana


On Mon, Mar 24, 2014 at 3:59 PM, Yana Kadiyska <yana.kadiy...@gmail.com>wrote:

> I am able to run standalone apps. I think you are making one mistake
> that throws you off from there onwards. You don't need to put your app
> under SPARK_HOME. I would create it in its own folder somewhere, it
> follows the rules of any standalone scala program (including the
> layout). In the giude, $SPARK_HOME is only relevant to find the Readme
> file which they are parsing/word-counting. But otherwise the compile
> time dependencies on spark would be resolved via the sbt file (or the
> pom file if you look at the Java example).
>
> So for example I put my app under C:\Source\spark-code and the jar
> gets created in C:\Source\spark-code\target\scala-2.9.3 (or 2.10 if
> you're running with scala 2.10 as the example shows). But for that
> part of the guide, it's not any different than building a scala app.
>
> On Mon, Mar 24, 2014 at 3:44 PM, Diana Carroll <dcarr...@cloudera.com>
> wrote:
> > Has anyone successfully followed the instructions on the Quick Start
> page of
> > the Spark home page to run a "standalone" Scala application?  I can't,
> and I
> > figure I must be missing something obvious!
> >
> > I'm trying to follow the instructions here as close to "word for word" as
> > possible:
> >
> http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala
> >
> > 1.  The instructions don't say what directory to create my test
> application
> > in, but later I'm instructed to run "sbt/sbt" so I conclude that my
> working
> > directory must be $SPARK_HOME.  (Temporarily ignoring that it is a little
> > weird to be working directly in the Spark distro.)
> >
> > 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
> > Copy&paste in the code from the instructions exactly, replacing
> > YOUR_SPARK_HOME with my spark home path.
> >
> > 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copy&paste in the sbt
> file
> > from the instructions
> >
> > 4.  From the $SPARK_HOME I run "sbt/sbt package".  It runs through the
> > ENTIRE Spark project!  This takes several minutes, and at the end, it
> says
> > "Done packaging".  unfortunately, there's nothing in the
> > $SPARK_HOME/mysparktest/ folder other than what I already had there.
> >
> > (Just for fun, I also did what I thought was more logical, which is set
> my
> > working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
> > package, but that was even less successful: I got an error:
> > awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
> for
> > reading (No such file or directory)
> > Attempting to fetch sbt
> > /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> > directory
> > /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> > directory
> > Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
> > install sbt manually from http://www.scala-sbt.org/
> >
> >
> > So, help?  I'm sure these instructions work because people are following
> > them every day, but I can't tell what they are supposed to do.
> >
> > Thanks!
> > Diana
> >
>

Reply via email to