Hi Xiangrui, Thank you for the response. I tried sbt package and sbt compile both the commands give me success result
sbt compile [info] Set current project to machine-learning (in build file:/opt/mapr/spark/spark-1.2.1/SparkTraining/machine-learning/) [info] Updating {file:/opt/mapr/spark/spark-1.2.1/SparkTraining/machine-learning/}machine-learning... [info] Resolving org.fusesource.jansi#jansi;1.4 ... [info] Done updating. [success] Total time: 1 s, completed Apr 6, 2015 5:14:43 AM sbt package [info] Set current project to machine-learning (in build file:/opt/mapr/spark/spark-1.2.1/SparkTraining/machine-learning/) [success] Total time: 1 s, completed Apr 6, 2015 5:15:04 AM How do I proceed from here. Regards Phani Kumar -----Original Message----- From: Xiangrui Meng [mailto:men...@gmail.com] Sent: Monday, April 06, 2015 9:50 AM To: Phani Yadavilli -X (pyadavil) Cc: user@spark.apache.org Subject: Re: Need help with ALS Recommendation code Could you try `sbt package` or `sbt compile` and see whether there are errors? It seems that you haven't reached the ALS code yet. -Xiangrui On Sat, Apr 4, 2015 at 5:06 AM, Phani Yadavilli -X (pyadavil) <pyada...@cisco.com> wrote: > Hi , > > > > I am trying to run the following command in the Movie Recommendation > example provided by the ampcamp tutorial > > > > Command: sbt package "run /movielens/medium" > > > > Exception: sbt.TrapExitSecurityException thrown from the > UncaughtExceptionHandler in thread "run-main-0" > > java.lang.RuntimeException: Nonzero exit code: 1 > > at scala.sys.package$.error(package.scala:27) > > [trace] Stack trace suppressed: run last compile:run for the full output. > > [error] (compile:run) Nonzero exit code: 1 > > [error] Total time: 0 s, completed Apr 4, 2015 12:00:18 PM > > > > I am unable to identify the error code.Can someone help me on this. > > > > Regards > > Phani Kumar --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org