Abraham is correct. spark-shell is for typing Spark code into. You can't give it a .scala file as an argument. spark-submit is for running a packaged Spark application. You also can't give it a .scala file. You need to compile a .jar file with your application. This is true for Spark Streaming too.
On Wed, Oct 8, 2014 at 12:09 AM, spr <s...@yarcdata.com> wrote: > || Try using spark-submit instead of spark-shell > > Two questions: > - What does spark-submit do differently from spark-shell that makes you > think that may be the cause of my difficulty? > > - When I try spark-submit it complains about "Error: Cannot load main class > from JAR: file:/Users/spr/.../try1.scala". My program is not structured as > a main class. Does it have to be to run with Spark Streaming? Or with > spark-submit? > > Thanks much. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/SparkStreaming-program-does-not-start-tp15876p15881.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org