Hello,

Thanks much. I could start the service.

When I run my program, the launcher is not being able to find the app class:

java.lang.ClassNotFoundException: SparkSubmitter
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
Spark job complete. Exit code:101
        at java.lang.Class.forName(Class.java:274)
        at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:639)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

My launch code is as follows:
val spark = new SparkLauncher()
        .setSparkHome("C:\\spark-1.5.1-bin-hadoop2.6")
        
.setAppResource("C:\\SparkService\\Scala\\RequestSubmitter\\target\\scala-2.10\\spark-submitter_2.10-0.0.1.jar")
        .setMainClass("SparkSubmitter")
        .addAppArgs(inputQuery)
        .setMaster("spark://157.54.189.70:7077")
        .launch()
spark.waitFor()

I added the spark-submitter_2.10-0.0.1.jar in the classpath as well
but that didn't help.

Thanks & regards
Arko

On Fri, Feb 19, 2016 at 6:49 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> Cycling old bits:
>
> http://search-hadoop.com/m/q3RTtHrxMj2abwOk2
>
> On Fri, Feb 19, 2016 at 6:40 PM, Arko Provo Mukherjee
> <arkoprovomukher...@gmail.com> wrote:
>>
>> Hi,
>>
>> Thanks for your response. Is there a similar link for Windows? I am
>> not sure the .sh scripts would run on windows.
>>
>> My default the start-all.sh doesn't work and I don't see anything in
>> localhos:8080
>>
>> I will do some more investigation and come back.
>>
>> Thanks again for all your help!
>>
>> Thanks & regards
>> Arko
>>
>>
>> On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>> > Please see https://spark.apache.org/docs/latest/spark-standalone.html
>> >
>> > On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
>> > <arkoprovomukher...@gmail.com> wrote:
>> >>
>> >> Hi,
>> >>
>> >> Thanks for your response, that really helped.
>> >>
>> >> However, I don't believe the job is being submitted. When I run spark
>> >> from the shell, I don't need to start it up explicitly. Do I need to
>> >> start up Spark on my machine before running this program?
>> >>
>> >> I see the following in the SPARK_HOME\bin directory:
>> >> Name
>> >> ----
>> >> beeline.cmd
>> >> load-spark-env.cmd
>> >> pyspark.cmd
>> >> pyspark2.cmd
>> >> run-example.cmd
>> >> run-example2.cmd
>> >> spark-class.cmd
>> >> spark-class2.cmd
>> >> spark-shell.cmd
>> >> spark-shell2.cmd
>> >> spark-submit.cmd
>> >> spark-submit2.cmd
>> >> sparkR.cmd
>> >> sparkR2.cmd
>> >>
>> >> Do I need to run anyone of them before submitting the job via the
>> >> program?
>> >>
>> >> Thanks & regards
>> >> Arko
>> >>
>> >> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <hol...@pigscanfly.ca>
>> >> wrote:
>> >> > How are you trying to launch your application? Do you have the Spark
>> >> > jars on
>> >> > your class path?
>> >> >
>> >> >
>> >> > On Friday, February 19, 2016, Arko Provo Mukherjee
>> >> > <arkoprovomukher...@gmail.com> wrote:
>> >> >>
>> >> >> Hello,
>> >> >>
>> >> >> I am trying to submit a spark job via a program.
>> >> >>
>> >> >> When I run it, I receive the following error:
>> >> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
>> >> >> org/apache/spark/launcher/SparkLauncher
>> >> >>         at Spark.SparkConnector.run(MySpark.scala:33)
>> >> >>         at java.lang.Thread.run(Thread.java:745)
>> >> >> Caused by: java.lang.ClassNotFoundException:
>> >> >> org.apache.spark.launcher.SparkLauncher
>> >> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> >> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> >>         at java.security.AccessController.doPrivileged(Native
>> >> >> Method)
>> >> >>         at
>> >> >> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >> >>         ... 2 more
>> >> >>
>> >> >> It seems it cannot find the SparkLauncher class. Any clue to what I
>> >> >> am
>> >> >> doing wrong?
>> >> >>
>> >> >> Thanks & regards
>> >> >> Arko
>> >> >>
>> >> >>
>> >> >> ---------------------------------------------------------------------
>> >> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> >> >> For additional commands, e-mail: user-h...@spark.apache.org
>> >> >>
>> >> >
>> >> >
>> >> > --
>> >> > Cell : 425-233-8271
>> >> > Twitter: https://twitter.com/holdenkarau
>> >> >
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> >> For additional commands, e-mail: user-h...@spark.apache.org
>> >>
>> >
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to