Hi,

I want to deploy my application on a standalone cluster. 
Spark submit acts in strange way. When I deploy the application in
*"client"* mode, everything works well and my application can see the
additional jar files. 

Here is the command:
>   spark-submit --master spark://1.2.3.4:7077 --deploy-mode  client
> --supervise --jars $(echo /myjars/*.jar | tr ' ' ',')  --class
> com.algorithm /my/path/algorithm.jar 

However, when I submit the command in *"cluster"* deployment mode. The
driver can not see the additional jars.
I always get *java.lang.ClassNotFoundException*

Here is the command:
>   spark-submit --master spark://1.2.3.4:7077 --deploy-mode cluster
> --supervise --jars $(echo /myjars/*.jar | tr ' ' ',')  --class
> com.algorithm /my/path/algorithm.jar 


Do I miss something ?

thanks,
Hisham



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-in-deployment-mode-with-the-jars-option-tp23519.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to