On 17 Dec 2015, at 16:50, Saiph Kappa 
<saiph.ka...@gmail.com<mailto:saiph.ka...@gmail.com>> wrote:

Hi,

Since it is not currently possible to submit a spark job to a spark cluster 
running in standalone mode (cluster mode - it's not currently possible to 
specify this deploy mode within the code), can I do it with YARN?

I tried to do something like this (but in scala):

«

... // Client object - main method
System.setProperty("SPARK_YARN_MODE", "true")
val sparkConf = new SparkConf()

try {
  val args = new ClientArguments(argStrings, sparkConf)
  new Client(args, sparkConf).run()
} catch {
  case e: Exception => {
    Console.err.println(e.getMessage)
    System.exit(1)
  }
}

System.exit(0)


» in http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/


However it is not possible to create a new instance of Client since import 
org.apache.spark.deploy.yarn.Client is private

the standard way to work around a problem like this is to place your code in a 
package which has access. File a JIRA asking for a public API too —one that 
doesn't require you to set system properties as a way of passing parameters down



Is there any way I can submit spark jobs from the code in cluster mode and not 
using the spark-submit script?


Thanks.

Reply via email to