Re: execute native system commands in Spark

2015-11-02 Thread Deenar Toraskar
You can do the following, make sure you the no of executors requested equal the number of executors on your cluster. import scala.sys.process._ import org.apache.hadoop.security.UserGroupInformation import org.apache.spark.deploy.SparkHadoopUtil sc.parallelize(0 to 10).map { _ =>(("hostname".!!).t

Re: execute native system commands in Spark

2015-11-02 Thread Adrian Tanase
Have you seen .pipe()? On 11/2/15, 5:36 PM, "patcharee" wrote: >Hi, > >Is it possible to execute native system commands (in parallel) Spark, >like scala.sys.process ? > >Best, >Patcharee > >- >To unsubscribe, e-mail: user-un

execute native system commands in Spark

2015-11-02 Thread patcharee
Hi, Is it possible to execute native system commands (in parallel) Spark, like scala.sys.process ? Best, Patcharee - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.a