Re: run multiple spark applications in parallel

2014-10-28 Thread Zhan Zhang
You can set your executor number with --num-executors. Also changing yarn-client save you one container for driver. Then check your yarn resource manager to make sure there are more containers available to serve your extra apps. Thanks. Zhan Zhang On Oct 28, 2014, at 5:31 PM, Soumya Simanta

Re: run multiple spark applications in parallel

2014-10-28 Thread Soumya Simanta
Maybe changing --master yarn-cluster to --master yarn-client help. On Tue, Oct 28, 2014 at 7:25 PM, Josh J wrote: > Sorry, I should've included some stats with my email > > I execute each job in the following manner > > ./bin/spark-submit --class CLASSNAME --master yarn-cluster --driver-memory

Re: run multiple spark applications in parallel

2014-10-28 Thread Josh J
Sorry, I should've included some stats with my email I execute each job in the following manner ./bin/spark-submit --class CLASSNAME --master yarn-cluster --driver-memory 1g --executor-memory 1g --executor-cores 1 UBER.JAR ${ZK_PORT_2181_TCP_ADDR} my-consumer-group1 1 The box has 24 CPUs, Inte

Re: run multiple spark applications in parallel

2014-10-28 Thread Soumya Simanta
Try reducing the resources (cores and memory) of each application. > On Oct 28, 2014, at 7:05 PM, Josh J wrote: > > Hi, > > How do I run multiple spark applications in parallel? I tried to run on yarn > cluster, though the second application submitted does not run. > > Thanks, > Josh