Do you mind sharing what your software does? What is the input data size? What 
is the spark version and apis used? How many nodes? What is the input data 
format? Is compression used?

> On 21 Sep 2016, at 13:37, Trinadh Kaja <ktr.hadoo...@gmail.com> wrote:
> 
> Hi all,
> 
> how to increase spark performance ,i am using pyspark.
> 
> cluster info :
> 
> Total memory :600gb
> Cores            :96
> 
> command :
> spark-submit --master  yarn-client --executor-memory 10G --num-executors 50 
> --executor-cores 2 --driver-memory 10g --queue thequeue
>  
> 
> please help on this 
> 
> -- 
> Thanks&Regards
> K.Trinadh
> Ph-7348826118

Reply via email to