Re: Increase no of tasks

2018-06-26 Thread Lalwani, Jayesh
You can use repartition method of Dataframe to change the number of partitions https://spark.apache.org/docs/2.3.0/api/scala/index.html#org.apache.spark.sql.Dataset@repartition(numPartitions:Int):org.apache.spark.sql.Dataset[T] On 6/22/18, 3:04 PM, "pratik4891" wrote: It's default , I have

Re: Increase no of tasks

2018-06-22 Thread pratik4891
It's default , I haven't changed that. Is there any specific way I can know that no Thanks -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Increase no of tasks

2018-06-22 Thread Apostolos N. Papadopoulos
How many partitions do you have in your data? On 22/06/2018 09:46 μμ, pratik4891 wrote: Hi Gurus, I am running a spark job and in one stage it's creating 9 tasks .So even if I have 25 executors only 9s are getting utilized. The other executors going to dead status , how can I increase the no