Kudo goes to Josh. 

Cheers

> On Nov 14, 2015, at 10:04 PM, Jerry Lam <chiling...@gmail.com> wrote:
> 
> Hi Ted, 
> 
> That looks exactly what happens. It has been 5 hrs now. The code was built 
> for 1.4. Thank you very much! 
> 
> Best Regards,
> 
> Jerry
> 
> Sent from my iPhone
> 
>> On 14 Nov, 2015, at 11:21 pm, Ted Yu <yuzhih...@gmail.com> wrote:
>> 
>> Which release are you using ?
>> If older than 1.5.0, you miss some fixes such as SPARK-9952
>> 
>> Cheers
>> 
>>> On Sat, Nov 14, 2015 at 6:35 PM, Jerry Lam <chiling...@gmail.com> wrote:
>>> Hi spark users and developers,
>>> 
>>> Have anyone experience the slow startup of a job when it contains a stage 
>>> with over 4 millions of tasks? 
>>> The job has been pending for 1.4 hours without doing anything (please refer 
>>> to the attached pictures). However, the driver is busy doing something. 
>>> jstack the driver and I found the following relevant:
>>> 
>>> ```
>>> "dag-scheduler-event-loop" daemon prio=10 tid=0x00007f24a8c59800 nid=0x454 
>>> runnable [0x00007f23b3e29000]
>>>    java.lang.Thread.State: RUNNABLE
>>>         at 
>>> org.apache.spark.rdd.RDD$$anonfun$preferredLocations$2.apply(RDD.scala:231)
>>>         at 
>>> org.apache.spark.rdd.RDD$$anonfun$preferredLocations$2.apply(RDD.scala:231)
>>>         at scala.Option.getOrElse(Option.scala:120)
>>>         at org.apache.spark.rdd.RDD.preferredLocations(RDD.scala:230)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal(DAGScheduler.scala:1399)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler.getPreferredLocs(DAGScheduler.scala:1373)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$16.apply(DAGScheduler.scala:911)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$16.apply(DAGScheduler.scala:910)
>>>         at 
>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>         at 
>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>>         at 
>>> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>>>         at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>>         at 
>>> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>>>         at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:910)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:834)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:837)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:836)
>>>         at scala.collection.immutable.List.foreach(List.scala:318)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:836)
>>>         at 
>>> org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:818)
>>>         at 
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1453)
>>>         at 
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1445)
>>>         at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>>> ```
>>> 
>>> It seems that it takes long time for the driver to create/schedule the DAG 
>>> for that many tasks. Is there a way to speed it up? 
>>> 
>>> Best Regards,
>>> 
>>> Jerry
>>> 
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 

Reply via email to