Hi,
 I have a Hidden Markov Model running with 200MB data.
 Once the program finishes (i.e. all stages/jobs are done) the program
hangs for 20 minutes or so before killing master.

In the spark master the following log appears.

2015-02-12 13:00:05,035 ERROR akka.actor.ActorSystemImpl: Uncaught fatal
error from thread [sparkMaster-akka.actor.default-dispatcher-31] shutting
down ActorSystem [sparkMaster]
java.lang.OutOfMemoryError: GC overhead limit exceeded
        at scala.collection.immutable.List$.newBuilder(List.scala:396)
        at
scala.collection.generic.GenericTraversableTemplate$class.genericBuilder(GenericTraversableTemplate.scala:69)
        at
scala.collection.AbstractTraversable.genericBuilder(Traversable.scala:105)
        at
scala.collection.generic.GenTraversableFactory$GenericCanBuildFrom.apply(GenTraversableFactory.scala:58)
        at
scala.collection.generic.GenTraversableFactory$GenericCanBuildFrom.apply(GenTraversableFactory.scala:53)
        at
scala.collection.TraversableLike$class.builder$1(TraversableLike.scala:239)
        at
scala.collection.TraversableLike$class.map(TraversableLike.scala:243)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at
org.json4s.MonadicJValue$$anonfun$org$json4s$MonadicJValue$$findDirectByName$1.apply(MonadicJValue.scala:26)
        at
org.json4s.MonadicJValue$$anonfun$org$json4s$MonadicJValue$$findDirectByName$1.apply(MonadicJValue.scala:22)
        at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
        at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
        at
scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
        at org.json4s.MonadicJValue.org
$json4s$MonadicJValue$$findDirectByName(MonadicJValue.scala:22)
        at org.json4s.MonadicJValue.$bslash(MonadicJValue.scala:16)
        at
org.apache.spark.util.JsonProtocol$.taskStartFromJson(JsonProtocol.scala:450)
        at
org.apache.spark.util.JsonProtocol$.sparkEventFromJson(JsonProtocol.scala:423)
        at
org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2$$anonfun$apply$1.apply(ReplayListenerBus.scala:71)
        at
org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2$$anonfun$apply$1.apply(ReplayListenerBus.scala:69)
        at scala.collection.Iterator$class.foreach(Iterator.scala:727)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        at
org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2.apply(ReplayListenerBus.scala:69)
        at
org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2.apply(ReplayListenerBus.scala:55)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at
org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:55)
        at
org.apache.spark.deploy.master.Master.rebuildSparkUI(Master.scala:726)
        at
org.apache.spark.deploy.master.Master.removeApplication(Master.scala:675)
        at
org.apache.spark.deploy.master.Master.finishApplication(Master.scala:653)
        at
org.apache.spark.deploy.master.Master$$anonfun$receiveWithLogging$1$$anonfun$applyOrElse$29.apply(Master.scala:399)

Can anyone help?

..Manas

Reply via email to