so I need to reconfigure my sparkcontext this way:
val conf = new SparkConf()
.setMaster("local")
.setAppName("CountingSheep")
.set("spark.executor.memory", "1g")
.set("spark.akka.frameSize","20")
val sc = new SparkContext(conf)
And start a new
news20.binary's feature dimension is 1.35M. So the serialized task
size is above the default limit 10M. You need to set
spark.akka.frameSize to, e.g, 20. Due to a bug SPARK-1112, this
parameter is not passed to executors automatically, which causes Spark
freezes. This was fixed in the latest master
Tried the newest branch, but still get stuck on the same task: (kill) runJob
at SlidingRDD.scala:74
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Terminal-freeze-during-SVM-Broken-pipe-tp9022p9304.html
Sent from the Apache Spark User List mailing list ar
It means pulling the code from latest development branch from git
repository.
On Jul 9, 2014 9:45 AM, "AlexanderRiggers"
wrote:
> By latest branch you mean Apache Spark 1.0.0 ? and what do you mean by
> master? Because I am using v 1.0.0 - Alex
>
>
>
> --
> View this message in context:
> http://
By latest branch you mean Apache Spark 1.0.0 ? and what do you mean by
master? Because I am using v 1.0.0 - Alex
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Terminal-freeze-during-SVM-Broken-pipe-tp9022p9208.html
Sent from the Apache Spark User List mail