Hi! 
I wrote a standalone cluster app in scala, and i did some properties
setting: 

System.setProperty("spark.akka.frameSize", "100") 
System.setProperty("spark.executor.memory", "3g") 
val sc = new SparkContext(...) 

It is strange that "spark.executor.memory" has taken effect, but
"spark.akka.frameSize" does not take effect. 
here is the sparkUI: 
Spark Executor Command: "/usr/java/jdk1.6.0_31/bin/java" "-cp"
":/usr/spark/conf:/usr/spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.2.1.jar"
"-Xms3072M" "-Xmx3072M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@Master.Hadoop:55755/user/CoarseGrainedScheduler" "8"
"Salve2.Hadoop" "8" "akka.tcp://sparkWorker@Salve2.Hadoop:56145/user/Worker"
"app-20140301165638-0001" 

Because "spark.akka.frameSize" is 10 by default, my app keep showing "Lost
TID". 
I use sbt package and sbt run to run my app. 

So,What's the reason and how to make "spark.akka.frameSize" take effect? 
Thank you very much!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-properties-setting-doesn-t-take-effect-tp2201.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to