Hi All,
I see the following error messages on my worker nodes. Are they due to improper
cleanup or wrong configuration? Any help with this would be great!
14/06/25 12:30:55 INFO SecurityManager: Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties14/06/25 12:30:55 INFO
SecurityManager: Changing view acls to: userid14/06/25 12:30:55 INFO
SecurityManager: SecurityManager: authentication disabled; ui acls disabled;
users with view permissions: Set(p529444)14/06/25 12:30:56 INFO Slf4jLogger:
Slf4jLogger started14/06/25 12:30:56 INFO Remoting: Starting remoting14/06/25
12:30:56 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkWorker@worker1ip:60276]14/06/25 12:30:57 INFO Worker:
Starting Spark worker worker1ip:60276 with 1 cores, 2.7 GB RAM14/06/25 12:30:57
INFO Worker: Spark home: /apps/software/spark-1.0.0-bin-hadoop114/06/25
12:30:57 INFO WorkerWebUI: Started WorkerWebUI at http://worker1ip:808114/06/25
12:30:57 INFO Worker: Connecting to master spark://serverip:7077...14/06/25
12:30:57 INFO Worker: Successfully registered with master
spark://serverip:707714/06/25 12:32:05 INFO Worker: Asked to launch executor
app-20140625123205-0000/2 for ApproxStrMatch14/06/25 12:32:05 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:56569/user/CoarseGrainedScheduler" "2" "p
worker1ip" "1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625123205-0000"14/06/25 12:32:09 INFO Worker: Executor
app-20140625123205-0000/2 finished with state FAILED message Command exited
with code 1 exitStatus 114/06/25 12:32:09 INFO Worker: Asked to launch executor
app-20140625123205-0000/5 for ApproxStrMatch14/06/25 12:32:09 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:56569/user/CoarseGrainedScheduler" "5" "worker1ip"
"1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625123205-0000"14/06/25 12:32:12 INFO Worker: Executor
app-20140625123205-0000/5 finished with state FAILED message Command exited
with code 1 exitStatus 114/06/25 12:32:12 INFO Worker: Asked to launch executor
app-20140625123205-0000/9 for ApproxStrMatch14/06/25 12:32:12 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:56569/user/CoarseGrainedScheduler" "9" "worker1ip"
"1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625123205-0000"14/06/25 12:32:16 INFO Worker: Asked to kill executor
app-20140625123205-0000/914/06/25 12:32:16 INFO ExecutorRunner: Runner thread
for executor app-20140625123205-0000/9 interrupted14/06/25 12:32:16 INFO
ExecutorRunner: Killing process!14/06/25 12:32:16 INFO Worker: Executor
app-20140625123205-0000/9 finished with state KILLED14/06/25 13:28:44 INFO
Worker: Asked to launch executor app-20140625132844-0001/2 for
ApproxStrMatch14/06/25 13:28:44 INFO ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:46648/user/CoarseGrainedScheduler" "2" "worker1ip"
"1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625132844-0001"14/06/25 13:28:48 INFO Worker: Executor
app-20140625132844-0001/2 finished with state FAILED message Command exited
with code 1 exitStatus 114/06/25 13:28:48 INFO Worker: Asked to launch executor
app-20140625132844-0001/5 for ApproxStrMatch14/06/25 13:28:48 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:46648/user/CoarseGrainedScheduler" "5" "worker1ip"
"1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625132844-0001"14/06/25 13:28:51 INFO Worker: Executor
app-20140625132844-0001/5 finished with state FAILED message Command exited
with code 1 exitStatus 114/06/25 13:28:51 INFO Worker: Asked to launch executor
app-20140625132844-0001/8 for ApproxStrMatch14/06/25 13:28:51 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:46648/user/CoarseGrainedScheduler" "8" "worker1ip"
"1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625132844-0001"14/06/25 13:28:54 INFO Worker: Executor
app-20140625132844-0001/8 finished with state FAILED message Command exited
with code 1 exitStatus 114/06/25 13:30:31 INFO Worker: Asked to launch executor
app-20140625133031-0002/2 for ApproxStrMatch14/06/25 13:30:31 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:42235/user/CoarseGrainedScheduler" "2" "worker1ip"
"1" "akka.tcp://sparkWorker@ worker1ip:60276/user/Worker"
"app-20140625133031-0002"14/06/25 13:30:34 INFO Worker: Executor
app-20140625133031-0002/2 finished with state FAILED message Command exited
with code 1 exitStatus 114/06/25 13:30:34 INFO Worker: Asked to launch executor
app-20140625133031-0002/5 for ApproxStrMatch14/06/25 13:30:35 INFO
ExecutorRunner: Launch command:
"/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/bin/java" "-cp"
"::/apps/software/spark-1.0.0-bin-hadoop1/conf:/apps/software/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/apps/hadoop/hadoop-conf"
"-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@localhost:42235/user/CoarseGrainedScheduler" "5" "worker1ip"
"1" "akka.tcp://sparkWorker@worker1ip:60276/user/Worker"
"app-20140625133031-0002"14/06/25 13:30:36 INFO Worker: Asked to kill executor
app-20140625133031-0002/514/06/25 13:30:36 INFO Worker: Executor
app-20140625133031-0002/5 finished with state KILLED14/06/25 13:30:36 INFO
ExecutorRunner: Runner thread for executor app-20140625133031-0002/5
interrupted14/06/25 13:30:36 INFO ExecutorRunner: Killing process!