Hi Andrew,

Could you use the following log configuration in conf/log4j.properties
and start over? Share the logs when you finish.

log4j.logger.org.apache.spark.deploy.yarn.Client=DEBUG
log4j.logger.org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend=DEBUG

You could also have a look to the logs in YARN. Go to
localhost:8088/cluster/apps and see the app's logs.


Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Mon, May 9, 2016 at 9:45 AM, Andrew Holway
<andrew.hol...@otternetworks.de> wrote:
> Hi,
>
> I am having a hard time getting to the bottom of this problem. I'm really
> not sure where to start with it. Everything works fine in local mode.
>
> Cheers,
>
> Andrew
>
> [testing@instance-16826 ~]$ /opt/mapr/spark/spark-1.5.2/bin/spark-submit
> --num-executors 21 --executor-cores 5 --master yarn-client --executor-memory
> 5g ~/pi.py
>
> 16/05/09 13:35:07 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 16/05/09 13:35:08 WARN MetricsSystem: Using default name DAGScheduler for
> source because spark.app.id is not set.
>
> 16/05/09 13:35:14 ERROR SparkContext: Error initializing SparkContext.
>
> org.apache.spark.SparkException: Yarn application has already ended! It
> might have been killed or unable to launch application master.
>
> at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:123)
>
> at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
>
> at
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
>
> at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
>
> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>
> at py4j.Gateway.invoke(Gateway.java:214)
>
> at
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
>
> at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
>
> at py4j.GatewayConnection.run(GatewayConnection.java:207)
>
> at java.lang.Thread.run(Thread.java:745)
>
> 16/05/09 13:35:14 ERROR Utils: Uncaught exception in thread Thread-3
>
> java.lang.NullPointerException
>
> at
> org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
>
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
>
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
>
> at
> org.apache.spark.SparkContext$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749)
>
> at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
>
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1748)
>
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:593)
>
> at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
>
> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>
> at py4j.Gateway.invoke(Gateway.java:214)
>
> at
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
>
> at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
>
> at py4j.GatewayConnection.run(GatewayConnection.java:207)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Traceback (most recent call last):
>
>   File "/home/testing/pi.py", line 30, in <module>
>
>     sc = SparkContext(appName="PythonPi")
>
>   File
> "/opt/mapr/spark/spark-1.5.2/python/lib/pyspark.zip/pyspark/context.py",
> line 113, in __init__
>
>   File
> "/opt/mapr/spark/spark-1.5.2/python/lib/pyspark.zip/pyspark/context.py",
> line 170, in _do_init
>
>   File
> "/opt/mapr/spark/spark-1.5.2/python/lib/pyspark.zip/pyspark/context.py",
> line 224, in _initialize_context
>
>   File
> "/opt/mapr/spark/spark-1.5.2/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
> line 701, in __call__
>
>   File
> "/opt/mapr/spark/spark-1.5.2/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py",
> line 300, in get_return_value
>
> py4j.protocol.Py4JJavaError: An error occurred while calling
> None.org.apache.spark.api.java.JavaSparkContext.
>
> : org.apache.spark.SparkException: Yarn application has already ended! It
> might have been killed or unable to launch application master.
>
> at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:123)
>
> at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
>
> at
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
>
> at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
>
> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>
> at py4j.Gateway.invoke(Gateway.java:214)
>
> at
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
>
> at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
>
> at py4j.GatewayConnection.run(GatewayConnection.java:207)
>
> at java.lang.Thread.run(Thread.java:745)
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to