Could you follow this guide http://spark.apache.org/docs/latest/running-on-yarn.html#configuration?
Thanks, Yucai -----Original Message----- From: maheshmath [mailto:mahesh.m...@gmail.com] Sent: Saturday, April 9, 2016 1:58 PM To: user@spark.apache.org Subject: Unable run Spark in YARN mode I have set SPARK_LOCAL_IP=127.0.0.1 still getting below error 16/04/09 10:36:50 INFO spark.SecurityManager: Changing view acls to: mahesh 16/04/09 10:36:50 INFO spark.SecurityManager: Changing modify acls to: mahesh 16/04/09 10:36:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mahesh); users with modify permissions: Set(mahesh) 16/04/09 10:36:51 INFO util.Utils: Successfully started service 'sparkDriver' on port 43948. 16/04/09 10:36:51 INFO slf4j.Slf4jLogger: Slf4jLogger started 16/04/09 10:36:51 INFO Remoting: Starting remoting 16/04/09 10:36:52 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@127.0.0.1:32792] 16/04/09 10:36:52 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 32792. 16/04/09 10:36:52 INFO spark.SparkEnv: Registering MapOutputTracker 16/04/09 10:36:52 INFO spark.SparkEnv: Registering BlockManagerMaster 16/04/09 10:36:52 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-a2079037-6bbe-49ce-ba78-d475e38ad362 16/04/09 10:36:52 INFO storage.MemoryStore: MemoryStore started with capacity 517.4 MB 16/04/09 10:36:52 INFO spark.SparkEnv: Registering OutputCommitCoordinator 16/04/09 10:36:53 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/04/09 10:36:53 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 16/04/09 10:36:53 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 16/04/09 10:36:53 INFO ui.SparkUI: Started SparkUI at http://127.0.0.1:4040 16/04/09 10:36:53 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/04/09 10:36:54 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers 16/04/09 10:36:54 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 16/04/09 10:36:54 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 16/04/09 10:36:54 INFO yarn.Client: Setting up container launch context for our AM 16/04/09 10:36:54 INFO yarn.Client: Setting up the launch environment for our AM container 16/04/09 10:36:54 INFO yarn.Client: Preparing resources for our AM container 16/04/09 10:36:56 INFO yarn.Client: Uploading resource file:/home/mahesh/Programs/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar -> hdfs://localhost:54310/user/mahesh/.sparkStaging/application_1460137661144_0003/spark-assembly-1.6.1-hadoop2.6.0.jar 16/04/09 10:36:59 INFO yarn.Client: Uploading resource file:/tmp/spark-f28e3fd5-4dcd-4199-b298-c7fc607dedb4/__spark_conf__5551799952710555772.zip -> hdfs://localhost:54310/user/mahesh/.sparkStaging/application_1460137661144_0003/__spark_conf__5551799952710555772.zip 16/04/09 10:36:59 INFO spark.SecurityManager: Changing view acls to: mahesh 16/04/09 10:36:59 INFO spark.SecurityManager: Changing modify acls to: mahesh 16/04/09 10:36:59 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mahesh); users with modify permissions: Set(mahesh) 16/04/09 10:36:59 INFO yarn.Client: Submitting application 3 to ResourceManager 16/04/09 10:36:59 INFO impl.YarnClientImpl: Submitted application application_1460137661144_0003 16/04/09 10:37:00 INFO yarn.Client: Application report for application_1460137661144_0003 (state: ACCEPTED) 16/04/09 10:37:00 INFO yarn.Client: client token: N/A diagnostics: N/A ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1460178419692 final status: UNDEFINED tracking URL: http://gubbi:8088/proxy/application_1460137661144_0003/ user: mahesh 16/04/09 10:37:01 INFO yarn.Client: Application report for application_1460137661144_0003 (state: ACCEPTED) 16/04/09 10:37:02 INFO yarn.Client: Application report for application_1460137661144_0003 (state: ACCEPTED) 16/04/09 10:37:03 INFO yarn.Client: Application report for application_1460137661144_0003 (state: ACCEPTED) 16/04/09 10:37:04 INFO yarn.Client: Application report for application_1460137661144_0003 (state: ACCEPTED) 16/04/09 10:37:05 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) 16/04/09 10:37:05 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> gubbi, PROXY_URI_BASES -> http://gubbi:8088/proxy/application_1460137661144_0003), /proxy/application_1460137661144_0003 16/04/09 10:37:05 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 16/04/09 10:37:05 INFO yarn.Client: Application report for application_1460137661144_0003 (state: ACCEPTED) 16/04/09 10:37:06 INFO yarn.Client: Application report for application_1460137661144_0003 (state: RUNNING) 16/04/09 10:37:06 INFO yarn.Client: client token: N/A diagnostics: N/A ApplicationMaster host: 10.0.2.15 ApplicationMaster RPC port: 0 queue: default start time: 1460178419692 final status: UNDEFINED tracking URL: http://gubbi:8088/proxy/application_1460137661144_0003/ user: mahesh 16/04/09 10:37:06 INFO cluster.YarnClientSchedulerBackend: Application application_1460137661144_0003 has started running. 16/04/09 10:37:06 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42253. 16/04/09 10:37:06 INFO netty.NettyBlockTransferService: Server created on 42253 16/04/09 10:37:06 INFO storage.BlockManagerMaster: Trying to register BlockManager 16/04/09 10:37:06 INFO storage.BlockManagerMasterEndpoint: Registering block manager 127.0.0.1:42253 with 517.4 MB RAM, BlockManagerId(driver, 127.0.0.1, 42253) 16/04/09 10:37:06 INFO storage.BlockManagerMaster: Registered BlockManager 16/04/09 10:37:07 INFO scheduler.EventLoggingListener: Logging events to file:/home/mahesh/logs/spark/application_1460137661144_0003 16/04/09 10:37:15 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) 16/04/09 10:37:15 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> gubbi, PROXY_URI_BASES -> http://gubbi:8088/proxy/application_1460137661144_0003), /proxy/application_1460137661144_0003 16/04/09 10:37:15 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 16/04/09 10:37:15 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED! 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 16/04/09 10:37:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 16/04/09 10:37:16 INFO ui.SparkUI: Stopped Spark web UI at http://127.0.0.1:4040 16/04/09 10:37:16 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors 16/04/09 10:37:16 INFO cluster.YarnClientSchedulerBackend: Asking each executor to shut down 16/04/09 10:37:16 INFO cluster.YarnClientSchedulerBackend: Stopped 16/04/09 10:37:16 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 16/04/09 10:37:16 INFO storage.MemoryStore: MemoryStore cleared 16/04/09 10:37:16 INFO storage.BlockManager: BlockManager stopped 16/04/09 10:37:16 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 16/04/09 10:37:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 16/04/09 10:37:16 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 16/04/09 10:37:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 16/04/09 10:37:16 INFO spark.SparkContext: Successfully stopped SparkContext 16/04/09 10:37:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down. 16/04/09 10:37:23 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms) 16/04/09 10:37:23 ERROR spark.SparkContext: Error initializing SparkContext. java.lang.NullPointerException at org.apache.spark.SparkContext.<init>(SparkContext.scala:584) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017) at $line3.$read$$iwC$$iwC.<init>(<console>:15) at $line3.$read$$iwC.<init>(<console>:24) at $line3.$read.<init>(<console>:26) at $line3.$read$.<init>(<console>:30) at $line3.$read$.<clinit>(<console>) at $line3.$eval$.<init>(<console>:7) at $line3.$eval$.<clinit>(<console>) at $line3.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/04/09 10:37:23 INFO spark.SparkContext: SparkContext already stopped. java.lang.NullPointerException at org.apache.spark.SparkContext.<init>(SparkContext.scala:584) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017) at $iwC$$iwC.<init>(<console>:15) at $iwC.<init>(<console>:24) at <init>(<console>:26) at .<init>(<console>:30) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) java.lang.NullPointerException at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) at $iwC$$iwC.<init>(<console>:15) at $iwC.<init>(<console>:24) at <init>(<console>:26) at .<init>(<console>:30) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) <console>:16: error: not found: value sqlContext import sqlContext.implicits._ ^ <console>:16: error: not found: value sqlContext import sqlContext.sql -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-run-Spark-in-YARN-mode-tp26726.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org