Ok, this had more information:

> INFO [2019-03-15 02:00:46,364] ({pool-2-thread-3} 
> Logging.scala[logInfo]:54) - Logging events to 
> hdfs:///var/log/spark/applicationHistory/application_1551287663522_0145
> ERROR [2019-03-15 02:00:46,366] ({SparkListenerBus} 
> Logging.scala[logError]:91) - uncaught error in thread 
> SparkListenerBus, stopping SparkContext
> java.lang.NoSuchMethodError: 
> org.json4s.Formats.emptyValueStrategy()Lorg/json4s/prefs/EmptyValueStrategy;
>     at org.json4s.jackson.JsonMethods$class.render(JsonMethods.scala:32)
>     at org.json4s.jackson.JsonMethods$.render(JsonMethods.scala:50)
>     at 
> org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:136)
>     at 
> org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:136)
>     at scala.Option.foreach(Option.scala:257)
>     at 
> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:136)
>     at 
> org.apache.spark.scheduler.EventLoggingListener.onBlockManagerAdded(EventLoggingListener.scala:168)
>     at 
> org.apache.spark.scheduler.SparkListenerBus$class.doPostEvent(SparkListenerBus.scala:49)
>     at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
>     at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
>     at 
> org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:63)
>     at 
> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:36)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:94)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
>     at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
>     at 
> org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1245)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)
> ERROR [2019-03-15 02:00:46,367] ({SparkListenerBus} 
> Logging.scala[logError]:91) - throw uncaught fatal error in thread 
> SparkListenerBus
> java.lang.NoSuchMethodError: 
> org.json4s.Formats.emptyValueStrategy()Lorg/json4s/prefs/EmptyValueStrategy;
>     at org.json4s.jackson.JsonMethods$class.render(JsonMethods.scala:32)
>     at org.json4s.jackson.JsonMethods$.render(JsonMethods.scala:50)
>     at 
> org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:136)
>     at 
> org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:136)
>     at scala.Option.foreach(Option.scala:257)
>     at 
> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:136)
>     at 
> org.apache.spark.scheduler.EventLoggingListener.onBlockManagerAdded(EventLoggingListener.scala:168)
>     at 
> org.apache.spark.scheduler.SparkListenerBus$class.doPostEvent(SparkListenerBus.scala:49)
>     at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
>     at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
>     at 
> org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:63)
>     at 
> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:36)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:94)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
>     at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
>     at 
> org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1245)
>     at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)
>  INFO [2019-03-15 02:00:46,368] ({pool-2-thread-3} 
> Logging.scala[logInfo]:54) - SchedulerBackend is ready for scheduling 
> beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
>  INFO [2019-03-15 02:00:46,375] ({stop-spark-context} 
> AbstractConnector.java[doStop]:306) - Stopped 
> ServerConnector@718326a2{HTTP/1.1}{0.0.0.0:55600}
>  INFO [2019-03-15 02:00:46,376] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@69eb6481{/stages/stage/kill,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,376] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@1b8dfd51{/jobs/job/kill,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,377] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@6c8c0f44{/api,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,378] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@37481c3{/,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,378] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@5da5f62f{/static,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,379] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@7ee10d{/executors/threadDump/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,379] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@2b5899f0{/executors/threadDump,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,379] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@4bd2abab{/executors/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,380] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@260f612c{/executors,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,380] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@7d8ad0f6{/environment/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,380] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@4ee5c1e0{/environment,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,381] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@64c12e28{/storage/rdd/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,381] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@e9ab88b{/storage/rdd,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,381] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@72f8348e{/storage/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,382] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@322305bd{/storage,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,382] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@528716d6{/stages/pool/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,383] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@197cbc56{/stages/pool,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,383] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@fb9f45b{/stages/stage/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,383] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@3fabede6{/stages/stage,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,383] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@26fda9b6{/stages/json,null,UNAVAILABLE}
> ERROR [2019-03-15 02:00:46,383] ({pool-2-thread-3} 
> NewSparkInterpreter.java[open]:127) - Fail to open SparkInterpreter
> ERROR [2019-03-15 02:00:46,384] ({pool-2-thread-3} Job.java[run]:190) 
> - Job failed
> org.apache.zeppelin.interpreter.InterpreterException: Fail to open 
> SparkInterpreter
>     at 
> org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:128)
>     at 
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
>     at 
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
>     at 
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
>     at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
>     at 
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
>     at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>     at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>     at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>     at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>     at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:259)
>     at 
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:178)
>     at 
> org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:89)
>     at 
> org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
>     ... 12 more
> Caused by: java.lang.IllegalStateException: Cannot call methods on a 
> stopped SparkContext.
> This stopped SparkContext was created at:
>
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> java.lang.reflect.Method.invoke(Method.java:498)
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:259)
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:178)
> org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:89)
> org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
> org.apache.zeppelin.scheduler.Job.run(Job.java:188)
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> java.util.concurrent.FutureTask.run(FutureTask.java:266)
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
> The currently active SparkContext was created at:
>
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> java.lang.reflect.Method.invoke(Method.java:498)
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:259)
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:178)
> org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:89)
> org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
> org.apache.zeppelin.scheduler.Job.run(Job.java:188)
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> java.util.concurrent.FutureTask.run(FutureTask.java:266)
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
>     at 
> org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:100)
>     at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:82)
>     at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:79)
>     at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:877)
>     ... 20 more
>  INFO [2019-03-15 02:00:46,384] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@12009a60{/stages,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,384] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@3cc19c74{/jobs/job/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,385] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@21371f10{/jobs/job,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,385] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@440e08ee{/jobs/json,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,385] ({stop-spark-context} 
> ContextHandler.java[doStop]:865) - Stopped 
> o.s.j.s.ServletContextHandler@491cfe62{/jobs,null,UNAVAILABLE}
>  INFO [2019-03-15 02:00:46,387] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - Stopped Spark web UI at 
> http://10.1.50.128:55600
>  INFO [2019-03-15 02:00:46,387] ({pool-2-thread-3} 
> SchedulerFactory.java[jobFinished]:120) - Job 
> 20190222-204451_856915056 finished by scheduler interpreter_587572627
>  INFO [2019-03-15 02:00:46,502] ({Yarn application state monitor} 
> Logging.scala[logInfo]:54) - Interrupting monitor thread
>  INFO [2019-03-15 02:00:46,549] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - Shutting down all executors
>  INFO [2019-03-15 02:00:46,549] ({dispatcher-event-loop-4} 
> Logging.scala[logInfo]:54) - Asking each executor to shut down
>  INFO [2019-03-15 02:00:46,550] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - Stopping SchedulerExtensionServices
> (serviceOption=None,
>  services=List(),
>  started=false)
>  INFO [2019-03-15 02:00:46,552] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - Stopped
>  INFO [2019-03-15 02:00:46,554] ({dispatcher-event-loop-1} 
> Logging.scala[logInfo]:54) - MapOutputTrackerMasterEndpoint stopped!
>  INFO [2019-03-15 02:00:46,560] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - MemoryStore cleared
>  INFO [2019-03-15 02:00:46,560] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - BlockManager stopped
>  INFO [2019-03-15 02:00:46,560] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - BlockManagerMaster stopped
>  INFO [2019-03-15 02:00:46,561] ({dispatcher-event-loop-3} 
> Logging.scala[logInfo]:54) - OutputCommitCoordinator stopped!
>  INFO [2019-03-15 02:00:46,564] ({stop-spark-context} 
> Logging.scala[logInfo]:54) - Successfully stopped SparkContext

On 3/14/19 9:42 PM, Jeff Zhang wrote:
> This log is zeppelin server log, the root log should be in the spark 
> interpreter log. The file name is something like this : 
> zeppelin-interpreter-spark*.log
> -- 

========= mailto:db...@incadencecorp.com ============
David W. Boyd
VP,  Data Solutions
10432 Balls Ford, Suite 240
Manassas, VA 20109
office:   +1-703-552-2862
cell:     +1-703-402-7908
============== http://www.incadencecorp.com/ ============
ISO/IEC JTC1 WG9, editor ISO/IEC 20547 Big Data Reference Architecture
Chair ANSI/INCITS TC Big Data
Co-chair NIST Big Data Public Working Group Reference Architecture
First Robotic Mentor - FRC, FTC - www.iliterobotics.org
Board Member- USSTEM Foundation - www.usstem.org

The information contained in this message may be privileged
and/or confidential and protected from disclosure.
If the reader of this message is not the intended recipient
or an employee or agent responsible for delivering this message
to the intended recipient, you are hereby notified that any
dissemination, distribution or copying of this communication
is strictly prohibited.  If you have received this communication
in error, please notify the sender immediately by replying to
this message and deleting the material from any computer.

  

Reply via email to