Re: Trying to 0.7.3 running with Spark

2017-10-09 Thread Michael Segel
How about exporting the note, then re-importing it? I had the same problem, but the restart of Zeppelin worked. Someone else reported the same / similar issue in a different thread. On Oct 9, 2017, at 11:04 AM, Healy, Terence D mailto:the...@bnl.gov>> wrote: Any suggestions on how to move forwar

Re: Trying to 0.7.3 running with Spark

2017-10-09 Thread Healy, Terence D
Any suggestions on how to move forward to get this running? Even with Mongo and Impala out of the picture, I get the same error. Even on a different system (Mac). From: Terence Healy Date: Friday, October 6, 2017 at 10:35 AM To: "users@zeppelin.apache.org" Subject: Trying to 0.7.3 running with

Re: Trying to 0.7.3 running with Spark

2017-10-07 Thread Jianfeng (Jeff) Zhang
Could you check the log again ? There should be another exception above the exception you pasted. Most likely SparkContext is failed to create. Best Regard, Jeff Zhang From: Terry Healy mailto:the...@bnl.gov>> Reply-To: "users@zeppelin.apache.org" mailto:us

Re: Trying to 0.7.3 running with Spark

2017-10-06 Thread Michael Segel
I know. If its not working… what do you see in the spark interpreter? Do you see %spark, %spark.sql or do you see just %sql ? I’m sorry, I fuzzed with it and don’t remember what I did to change it. On Oct 6, 2017, at 9:47 AM, Terry Healy mailto:the...@bnl.gov>> wrote: Hi Michael- I'm not s

Re: Trying to 0.7.3 running with Spark

2017-10-06 Thread Terry Healy
Hi Michael- I'm not sure what the values are supposed to be in order to compare. The interpreter is running; a save and restart still gives the same result. On 10/06/2017 10:41 AM, Michael Segel wrote: What do you see when you check out the spark interpreter?  Something with %spark, or %spa

Re: Trying to 0.7.3 running with Spark

2017-10-06 Thread Michael Segel
What do you see when you check out the spark interpreter? Something with %spark, or %spark.sql (Sorry, going from memory. ) I think it may also have to do with not having the spark interpreter running, so if you manually restart the interpreter then re-run the notebook… it should work… HTH