Re: PySpark with livy

2017-10-01 Thread Jeff Zhang
I see your test with livy via curl command, but seems you are submitting it as batch. Could you do it via interactive livy session, this is what livy interpreter of zeppelin does. Mauro Schneider 于2017年10月1日周日 上午4:55写道: > Hi Jeff > > Yes, the code work with PySpark Shell and the Spark Submit in

Re: PySpark with livy

2017-09-30 Thread Mauro Schneider
Hi Jeff Yes, the code work with PySpark Shell and the Spark Submit in the same server where is running the Zeppelin and Livy. And I did an another test, I executed the same code with cUrl using to Livy and work ok. Mauro Schneider On Fri, Sep 29, 2017 at 8:26 PM, Jeff Zhang wrote: > It is

Re: PySpark with livy

2017-09-29 Thread Jeff Zhang
It is more likely your spark configuration issue, could you run this code in pyspark shell ? Mauro Schneider 于2017年9月29日周五 下午11:24写道: > > Hi > > I'm trying execute PySpark code with Zeppelin and Livy but without > success. With Scala and Livy work well but when I execute the code below I > gett

PySpark with livy

2017-09-29 Thread Mauro Schneider
Hi I'm trying execute PySpark code with Zeppelin and Livy but without success. With Scala and Livy work well but when I execute the code below I getting a Exception from Zeppelin. %livy.pyspark txtFile = sc.textFile ("/data/staging/zeppelin_test/data.txt") counts = txtFile.flatMap(lambda line: l