If I start Zeppelin by zeppelin.cmd, only zeppelin log appears.
Interpreter log is created only when I manually start the interpreter; but the 
log contains only information that the interpreter was started (see my 
preceding mail with attachment).

-          INFO [2016-11-29 08:43:59,757] ({Thread-0} 
RemoteInterpreterServer.java[run]:81) - Starting remote interpreter server on 
port 55492


From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: Tuesday, November 29, 2016 9:48 AM
To: users@zeppelin.apache.org
Subject: Re: Unable to connect with Spark Interpreter

According your log, the spark interpreter fail to start.  Do you see any spark 
interpreter log ?



Jan Botorek <jan.boto...@infor.com<mailto:jan.boto...@infor.com>>于2016年11月29日周二 
下午4:08写道:
Hello,
Thanks for the advice, but it doesn’t seem that anything is wrong when I start 
the interpreter manually. I attach logs from interpreter and from zeppelin.
This is the cmd output from interpreter launched manually:

D:\zeppelin-0.6.2\bin> interpreter.cmd -d D:\zeppelin-0.6.2\interpreter\spark 
-p 55492
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; 
support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/ 
D:/zeppelin-0.6.2/interpreter/spark/zeppelin-spark_2.11-0.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/ 
D:/zeppelin-0.6.2/lib/zeppelin-interpreter-0.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Could you, please, think of any possible next steps?

Best regards,
Jan

From: moon soo Lee [mailto:m...@apache.org<mailto:m...@apache.org>]
Sent: Monday, November 28, 2016 5:36 PM

To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

According to your log, your interpreter process seems failed to start.
Check following lines in your log.
You can try run interpreter process manually and see why it is failing.
i.e. run

D:\zeppelin-0.6.2\bin\interpreter.cmd -d D:\zeppelin-0.6.2\interpreter\spark -p 
55492

-------

 INFO [2016-11-28 10:34:02,837] ({pool-1-thread-2} 
RemoteInterpreterProcess.java[reference]:148) - Run interpreter process 
[D:\zeppelin-0.6.2\bin\interpreter.cmd, -d, 
D:\zeppelin-0.6.2\interpreter\spark, -p, 55492, -l, 
D:\zeppelin-0.6.2/local-repo/2C36NT8YK]^M

 INFO [2016-11-28 10:34:03,491] ({Exec Default Executor} 
RemoteInterpreterProcess.java[onProcessFailed]:288) - Interpreter process 
failed {}^M

org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit 
value: 1)


On Mon, Nov 28, 2016 at 1:42 AM Jan Botorek 
<jan.boto...@infor.com<mailto:jan.boto...@infor.com>> wrote:
Hello again,
I am sorry, but don’t you, guys, really nobody tackle with the same issue, 
please?

I have currently tried the new version (0.6.2) – both binary and „to compile“ 
versions. But the issue remains the same. I have tried it on several laptops 
and servers, always the same result.

Please, don’t you have any idea what to check or repair, please?

Best regards,
Jan
From: Jan Botorek [mailto:jan.boto...@infor.com<mailto:jan.boto...@infor.com>]
Sent: Wednesday, November 16, 2016 12:54 PM
To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: RE: Unable to connect with Spark Interpreter

Hello Alexander,
Thank you for a quick response. Please, see the server log attached. 
Unfortunately, I don’t have any zeppelin-interpreter-spark*.log in the logs 
file.

Questions:

-          It happens everytime – even, If I try to run several paragraphs

-          Yes, it keeps happening even if the interpreter is re-started
--
Jan

From: Alexander Bezzubov [mailto:b...@apache.org]
Sent: Wednesday, November 16, 2016 12:47 PM
To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

Hi Jan,

this is rather generic error saying that ZeppelinServer somehow could not 
connect to the interpreter proces on your machine.

Could you please share more from logs/* in particular, .out and .log of the 
Zeppelin server AND zepplein-interpreter-spark*.log - usually this is enough to 
identify the reason.

Two more questions:
- does this happen on every paragraph run? if you try to click Run multiple 
times in a row
- does it still happen if you re-starting Spark interpreter manually from GUI? 
("Anonymous"->Interpreters->Spark->restart)

--
Alex

On Wed, Nov 16, 2016, 12:37 Jan Botorek 
<jan.boto...@infor.com<mailto:jan.boto...@infor.com>> wrote:
Hello,
I am not able to run any Spark code in the Zeppelin. I tried compiled versions 
of Zeppelin as well as to compile the source code on my own based on the 
https://github.com/apache/zeppelin steps.
My configuration is Scala in 2.11 version and spark 2.0.1. Also, I tried 
different versions of Zeppelin available at github (master, 0.6, 0.5.6).

The result is always the same. The Zeppelin starts but when any code is run 
(e.g. “2 + 1”, “sc.version”), the subsequent exception is thrown.

java.net.ConnectException: Connection refused: connect at 
java.net.DualStackPlainSocketImpl.connect0(Native Method) at 
java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:79)
 at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) 
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172) at 
java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at 
java.net.Socket.connect(Socket.java:589) at 
org.apache.thrift.transport.TSocket.open(TSocket.java:182) at 
org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:51)
 at 
org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:37)
 at 
org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:60)
 at 
org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861)
 at 
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435)
 at 
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess.getClient(RemoteInterpreterProcess.java:189)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.init(RemoteInterpreter.java:163)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:328)
 at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.getFormType(LazyOpenInterpreter.java:105)
 at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:260) at 
org.apache.zeppelin.scheduler.Job.run(Job.java:176) at 
org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:328)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at 
java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
 at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745)

Based on googling and my assumptions, there is something wrong with the spark 
interpreter in relation to the Zeppelin.
I also tried to connect the Spark interpreter to Spark running externally (in 
interpreter settings of Zeppelin), but it didn’t work.

Do you have any ideas about what could possibly be wrong?
Thank you for any help – any ideas and insights would be appreciated.

Best regards,
Jan Botorek

Reply via email to