Re: Too many open files exception

2015-12-23 Thread Amirhossein Aleyasin
Thanks for your reply. The default max open files was 4096 and it seems it's not enough for Spark. I followed this instruction ( https://easyengine.io/tutorials/linux/increase-open-files-limit/) and increase it a large value. it works perfectly now. Thanks. On Tue, Dec 22, 2015 at 10:01 PM, Alex

Re: Can't run any new jobs because of OOME (spark interpreter)

2015-12-23 Thread Jungtaek Lim
Hi moon soo, Thanks for replying. I completely agree that freeing up memory completely is the hard thing. Actually I'm experimenting this behavior and seems like memory usage of the interpreter grows although I didn't run any further jobs. Maybe other issue can come in. I'll experiment more and

Re: Can't run any new jobs because of OOME (spark interpreter)

2015-12-23 Thread moon soo Lee
Hi Jungtaek Lim, SparkInterpreter uses scala REPL inside. Please see related issue https://issues.scala-lang.org/browse/SI-4331. There's a workaround in the description. But I believe there will be no easy way to free up the memory completely, unless destroy and create scala REPL again. Thanks,

Re: Can't run any new jobs because of OOME (spark interpreter)

2015-12-23 Thread Jungtaek Lim
Forgot to add error log and stack traces, 15/12/16 11:17:02 INFO SchedulerFactory: Job remoteInterpretJob_1450232220684 started by scheduler org.apache.zeppelin.spark.SparkInterpreter2005736637 15/12/16 11:17:08 ERROR Job: Job failed java.lang.OutOfMemoryError: Java heap space at scala.ref

Can't run any new jobs because of OOME (spark interpreter)

2015-12-23 Thread Jungtaek Lim
Hi users, I've met OOME when using spark interpreter and wish to resolve this issue. - Spark version: 1.4.1 + applying SPARK-11818 - Spark cluster: Mesos 0.22.1 - Zeppelin: commit 1ba6e2a

Connection Exception using Spark 1.5.2 pre-built with Hadoop 2.6

2015-12-23 Thread Hoc Phan
Hi I am downloading this prebuild binary from Downloads | Apache Spark |   | |   |   |   |   |   | | Downloads | Apache SparkDownload Spark The latest release of Spark is Spark 1.5.2, released on November 9, 2015(release notes)(git tag) Choose a Spark release: Choose a package type: Choose a down

Re: zeppelin behind apache reverse proxy

2015-12-23 Thread vincent gromakowski
Hello Girish, I am not sure setting up a proxy on the same host as zeppelin and the web brower is releavant because your websocket will be able to bypass the proxy. Here is what I get : GET http://public.hostname/ws ==> return 200. I can see the "zeppelin Jetty" in the response header then I