Re: JVM Error while building spark

2014-08-08 Thread Rasika Pohankar
maven? As a curiosity. Thankyou. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/JVM-Error-while-building-spark-tp11665p11779.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: JVM Error while building spark

2014-08-07 Thread Sean Owen
in the spark configuration > file(spark-env.sh), but it still gives the same error. > > Spark Version : 1.0.1 > Scala : 2.10.4 > Ubuntu : 12.04 LTS > Java : 1.7.0_65 > > How to solve the error? Please help. > > Thank you. > > > > -- > View this message in co

JVM Error while building spark

2014-08-07 Thread Rasika Pohankar
s the same error. Spark Version : 1.0.1 Scala : 2.10.4 Ubuntu : 12.04 LTS Java : 1.7.0_65 How to solve the error? Please help. Thank you. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/JVM-Error-while-building-spark-tp11665.html Sent from the Apache Spark

Re: JVM error

2014-02-28 Thread Bryn Keller
Hi Mohit, Yes, in pyspark you only get one chance to initialize a spark context. If it goes wrong, you have to restart the process. Thanks, Bryn On Fri, Feb 28, 2014 at 4:55 PM, Mohit Singh wrote: > And I tried that but got the error: > > Traceback (most recent call last): > File "", line 1

Re: JVM error

2014-02-28 Thread Mohit Singh
And I tried that but got the error: Traceback (most recent call last): File "", line 1, in File "/home/hadoop/spark/python/pyspark/context.py", line 83, in __init__ SparkContext._ensure_initialized(self) File "/home/hadoop/spark/python/pyspark/context.py", line 165, in _ensure_initialize

Re: JVM error

2014-02-28 Thread Bryn Keller
Sorry, typo - that last line should be: sc = pyspark.Spark*Context*(conf = conf) On Fri, Feb 28, 2014 at 9:37 AM, Mohit Singh wrote: > Hi Bryn, > Thanks for the suggestion. > I tried that.. > conf = pyspark.SparkConf().set("spark.executor.memory","20G") > But.. got an error here: > > sc = py

Re: JVM error

2014-02-28 Thread Mohit Singh
Hi Bryn, Thanks for the suggestion. I tried that.. conf = pyspark.SparkConf().set("spark.executor.memory","20G") But.. got an error here: sc = pyspark.SparkConf(conf = conf) Traceback (most recent call last): File "", line 1, in TypeError: __init__() got an unexpected keyword argument 'conf'

Re: JVM error

2014-02-27 Thread Evgeniy Shishkin
On 27 Feb 2014, at 07:22, Aaron Davidson wrote: > Setting spark.executor.memory is indeed the correct way to do this. If you > want to configure this in spark-env.sh, you can use > export SPARK_JAVA_OPTS=" -Dspark.executor.memory=20g" > (make sure to append the variable if you've been using SPA

Re: JVM error

2014-02-26 Thread Aaron Davidson
Setting spark.executor.memory is indeed the correct way to do this. If you want to configure this in spark-env.sh, you can use export SPARK_JAVA_OPTS=" -Dspark.executor.memory=20g" (make sure to append the variable if you've been using SPARK_JAVA_OPTS previously) On Wed, Feb 26, 2014 at 7:50 PM,

Re: JVM error

2014-02-26 Thread Bryn Keller
Hi Mohit, You can still set SPARK_MEM in spark-env.sh, but that is deprecated. This is from SparkContext.scala: if (!conf.contains("spark.executor.memory") && sys.env.contains("SPARK_MEM")) { logWarning("Using SPARK_MEM to set amount of memory to use per executor process is " + "depreca

Re: JVM error

2014-02-26 Thread Mohit Singh
Hi Bryn, Thanks for responding. Is there a way I can permanently configure this setting? like SPARK_EXECUTOR_MEMORY or somethign like that? On Wed, Feb 26, 2014 at 2:56 PM, Bryn Keller wrote: > Hi Mohit, > > Try increasing the *executor* memory instead of the worker memory - the > most appro

Re: JVM error

2014-02-26 Thread Bryn Keller
Hi Mohit, Try increasing the *executor* memory instead of the worker memory - the most appropriate place to do this is actually when you're creating your SparkContext, something like: conf = pyspark.SparkConf() .setMaster("spark://master:7077") .setAp

JVM error

2014-02-26 Thread Mohit Singh
Hi, I am experimenting with pyspark lately... Every now and then, I see this error bieng streamed to pyspark shell .. and most of the times.. the computation/operation completes.. and sometimes, it just gets stuck... My setup is 8 node cluster.. with loads of ram(256GB's) and space( TB's) per nod