Re: error with pyspark

2014-08-11 Thread Baoqiang Cao
Thanks Daves and Ron! It indeed was due to ulimit issue. Thanks a lot! Best, Baoqiang Cao Blog: http://baoqiang.org Email: bqcaom...@gmail.com On Aug 11, 2014, at 3:08 AM, Ron Gonzalez wrote: > If you're running on Ubuntu, do ulimit -n, which gives the max number of > allowed

set SPARK_LOCAL_DIRS issue

2014-08-09 Thread Baoqiang Cao
? Best, Baoqiang Cao Blog: http://baoqiang.org Email: bqcaom...@gmail.com

error with pyspark

2014-08-08 Thread Baoqiang Cao
util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) The m1.txt is about 4G, and I have >120GB Ram and used -Xmx120GB. It is on Ubuntu.

Re: memory issue on standalone master

2014-08-07 Thread Baoqiang Cao
My problem was that I didn’t know how to add. For what might be worthy, it was solved by editing the spark-env.sh. Thanks anyway! Baoqiang Cao Blog: http://baoqiang.org Email: bqcaom...@gmail.com On Aug 7, 2014, at 3:27 PM, maddenpj wrote: > It looks like your Java heap space is too