Re: Error: Could not delete temporary files.

2014-07-08 Thread Marcelo Vanzin
Have you tried the obvious (increase the heap size of your JVM)? On Tue, Jul 8, 2014 at 2:02 PM, Rahul Bhojwani wrote: > Thanks Marcelo. > I was having another problem. My code was running properly and then it > suddenly stopped with the error: > > java.lang.OutOfMemoryError: Java heap space >

Re: Error: Could not delete temporary files.

2014-07-08 Thread Rahul Bhojwani
Thanks Marcelo. I was having another problem. My code was running properly and then it suddenly stopped with the error: java.lang.OutOfMemoryError: Java heap space at java.io.BufferedOutputStream.(Unknown Source) at org.apache.spark.api.python.PythonRDD$$anon$2.run(PythonRDD.scala:

Re: Error: Could not delete temporary files.

2014-07-08 Thread Marcelo Vanzin
Sorry, that would be sc.stop() (not close). On Tue, Jul 8, 2014 at 1:31 PM, Marcelo Vanzin wrote: > Hi Rahul, > > Can you try calling "sc.close()" at the end of your program, so Spark > can clean up after itself? > > On Tue, Jul 8, 2014 at 12:40 PM, Rahul Bhojwani > wrote: >> Here I am adding my

Re: Error: Could not delete temporary files.

2014-07-08 Thread Marcelo Vanzin
Hi Rahul, Can you try calling "sc.close()" at the end of your program, so Spark can clean up after itself? On Tue, Jul 8, 2014 at 12:40 PM, Rahul Bhojwani wrote: > Here I am adding my code. If you can have a look to help me out. > Thanks > ### > > import tokenizer > import ge

Re: Error: Could not delete temporary files.

2014-07-08 Thread Rahul Bhojwani
I have pasted the logs below: PS F:\spark-0.9.1\codes\sentiment analysis> pyspark .\naive_bayes_analyser.py Running python with PYTHONPATH=F:\spark-0.9.1\spark-0.9.1\bin\..\python; SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/F:/spark-0.9.1/spark-0.9.1/as

Re: Error: Could not delete temporary files.

2014-07-08 Thread Rahul Bhojwani
Here I am adding my code. If you can have a look to help me out. Thanks ### import tokenizer import gettingWordLists as gl from pyspark.mllib.classification import NaiveBayes from numpy import array from pyspark import SparkContext, SparkConf conf = (SparkConf().setMaster("loc

Re: Error: Could not delete temporary files.

2014-07-08 Thread Marcelo Vanzin
Note I didn't say that was your problem - it would be if (i) you're running your job on Yarn and (ii) you look at the Yarn NodeManager logs and see that it's actually killing your process. I just said that the exception shows up in those kinds of situations. You haven't provided enough information

Re: Error: Could not delete temporary files.

2014-07-08 Thread Rahul Bhojwani
Hi Marcelo. Thanks for the quick reply. Can you suggest me how to increase the memory limits or how to tackle this problem. I am a novice. If you want I can post my code here. Thanks On Wed, Jul 9, 2014 at 12:50 AM, Marcelo Vanzin wrote: > This is generally a side effect of your executor bein

Re: Error: Could not delete temporary files.

2014-07-08 Thread Marcelo Vanzin
This is generally a side effect of your executor being killed. For example, Yarn will do that if you're going over the requested memory limits. On Tue, Jul 8, 2014 at 12:17 PM, Rahul Bhojwani wrote: > HI, > > I am getting this error. Can anyone help out to explain why is this error > coming. > >