Have you tried the obvious (increase the heap size of your JVM)?
On Tue, Jul 8, 2014 at 2:02 PM, Rahul Bhojwani
wrote:
> Thanks Marcelo.
> I was having another problem. My code was running properly and then it
> suddenly stopped with the error:
>
> java.lang.OutOfMemoryError: Java heap space
>
Thanks Marcelo.
I was having another problem. My code was running properly and then it
suddenly stopped with the error:
java.lang.OutOfMemoryError: Java heap space
at java.io.BufferedOutputStream.(Unknown Source)
at
org.apache.spark.api.python.PythonRDD$$anon$2.run(PythonRDD.scala:
Sorry, that would be sc.stop() (not close).
On Tue, Jul 8, 2014 at 1:31 PM, Marcelo Vanzin wrote:
> Hi Rahul,
>
> Can you try calling "sc.close()" at the end of your program, so Spark
> can clean up after itself?
>
> On Tue, Jul 8, 2014 at 12:40 PM, Rahul Bhojwani
> wrote:
>> Here I am adding my
Hi Rahul,
Can you try calling "sc.close()" at the end of your program, so Spark
can clean up after itself?
On Tue, Jul 8, 2014 at 12:40 PM, Rahul Bhojwani
wrote:
> Here I am adding my code. If you can have a look to help me out.
> Thanks
> ###
>
> import tokenizer
> import ge
I have pasted the logs below:
PS F:\spark-0.9.1\codes\sentiment analysis> pyspark
.\naive_bayes_analyser.py
Running python with PYTHONPATH=F:\spark-0.9.1\spark-0.9.1\bin\..\python;
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/F:/spark-0.9.1/spark-0.9.1/as
Here I am adding my code. If you can have a look to help me out.
Thanks
###
import tokenizer
import gettingWordLists as gl
from pyspark.mllib.classification import NaiveBayes
from numpy import array
from pyspark import SparkContext, SparkConf
conf = (SparkConf().setMaster("loc
Note I didn't say that was your problem - it would be if (i) you're
running your job on Yarn and (ii) you look at the Yarn NodeManager
logs and see that it's actually killing your process.
I just said that the exception shows up in those kinds of situations.
You haven't provided enough information
Hi Marcelo.
Thanks for the quick reply. Can you suggest me how to increase the memory
limits or how to tackle this problem. I am a novice. If you want I can post
my code here.
Thanks
On Wed, Jul 9, 2014 at 12:50 AM, Marcelo Vanzin wrote:
> This is generally a side effect of your executor bein
This is generally a side effect of your executor being killed. For
example, Yarn will do that if you're going over the requested memory
limits.
On Tue, Jul 8, 2014 at 12:17 PM, Rahul Bhojwani
wrote:
> HI,
>
> I am getting this error. Can anyone help out to explain why is this error
> coming.
>
>