hey,
Thanks. Now it worked.. :)
On Wed, Jun 15, 2016 at 6:59 PM, Jeff Zhang wrote:
> Then the only solution is to increase your driver memory but still
> restricted by your machine's memory. "--driver-memory"
>
> On Thu, Jun 16, 2016 at 9:53 AM, spR wrote:
>
>> Hey,
>>
>> But I just have one
Then the only solution is to increase your driver memory but still
restricted by your machine's memory. "--driver-memory"
On Thu, Jun 16, 2016 at 9:53 AM, spR wrote:
> Hey,
>
> But I just have one machine. I am running everything on my laptop. Won't I
> be able to do this processing in local mo
Hey,
But I just have one machine. I am running everything on my laptop. Won't I
be able to do this processing in local mode then?
Regards,
Tejaswini
On Wed, Jun 15, 2016 at 6:32 PM, Jeff Zhang wrote:
> You are using local mode, --executor-memory won't take effect for local
> mode, please use
You are using local mode, --executor-memory won't take effect for local
mode, please use other cluster mode.
On Thu, Jun 16, 2016 at 9:32 AM, Jeff Zhang wrote:
> Specify --executor-memory in your spark-submit command.
>
>
>
> On Thu, Jun 16, 2016 at 9:01 AM, spR wrote:
>
>> Thank you. Can you
Specify --executor-memory in your spark-submit command.
On Thu, Jun 16, 2016 at 9:01 AM, spR wrote:
> Thank you. Can you pls tell How to increase the executor memory?
>
>
>
> On Wed, Jun 15, 2016 at 5:59 PM, Jeff Zhang wrote:
>
>> >>> Caused by: java.lang.OutOfMemoryError: GC overhead limit e
hey,
I did this in my notebook. But still I get the same error. Is this the
right way to do it?
from pyspark import SparkConf
conf = (SparkConf()
.setMaster("local[4]")
.setAppName("My app")
.set("spark.executor.memory", "12g"))
sc.conf = conf
On Wed, Jun 15, 2016 at 5
Thank you. Can you pls tell How to increase the executor memory?
On Wed, Jun 15, 2016 at 5:59 PM, Jeff Zhang wrote:
> >>> Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
>
>
> It is OOM on the executor. Please try to increase executor memory.
> "--executor-memory"
>
>
>
>
>
>>> Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
It is OOM on the executor. Please try to increase executor memory.
"--executor-memory"
On Thu, Jun 16, 2016 at 8:54 AM, spR wrote:
> Hey,
>
> error trace -
>
> hey,
>
>
> error trace -
>
>
>
Hey,
error trace -
hey,
error trace -
---Py4JJavaError
Traceback (most recent call
last) in ()> 1 temp.take(2)
/Users/my/Documents/My_Study_folder/spark-1.6.1/python/pyspark/sql/dataframe.
Could you paste the full stacktrace ?
On Thu, Jun 16, 2016 at 7:24 AM, spR wrote:
> Hi,
> I am getting this error while executing a query using sqlcontext.sql
>
> The table has around 2.5 gb of data to be scanned.
>
> First I get out of memory exception. But I have 16 gb of ram
>
> Then my noteb
Hi,
I am getting this error while executing a query using sqlcontext.sql
The table has around 2.5 gb of data to be scanned.
First I get out of memory exception. But I have 16 gb of ram
Then my notebook dies and I get below error
Py4JNetworkError: An error occurred while trying to connect to the
11 matches
Mail list logo