Use unpersist(), even when not persisted before.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/memory-size-for-caching-RDD-tp8256p8579.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
Raymond
> Cc: Patrick Wendell; u...@spark.apache.org; dev@spark.apache.org
> Subject: Re: memory size for caching RDD
>
> Oh I see.
>
> I want to implement something like this: sometimes I need to release some
> memory for other usage even when they are occupied by some RDDs (can be
>
Regards,
Raymond Liu
From: 牛兆捷 [mailto:nzjem...@gmail.com]
Sent: Thursday, September 04, 2014 2:57 PM
To: Liu, Raymond
Cc: Patrick Wendell; u...@spark.apache.org; dev@spark.apache.org
Subject: Re: memory size for caching RDD
Oh I see.
I want to implement something like this: sometimes I need to
ion conf.
> e.g. spark.shuffle.memoryFraction which you also set the up limit.
>
>
>
> Best Regards,
>
> *Raymond Liu*
>
>
>
> *From:* 牛兆捷 [mailto:nzjem...@gmail.com]
> *Sent:* Thursday, September 04, 2014 2:27 PM
> *To:* Patrick Wendell
> *Cc:* u...@spark.apache.org; dev@spar
Thanks raymond.
I duplicated the question. Please see the reply here. [?]
2014-09-04 14:27 GMT+08:00 牛兆捷 :
> But is it possible to make t resizable? When we don't have many RDD to
> cache, we can give some memory to others.
>
>
> 2014-09-04 13:45 GMT+08:00 Patrick Wendell :
>
> Changing this is
limit.
Best Regards,
Raymond Liu
From: 牛兆捷 [mailto:nzjem...@gmail.com]
Sent: Thursday, September 04, 2014 2:27 PM
To: Patrick Wendell
Cc: u...@spark.apache.org; dev@spark.apache.org
Subject: Re: memory size for caching RDD
But is it possible to make t resizable? When we don't have many RDD to
But is it possible to make t resizable? When we don't have many RDD to
cache, we can give some memory to others.
2014-09-04 13:45 GMT+08:00 Patrick Wendell :
> Changing this is not supported, it si immutable similar to other spark
> configuration settings.
>
> On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷
Changing this is not supported, it si immutable similar to other spark
configuration settings.
On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷 wrote:
> Dear all:
>
> Spark uses memory to cache RDD and the memory size is specified by
> "spark.storage.memoryFraction".
>
> One the Executor starts, does Spark su
Dear all:
Spark uses memory to cache RDD and the memory size is specified by
"spark.storage.memoryFraction".
One the Executor starts, does Spark support adjusting/resizing memory size
of this part dynamically?
Thanks.
--
*Regards,*
*Zhaojie*