Its just in the same thread for a particular RDD, I need to uncache it
every 2 minutes to clear out the data that is present in a Map inside that.

On Wed, Nov 4, 2015 at 11:54 PM, Saisai Shao <sai.sai.s...@gmail.com> wrote:

> Hi Swetha,
>
> Would you mind elaborating your usage scenario of DStream unpersisting?
>
> From my understanding:
>
> 1. Spark Streaming will automatically unpersist outdated data (you already
> mentioned about the configurations).
> 2. If streaming job is started, I think you may lose the control of the
> job, when do you call this unpersist, how to call this unpersist (from
> another thread)?
>
> Thanks
> Saisai
>
>
> On Thu, Nov 5, 2015 at 3:13 PM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> Other than setting the following.
>>
>> sparkConf.set("spark.streaming.unpersist", "true")
>> sparkConf.set("spark.cleaner.ttl", "7200s")
>>
>>
>> On Wed, Nov 4, 2015 at 5:03 PM, swetha <swethakasire...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> How to unpersist a DStream in Spark Streaming? I know that we can persist
>>> using dStream.persist() or dStream.cache. But, I don't see any method to
>>> unPersist.
>>>
>>> Thanks,
>>> Swetha
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-unpersist-a-DStream-in-Spark-Streaming-tp25281.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to