Hi Prem,
I am experiencing the same problem on Spark 1.0.2 and Job Server 0.4.0
Did you find a solution for this problem?
Thank you,
Hung
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/RDD-Cleanup-tp9182p16843.html
Sent from the Apache Spark User List ma
Hi premdass,
Where did you set spark.cleaner.referenceTracking = true/false?
Was this in your job-server conf?
Cheers.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/RDD-Cleanup-tp9182p10939.html
Sent from the Apache Spark User List mailing list archiv
we simply hold on to the reference to the rdd after it has been cached. so
we have a single Map[String, RDD[X]] for cached RDDs for the application
On Wed, Jul 9, 2014 at 11:00 AM, premdass wrote:
> Hi,
>
> Yes . I am caching the RDD's by calling cache method..
>
>
> May i ask, how you are sha
Hi,
Yes . I am caching the RDD's by calling cache method..
May i ask, how you are sharing RDD's across jobs in same context? By the RDD
name. I tried printing the RDD's of the Spark context, and when the
referenceTracking is enabled, i get empty list after the clean up.
Thanks,
Prem
--
Vie
did you explicitly cache the rdd? we cache rdds and share them between jobs
just fine within one context in spark 1.0.x. but we do not use the ooyala
job server...
On Wed, Jul 9, 2014 at 10:03 AM, premdass wrote:
> Hi,
>
> I using spark 1.0.0 , using Ooyala Job Server, for a low latency query