Thanks Sabarish, I thought of same. will try that. Hi Ted, good question. I guess one way is to have an api like `rdd.persist(storageLevel, compress)` where 'compress' can be true or false.
On Tue, Mar 15, 2016 at 5:18 PM, Sabarish Sasidharan <sabarish....@gmail.com > wrote: > It will compress only rdds with serialization enabled in the persistence > mode. So you could skip _SER modes for your other rdds. Not perfect but > something. > On 15-Mar-2016 4:33 pm, "Nirav Patel" <npa...@xactlycorp.com> wrote: > >> Hi, >> >> I see that there's following spark config to compress an RDD. My guess >> is it will compress all RDDs of a given SparkContext, right? If so, is >> there a way to instruct spark context to only compress some rdd and leave >> others uncompressed ? >> >> Thanks >> >> spark.rdd.compress false Whether to compress serialized RDD partitions >> (e.g. forStorageLevel.MEMORY_ONLY_SER). Can save substantial space at >> the cost of some extra CPU time. >> >> >> >> [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/> >> >> <https://www.nyse.com/quote/XNYS:XTLY> [image: LinkedIn] >> <https://www.linkedin.com/company/xactly-corporation> [image: Twitter] >> <https://twitter.com/Xactly> [image: Facebook] >> <https://www.facebook.com/XactlyCorp> [image: YouTube] >> <http://www.youtube.com/xactlycorporation> > > -- [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/> <https://www.nyse.com/quote/XNYS:XTLY> [image: LinkedIn] <https://www.linkedin.com/company/xactly-corporation> [image: Twitter] <https://twitter.com/Xactly> [image: Facebook] <https://www.facebook.com/XactlyCorp> [image: YouTube] <http://www.youtube.com/xactlycorporation>