TJ, what was your expansion factor between image size on disk and in memory in pyspark? I'd expect in memory to be larger due to Java object overhead, but don't know the exact amounts you should expect.
On Fri, Nov 14, 2014 at 12:50 AM, TJ Klein <tjkl...@gmail.com> wrote: > Hi, > > I am using PySpark (1.1) and I am using it for some image processing tasks. > The images (RDD) are of in the order of several MB to low/mid two digit MB. > However, when using the data and running operations on it using Spark, I > experience blowing up memory. Is there anything I can do about it? I played > around with serialization and RDD compression, but that didn't really help. > Any other idea what I can do or what I should particularly aware of? > > Best, > Tassilo > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Memory-Hungry-tp18923.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >