You could set "spark.executor.memory" to something bigger than the default
(512mb)


On Thu, Sep 11, 2014 at 8:31 AM, Aniket Bhatnagar <
aniket.bhatna...@gmail.com> wrote:

> I am running a simple Spark Streaming program that pulls in data from
> Kinesis at a batch interval of 10 seconds, windows it for 10 seconds, maps
> data and persists to a store.
>
> The program is running in local mode right now and runs out of memory
> after a while. I am yet to investigate heap dumps but I think Spark isn't
> releasing memory after processing is complete. I have even tried changing
> storage level to disk only.
>
> Help!
>
> Thanks,
> Aniket
>

Reply via email to