We've had various OOM issues with spark and have been trying to track them
down one by one.

Now we have one in spark-shell which is super surprising.

We currently allocate 6GB to spark shell, as confirmed via 'ps'

Why the heck would the *shell* need that much memory.

I'm going to try to give it more of course but would be nice to know if
this is a legitimate memory constraint or there is a bug somewhere.

PS: One thought I had was that it would be nice to have spark keep track of
where an OOM was encountered, in what component.

Kevin


-- 

We’re hiring if you know of any awesome Java Devops or Linux Operations
Engineers!

Founder/CEO Spinn3r.com
Location: *San Francisco, CA*
blog: http://burtonator.wordpress.com
… or check out my Google+ profile
<https://plus.google.com/102718274791889610666/posts>

Reply via email to