HI,

I have a piece of code in which an rdd is created from a main method.
It then does work on this rdd from 2 different threads running in parallel.

When running this code as part of a test with a local master it will
sometimes make spark hang ( 1 task will never get completed)

If i make a copy of the rdd  the joh will complete fine.

I suspect it's a bad idea to use the same rdd from two threads but I could
not find any documentation on the subject.

Should it be possible to do this and if not can anyone point me to
documentation pointing our that this is not on the table

--jelmer

Reply via email to