Hi everyone,

I am having problems submitting an app through spark-submit when the master
is not "local". However the pi.py example which comes with Spark works with
any master. I believe my script has the same structure as pi.py, but for
some reason my script is not as flexible. Specifically, the failure occurs
when count() is called. Count is the first action in the script. Also,
Spark complains that is is losing executors however, interactively in
Jupyter, everything works perfectly with any master passed to spark conf.

Does anyone know what might be happening? Is there anywhere I can look up
the requirements for spark-submit scripts?

Thanks,
Tobi

Reply via email to