Hi all,
I am running the Python process that communicates with Spark in a
virtualenv. Is there any way I can make sure that the Python processes
of the workers are also started in a virtualenv? Currently I am getting
ImportErrors when the worker tries to unpickle stuff that is not
installed system-wide. For now both the worker and the driver run on the
same machine in local mode.
Thanks in advance!
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org