I believe I solved my problem. The worker-node didn't know where to return
the answers. I set SPARK_LOCAL_IP and the program runs as it should.
On Mon, Feb 24, 2014 at 3:55 PM, Anders Bennehag wrote:
> Hello there,
>
> I'm having some trouble with my spark-cl
Hi there,
I am running spark 0.9.0 standalone on a cluster. The documentation
http://spark.incubator.apache.org/docs/latest/python-programming-guide.htmlstates
that code-dependencies can be deployed through the pyFiles argument
to the SparkContext.
But in my case, the relevant code, lets call it
I just discovered that putting myLib in /usr/local/python2-7/dist-packages/
on the worker-nodes will let me import the module in a pyspark-script...
That is a solution but it would be nice if modules in PYTHONPATH were
included as well.
On Wed, Mar 5, 2014 at 1:34 PM, Anders Bennehag wrote