Re: Problem with pyspark on Docker talking to YARN cluster

2016-04-06 Thread John Omernik
stabilize the driver-side endpoint. (ref >><https://spark.apache.org/docs/latest/configuration.html#networking>) >>2. use host networking for your container, i.e. "docker run >> --net=host ..." >> 3. use yarn-cluster mode (see SPARK-5162 &g

Re: Problem with pyspark on Docker talking to YARN cluster

2015-06-10 Thread Ashwin Shankar
r run --net=host >..." >3. use yarn-cluster mode (see SPARK-5162 ><https://issues.apache.org/jira/browse/SPARK-5162>) > > > Hope this helps, > Eron > > > -- > Date: Wed, 10 Jun 2015 13:43:04 -0700 > Subject: Proble

Problem with pyspark on Docker talking to YARN cluster

2015-06-10 Thread Ashwin Shankar
All, I was wondering if any of you have solved this problem : I have pyspark(ipython mode) running on docker talking to a yarn cluster(AM/executors are NOT running on docker). When I start pyspark in the docker container, it binds to port *49460.* Once the app is submitted to YARN, the app(AM) o