Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/5722#issuecomment-137254438
  
    Either way I agree with @vanzin that this is not the right fix. I've run 
Spark in a fire-walled environment where I had to assign a special port to each 
component that would otherwise default at 0. E.g. connection manager at port 
9910, block manager at port 9920, executor at port 9930 etc. This is 
effectively the same as specifying a port range if we set 
`spark.port.maxRetries = 10`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to