Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/2746#issuecomment-60823170
  
    In my mind there are sort of two modes. One is that the user wants to set 
the exact number of executors and the other is that they want to set a range of 
executor sizes and have elasticity.
    
    I would actually reject users that try to set both the absolute number and 
a min or max, since it is not obvious what the semantics should be (as @vanzin 
points out)
    
    So you'd have like this:
    
    ```
    // Request a fixed number of executors
    ./bin/spark-submit --num-executors = 10
    // Request a dynamically sized cluster
    ./bin/spark-submit --min-executors=5 --max-executor=15
    // INVALID
    ./bin/spark-submit --num-executors 10 --max-executors 15
    ```
    It seems to me a like a type mismatch to accept both "num-executors" and 
"max-executors" in a single invocation.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to