Github user hellertime commented on the pull request:

    https://github.com/apache/spark/pull/3074#issuecomment-61581145
  
    This code has been in use for a while now. I'm currently building a 
workflow reliant on the ability for spark to spin up its tasks inside docker. 
    
    Not sure what exactly you mean about the other executor? If you are 
referring to coarse/fine. This patch only touches the fine mode backend 
scheduler, since that is what we were using.
    
    Coarse mode should be easy to adapt, it would just require refactoring the 
`maybeDockerize` code to not expect an ExecutorInfo since the coarse mode puts 
the ContainerInfo directly into the task (that was a bit hand-wavy, apologies).
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to