Hi, Has anyone tried to get Spark working on an HPC setup?If yes, can you please share your learnings and how you went about doing it? An HPC setup typically comes bundled with dynamically allocated cluster and a very efficient scheduler. Configuring Spark standalone in this mode of operation is challenging as the Hadoop dependencies need to be eliminated and the cluster needs to be configured on the fly. Thanks,Sid
- Spark on an HPC setup Sidharth Kashyap
- Re: Spark on an HPC setup Jeremy Freeman