Thanks Liang, Vadim and everyone for your inputs!!
With this clarity, I've tried client modes for both main and sub-spark
jobs. Every main spark job and its corresponding threaded spark jobs are
coming up on the YARN applications list and the jobs are getting executed
properly. I need to now test
Using 0 for spark.mesos.mesosExecutor.cores is better than dynamic
allocation, but have to pay a little more overhead for launching a
task, which should be OK if the task is not trivial.
Since the direct result (up to 1M by default) will also go through
mesos, it's better to tune it lower, otherwi