Using 0 for spark.mesos.mesosExecutor.cores is better than dynamic
allocation, but have to pay a little more overhead for launching a
task, which should be OK if the task is not trivial.
Since the direct result (up to 1M by default) will also go through
mesos, it's better to tune it lower, otherwi
Hi,
You can start multiple spark apps per cluster. You will have one stream
context per app.
Em 24 de dez de 2016 18:22, "shyla deshpande"
escreveu:
> Hi All,
>
> Thank you for the response.
>
> As per
>
> https://docs.cloud.databricks.com/docs/latest/databricks_
> guide/index.html#07%20Spark%2
Hi All,
Thank you for the response.
As per
https://docs.cloud.databricks.com/docs/latest/databricks_guide/index.html#07%20Spark%20Streaming/15%20Streaming%20FAQs.html
There can be only one streaming context in a cluster which implies only one
streaming job.
So, I am still confused. Anyone havi
Hi,
Please use SparkLauncher API class and invoke the threads using async calls
using Futures.
Using SparkLauncher, you can mention class name, application resouce,
arguments to be passed to the driver, deploy-mode etc.
I would suggest to use scala's Future, is scala code is possible.
https://spa
Thanks Liang, Vadim and everyone for your inputs!!
With this clarity, I've tried client modes for both main and sub-spark
jobs. Every main spark job and its corresponding threaded spark jobs are
coming up on the YARN applications list and the jobs are getting executed
properly. I need to now test