can confirm
> spark.dynamicAllocation.enabled is enough.
>
> Best Regards
> Richard
>
> From: Sourav Mazumder
> Date: Sunday, December 3, 2017 at 12:31 PM
> To: user
> Subject: Dynamic Resource allocation in Spark Streaming
>
> Hi,
>
> I see the following jira is reso
Sourav:
I’m using spark streaming 2.1.0 and can confirm
spark.dynamicAllocation.enabled is enough.
Best Regards
Richard
From: Sourav Mazumder
Date: Sunday, December 3, 2017 at 12:31 PM
To: user
Subject: Dynamic Resource allocation in Spark Streaming
Hi,
I see the following
Hi,
I see the following jira is resolved in Spark 2.0
https://issues.apache.org/jira/browse/SPARK-12133 which is supposed to
support Dynamic Resource Allocation in Spark Streaming.
I also see the JiRA https://issues.apache.org/jira/browse/SPARK-22008 which
is about fixing numer of executor
Well in spark, you can get the information that you need from the driver ui
running on port 4040, click on the active job, then click on the stages and
inside the stages you will find the tasks and the machine address on which
the task is being executed, you can also check the cpu load on that mach
I am not much clear about resource allocation (CPU/CORE/Thread level
allocation) as per the parallelism by setting number of cores in spark
standalone mode .
Any guidelines for that .
--
Thanks & Regards,
Anshu Shukla