Re: Dynamic Resource allocation in Spark Streaming

2017-12-03 Thread Sourav Mazumder
can confirm > spark.dynamicAllocation.enabled is enough. > > Best Regards > Richard > > From: Sourav Mazumder > Date: Sunday, December 3, 2017 at 12:31 PM > To: user > Subject: Dynamic Resource allocation in Spark Streaming > > Hi, > > I see the following jira is reso

Re: Dynamic Resource allocation in Spark Streaming

2017-12-03 Thread Qiao, Richard
Sourav: I’m using spark streaming 2.1.0 and can confirm spark.dynamicAllocation.enabled is enough. Best Regards Richard From: Sourav Mazumder Date: Sunday, December 3, 2017 at 12:31 PM To: user Subject: Dynamic Resource allocation in Spark Streaming Hi, I see the following

Dynamic Resource allocation in Spark Streaming

2017-12-03 Thread Sourav Mazumder
Hi, I see the following jira is resolved in Spark 2.0 https://issues.apache.org/jira/browse/SPARK-12133 which is supposed to support Dynamic Resource Allocation in Spark Streaming. I also see the JiRA https://issues.apache.org/jira/browse/SPARK-22008 which is about fixing numer of executor

Re: Resource allocation in SPARK streaming

2015-09-02 Thread Akhil Das
Well in spark, you can get the information that you need from the driver ui running on port 4040, click on the active job, then click on the stages and inside the stages you will find the tasks and the machine address on which the task is being executed, you can also check the cpu load on that mach

Resource allocation in SPARK streaming

2015-09-01 Thread anshu shukla
I am not much clear about resource allocation (CPU/CORE/Thread level allocation) as per the parallelism by setting number of cores in spark standalone mode . Any guidelines for that . -- Thanks & Regards, Anshu Shukla