passing --executor-cores or –total-executor-cores as
>> arguments, , depending on the spark version?
>>
>>
>>
>>
>>
>> *From:* kant kodali [mailto:kanth...@gmail.com]
>> *Sent:* Friday, February 17, 2017 5:03 PM
>> *To:* Alex Kozlov
>> *Cc:* u
the spark version?
>>>
>>>
>>>
>>>
>>>
>>> *From:* kant kodali [mailto:kanth...@gmail.com]
>>> *Sent:* Friday, February 17, 2017 5:03 PM
>>> *To:* Alex Kozlov
>>> *Cc:* user @spark
>>> *Subject:* Re: qu
;>
>>
>>
>>
>>
>> *From:* kant kodali [mailto:kanth...@gmail.com]
>> *Sent:* Friday, February 17, 2017 5:03 PM
>> *To:* Alex Kozlov
>> *Cc:* user @spark
>> *Subject:* Re: question on SPARK_WORKER_CORES
>>
>>
>>
>> St
m:* kant kodali [mailto:kanth...@gmail.com]
> *Sent:* Friday, February 17, 2017 5:03 PM
> *To:* Alex Kozlov
> *Cc:* user @spark
> *Subject:* Re: question on SPARK_WORKER_CORES
>
>
>
> Standalone.
>
>
>
> On Fri, Feb 17, 2017 at 5:01 PM, Alex Kozlov wrote:
>
>
Have you tried passing --executor-cores or –total-executor-cores as arguments,
, depending on the spark version?
From: kant kodali [mailto:kanth...@gmail.com]
Sent: Friday, February 17, 2017 5:03 PM
To: Alex Kozlov
Cc: user @spark
Subject: Re: question on SPARK_WORKER_CORES
Standalone.
On
Standalone.
On Fri, Feb 17, 2017 at 5:01 PM, Alex Kozlov wrote:
> What Spark mode are you running the program in?
>
> On Fri, Feb 17, 2017 at 4:55 PM, kant kodali wrote:
>
>> when I submit a job using spark shell I get something like this
>>
>> [Stage 0:>(36814 + 4) / 220129]
>>
>>
What Spark mode are you running the program in?
On Fri, Feb 17, 2017 at 4:55 PM, kant kodali wrote:
> when I submit a job using spark shell I get something like this
>
> [Stage 0:>(36814 + 4) / 220129]
>
>
> Now all I want is I want to increase number of parallel tasks running from
>