Hi,
I think I faced the same problem for Spark 2.1.0 when I tried to define number
of executors from SparkConf ot SparkSession builder in a standalone cluster.
Always it is taking all available core.
There are three ways to do it:
1. Define spark.executor.cores in conf/spark-defaults.conf and the
Thank you, Daniel and Yong!
On Wed, Jan 18, 2017 at 4:56 PM, Daniel Siegmann <
dsiegm...@securityscorecard.io> wrote:
> I am not too familiar with Spark Standalone, so unfortunately I cannot
> give you any definite answer. I do want to clarify something though.
>
> The properties spark.sql.shuffl
I am not too familiar with Spark Standalone, so unfortunately I cannot give
you any definite answer. I do want to clarify something though.
The properties spark.sql.shuffle.partitions and spark.default.parallelism
affect how your data is split up, which will determine the *total* number
of tasks,
: Wednesday, January 18, 2017 3:21 PM
To: Yong Zhang
Cc: spline_pal...@yahoo.com; jasbir.s...@accenture.com; User
Subject: Re: Spark #cores
So, I should be using spark.sql.shuffle.partitions to control the parallelism?
Is there there a guide to how to tune this?
Thank you,
Saliya
On Wed, Jan 18
g
>
>
> --
> *From:* Saliya Ekanayake
> *Sent:* Wednesday, January 18, 2017 12:33 PM
> *To:* spline_pal...@yahoo.com
> *Cc:* jasbir.s...@accenture.com; User
> *Subject:* Re: Spark #cores
>
> The Spark version I am using is 2.10. The lang
llelism", instead of
"spark.sql.shuffle.partitions".
Yong
From: Saliya Ekanayake
Sent: Wednesday, January 18, 2017 12:33 PM
To: spline_pal...@yahoo.com
Cc: jasbir.s...@accenture.com; User
Subject: Re: Spark #cores
The Spark version I am using is 2.10. The la
The Spark version I am using is 2.10. The language is Scala. This is
running in standalone cluster mode.
Each worker is able to use all physical CPU cores in the cluster as is the
default case.
I was using the following parameters to spark-submit
--conf spark.executor.cores=1 --conf spark.defaul
Hi,
Can you please share how you are assigning cpu core & tell us spark version and
language you are using?
//Palash
Sent from Yahoo Mail on Android
On Wed, 18 Jan, 2017 at 10:16 pm, Saliya Ekanayake wrote:
Thank you, for the quick response. No, this is not Spark SQL. I am running the
bu
Thank you, for the quick response. No, this is not Spark SQL. I am running
the built-in PageRank.
On Wed, Jan 18, 2017 at 10:33 AM, wrote:
> Are you talking here of Spark SQL ?
>
> If yes, spark.sql.shuffle.partitions needs to be changed.
>
>
>
> *From:* Saliya Ekanayake [mailto:esal...@gmail.co
Are you talking here of Spark SQL ?
If yes, spark.sql.shuffle.partitions needs to be changed.
From: Saliya Ekanayake [mailto:esal...@gmail.com]
Sent: Wednesday, January 18, 2017 8:56 PM
To: User
Subject: Spark #cores
Hi,
I am running a Spark application setting the number of executor cores 1 an
10 matches
Mail list logo