Re: Will the setting for spark.default.parallelism be used for spark.sql.shuffle.output.partitions?

2017-03-30 Thread shyla deshpande
The spark version I am using is spark 2.1. On Thu, Mar 30, 2017 at 9:58 AM, shyla deshpande wrote: > Thanks >

Will the setting for spark.default.parallelism be used for spark.sql.shuffle.output.partitions?

2017-03-30 Thread shyla deshpande
Thanks

Re: 'numBins' property not honoured in BinaryClassificationMetrics class when spark.default.parallelism is not set to 1

2016-07-03 Thread Sean Owen
rg.apache.spark.{SparkConf, SparkContext} > > /** > * Created by sneha.shukla on 17/06/16. > */ > > object TestCode { > > def main(args: Array[String]): Unit = { > > val sparkConf = new > SparkConf().setAppName("HBaseRead").setMaster("local") &

Re: 'numBins' property not honoured in BinaryClassificationMetrics class when spark.default.parallelism is not set to 1

2016-07-03 Thread sneha29shukla
Hi, Any pointers? I'm not sure if this thread is reaching the right audience? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/numBins-property-not-honoured-in-BinaryClassificationMetrics-class-when-spark-default-parallelism-is1-tp27204p27269.html Sent from

'numBins' property not honoured in BinaryClassificationMetrics class when spark.default.parallelism is not set to 1

2016-06-22 Thread sneha29shukla
trics import org.apache.spark.{SparkConf, SparkContext} object TestCode { def main(args: Array[String]): Unit = { val sparkConf = new SparkConf().setAppName("HBaseRead").setMaster("local") sparkConf.set("spark.default.parallelism","

Fwd: 'numBins' property not honoured in BinaryClassificationMetrics class when spark.default.parallelism is not set to 1

2016-06-21 Thread Sneha Shukla
trics import org.apache.spark.{SparkConf, SparkContext} /** * Created by sneha.shukla on 17/06/16. */ object TestCode { def main(args: Array[String]): Unit = { val sparkConf = new SparkConf().setAppName("HBaseRead").setMaster("local") sparkConf.set("

Re: Spark.default.parallelism can not set reduce number

2016-05-20 Thread Ovidiu-Cristian MARCU
") .intConf .createWithDefault(200) > On 20 May 2016, at 13:17, 喜之郎 <251922...@qq.com> wrote: > > Hi all. > I set Spark.default.parallelism equals 20 in spark-default.conf. And send > this file to all nodes. > But I found reduce number is still default value,200. > Does any

Re: Spark.default.parallelism can not set reduce number

2016-05-20 Thread Takeshi Yamamuro
You need to use `spark.sql.shuffle.partitions`. // maropu On Fri, May 20, 2016 at 8:17 PM, 喜之郎 <251922...@qq.com> wrote: > Hi all. > I set Spark.default.parallelism equals 20 in spark-default.conf. And send > this file to all nodes. > But I found reduce number is still

Spark.default.parallelism can not set reduce number

2016-05-20 Thread ??????
Hi all. I set Spark.default.parallelism equals 20 in spark-default.conf. And send this file to all nodes. But I found reduce number is still default value,200. Does anyone else encouter this problem? can anyone give some advice? [Stage 9

spark.default.parallelism

2015-02-27 Thread Deep Pradhan
Hi, I have a four single core machines as slaves in my cluster. I set the spark.default.parallelism to 4 and ran SparkTC given in examples. It took around 26 sec. Now, I increased the spark.default.parallelism to 8, but my performance deteriorates. The same application takes 32 sec now. I have

RE: spark.default.parallelism bug?

2014-08-26 Thread Liu, Raymond
Hi Grzegorz From my understanding, for cogroup operation ( which used by intersection), if spark.default.parallelism is not set by user, it won’t bother to use the default value, it will use the partition number ( the max one among all the rdds in cogroup operation) to build up a

spark.default.parallelism bug?

2014-08-26 Thread Grzegorz Białek
Hi, consider the following code: import org.apache.spark.{SparkContext, SparkConf} object ParallelismBug extends App { var sConf = new SparkConf() .setMaster("spark://hostName:7077") // .setMaster("local[4]") .set("spark.default.parallelism", "