There are almost no cases in which you'd want a zero-partition RDD. The only one I can think of is an empty RDD, where the number of partitions is irrelevant. Still, I would not be surprised if other parts of the code assume at least 1 partition.
Maybe this check could be tightened. It would be interesting to see if the tests catch any scenario where a 0-partition RDD is created, and why. On Fri, Sep 16, 2016 at 7:54 AM, WangJianfei <wangjianfe...@otcaix.iscas.ac.cn> wrote: > class HashPartitioner(partitions: Int) extends Partitioner { > require(partitions >= 0, s"Number of partitions ($partitions) cannot be > negative.") > > the soruce code require(partitions >=0) ,but I don't know why it makes sense > when the partitions is 0. > > > > -- > View this message in context: > http://apache-spark-developers-list.1001551.n3.nabble.com/What-s-the-meaning-when-the-partitions-is-zero-tp18957.html > Sent from the Apache Spark Developers List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org