[
https://issues.apache.org/jira/browse/SPARK-17521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15486894#comment-15486894
]
WangJianfei commented on SPARK-17521:
-------------------------------------
Hi @Sean Owen I can do this fix:just depreceted makeRDD and at the same time
fix the bug. By the way,I learn spark by looking the spark source code,and
find this problem occasionally, do you have some tips about learning spark?
Thank you very much
Best Wishes!
Wangjianfei
> Error when I use sparkContext.makeRDD(Seq())
> --------------------------------------------
>
> Key: SPARK-17521
> URL: https://issues.apache.org/jira/browse/SPARK-17521
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.0.0
> Reporter: WangJianfei
> Priority: Minor
> Labels: easyfix
>
> when i use sc.makeRDD below
> ```
> val data3 = sc.makeRDD(Seq())
> println(data3.partitions.length)
> ```
> I got an error:
> Exception in thread "main" java.lang.IllegalArgumentException: Positive
> number of slices required
> We can fix this bug just modify the last line ,do a check of seq.size
> ````
> def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope {
> assertNotStopped()
> val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap
> new ParallelCollectionRDD[T](this, seq.map(_._1), seq.size, indexToPrefs)
> }
> ```
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]