WangJianfei created SPARK-17533:
-----------------------------------
Summary: I think it's necessary to have an overrided method of
union in sparkContext
Key: SPARK-17533
URL: https://issues.apache.org/jira/browse/SPARK-17533
Project: Spark
Issue Type: New Feature
Components: Spark Core
Affects Versions: 2.0.0
Reporter: WangJianfei
Priority: Minor
I think it's necessary to have an override method of union in sparkContext
for the purpose of that the user can desinate the number of partitions and the
Partitioner.
A func like this
```
def union[T: ClassTag](rdds: Seq[RDD[T], numPartitions: Int, partitioner:
Partitioner): RDD[T] = withScope {
}
```
we can discuss here.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]