Please take a look at the following for example:

./core/src/main/scala/org/apache/spark/api/python/PythonPartitioner.scala
./core/src/main/scala/org/apache/spark/Partitioner.scala

Cheers

On Tue, Nov 17, 2015 at 9:24 AM, prateek arora <prateek.arora...@gmail.com>
wrote:

> Hi
> Thanks
> I am new in spark development so can you provide some help to write a
> custom partitioner to achieve this.
> if you have and link or example to write custom partitioner please
> provide to me.
>
> On Mon, Nov 16, 2015 at 6:13 PM, Sabarish Sasidharan <
> sabarish.sasidha...@manthan.com> wrote:
>
>> You can write your own custom partitioner to achieve this
>>
>> Regards
>> Sab
>> On 17-Nov-2015 1:11 am, "prateek arora" <prateek.arora...@gmail.com>
>> wrote:
>>
>>> Hi
>>>
>>> I have a RDD with 30 record ( Key/value pair ) and running 30 executor .
>>> i
>>> want to reparation this RDD in to 30 partition so every partition  get
>>> one
>>> record and assigned to one executor .
>>>
>>> when i used rdd.repartition(30) its repartition my rdd in 30 partition
>>> but
>>> some partition get 2 record , some get 1 record and some not getting any
>>> record .
>>>
>>> is there any way in spark so i can evenly distribute my record in all
>>> partition .
>>>
>>> Regards
>>> Prateek
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/how-can-evenly-distribute-my-records-in-all-partition-tp25394.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>

Reply via email to