Yes so that brings me to another question. How do I do a batch insert from
worker?
In prod we are planning to put a 3 shared kinesis. So the number of
partitions should be 3. Right?
On Mar 8, 2015 8:57 PM, "Ted Yu" wrote:
> What's the expected number of partitions in your use case ?
>
> Have you
What's the expected number of partitions in your use case ?
Have you thought of doing batching in the workers ?
Cheers
On Sat, Mar 7, 2015 at 10:54 PM, A.K.M. Ashrafuzzaman <
ashrafuzzaman...@gmail.com> wrote:
> While processing DStream in the Spark Programming Guide, the suggested
> usage of c
While processing DStream in the Spark Programming Guide, the suggested usage of
connection is the following,
dstream.foreachRDD(rdd => {
rdd.foreachPartition(partitionOfRecords => {
// ConnectionPool is a static, lazily initialized pool of connections
val connection = Co