ase ?
>
> Have you thought of doing batching in the workers ?
>
> Cheers
>
> On Sat, Mar 7, 2015 at 10:54 PM, A.K.M. Ashrafuzzaman <
> ashrafuzzaman...@gmail.com> wrote:
>
>> While processing DStream in the Spark Programming Guide, the suggested
>> usage
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
Thanks Chris,
That is what I wanted to know :)
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
On Mar 2, 2015, at 2:04 AM, Chris Fregly wrote:
> hey AKM!
>
>
Sorry guys may bad,
Here is a high level code sample,
val unionStreams = ssc.union(kinesisStreams)
unionStreams.foreachRDD(rdd => {
rdd.foreach(tweet =>
val strTweet = new String(tweet, "UTF-8")
val interaction = InteractionParser.parser(strTweet)
interactionDAL.insert(interaction)
g scala and
spark streaming.
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
gt; numAvailableWorkers - 1), you can have lesser number of workers than number
> of shards. Makes sense?
>
> On Sun Dec 14 2014 at 10:06:36 A.K.M. Ashrafuzzaman <
> ashrafuzzaman...@gmail.com> wrote:
>
>> Thanks Aniket,
>> The trick is to have the #workers >=
will do a memory leak test. But this
is a simple and small application. I don’t see a leak there with naked eyes.
Can any one help me with how I should investigate?
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1
rds.
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
On Nov 26, 2014, at 6:23 PM, A.K.M. Ashrafuzzaman
wrote:
> Hi guys,
> When we are using Kinesis with 1 shar
from EC2 and now the kinesis
is getting consumed.
4 cores Single machine -> works
2 cores Single machine -> does not work
2 cores 2 workers -> does not work
So my question is that do we need a cluster of (#KinesisShards + 1) workers to
be able to consume from Kinesis?
A.K.M. Ashrafuzz
* Start the streaming context and await termination */
ssc.start()
ssc.awaitTermination()
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred <http://www.newscred.com>
(M) 880-175-5592433
Twitter <https://twitter.com/ashrafuzzaman> | Blog
<http://jitu-blog.blogspot.com/> | Face
using,
scala: 2.10.4
java version: 1.8.0_25
Spark: 1.1.0
spark-streaming-kinesis-asl: 1.1.0
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
11 matches
Mail list logo