Hi, Selina,
Your question is not clear.
{quote}
When the messages was send to Kafka by KafkaProducer, It always failed
when the message more than 3000 - 4000 messages.
{quote}
What's failing? The error stack shows errors on the consumer side and you
were referring to failures to produce to Kafka.
Hi, All
I am trying to write my first StreamTask class. I have a topic at
Kafka called "http-demo". I like to read the topic and write it to another
topic called "demo-duplicate"
Howeven there is not topic written to Kafka.
My properties file and StreamTask are below. Can anyone t
Hi,
After I got the error in previous email. I try to check the content of
the topic. It show the error below. However when I stop Samza and re-run it
again, it will be fine. Does anyone know What was the problem?
[2015-07-23 17:50:08,391] WARN
[console-consumer-83311_Selinas-MacBook-Pro.loc
Hi,
When the messages was send to Kafka by KafkaProducer, It always failed
when the message more than 3000 - 4000 messages. The error is shown below.
I am wondering if any topic size I need to set at Samza configuration?
[2015-07-23 17:30:03,792] WARN
[console-consumer-84579_Selinas-MacBook-
I forgot to mention that for me the error always happened after restarting a
broker.
Sent from my iPhone
> On Jul 23, 2015, at 4:25 PM, Jordan Shaw wrote:
>
> Hey Roger,
> I restarted the producer and the error went away on the broker. If it comes
> back I'll switch over to lz4. Thanks for the
Yeah, that's why I added some test code in the window() to call store.all()
and iterate through. I traced into it in my local environment and verified
that the iterator is functioning with store.all().
-Yi
On Thu, Jul 23, 2015 at 4:26 PM, Shekar Tippur wrote:
> Yi,
>
> In my case, I am able to
Yi,
In my case, I am able to append to the key but I am not able to get the
store and iterate through.
If you look at http://pastebin.com/fKGpHwW6, line 146, I am able to get the
store value. but in window routine - line 187, I am unable to get the
values from store.
- Shekar
Hey Roger,
I restarted the producer and the error went away on the broker. If it comes
back I'll switch over to lz4. Thanks for the reply.
-Jordan
On Thu, Jul 23, 2015 at 9:32 AM, Roger Hoover
wrote:
> Hi Jordan,
>
> I ran into a similiar issue when using snappy compression and the new
> produce
Hi, Shekar,
I was merely testing whether the counter per key works, if that makes sense
to your use case.
-Yi
On Thu, Jul 23, 2015 at 3:25 PM, Shekar Tippur wrote:
> Yi,
>
> I am new to Scala. While it is readable, I am not sure where you are
> incrementing the count per application?
>
> - She
---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36545/#review92825
---
samza-core/src/main/java/org/apache/samza/checkpoint/CheckpointMana
Yi,
I am new to Scala. While it is readable, I am not sure where you are
incrementing the count per application?
- Shekar
On Wed, Jul 22, 2015 at 5:20 PM, Shekar Tippur wrote:
> Thanks Yi. I got the pastebin link.
> I am looking at it.
>
> Shekar
> On Jul 22, 2015 5:09 PM, "Yi Pan" wrote:
>
>
Hi Tommy,
It has not been implemented just because no one is working on it, not other
reasons. :) If you want to take a stab, feel free to do this. That will be
great.
(copycat use case? :)
Cheers,
Fang, Yan
yanfang...@gmail.com
On Thu, Jul 23, 2015 at 1:23 PM, Tommy Becker wrote:
> I'm writ
I'm writing a Samza job that basically serves to pump data out of Kafka into
another system. For my particular use-case, I want to essentially process the
entire topic as it exists when the job starts and then exit. As far as I can
tell, there doesn't seem to be a way to do that right now bec
Hi Dan and Samza devs,
I have a use case for which I need to set an external version on
Elasticsearch documents. Versioning (
https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-index_.html#index-versioning)
lets you prevent duplicate messages from temporarily overwriting new
ver
Hi Jordan,
I ran into a similiar issue when using snappy compression and the new
producer. If you disable compression or switch to lz4 or gzip, does the
issue go away?
Cheers,
Roger
On Wed, Jul 22, 2015 at 11:54 PM, Jordan Shaw wrote:
> Hey Everyone,
> I'm getting an:
> "kafka.message.Inva
---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36728/
---
Review request for samza.
Repository: samza-hello-samza
Description
---
---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36727/
---
Review request for samza.
Repository: samza
Description
---
Upgraded ver
---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36692/
---
(Updated July 23, 2015, 11:23 a.m.)
Review request for samza.
Repository: sam
---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36692/
---
(Updated July 23, 2015, 11:16 a.m.)
Review request for samza.
Changes
---
19 matches
Mail list logo