-- but in your case, it would be assigned multiple times, depending
> > on your overall parallelism of your Streams app).
> >
> > For the time-based input, you can just read it regularly, and for each
> > record you do a look-up in the HashTable to compute the join.
>
that spills to disk). Of
> course, the disk usage will also be huge. Eventually, your disc might
> also become too small...
>
> Can you clarify, why you want to join everything? This does not sound
> like a good idea. Very large windows are handleable, but "infinite"
> windo
Hi,
I've been thinking how to solve with Kafka Streams one of my business
process without success for the moment. Hope someone can help me.
I am reading from two topics events like that (I'll simplify the problem at
this point):
ObjectX
Key: String
Value: String
ObjectY
Key: String
Value: Strin
me
> >> unexpected
> >>>> behavior.
> >>>>
> >>>>
> >>>> 2) log compaction is a Kafka broker feature that Kafka Streams
> leverage
> >>> on:
> >>>>
> >>>> https://cwiki.apache.org/confl
Hello,
I read in the docs that Kafka Streams stores the computed aggregations in a
local embedded key-value store (RocksDB by default), i.e., Kafka Streams
provides so-called state stores. I'm wondering about the relationship
between each state store and its replicated changelog Kafka topic.
If w
ceived a new record from outer / inner table, output (a,
> null) or (null, b).
>
>
> 2) The result topic is also a changelog topic, although it will be log
> compacted on the key over time, if you consume immediately the log may not
> be yet compacted.
>
>
> Guozhang
>
Hello,
+1, same problem when I tried it. However, I dove into the code examples so
I can't give you a solution.
2016-04-19 17:20 GMT+02:00 Ramanan, Buvana (Nokia - US) <
buvana.rama...@nokia.com>:
> Hello,
>
> I went thru QuickStart instructions at:
> http://docs.confluent.io/2.1.0-alpha1/stream
e streams are really a "changelog" stream, hence you
> should create the stream as KTable, and do KTable-KTable join.
>
> 2) Could elaborate about "achieving this"? What behavior do require in the
> application logic?
>
>
> Guozhang
>
>
> On Thu, Apr 14,
Hi,
I am a newbie to Kafka Streams and I am using it trying to solve a
particular use case. Let me explain.
I have two sources of data both like that:
Key (string)
DateTime (hourly granularity)
Value
I need to join the two sources by key and date (hour of day) to obtain:
Key (string)
DateTime