Hello,
I am having trouble implementing streams to table join.
I have 2 POJO's each representing streams and table data structures. raw
topic contains streams and cache topic contains table structure. The join
is not happening since the print statement is not being called.
Appreciate any pointer
Hi,
I am trying to send messages (synchronous) to a kafka cluster (lets
call it A). I get 'Batch Expired Exception' very frequently.
Also the average time taken per send is very high around 5 seconds.
However for the same code when I send messages to a different kafka
cluster B (with same netwo
Hi all,
I'm trying to use the transformation for Kafka Connect and running into issues.
The tranformation configuration is:
-
"transforms": "GetAfter",
"transforms.GetAfter.type":
"org.apache.kafka.connect.transforms.ExtractField$Value",
"transforms.GetAfter.field": "after",
Minglei,
Apache mailing list would not allow you to attach any files.
I'd suspect that your `gradle` is not setup yet from the mentioned "not
find package" statement.
Try "gradle" first then `./gradlew ...` and see if it fixed your issue.
Guozhang
On Sat, Jun 24, 2017 at 4:44 AM, 张明磊 wrote:
Hi Shekar,
Could you demonstrate your input data. More specifically, what are the key
types of your input streams, and are they not-null values? It seems the
root cause is similar to the other thread you asked on the mailing list.
Also, could you provide your used Kafka Streams version?
Guozhan
Hi Sameer,
>From your shared logs it seems the number of "threads" is actually larger
than the number of topic partitions (note each machine seem to have lots of
threads). More specifically, I have only saw max. of 6 partitions for any
topics, e.g.:
LIC3-43-lic3-deb-ci2-43-repartition-0
LIC3-43-l
Hello,
This question is more of a request for suggestion, since I am already using
Plain API (Producer Consumer) and trying to explore either Stream/Connect
API to solve my problem.
I need to perform adhoc read from a different server and this is not
event-driven. For example:
1) User Logs in
2
Guozhang,
Thanks for responding.
The raw and cache keys are null. Both KStream and KTable entries are json's.
Here is the input to cache (KTable)
{"user_name": "Joe", "location": "US", "gender": "male"}
{"user_name": "Julie", "location": "US", "gender": "female"}}
{"user_name": "Kawasaki", "lo
Guozhang
I am using 0.10.2.1 version
- Shekar
On Sun, Jun 25, 2017 at 10:36 AM, Guozhang Wang wrote:
> Hi Shekar,
>
> Could you demonstrate your input data. More specifically, what are the key
> types of your input streams, and are they not-null values? It seems the
> root cause is similar to