mething to do topic that it listens still
"sometimes" I can see a result.
What I want is to get the last value of that topic?
Kind Regards,
Furkan KAMACI
On Thu, Nov 3, 2016 at 1:36 PM, Furkan KAMACI
wrote:
> Hi Matthias,
>
> Thanks for the response. I stream output as foll
umerRecords records = consumer.poll(1);
for (ConsumerRecord record : records) {
System.out.printf("Connected! offset = %d, key = %s, value =
%s", record.offset(), record.key(), record.value());
}
I can see that there is data when I check the streamed topic
(qps-
last hour or everything from the beginning due
you I've set it as earliest?
2) Sometimes it's been reset and numbers starts from 1. What can be the
reason for that?
Kind Regards,
Furkan KAMACI
Congrats!
On Mon, Oct 31, 2016 at 8:30 PM, Becket Qin wrote:
> Thanks everyone! It is really awesome to be working with you on Kafka!!
>
> On Mon, Oct 31, 2016 at 11:26 AM, Jun Rao wrote:
>
> > Congratulations, Jiangjie. Thanks for all your contributions to Kafka.
> >
> > Jun
> >
> > On Mon, Oc
efore
> you start you Kafka Streams application.
>
>
> - -Matthias
>
> On 10/18/16 3:34 PM, Furkan KAMACI wrote:
> > Sorry about concurrent questions. Tried below code, didn't get any
> > error but couldn't get created output topic:
> >
> >
pic (method, .to(...) has multiple overloads)
>
> Per default, each topic read/write operation uses Serdes from the
> streams config. If you data has a different type, you need to provide
> appropriate Serdes for those operators.
>
>
> - -Matthias
>
> On 10/18/16 2:01 PM, Fu
ning*
*28952314828122*
*28988681653726*
* 29080089383233*
I know that I miss something but couldn't find it.
Kind Regards,
Furkan KAMACI
On Tue, Oct 18, 2016 at 10:34 PM, Matthias J. Sax
wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA512
&
Hi Matthias,
Thanks for your detailed answer. By the way I couldn't find "KGroupedStream"
at version of 0.10.0.1?
Kind Regards,
Furkan KAMACI
On Tue, Oct 18, 2016 at 8:41 PM, Matthias J. Sax
wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA512
>
> Hi,
&g
. Kafka
Streams or Spark Streams. My choice is to use Kafka Streams.
For last 1 hours, or since the beginning, I have to calculate the queries
per second. How can I make such an aggregation at Kafka Streams?
Kind Regards,
Furkan KAMACI
,
Furkan KAMACI
10 matches
Mail list logo