Hi,
We are trying to do a test using States but we have not been able to
achieve our desired result. Basically we have a data stream with data as
[{"id":"11","value":123}] and we want to calculate the sum of all values
grouping by ID. We were able to achieve this using windows but not with
states
By(_._1)
> .mapWithState((in: (String, Int), sum: Option[Int]) => {*val* newSum =
> in._2 + sum.getOrElse(0)
> ( (in._1, newSum), Some(newSum) )
> }
>
>
> Does that help?
>
> Thanks also for pointing out the error in the sample code...
>
> Greetings
windows.
> >
> > If you do not want the partial sums, but only the final sum, you need to
> define what window in which the sum is computed. At the end of that window,
> that value is emitted. The window can be based on time, counts, or other
> measures.
> >
> > Gree
Hi,
We are using the sink for ElasticSearch and when we try to run our job we
get the following exception:
java.lang.ExceptionInInitializerError Caused by:
java.lang.IllegalArgumentException: An SPI class of type
org.apache.lucene.codecs.Codec with name 'Lucene410' does not exist. You
need to ad
> Aljoscha
> > On 12 Jan 2016, at 11:55, Lopez, Javier wrote:
> >
> > Hi,
> >
> > We are using the sink for ElasticSearch and when we try to run our job
> we get the following exception:
> >
> > java.lang.ExceptionInIniti
Hi guys,
I'm running a small test with the SNAPSHOT version in order to be able to
use Kafka 0.9 and I'm getting the following error:
*cannot access org.apache.flink.api.java.operators.Keys*
*[ERROR] class file for org.apache.flink.api.java.operators.Keys not found*
The code I'm using is as foll
this is not the problem, I would try to update all Flink dependencies.
>
> Cheers, Fabian
>
> [1]
> https://cwiki.apache.org/confluence/display/FLINK/Maven+artifact+names+suffixed+with+Scala+version
>
> 2016-02-15 10:54 GMT+01:00 Lopez, Javier :
>
>> Hi guys,
>>
&
Hi guys,
We are using Flink 1.0-SNAPSHOT with Kafka 0.9 Consumer and we have not
been able to retrieve data from our Kafka Cluster. The DEBUG data reports
the following:
10:53:24,365 DEBUG org.apache.kafka.clients.NetworkClient
- Sending metadata request ClientRequest(expectResponse=true,
ca
le.auto.commit", "true");
properties.put("auto.commit.interval.ms", "1000");
properties.put("session.timeout.ms", "3");
properties.put("key.deserializer",
"org.apache.kafka.common.serialization.StringDeserializ
Hi Robert,
After we restarted our Kafka / Zookeeper cluster the consumer worked. Some
of our topics had some problems. The flink's consumer for Kafka 0.9 works
as expected.
Thanks!
On 19 February 2016 at 12:03, Lopez, Javier wrote:
> Hi, these are the properties:
>
> Propert
10 matches
Mail list logo