It turns out that the Flink documentation has a whole section about this
problem: https://goo.gl/DMs9Dx
Now let's find the solution!.
Bart
> concrete idea why that happens in your case,
> but maybe this info helps to track down the problem.
>
> Best,
> Stefan
>
> > Am 22.06.2017 um 11:09 schrieb Bart van Deenen :
> >
> > Hi All
> >
> > I have a simple avro file
> >
>
Hi All
I have a simple avro file
{"namespace": "com.kpn.datalab.schemas.omnicrm.contact_history.v1",
"type": "record",
"name": "contactHistory",
"fields": [
{"name": "events", "type": {"type":"array",
"items": "bytes"}},
{"name": "krn", "type": "string"}
]
the dashboard, and also for logging.
>
> Regards,
> Gordon
>
>
>
> On September 12, 2016 at 3:44:58 PM, Bart van Deenen
> (bartvandee...@fastmail.fm) wrote:
>>
>>
>> Hi all
>>
>> I'm using Flink 1.1 with a streaming job, consisting of a few maps
ction=org.apache.flink.streaming.api.scala.function.util.ScalaFoldFunction@5c42d2b7},
EventTimeTrigger(), WindowedStream.fold(WindowedStream.java:238)) ->
Filter -> Map
Is it possible to give this a more human readable name from my job
program?
Greetings
Bart van Deenen
that work?
>>
>> I apologize for just providing pointers to the docs in my reply but
>> checkpointing deserves a good explanation and I feel the docs get the
>> job done pretty well. I will gladly help you if you have any doubt.
>>
>> Hope I've been of some h
Hi all
I'm having a datastream transformation, that updates a mutable
hashmap that exists outside of the stream.
So it's something like
object FlinkJob {
val uriLookup = mutable.HashMap.empty[String, Int]
def main(args: Array[String]) {
val stream: DataStream = ...
stream.keybBy
Thanks
Copying jars into the lib directory works fine
On Tue, Mar 29, 2016, at 13:34, Balaji Rajagopalan wrote:
> You will have to include dependent jackson jar in flink server lib
> folder, or create a fat jar.
>
> balaji
>
> On Tue, Mar 29, 2016 at 4:47 PM, Bart van Deen
Hi all
I've succesfully built a Flink streaming job, and it runs beautifully in
my IntelliJ ide, with a Flink instance started on the fly. The job eats
Kafka events, and outputs to file. All the i/o is json encoded with
Jackson.
But I'm having trouble with deploying the jar on a Flink server
(ve
oded in the
> tuple itself, thus, it is independent of the wall-clock time).
>
> However, be aware that the order of tuples *within a window* might
> change!
>
> Thus, the timestamp of the "most recent event in the window" might
> change...
>
>
> -Matthias
t the same events in a Window, or are the limits of a window
somehow dependent on the real time of the run.
The windows I'm using are two sliding timeWindow's and one timeWindowAll
Thanks for any answers
Bart van Deenen
terable
> foreach { case (l, r) => collector.collect(l + r) } } result.print()
> env.execute("Flink Scala API Skeleton")
>
> I hope this helps you.
> Cheers,
> Till
>
>
> On Tue, Mar 22, 2016 at 12:40 PM, Bart van Deenen
> wrote:
>>
>> Hi all
>
pages)
})
out.collect(agg)
})
Pheew.
--
Bart van Deenen
bartvandee...@fastmail.fm
On Tue, Mar 22, 2016, at 12:40, Bart van Deenen wrote:
>
> Hi all
>
> I'm using 1.0, and have all my data nicely bundled in one allWindow, but
> I don't understand the syntax
Collector out) throws Exception {
int sum = 0;
for (value t: values) {
sum += t.f1;
}
out.collect (new Integer(sum));
}
});
Help very appreciated!
Greetings
--
Bart van Deenen
bartvandee...@fastmail.fm
If I do a fold on a KeyedStream, I aggregate events for such-and-such
key.
My question is, what happens with the aggregate (and its key) when
events for this key stop coming?
My keys are browser session keys, and are virtually limitless.
Ideally, I'd like to send some sort of purge event on key
tural than some sort of micro batching.
Thanks
Bart
--
Bart van Deenen
bartvandee...@fastmail.fm
On Fri, Mar 18, 2016, at 12:16, Fabian Hueske wrote:
> Yes, that's possible.
>
> You have to implement a custom trigger for that. The
> Trigger.onElement() method will
Hi Fabian
So you're saying that with a windowed stream I can still emit a folded
aggregate for each event as it comes in? I didn't realize that, I
thought that windows was a sort of micro batching.
I'll go read the link you posted
Thanks
--
Bart van Deenen
bartvandee...@fastm
Ok, thanks! I'll do it that way, with a custom trigger and a fold per
key.
Bart
--
Bart van Deenen
bartvandee...@fastmail.fm
On Fri, Mar 18, 2016, at 13:31, Fabian Hueske wrote:
> The "weight" of a window depends on the function that you apply.
> If you apply a g
18 matches
Mail list logo