Hi,
you don't need the BlockedEventState class, you should be able to just do
this:
private transient ValueState blockedRoads;
@Override
public void open(final org.apache.flink.configuration.Configuration
parameters) throws Exception {
final ValueStateDescri
On Mon, Sep 12, 2016 at 8:35 PM, Chesnay Schepler
wrote:
> Hello,
>
> you cannot pass a configuration in the Steaming API. This way of
> configuration is more of a relic from past times.
>
> The common way to pass configure a function is to pass the parameters
> through the constructor and store
Hello,
the JDBC Sink completely ignores the taskNumber and parallelism.
Regards,
Chesnay
On 12.09.2016 08:41, Swapnil Chougule wrote:
Hi Team,
I want to know how tasknumber & numtasks help in opening db connection
in Flink JDBC JDBCOutputFormat Open. I checked with docs where it says:
Hello,
you cannot pass a configuration in the Steaming API. This way of
configuration is more of a relic from past times.
The common way to pass configure a function is to pass the parameters
through the constructor and store the values in a field.
Regards,
Chesnay
On 12.09.2016 18:27, Lui
hi,
is there a way to test window jobs?
I would like to build the job, give some inputs, "fast forward" to the next
window, collect the results and assert them.
hi! I'm trying to pass Configuration parameters to a RichMapFunction in
flink streaming and I can't find the way to do it
I need to pass two strings to the MapFunction and I was getting a
serialization error, so I tried RichMapFunction and open() but I can't find
a way to set the the parameters I
Hi Fabian,
I'm coding to check if your proposal works and hit with an issue with
ClassCastException.
// Here is my Value that has state information.an implementation of
my value state... where the key is a Double value... on connected stream ks2
public class BlockedEventState im
You can just use plain JDBC. Just keep in mind, that the classes will be
serialized and sent through the cluster. So probably, you want to
initialize all the non-serializable database access object in the open
method itself (as opposed to the constructor (client side)).
Cheers,
Konstantin
On 12.
Thank you Konstantin, the amount of data I have to load into memory will be
very small so that should be alright.
When opening and querying the database would I use any sort of Flink magic
or just do plain JDBC ?
I read about the JDBCInput concept which one could use with the DataSet API
and was wo
Thanks!
On Mon, Sep 12, 2016, at 10:18, Tzu-Li (Gordon) Tai wrote:
> Hi!
>
> Yes, you can set custom operator names by calling `.name(…)` on
> DataStreams after a transformation.
> For example, `.addSource(…).map(...).name(…)`. This name will be used
> for visualization on the dashboard, and also
Hi!
Yes, you can set custom operator names by calling `.name(…)` on DataStreams
after a transformation.
For example, `.addSource(…).map(...).name(…)`. This name will be used for
visualization on the dashboard, and also for logging.
Regards,
Gordon
On September 12, 2016 at 3:44:58 PM, Bart van
Hi all
I'm using Flink 1.1 with a streaming job, consisting of a few maps and a
few aggregations.
In the web dashboard for the job I see subtask names like:
TriggerWindow(SlidingEventTimeWindows(60, 5000),
FoldingStateDescriptor{serializer=null, initialValue=Res(0,List()),
foldFunction=org.ap
12 matches
Mail list logo