How are you verifying whether it is registered?
For the sake of covering all angles: Are you certain that
createPartitionIndex is called?
On 06.07.2017 08:51, wyphao.2007 wrote:
Hi Chesnay, thank you for your reply
The code above does not get registered at all.
在2017年07月06 14时45分, "Chesna
Hi Chesnay, thank you for your reply
The code above does not get registered at all.
在2017年07月06 14时45分, "Chesnay Schepler"写道:
Hello,
Plase provide more information as to how it is not working as
expected.
Does it throw an exception, log a warning,
Hello,
Plase provide more information as to how it is not working as expected.
Does it throw an exception, log a warning, is the metric
not get registered at all or does the value not changing?
On 06.07.2017 08:10, wyphao.2007 wrote:
Hi, all
I want to know element's latency before write to Ela
Hi, all
I want to know element's latency before write to Elasticsearch, so I
registering a custom metrics as follow:
class CustomElasticsearchSinkFunction extends
ElasticsearchSinkFunction[EventEntry] {
private var metricGroup: Option[MetricGroup] = None
private var latency: Long = _
pr
Since you’re placing jars in the lib/ folder yourself instead of packaging an
uber jar, you also need to provide the Kafka dependency jars.
It usually isn’t recommended to place dependencies in the lib/ folder.
Packaging an uber jar is the recommended approach.
Using the maven-shade-plugin, you
Hi Richard,
Producing to multiple topics is treated a bit differently in the Flink Kafka
producer.
You need to set a single default target topic, and in
`KeyedSerializationSchema#getTargetTopic()` you can override the default topic
with whatever is returned. The `getTargetTopic` method is invok
when using FlinkKafkaConsumer010 to subscribing multiple topics as
List topics = Arrays.asList("test1","test2");
DataStream stream = env.addSource(new FlinkKafkaConsumer010<>(topics,
new SimpleStringSchema(), properties));
How do I get topic names in my SinkFunction? i.e. stream.addSink()
Thanks,
Hi Paolo,
Have you followed the instructions in this documentation [1]?
The connectors are not part of the binary distributions, so you would need to
bundle the dependencies with your code by building an uber jar.
Cheers,
Gordon
[1] https://ci.apache.org/projects/flink/flink-docs-release-1.3/d
Hi,
I am following the basic steps to implement a consumer and a producer with
Kafka for Flink. My Flink version is 1.2.0, the Kafka's one is 0.10.2.0, so
in my pom.xml I will add the :
org.apache.flink
flink-connector-kafka-0.10_2.10
1.2.0
The problem is that if I run the program with maven
I was trying to join two keyed streams in a particular way and get a
combined stream.
For example:
Lets say I call the two streams as X and Y.
The X stream contains:
(Key,Value)
(A,P)
(A,Q)
(A,R)
(B,P)
(C,P)
(C,Q)
The Y stream contains:
(Key,Value,Flag1,Flag2)
(A,M1,0,0)
(A,M2,0,0)
(A,M3,1,0)
I was trying to join two keyed streams in a particular way and get a combined
stream.
For example:
Lets say I call the two streams as X and Y.
The X stream contains:
(Key,Value)
(A,P)
(A,Q)
(A,R)
(B,P)
(C,P)
(C,Q)
The Y stream contains:
(Key,Value,Flag1,Flag2)
(A,M1,0,0)
(A,M2,0,0)
(A,M3,1,0)
Hi Jürgen,
one easy way is to disable the Kryo fallback with
env.getConfig().disableGenericTypes();
If it was using Kryo you should see an exception, which also states the
class, for which it needs to fallback to Kryo. This fails on the first
non-Kryo class though. So depending on the other clas
Hi Stephan,
do you know an easy way to find out if Kryo or POJO is used? I have an
Object that would be a POJO, but it has one field that uses an object
without a public no argument constructor. As I understood the
documentation, this should result in Kryo being used.
Thanks,
Jürgen
On 03.0
13 matches
Mail list logo