Re: doing demultiplexing using Apache flink

2020-05-02 Thread dhurandar S
Thank you, Alexander, This is really helpful. Can the input be Flink SQL? Idea is to provide the capability to take SQL as the input and create new streams on-demand for the given SQL. So users of the system provide "SQL" in the configuration files and henceforth they can start listening to a top

Re: doing demultiplexing using Apache flink

2020-04-30 Thread Alexander Fedulov
This too, should be possible. Flink uses `StreamingFileSink` to transfer data to S3 [1 ]. You can pass it your custom bucket assigner [2

Re: doing demultiplexing using Apache flink

2020-04-30 Thread Arvid Heise
Hi Dhurandar, if you use KafkaSerializationSchema [1], you can create a producer record, where you explicitly set the output topic. The topic can be arbitrarily calculated. You pass it while constructing the sink: stream.addSink(new FlinkKafkaProducer( topic, serSchema, // <--

Re: doing demultiplexing using Apache flink

2020-04-29 Thread dhurandar S
Thank you Alexander for the response. This is very helpful. Can i apply the same pattern to S3 as well , as in read from Kafka or Kinesis and write multiple files in S3 or multiple topics in Kinesis ? regards, Rahul On Wed, Apr 29, 2020 at 2:32 PM Alexander Fedulov wrote: > Hi Dhurandar, > > it

Re: doing demultiplexing using Apache flink

2020-04-29 Thread Alexander Fedulov
Hi Dhurandar, it is not supported out of the box, however, I think it is possible by doing the following: 1) Create a wrapper type, containing the original message and a topic destination where it is supposed to be sent. You can enrich the messages with it in accordance to the configuration you've

doing demultiplexing using Apache flink

2020-04-29 Thread dhurandar S
Hi , We have a use case where we have to demultiplex the incoming stream to multiple output streams. We read from 1 Kafka topic and as an output we generate multiple Kafka topics. The logic of generating each new Kafka topic is different and not known beforehand. Users of the system keep adding n

doing demultiplexing using Apache flink

2020-04-29 Thread dhurandar S
> > Hi , > > We have a use case where we have to demultiplex the incoming stream to > multiple output streams. > > We read from 1 Kafka topic and as an output we generate multiple Kafka > topics. The logic of generating each new Kafka topic is different and not > known beforehand. Users of the syst

doing demultiplexing using Apache flink

2020-04-29 Thread dhurandar S
Hi , We have a use case where we have to demultiplex the incoming stream to multiple output streams. We read from 1 Kafka topic and as an output we generate multiple Kafka topics. The logic of generating each new Kafka topic is different and not known beforehand. Users of the system keep adding n

doing demultiplexing using Apache flink

2020-04-29 Thread dhurandar S
Hi , We have a use case where we have to demultiplex the incoming stream to multiple output streams. We read from 1 Kafka topic and as an output we generate multiple Kafka topics. The logic of generating each new Kafka topic is different and not known beforehand. Users of the system keep adding n