Thank you Alexander for the response. This is very helpful.
Can i apply the same pattern to S3 as well , as in read from Kafka or
Kinesis and write multiple files in S3 or multiple topics in Kinesis ?

regards,
Rahul

On Wed, Apr 29, 2020 at 2:32 PM Alexander Fedulov <alexan...@ververica.com>
wrote:

> Hi Dhurandar,
>
> it is not supported out of the box, however, I think it is possible by
> doing the following:
> 1) Create a wrapper type, containing the original message and a topic
> destination where it is supposed to be sent. You can enrich the messages
> with it in accordance to the configuration you've mentioned.
> 2) Extend `KeyedSerializationSchema` and make its `getTargetTopic` return
> the desired topic
> 3) Initialize `FlinkKafkaProducer011` with this custom
> `KeyedSerializationSchema`
> Please mind that `KeyedSerializationSchema` and is marked as deprecated
> and is supposed to be substituted by the new `KafkaSerializationSchema`,
> which would require a slight modification, but, from what I can tell, it
> will still be possible to achieve such dynamic events dispatching.
>
> Best regards,
> Alexander Fedulov
>
>>

Reply via email to