Hi,
New PyFlink user here. Loving it so far. The first major problem I've run
into is that I cannot create a Kafka Producer with dynamic topics. I see
that this has been available for quite some time in Java with Keyed
Serialization using the getTargetTopic method. Another way to do this in
Java may be with stream.split(), and adding a different sink for the split
streams. Stream splitting is also not available in PyFlink.
Am I missing anything? Has anyone implemented this before in PyFlink, or
know of a way to make it happen?
The use case here is that I'm using a CDC Debezium connector to populate
kafka topics from a multi-tenant database, and I'm trying to use PyFlink to
split the records into a different topic for each tenant.

Thanks

Reply via email to