Ah, I see. Pitty. You could always use reflection if you really had to, but that's of course not a long term solution.
I will raise this issue to the KafkaSource/AWS contributors. Best, Piotr Nowojski pon., 10 sty 2022 o 16:55 Clayton Wohl <claytonw...@gmail.com> napisał(a): > Custom code can create subclasses of FlinkKafkaConsumer, because the > constructors are public. Custom code can't create subclasses of KafkaSource > because the constructors are package private. So the same solution of > creating code subclasses won't work for KafkaSource. > > Thank you for the response :) > > > On Mon, Jan 10, 2022 at 6:22 AM Piotr Nowojski <pnowoj...@apache.org> > wrote: > >> Hi Clayton, >> >> I think in principle this example should be still valid, however instead >> of providing a `CustomFlinkKafkaConsumer` and overriding it's `open` >> method, you would probably need to override >> `org.apache.flink.connector.kafka.source.reader.KafkaSourceReader#start`. >> So you would most likely need both at the very least a custom >> `KafkaSourceReader` and `KafkaSource` to instantiate your custom >> `KafkaSourceReader`. But I'm not sure if anyone has ever tried this so far. >> >> Best, >> Piotrek >> >> pt., 7 sty 2022 o 21:18 Clayton Wohl <claytonw...@gmail.com> napisał(a): >> >>> If I want to migrate from FlinkKafkaConsumer to KafkaSource, does the >>> latter support this: >>> >>> >>> https://docs.aws.amazon.com/kinesisanalytics/latest/java/example-keystore.html >>> >>> Basically, I'm running my Flink app in Amazon's Kinesis Analytics hosted >>> Flink environment. I don't have reliable access to the local file system. >>> At the documentation link above, Amazon recommends adding a hook to copy >>> the keystore files from the classpath to a /tmp directory at runtime. Can >>> KafkaSource do something similar? >>> >>