Savepoint class refactor in 1.11 causing restore from 1.9 savepoint to fail

2021-07-30 Thread Weston Woods
I am unable to restore a 1.9 savepoint into a 1.11 runtime for the very interesting reason that the Savepoint class was renamed and repackaged between those two releases. Apparently a Kryo serializer has that class registered in the 1.9 runtime. I can’t think of a good reason for that clas

Re: Unable to use custom AWS credentials provider - 1.9.2

2021-07-30 Thread Arvid Heise
Well usually the plugins should be properly isolated but Flink 1.9 is quite old so there is a chance the plugin classloader was not fully isolated. But I also have a hard time concluding anything with the small stacktrace. Do you need aws-java-sdk-core because of Kinesis? On Fri, Jul 30, 2021 at

RE: Unable to use custom AWS credentials provider - 1.9.2

2021-07-30 Thread Hailu, Andreas [Engineering]
Hi Arvid, Yes, we do have AWSCredentialsProvider in our user JAR. It’s coming from aws-java-sdk-core. Must we exclude that, then? // ah From: Arvid Heise Sent: Friday, July 30, 2021 11:26 AM To: Ingo Bürk Cc: user Subject: Re: Unable to use custom AWS credentials provider - 1.9.2 Can you do

Re: Unable to use custom AWS credentials provider - 1.9.2

2021-07-30 Thread Arvid Heise
Can you double-check if you have a AWSCredentialsProvider in your user jar or in your flink/lib/ ? Same for S3AUtils? On Fri, Jul 30, 2021 at 9:50 AM Ingo Bürk wrote: > Hi Andreas, > > Such an exception can occur if the class in question (your provider) and > the one being checked (AWSCredential

RE: Obtain JobManager Web Interface URL

2021-07-30 Thread Hailu, Andreas [Engineering]
Hello Yangze, thanks for responding. I'm attempting to perform this programmatically on YARN, so looking at a log just won't do :) What's the appropriate way to get an instance of a ClusterClient? Do you know of any examples I can look at? // ah -Original Message- From: Yangze Guo Sen

Re: Flink CDC job getting failed due to G1 old gc

2021-07-30 Thread Ayush Chauhan
I am using RocksDB as the state backend. My pipeline checkpoint size is hardly ~100kb. I will add gc and heap dump config and will let you know of any findings Right now I have doubts that there is some memory leak either in flink cdc code or in iceberg sink https://iceberg.apache.org/flink/#over

Issue with writing to Azure blob storage using StreamingFileSink and FileSink

2021-07-30 Thread Sudhanva Purushothama
Hello,     I have been trying to Use StreamingFileSink to write to parquetFiles into azure blob storage. I am getting the following error. I did see in the ticket https://issues.apache.org/jira/browse/FLINK-17444 that support for StreamingFileSink is not yet provided. code.java Description: Binary

Re: Migrating Kafka Sources (major version change)

2021-07-30 Thread Arvid Heise
Hi Dan, sorry for the mixup. I think the idleness definition [1] is orthogonal to the used source interface. The new source interface just makes it more obvious to the user that he can override the watermark strategy. I'd still recommend having a look at the new Kafka source though. One interesti

Over Window Aggregation Tuning

2021-07-30 Thread Wanghui (HiCampus)
Hi : When I use Over Window Aggregation,windows size is 1 hour,I find the processing speed decreases over time. How can I tuning the over window? Best regards Hui Wang

Re: Flink 1.13 fails to load log4j2 yaml configuration file via jackson-dataformat-yaml

2021-07-30 Thread Yuval Itzchakov
Thanks Chesnay. Will try and report back. On Fri, Jul 30, 2021, 10:19 Chesnay Schepler wrote: > Of course if is finding the file, you are actively pointing it towards it. > The BashJavaUtils are supposed to use the log4j configuration file *that > is bundled in the BashJavaUtils.jar, *which you

Re: Unable to use custom AWS credentials provider - 1.9.2

2021-07-30 Thread Ingo Bürk
Hi Andreas, Such an exception can occur if the class in question (your provider) and the one being checked (AWSCredentialsProvider) were loaded from different class loaders. Any chance you can try once with 1.10+ to see if it would work? It does look like a Flink issue to me, but I'm not sure thi

Re: Flink 1.13 fails to load log4j2 yaml configuration file via jackson-dataformat-yaml

2021-07-30 Thread Chesnay Schepler
Of course if is finding the file, you are actively pointing it towards it. The BashJavaUtils are supposed to use the log4j configuration file /that is bundled in the BashJavaUtils.jar, /which you are now interfering with. That's also why it doesn't require all of lib/ to be on the classpath; th