Found a simple workaround: simply use source jar, and don't attach the
JavaDocs jar. I use IntelliJ and Gradle so for me the change was simple:
[image: Screenshot 2024-08-27 at 08.26.15.png]
As per usual with IntelliJ and Gradle a round of cache flushing was
required after the change.
I think so
Hi Jose,
I have facing a similar issue when working on schema evolution in the
Iceberg connector. The RowData is optimized in a way, that it is expected
to have the same schema for the lifetime of the deployment. This avoids any
extra serialization for every record.
To work around this I see 2 opt
Hi,
I want to build a custom Sink that receives a Row (or GenericRowData or
RowData, depending on your reply) and needs to do some processing before
sending it to the external sink.
So it should be something like this:
Input -> ROW
Then I need to process that element, depending what the Type is
I'm fetching JavaDocs from Maven Central and use them extensively during
development. After the 1.20 upgrade the headers and other parts of the
JavaDocs are now in Chinese. The descriptions are still in english, so it's
still usable and only slightly confusing.
Are english JavaDocs published elsew
Hi.
After upgrading from 1.18 to either 1.19 or 1.20 onTimer don’t fire in one of
our classes that extends KeyedProcessFunction when executing texts running on
mini cluster.
If I add a breakpoint in InternalTimeServiceImpl line 126 the callback is
executed but without a breakpoint it doesn’
Hi Flink Community,
I'm a Hands on Apache Flink Software Engineer looking for job opportunities
in India or the UK (with Tier 2 sponsorship). If anyone knows of openings
or can point me in the right direction, please let me know.
Thanks,
Sri Tummala
Hello,
we use logback instead of log4j and get this error after upgrading Flink to
1.20. It seems that 1.20 does not support logback 1.2.13. We cannot upgrade
because, as documentation states, "Logback 1.3+ requires SLF4J 2, which is
currently not supported". We see that the 1.20 implementation of