hat upgrading Kryo isn't
> such a problem in the future; in essence reworking how Kryo is integrated
> into Flink.
>
> That said, the v5 migration guide is quite interesting; specifically that
> Kryo offers a versioned jar.
>
> On 09/02/2023 17:32, Clayton Wohl wrote:
&g
ssibility a future release of Flink can upgrade to a
> recent version of Kryo serialization?
>
> Of course there is, but someone needs to figure out a way to do this
> without breaking everything or providing a reasonable upgrade path,
> which has been blocking us so far.
>
> On 0
I've noticed the latest Flink is using the Kryo serializer library version
2.24.0 which is back from 2015!
The Kryo project is actively maintained, it's on version 5.4.0, so 2.24.0
is really quite ancient. I presume the concern is maintaining compatibility
with persisted savepoints. That's a valid
x directly. It works! Thank you :)
On Wed, Nov 23, 2022 at 4:39 PM Clayton Wohl wrote:
> When upgrading an application from Flink 1.14.6 to Flink 1.16.0, I get the
> following exception:
>
> ERROR org.apache.flink.runtime.metrics.ReporterSetup - Could not
> instantiate metrics
At my job, we are using the Spotify Flink Operator in production. Are there
any pros/cons of this Spotify Flink Operator versus the Apache Flink
Operator? We are particularly interested in the forthcoming autoscaling
functionality, but I understand that functionality isn't ready yet. Are
there any
When upgrading an application from Flink 1.14.6 to Flink 1.16.0, I get the
following exception:
ERROR org.apache.flink.runtime.metrics.ReporterSetup - Could not
instantiate metrics reporter prom. Metrics might not be exposed/reported.
java.lang.InstantiationException:
org.apache.flink.metrics.pro
I have a streaming Flink job that runs 24/7 on a Kubernetes cluster hosted
in AWS. Every few weeks or sometimes months, the job fails down with
network errors like the following error in the logs. This is with Flink
1.14.5.
Is there anything that I can do to help my application automatically retry
+1
At my employer, we maintain several Flink jobs in Scala. We've been writing
newer jobs in Java, and we'd be fine with porting our Scala jobs over to
the Java API.
I'd like to request Java 17 support. Specifically, Java records is a
feature our Flink code would use a lot of and make the Java sy
May I ask if there are any plans to support Java 17 in a future version of
Flink?
I see this ticket, but it is inactive:
https://issues.apache.org/jira/browse/FLINK-15736
There are several Java 17 features that would help in writing+supporting
Flink applications. Thank you :)
I have a Flink job that has been running with Flink 1.14.4 perfectly for a
few months.
I tried upgrading to Flink 1.15.0. There are no error messages or
exceptions, it runs perfectly fine for several hours, but after a few hours
the Flink app starts to lag in processing an input Kafka topic. I can
Is there any way to save a custom application-global cache into Flink state
so that it is used with checkpoints + savepoints? This cache is used by a
RichAsyncFunction that queries an external database, and RichAsyncFunction
doesn't support the Flink state functionality directly.
I asked this last
I have a RichAsyncFunction that does async queries to an external database.
I'm using a Guava cache within the Flink app. I'd like this Guava cache to
be serialized with the rest of Flink state in checkpoint/savepoints.
However, RichAsyncFunction doesn't support the state functionality at all.
The
-Dlogback.configurationFile=file:/opt/flink/conf/logback.xml
that all makes sense. thank you.
On Fri, Jan 28, 2022 at 4:02 PM Clayton Wohl wrote:
> When I run my Flink app via the Google Flink Operator, log4j2 logging
> doesn't show up in logs. System.out.println messages work.
>
When I run my Flink app via the Google Flink Operator, log4j2 logging
doesn't show up in logs. System.out.println messages work.
Everything should be very plain-vanilla-standard setup. I have a log4j2.xml
config file in my classpath included in my application .jar file. I'm
building a custom Docke
>
> Best,
> Piotrek
>
> pt., 7 sty 2022 o 21:18 Clayton Wohl napisaĆ(a):
>
>> If I want to migrate from FlinkKafkaConsumer to KafkaSource, does the
>> latter support this:
>>
>>
>> https://docs.aws.amazon.com/kinesisanalytics/latest/java/example-keysto
Are there any plans for Flink to support Java 17 and provide Java 17-based
Docker images?
There are a variety of new language/VM features we'd like to use and we
were hoping Flink would support Java 17.
thanks
Kurt
The latest version of flink-connector-kafka, still uses kafka-clients
2.4.1. There have been a lot of upgrades in the Kafka consumer/producer
library since then.
May I request that the Flink project upgrade to a recent version of the
Kafka library?
thanks!
If I want to migrate from FlinkKafkaConsumer to KafkaSource, does the
latter support this:
https://docs.aws.amazon.com/kinesisanalytics/latest/java/example-keystore.html
Basically, I'm running my Flink app in Amazon's Kinesis Analytics hosted
Flink environment. I don't have reliable access to the
18 matches
Mail list logo