Hi Georg,
I recently made a demo using flink sql client + schema registry + kafka as
well to test my own schema registry catalog. To help us locate the root
cause, I think you can add "SET 'sql-client.verbose' = 'true';" in your
launched sql client to enable the sql client output more informatio
Yes, restarting the app with a clean state does seem to fix the issue, but
I think I may have found a bug in Flink.
Here's how we can replicate it:
- Create a simple application with KeyedProcessFunction (with onTimer())
- Send a few records with the same key. In processElement(), register a
timer
Hi,
When trying to set up a demo for the kafka-sql-client reading an Avro topic
from Kafka I run into problems with regards to the additional dependencies.
In the spark-shell there is a --packages option which automatically
resolves any additional required jars (transitively) using the provided
ma
Dear Flink community,
On the 30th of March we will host a meetup on the upcoming Flink 1.15 release.
Get all the information here [1].
There will also be an AMA with Matthias and Chesnay. If you already got a
question
on your mind, let me know.
You might want to have a look at the release wiki
Is anyone able to comment on the below? My worry is this class isn’t well
support so I may need to find an alternative to bulk copy data into SQL Server
e.g. use a simple file sink and then have some process bulk copy the files.
From: Sandys-Lumsdaine, James
Sent
Hi, we are running flink 1.13.2 version on Kinesis Analytics. Our source is
a kafka topic with one partition so far and we are using the
FlinkKafkaConsumer (kafka-connector-1.13.2)
Sometimes we get some errors from the consumer like the below:
"locationInformation":"org.apache.kafka.clients.FetchS
HI Guowei,
Will check the doc out. Thanks for your help.
Best regards,
Chen-Che
On Mon, Mar 21, 2022 at 4:05 PM Guowei Ma wrote:
> Hi, Huang
> From the document[1] it seems that you need to close the iterate stream.
> such as `iteration.closeWith(feedback);`
> BTW You also could get a detailed
Hi Georg,
I just noticed that the replies between Jeff and me didn't go through the
mailing list. For reference, Jeff moved it to
https://github.com/zjffdu/flink-scala-shell
Best regards,
Martijn
On Tue, 22 Mar 2022 at 18:24, Georg Heiler
wrote:
> Many thanks.
>
> In the linked discussion it
Hi Ian,
Unfortunately configuring the naming is only possible when using the
FileSystem connector from DataStream. If this would be an option for you
the configuration is explained here:
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/filesystem/#part-file-configura