Hi
Afaik the commit files action happens at the committer operator instead of
the JM size after the new sink api [1].
It means this would not happen if you use the new `FlinkSink`.[2]
[1]
https://cwiki.apache.org/confluence/display/FLINK/FLIP-143%3A+Unified+Sink+API
[2]
https://github.com/apach
Can anyone help me with this?
Thanks in advance,
On Tue, Apr 19, 2022 at 4:28 PM Dongwon Kim wrote:
> Hi,
>
> I'm using Flink-1.14.4 and failed to load in WindowReaderFunction the
> state of a stateful trigger attached to a session window.
> I found that the following data become available in W
Yeah, I already tried that way. However, if we did not use DataStream at first. We cannot implement the Savepoint since through the doc if we use TableAPI (SQL API), the uid is generated automatically which means we cannot revert if the system is crashed. Best,Quynh Sent from Mail for Windows From:
DebeziumAvroRowDeserializationSchema and
DebeziumJsonRowDeserializationSchema are still not supported in
Python DataStream API.
Just take a further look at the Java implementation of
DebeziumAvroDeserializationSchema and DebeziumJsonDeserializationSchema,
the results type is RowData instead of Row
Yes, we should support them.
For now, if you want to use them, you could create ones in your own
project. You could refer to AvroRowDeserializationSchema[1] as an example.
It should not be complicated as it's simply a wrapper of the
Java implementation.
Regards,
Dian
[1]
https://github.com/apach
Regarding the problem `python setup.py install` vs `pip install
apache-flink==1.14.4`, have created an issue
https://issues.apache.org/jira/browse/FLINK-27373 to track it.
On Mon, Apr 25, 2022 at 9:42 AM Dian Fu wrote:
> Hi John,
>
> I'm also using MacOS. This is the steps I'm following which I
Hi all,
Due to FlinkKafkaConsumer and FlinkKafkaProducer will be depreciated at
Flink 1.15*[1]*, we are trying to migrate the APIs to KafkaSource and
KafkaSink*[2]*. At the same time, we also modified the serilizers*[3]*. Our
Kafka settings are not changed*[4]*.
The services are very stable befor
Thank Dian !! Very appreciate this.However, I have another questions related to this. In current version or any updating in future, does DataStream support DebeziumAvroRowDeserializationSchema and DebeziumJsonRowDeserializationSchema in PyFlink ? Since I look at the documentation and seem it is not
Hi,
We try to migrate our application from `Flink on standalone Kubernetes`
to `Application mode on Flink operator`. However, we cannot configure to
use local SSD for RocksDB state successful. Any through?
Detail:
In original `Flink on standalone Kubernetes`:
- set `io.tmp.dirs` to local SSD an
I start a standalone session on a single server with only one taskMgr.
The JVM metaspace will become bigger after submitting a new job.
Even if I cancel the submitted job, the JVM metaspace will not decrease.
After submitting about 15 times, the task manager was shut down because of
OOM
2022-04-2
Hi all,
We recently encountered a random issue. When our Flink application is doing
checkpoint creation, it occasionally fails because it thinks the medatafile
of the checkpoint already exists. However, the medata file does not exist
actually. We use Flink version 1.14.4 and the checkpoints are st
Hi John,
I'm also using MacOS. This is the steps I'm following which I have run
successfully:
1) python3 -m venv .venv
2) source .venv/bin/activate
3) pip install apache-flink==1.14.4
4) python -c "import pyflink;import
os;print(os.path.dirname(os.path.abspath(pyflink.__file__))+'/log')"
It will p
And now when I add further dependencies to the classpath to remove all
ClassNotFound exceptions, I get a different error which I don't understand
(TypeError: Could not found the Java class
'EnvironmentSettings.inStreamingMode'.), see the logs below:
$ python test_table_api.py TableTests.test_s
I get a bit further when I add all of the transitive dependencies to the
classpath, where I download these by calling mvn twice:
mkdir -p out
mvn org.apache.maven.plugins:maven-dependency-plugin:2.10:copy
-Dartifact=org.apache.flink:flink-python_2.11:1.14.4:pom
-DoutputDirectory=$(pwd)/out
m
Hi Dian,
Thank you very much, that's very helpful. I'm seeing a couple of errors when I
try to run the example though (Python 3.8 on Mac OS).
1. I create a fresh Python virtual env: `python -m venv .venv`
2. `source .venv/bin/activate`
3. When I tried to configure the project by runni
15 matches
Mail list logo