What is the error on the parallelism you're facing?
Att,
Pedro Mázala
Be awesome
On Mon, 26 May 2025 at 10:13, Sambhav Gupta wrote:
> Hi Team,
>
> We are migrating our codebase of flink to V2.1 version. Here were using
> dataset jobs which we need to migrate to data stream now and while doin
Hi Pavel,
Hadoop is already part of parent-first classloading by default [1], but I
have tried this as well. Switching to parent-first classloading does not
help either.
1 -
https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/deployment/config/#classloader-parent-first-patterns-defaul
Hello Pedroh!
I am adding the jar under /opt/flink/plugins/s3-fs-hadoop inside a docker
image. It's definitely not happening under the task manager and I don't
believe it's happening under Job manager either. The error is coming from
FlinkSessionJob under the Kubernetes' Custom Resource Definition
Hey,
There is no error with the parallelism .I want to increase it for this
function as it is creating a bottleneck for the disk space which I am not
able to do.
I tried using setParallelism() here but I think it doesn't comply with
flink schema and is using default parallelism instead
Can you p
Hi Aleksandr,
you could try the next configuration option:
classloader.parent-first-patterns.additional: "org.apache.hadoop"
to force Flink to load Hadoop classes on the parent ClassLoader.
No guarantees, but maybe it will solve your problem.
On Mon, 2025-05-12 at 11:26 +0100, Aleksandr Pilipen
Hi Team,
We are migrating our codebase of flink to V2.1 version. Here were using
dataset jobs which we need to migrate to data stream now and while doing
this we faced an error of parallelism of keyby and window function in our
full outerjoin function which is creating bottleneck for us in case of
Hello there Jean!
This may be related to a network issue between Flink and minio/s3. I had
this in the past and I configured Flink to not start if the state was not
possible. So every time I received one of those, Flink would restart and
try again.
Att,
Pedro Mázala
Be awesome
On Thu, 22 May
Hello there Bryan!
It looks like Flink cannot find the s3 schema in your packages. How are you
adding the jars? Is the error happening on TM or on JM?
Att,
Pedro Mázala
Be awesome
On Thu, 22 May 2025 at 19:45, Bryan Cantos wrote:
> Hello,
>
> I have deployed the Flink Operator via helm chart