Hi Martijn,

Many thanks for your reply. Yes I have seen the examples. I removed the 
flink-scala dependency and only use the java libs for everything. So there 
should no flink-scala API references in the stack.

These are the flink dependencies we are using:

"org.apache.flink" % "flink-core" % flinkVersion,
"org.apache.flink" % "flink-streaming-java" % flinkVersion,
"org.apache.flink" % "flink-table-api-java" % flinkVersion,
"org.apache.flink" % "flink-table-api-java-bridge" % flinkVersion,
"org.apache.flink" % "flink-table-runtime" % flinkVersion,
"org.apache.flink" % "flink-clients" % flinkVersion,
"org.apache.flink" % "flink-connector-base" % flinkVersion,
"org.apache.flink" % "flink-table-planner-loader" % flinkVersion,
"org.apache.flink" % "flink-connector-kafka" % flinkVersion,
"org.apache.flink" % "flink-statebackend-rocksdb" % flinkVersion % Provided,
"org.apache.flink" % "flink-avro" % flinkVersion,
"org.apache.flink" % "flink-avro-confluent-registry" % flinkVersion,

And the code path where it fails is not directly related to Flink: (It does not 
fail on the unit tests).


Map(
  PsNowTag -> new OutputTag[T](PsNowTag, typeInformation),
  SCRNTag -> new OutputTag[T](SCRNTag, typeInformation),
  GMSTag -> new OutputTag[T](GMSTag, typeInformation),
  MuleTag -> new OutputTag[T](MuleTag, typeInformation),
  FutureTag -> new OutputTag[T](FutureTag, typeInformation),
)

The failure looks like an incompatibility issue Scala runtime. Scala 2.12 vs. 
Scala 2.13 collections are not compatible:

java.lang.NoSuchMethodError: 'scala.collection.immutable.ArraySeq 
scala.runtime.ScalaRunTime$.wrapRefArray(java.lang.Object[])'

So this issue seems weird and does look like Flink is using the Scala 2.12 
runtime even if the flink-scala packages are not installed. The question is why?
Then I would wonder if it is possible to completely exclude Scala from the 
Flink docker image.

Best,

Patrick
--
Patrick Eifler
Staff Software Engineer
FTG Data Engineering
Sony Interactive Entertainment
Kemperplatz 1
10785 Berlin

From: Martijn Visser <martijnvis...@apache.org>
Date: Tuesday, 9. January 2024 at 16:41
To: Eifler, Patrick <patrick.eif...@sony.com>
Cc: user@flink.apache.org <user@flink.apache.org>
Subject: Re: Flink 1.17 with Scala 2.13 or Scala 3
Hi Patrick,

You're on the right track, because you can't use any of the Flink
Scala APIs in order to use any arbitrary Scala version. Have you seen
the examples with Scala 3? [2] Do you have an example of your
code/setup?

Best regards,

Martijn

[1] 
https://flink.apache.org/2022/02/22/scala-free-in-one-fifteen/<https://flink.apache.org/2022/02/22/scala-free-in-one-fifteen/>
[2] 
https://github.com/sjwiesman/flink-scala-3<https://github.com/sjwiesman/flink-scala-3>

On Tue, Jan 9, 2024 at 4:16 PM Eifler, Patrick <patrick.eif...@sony.com> wrote:
>
> Hi,
>
>
>
> The Flink image still has Scala 2.12 dependencies. I tried to run a flink job 
> written in Scala 2.13 avoiding the usage of all flink-scala api’s but getting 
> an incompatibility issue (regarding scala.collections) that I would normally 
> expect in a project were Scala 2.13 and Scala 2.12 are running alongside.
>
> So I wonder what am I missing, as in the docs and jira tickets everyone says 
> it works now with any Scala Version.
> Any pointers are appreciated, Thanks.
>
>
>
> Best,
>
> --
>
> Patrick Eifler

Reply via email to