By the way, I also upgraded to flink-connector-kafka ver. 3.0.2-1.18, to no
avail.

On Sun, Dec 3, 2023 at 7:45 AM Barak Ben-Nathan <barak...@xmcyber.com>
wrote:

> Thank's Jim,
>
> Unfortunately, this did not resolve the issue.
>
> I tried downgrading to 17.2 -- and everything works fine.
> In version 18.0 -- I also added  flink-clients (in addition to
> flink-core), but the problem persists....
>
>
> On Fri, Dec 1, 2023 at 10:44 PM Jim Hughes via user <user@flink.apache.org>
> wrote:
>
>> Hi Barak,
>>
>> The missing class is in "flink-core", I think adding that dependency will
>> provide it.
>>
>> The release notes for 1.14 note that Connectors no longer bundle
>> "flink-core".  I imagine this is what has caused your issue.
>>
>> https://nightlies.apache.org/flink/flink-docs-release-1.18/release-notes/flink-1.14/#connector-base-exposes-dependency-to-flink-core
>>
>> Cheers,
>>
>> Jim
>>
>> On Fri, Dec 1, 2023 at 3:30 PM Barak Ben-Nathan <barak...@xmcyber.com>
>> wrote:
>>
>>>
>>> hi,
>>>
>>> I am trying to upgrade my app to Flink 1.18.
>>>
>>> I have tests that run my stream from/to Embedded (in-memory) Kafka.
>>> I.e. They create a flink cluster thus:
>>>
>>> val flinkCluster = new MiniClusterWithClientResource(new 
>>> MiniClusterResourceConfiguration.Builder()
>>>   .setNumberSlotsPerTaskManager(2)
>>>   .setNumberTaskManagers(1)
>>>   .build)
>>>
>>> before {
>>>   flinkCluster.before()
>>> }
>>>
>>> after {
>>>   flinkCluster.after()
>>> }
>>>
>>> get an execution env thus:
>>>
>>> val env = StreamExecutionEnvironment.getExecutionEnvironment
>>>
>>> build the pipeline and execute it.
>>>
>>> These tests fail due to:
>>>
>>> org.apache.flink.runtime.client.JobInitializationException: Could not
>>> start the JobMaster.
>>>
>>> at
>>> org.apache.flink.runtime.jobmaster.DefaultJobMasterServiceProcess.lambda$new$0(DefaultJobMasterServiceProcess.java:97)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)
>>>
>>> at
>>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>>>
>>> at
>>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>>>
>>> at java.base/java.lang.Thread.run(Thread.java:829)
>>>
>>> Caused by: java.util.concurrent.CompletionException:
>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>> org.apache.flink.api.common.ExecutionConfig
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
>>>
>>> ... 3 more
>>>
>>> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>> org.apache.flink.api.common.ExecutionConfig
>>>
>>> at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:321)
>>>
>>> at
>>> org.apache.flink.util.function.FunctionUtils.lambda$uncheckedSupplier$4(FunctionUtils.java:114)
>>>
>>> at
>>> java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
>>>
>>> ... 3 more
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.flink.api.common.ExecutionConfig
>>>
>>> at
>>> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
>>>
>>> at
>>> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>>>
>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:527)
>>>
>>> at java.base/java.lang.Class.forName0(Native Method)
>>>
>>> at java.base/java.lang.Class.forName(Class.java:398)
>>>
>>> at
>>> org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78)
>>>
>>> at
>>> java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2003)
>>>
>>> at
>>> java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1870)
>>>
>>> at
>>> java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2201)
>>>
>>> at
>>> java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
>>>
>>> at
>>> java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:489)
>>>
>>> at
>>> java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:447)
>>>
>>> at
>>> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:539)
>>>
>>> at
>>> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:527)
>>>
>>> at
>>> org.apache.flink.util.SerializedValue.deserializeValue(SerializedValue.java:67)
>>>
>>> at
>>> org.apache.flink.runtime.scheduler.DefaultSchedulerFactory.createInstance(DefaultSchedulerFactory.java:101)
>>>
>>> at
>>> org.apache.flink.runtime.jobmaster.DefaultSlotPoolServiceSchedulerFactory.createScheduler(DefaultSlotPoolServiceSchedulerFactory.java:122)
>>>
>>> at
>>> org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:379)
>>>
>>> at
>>> org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:356)
>>>
>>> at
>>> org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.internalCreateJobMasterService(DefaultJobMasterServiceFactory.java:128)
>>>
>>> at
>>> org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.lambda$createJobMasterService$0(DefaultJobMasterServiceFactory.java:100)
>>>
>>> at
>>> org.apache.flink.util.function.FunctionUtils.lambda$uncheckedSupplier$4(FunctionUtils.java:112)
>>> ... 4 more
>>>
>>> My flink dependencies are:
>>>
>>> val flinkDependencies = Seq(
>>> //  "org.apache.flink" %% "flink-scala" % flinkVersion % Provided,
>>>   "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % Provided,
>>> //  "org.apache.flink" % "flink-streaming-java" % flinkVersion % Provided,
>>>   "org.apache.flink" % "flink-connector-base" % "1.18.0",
>>>   "org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18")
>>>
>>> val testFlinkDependencies = Seq(
>>>   // flink-test-utils brings log4j-slf4j-impl, which causes 
>>> java.lang.NoSuchMethodError when running tests
>>>   // 
>>> https://stackoverflow.com/questions/75710149/java-lang-nosuchmethoderror-void-org-apache-logging-slf4j-log4jloggerfactory
>>>   "org.apache.flink" % "flink-test-utils" % flinkVersion % Test excludeAll (
>>>     ExclusionRule(organization = "org.apache.logging.log4j", name = 
>>> "log4j-slf4j-impl")
>>>     ),
>>>   "org.apache.flink" % "flink-runtime" % flinkVersion % Test,
>>>   "org.apache.flink" % "flink-streaming-java" % flinkVersion % Test 
>>> classifier "tests"
>>> )
>>>
>>>  This used to work fine in Flink 1.13....
>>>
>>> Does anyone have an idea what is causing this?
>>>
>>>
>>>
>>>
>>>

Reply via email to