Hi Leonard, Chesnay, thanks for having a look. I was able to sort this out -it 
was because of the change in default Class Loading policy becoming child-first 
introduced in 1.10 through  https://issues.apache.org/jira/browse/FLINK-13749 . 
Once I changed it back to parent-first, I was able to submit jobs.

Hopefully any other devs who have similar issues will find this thread useful :)

// ah

From: Leonard Xu <xbjt...@gmail.com>
Sent: Friday, October 16, 2020 1:10 AM
To: Chesnay Schepler <ches...@apache.org>
Cc: Hailu, Andreas [Engineering] <andreas.ha...@ny.email.gs.com>; 
user@flink.apache.org
Subject: Re: Runtime Dependency Issues Upgrading to Flink 1.11.2 from 1.9.2

Hi, Chesnay

@Leonared I noticed you handled a similar case on the Chinese ML in 
July<https://urldefense.proofpoint.com/v2/url?u=http-3A__apache-2Dflink.147419.n8.nabble.com_flink1-2D11-2Dtd5154.html&d=DwMFoQ&c=7563p3e2zaQw0AB1wrFVgyagb2IE5rTZOYPxLxfZlX4&r=hRr4SA7BtUvKoMBP6VDhfisy2OJ1ZAzai-pcCC6TFXM&m=x3wVpzZtiuEaKQGifHliLwtb0J_8oiqzJoNrUxV19sE&s=ymvt_cFkIWmVUru2K5T3DxHuwJ3CSHc_3s9hdk3W4T4&e=>,
 do you have any insights?

The case in Chinese ML is the user added 
jakarta.ws.rs<https://urldefense.proofpoint.com/v2/url?u=http-3A__jakarta.ws.rs&d=DwMFoQ&c=7563p3e2zaQw0AB1wrFVgyagb2IE5rTZOYPxLxfZlX4&r=hRr4SA7BtUvKoMBP6VDhfisy2OJ1ZAzai-pcCC6TFXM&m=x3wVpzZtiuEaKQGifHliLwtb0J_8oiqzJoNrUxV19sE&s=eXv0iTHKBxgFr2xou-ojEM1QUgpwW_rE1fB_9gmStXg&e=>-api-3.0.0-M1.jar
 to Flink/lib which lead the dependency conflicts, Hailu's case looks 
differently.

Hi @Hailu,
The Hadoop dependency jersey-core-1.9.jar contains class 
javax.ws.rs.RuntimeDelegate,
the dependency 
javax.ws.rs<https://urldefense.proofpoint.com/v2/url?u=http-3A__javax.ws.rs&d=DwMFoQ&c=7563p3e2zaQw0AB1wrFVgyagb2IE5rTZOYPxLxfZlX4&r=hRr4SA7BtUvKoMBP6VDhfisy2OJ1ZAzai-pcCC6TFXM&m=x3wVpzZtiuEaKQGifHliLwtb0J_8oiqzJoNrUxV19sE&s=Pkw8aKBvVqJ9VfcNwDyLYcpm0qMJaKAFWUiQ_MA5U9g&e=>:javax.ws.rs<https://urldefense.proofpoint.com/v2/url?u=http-3A__javax.ws.rs&d=DwMFoQ&c=7563p3e2zaQw0AB1wrFVgyagb2IE5rTZOYPxLxfZlX4&r=hRr4SA7BtUvKoMBP6VDhfisy2OJ1ZAzai-pcCC6TFXM&m=x3wVpzZtiuEaKQGifHliLwtb0J_8oiqzJoNrUxV19sE&s=Pkw8aKBvVqJ9VfcNwDyLYcpm0qMJaKAFWUiQ_MA5U9g&e=>-api
 in your shaed jar also contains class javax.ws.rs.RuntimeDelegate,
I doubt the CastException come from here.

Best
Leonard





On 10/15/2020 7:51 PM, Hailu, Andreas wrote:
Hi Chesnay, no, we haven't changed our Hadoop version. The only changes were 
the update the 1.11.2 runtime dependencies listed earlier, as well as compiling 
with the flink-clients in some of our modules since we were relying  on the 
transitive dependency. Our 1.9.2 jobs are still able to run just fine, which is 
interesting.

// ah

From: Chesnay Schepler <ches...@apache.org><mailto:ches...@apache.org>
Sent: Thursday, October 15, 2020 7:34 AM
To: Hailu, Andreas [Engineering] 
<andreas.ha...@ny.email.gs.com><mailto:andreas.ha...@ny.email.gs.com>; 
user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: Runtime Dependency Issues Upgrading to Flink 1.11.2 from 1.9.2

I'm not aware of any Flink module bundling this class. Note that this class is 
also bundled in jersey-core (which is also on your classpath), so it appears 
that there is a conflict between this jar and your shaded one.
Have you changed the Hadoop version you are using or how you provide them to 
Flink?

On 10/14/2020 6:56 PM, Hailu, Andreas wrote:
Hi team! We're trying to upgrade our applications from 1.9.2 to 1.11.2. After 
re-compiling and updating our runtime dependencies to use 1.11.2, we see this 
LinkageError:

Caused by: java.lang.LinkageError: ClassCastException: attempting to 
castjar:file:/local/data/scratch/hailua_p2epdlsuat/flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar!/javax/ws/rs/ext/RuntimeDelegate.class<file://local/data/scratch/hailua_p2epdlsuat/flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar!/javax/ws/rs/ext/RuntimeDelegate.class>
 to 
jar:file:/local/data/scratch/hailua_p2epdlsuat/flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar!/javax/ws/rs/ext/RuntimeDelegate.class
        at 
javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:146) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at 
javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:120) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at javax.ws.rs.core.MediaType.valueOf(MediaType.java:179) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at com.sun.jersey.core.header.MediaTypes.<clinit>(MediaTypes.java:64) 
~[jersey-core-1.9.jar:1.9]
        at 
com.sun.jersey.core.spi.factory.MessageBodyFactory.initReaders(MessageBodyFactory.java:182)
 ~[jersey-core-1.9.jar:1.9]
        at 
com.sun.jersey.core.spi.factory.MessageBodyFactory.initReaders(MessageBodyFactory.java:175)
 ~[jersey-core-1.9.jar:1.9]
        at 
com.sun.jersey.core.spi.factory.MessageBodyFactory.init(MessageBodyFactory.java:162)
 ~[jersey-core-1.9.jar:1.9]
        at com.sun.jersey.api.client.Client.init(Client.java:342) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at com.sun.jersey.api.client.Client.access$000(Client.java:118) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at com.sun.jersey.api.client.Client$1.f(Client.java:191) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at com.sun.jersey.api.client.Client$1.f(Client.java:187) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193) 
~[jersey-core-1.9.jar:1.9]
        at com.sun.jersey.api.client.Client.<init>(Client.java:187) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at com.sun.jersey.api.client.Client.<init>(Client.java:170) 
~[flink-ingest-refiner-sandbox-SNAPSHOT-fat-shaded.jar:?]
        at 
org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.serviceInit(TimelineClientImpl.java:285)
 ~[hadoop-yarn-common-2.7.3.2.6.3.0-235.jar:?]
        at 
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) 
~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
        at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getTimelineDelegationToken(YarnClientImpl.java:355)
 ~[hadoop-yarn-client-2.7.3.2.6.3.0-235.jar:?]
        at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.addTimelineDelegationToken(YarnClientImpl.java:331)
 ~[hadoop-yarn-client-2.7.3.2.6.3.0-235.jar:?]
        at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.submitApplication(YarnClientImpl.java:250)
 ~[hadoop-yarn-client-2.7.3.2.6.3.0-235.jar:?]
        at 
org.apache.flink.yarn.YarnClusterDescriptor.startAppMaster(YarnClusterDescriptor.java:1002)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]
        at 
org.apache.flink.yarn.YarnClusterDescriptor.deployInternal(YarnClusterDescriptor.java:524)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]
        at 
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:424)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]
        at 
org.apache.flink.client.deployment.executors.AbstractJobClusterExecutor.execute(AbstractJobClusterExecutor.java:70)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]
        at 
org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:973)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]
        at 
org.apache.flink.client.program.ContextEnvironment.executeAsync(ContextEnvironment.java:124)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]
        at 
org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:72)
 ~[flink-dist_2.11-1.11.2.jar:1.11.2]

I'll note that the flink-ingest-refiner jar is our shaded JAR application that 
we use to submit jobs.
Looking into what dependencies have changed, on 1.9.2 our runtime dependencies 
from the available artifacts (sourced from one of the many mirrors)  are:
1.       flink-dist_2.11-1.9.2.jar
2.       flink-table-blink_2.11-1.9.2.jar
3.       flink-table_2.11-1.9.2.jar
4.       log4j-1.2.17.jar
5.       slf4j-log4j12-1.7.15.jar

Whereas 1.11.2's dependencies are:
1.       flink-dist_2.11-1.11.2.jar
2.       flink-table-blink_2.11-1.11.2.jar
3.       flink-table_2.11-1.11.2.jar
4.       log4j-1.2-api-2.12.1.jar
5.       log4j-api-2.12.1.jar
6.       log4j-core-2.12.1.jar
7.       log4j-slf4j-impl-2.12.1.jar

RuntimeDelegate comes from the 
javax.ws.rs<https://urldefense.proofpoint.com/v2/url?u=http-3A__javax.ws.rs&d=DwMFoQ&c=7563p3e2zaQw0AB1wrFVgyagb2IE5rTZOYPxLxfZlX4&r=hRr4SA7BtUvKoMBP6VDhfisy2OJ1ZAzai-pcCC6TFXM&m=x3wVpzZtiuEaKQGifHliLwtb0J_8oiqzJoNrUxV19sE&s=Pkw8aKBvVqJ9VfcNwDyLYcpm0qMJaKAFWUiQ_MA5U9g&e=>:javax.ws.rs<https://urldefense.proofpoint.com/v2/url?u=http-3A__javax.ws.rs&d=DwMFoQ&c=7563p3e2zaQw0AB1wrFVgyagb2IE5rTZOYPxLxfZlX4&r=hRr4SA7BtUvKoMBP6VDhfisy2OJ1ZAzai-pcCC6TFXM&m=x3wVpzZtiuEaKQGifHliLwtb0J_8oiqzJoNrUxV19sE&s=Pkw8aKBvVqJ9VfcNwDyLYcpm0qMJaKAFWUiQ_MA5U9g&e=>-api
 module, which we use internally for some of our REST implementations. I've 
been trying a few things here to no avail such as declaring our dependency on 
rs-api as compileOnly. Curious to hear your thoughts. Was there a version 
change in one of the above listed 1.11.2 modules?

____________

Andreas Hailu
Data Lake Engineering | Goldman Sachs & Co.


________________________________

Your Personal Data: We may collect and process information about you that may 
be subject to data protection laws. For more information about how we use and 
disclose your personal data, how we protect your information, our legal basis 
to use your information, your rights and who you can contact, please refer to: 
www.gs.com/privacy-notices<http://www.gs.com/privacy-notices>


________________________________

Your Personal Data: We may collect and process information about you that may 
be subject to data protection laws. For more information about how we use and 
disclose your personal data, how we protect your information, our legal basis 
to use your information, your rights and who you can contact, please refer to: 
www.gs.com/privacy-notices<http://www.gs.com/privacy-notices>




________________________________

Your Personal Data: We may collect and process information about you that may 
be subject to data protection laws. For more information about how we use and 
disclose your personal data, how we protect your information, our legal basis 
to use your information, your rights and who you can contact, please refer to: 
www.gs.com/privacy-notices<http://www.gs.com/privacy-notices>

Reply via email to