Hi, It is possible that one of the jars on the sharelib also contain an embedded (and not correctly shaded) jackson 2.9.0. I had a similar problem once with hive-exec.jar. You can check the content of the jars with a simple unzip.
Best, Sala On Mon, Apr 15, 2019 at 8:51 AM Lian Jiang <jiangok2...@gmail.com> wrote: > Thanks guys for reply. > > I am using jackson 2.6.7 which is in spark 2.3.1 library. > > I deleted all files under /user/oozie/share/lib/lib_{TS}/oozie/ (except > oozie-hadoop-utils-hadoop-2-4.3.1.3.0.0.0-1634.jar and > oozie-sharelib-oozie-4.3.1.3.0.0.0-1634.jar). Also, I make > /user/oozie/share/lib/lib_{TS}/spark/ > have the exactly same jars as those used in building my spark assembly > (works via spark-submit). But I still got above exception. > > Searching by 2.9.0 in the spark job log returns nothing. I guess some > jars loaded have jackson 2.9.0. Is there a way to search jackson 2.9.0 in > the loaded jars? > > I attached the loaded jars. Thanks for any help! > > On Sun, Apr 14, 2019 at 11:03 PM Andras Salamon > <asala...@cloudera.com.invalid> wrote: > >> Hi, >> >> Probably multiple jars in the spark sharelib contain the jackson library >> and the system loads an incorrect one. If you turn on java verbose class >> loading, you can find which jar is loaded and you can clean up the >> sharelib. >> >> Best, >> Sala >> >> On Sat, Apr 13, 2019 at 8:05 PM Lian Jiang <jiangok2...@gmail.com> wrote: >> >> > I tried spark.yarn.user.classpath.first = true and >> > oozie.launcher.mapreduce.user.classpath.first >> > = true but no luck. >> > >> > On Wed, Apr 10, 2019 at 11:47 PM Lian Jiang <jiangok2...@gmail.com> >> wrote: >> > >> > > Hi, >> > > >> > > I am using hortonwork HDP3.0 which has oozie 4.3.1 and spark 2.3.1. My >> > > spark job throws intermittent dependency error: >> > > >> > > 2019-04-11 06:29:07,178 [Driver] ERROR >> > > org.apache.spark.deploy.yarn.ApplicationMaster - User class threw >> > > exception: java.lang.NoClassDefFoundError: Could not initialize class >> > > com.fasterxml.jackson.databind.SerializationConfig >> > > >> > > java.lang.NoClassDefFoundError: Could not initialize class >> > > com.fasterxml.jackson.databind.SerializationConfig >> > > >> > > at >> > > >> com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:565) >> > > >> > > at >> > > >> com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:480) >> > > >> > > >> > > >> > > Caused by: com.fasterxml.jackson.databind.JsonMappingException: >> > > Incompatible Jackson version: 2.9.0 >> > > >> > > at >> > > >> > >> com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64) >> > > >> > > at >> > > >> > >> com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19) >> > > >> > > at >> > > >> > >> com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751) >> > > >> > > at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:59) >> > > >> > > at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala) >> > > >> > > >> > > >> > > The same spark job running via spark-submit always pass. Why spark >> action >> > > has such issue? Thanks for any clue. >> > > >> > >> >