Hey, Ted <fasterxml.jackson.version>2.4.4</fasterxml.jackson.version> > > Looks like Tranquility uses different version of jackson. > > How do you build your jar ? >
I'm building a jar with dependencies using the maven assembly plugin. Below is all jackson's dependencies: [INFO] com.fasterxml.jackson.module:jackson-module-scala_2.10:jar:2.4.5:compile [INFO] com.fasterxml.jackson.core:jackson-databind:jar:2.4.6:compile [INFO] com.fasterxml.jackson.datatype:jackson-datatype-joda:jar:2.4.6:compile [INFO] com.fasterxml.jackson.dataformat:jackson-dataformat-smile:jar:2.4.6:compile [INFO] com.fasterxml.jackson.jaxrs:jackson-jaxrs-smile-provider:jar:2.4.6:compile [INFO] com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.4.6:compile [INFO] com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.4.6:compile [INFO] com.fasterxml.jackson.core:jackson-core:jar:2.4.6:compile [INFO] com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.4.6:compile [INFO] com.fasterxml.jackson.datatype:jackson-datatype-guava:jar:2.4.6:compile [INFO] com.fasterxml.jackson.core:jackson-annotations:jar:2.4.6:compile [INFO] org.json4s:json4s-jackson_2.10:jar:3.2.10:provided [INFO] org.codehaus.jackson:jackson-xc:jar:1.8.3:provided [INFO] org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:provided [INFO] org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile [INFO] org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile [INFO] com.google.http-client:google-http-client-jackson2:jar:1.15.0-rc:compile As you can see, my jar requires fasterxml 2.4.6. In that case, what does spark do? Does it run my jar with my jackson lib (inside my jar) or uses the jackson version (2.4.4) used by spark? Note that one of my dependencies is: <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.6.1</version> <scope>provided</scope> </dependency> and the jackson version 2.4.4 was not listed in maven dependencies... > > Consider using maven-shade-plugin to resolve the conflict if you use maven. > > Cheers > > On Thu, Mar 31, 2016 at 9:50 AM, Marcelo Oikawa < > marcelo.oik...@webradar.com> wrote: > >> Hi, list. >> >> We are working on a spark application that sends messages to Druid. For >> that, we're using Tranquility core. In my local test, I'm using the >> "spark-1.6.1-bin-hadoop2.6" distribution and the following dependencies in >> my app: >> >> <dependency> >> <groupId>org.apache.spark</groupId> >> <artifactId>spark-streaming_2.10</artifactId> >> <version>1.6.1</version> >> <scope>provided</scope> >> </dependency> >> <dependency> >> <groupId>io.druid</groupId> >> <artifactId>tranquility-core_2.10</artifactId> >> <version>0.7.4</version> >> </dependency> >> >> But i getting the error down below when Tranquility tries to create >> Tranquilizer object: >> >> tranquilizer = >> DruidBeams.fromConfig(dataSourceConfig).buildTranquilizer(tranquilizerBuider); >> >> The stacktrace is down below: >> >> java.lang.IllegalAccessError: tried to access method >> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap; >> from class >> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector >> at >> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39) >> at >> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269) >> at >> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433) >> at >> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._constructDefaultValueInstantiator(BasicDeserializerFactory.java:325) >> at >> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory.findValueInstantiator(BasicDeserializerFactory.java:266) >> at >> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:266) >> at >> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:168) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:399) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:348) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:261) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:241) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:142) >> at >> com.fasterxml.jackson.databind.DeserializationContext.findContextualValueDeserializer(DeserializationContext.java:380) >> at >> com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.construct(PropertyBasedCreator.java:96) >> at >> com.fasterxml.jackson.databind.deser.BeanDeserializerBase.resolve(BeanDeserializerBase.java:413) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:292) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:241) >> at >> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:142) >> at >> com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:394) >> at >> com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3169) >> at >> com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:2767) >> at >> com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:2700) >> at >> com.metamx.tranquility.druid.DruidBeams$.fromConfigInternal(DruidBeams.scala:192) >> at >> com.metamx.tranquility.druid.DruidBeams$.fromConfig(DruidBeams.scala:119) >> at >> com.metamx.tranquility.druid.DruidBeams.fromConfig(DruidBeams.scala) >> >> Does someone faced that problem too? >> >> I know that it's related to jackson lib conflict but could anyone please >> shed some light? I created a jar with dependencies and when I submit a job >> for spark, does it run with just with the libraries inside the jar, right? >> Where is the conflict between jacksons libraries? >> > >