Shading worked pretty well for me when I ran into an issue similar to yours. POM is all you need to change.
<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>1.6</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <filters> <filter> <artifact>*:*</artifact> <excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> <exclude>META-INF/*.RSA</exclude> </excludes> </filter> </filters> <transformers> <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> <mainClass>com.group.id.Launcher1</mainClass> </transformer> </transformers> </configuration> </execution> </executions> </plugin> > On Feb 16, 2016, at 5:10 PM, Sean Owen <so...@cloudera.com> wrote: > > Shading is the answer. It should be transparent to you though if you > only apply it at the module where you create the deployable assembly > JAR. > > On Tue, Feb 16, 2016 at 5:08 PM, Martin Skøtt <mar...@z3n.dk> wrote: >> Hi, >> >> I recently started experimenting with Spark Streaming for ingesting and >> enriching content from a Kafka stream. Being new to Spark I expected a bit >> of a learning curve, but not with something as simple a using JSON data! >> >> I have a JAR with common classes used across a number of Java projects which >> I would also like to use in my Spark projects, but it uses a version of >> Jackson which is newer than the one packaged with Spark - I can't (and >> won't) downgrade to the older version in Spark. Any suggestions on how to >> solve this? >> >> I have looked at using the shade plugin to rename my version of Jackson, but >> that would require me to change my common code which I would like to avoid. >> >> >> -- >> Kind regards >> Martin >> > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org