@Victor,

I'm pretty sure I built it correctly, I specified -Dhadoop.version=2.6.0,
am I missing something here?  Followed the docs on this but I'm open to
suggestions.

make-distribution.sh --name hadoop2.6 --tgz -Pyarn -Phadoop-2.4
*-Dhadoop.version=2.6.0* -Phive -Phive-thriftserver -DskipTests clean
package

@Ted
Well it is building now with the Djackson.version=1.9.3, can update in a
few on if it works.

@Sean
Since it in the process of building I will let it finish and try it out,
but do you see any other possible issues with the approach I have taken?

Thanks all for the quick responses.

-Todd

On Thu, Mar 5, 2015 at 1:20 PM, Sean Owen <so...@cloudera.com> wrote:

> Jackson 1.9.13? and codehaus.jackson.version? that's already set by
> the profile hadoop-2.4.
>
> On Thu, Mar 5, 2015 at 6:13 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> > Please add the following to build command:
> > -Djackson.version=1.9.3
> >
> > Cheers
> >
> > On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist <tsind...@gmail.com> wrote:
> >>
> >> I am running Spark on a HortonWorks HDP Cluster. I have deployed there
> >> prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are
> a
> >> few fixes and features in there that I would like to leverage.
> >>
> >> I just downloaded the spark-1.2.1 source and built it to support Hadoop
> >> 2.6 by doing the following:
> >>
> >> radtech:spark-1.2.1 tnist$ ./make-distribution.sh --name hadoop2.6 --tgz
> >> -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
> >> -DskipTests clean package
> >>
> >> When I deploy this to my hadoop cluster and kick of a spark-shell,
> >>
> >> $> spark-1.2.1-bin-hadoop2.6]# ./bin/spark-shell --master yarn-client
> >> --driver-memory 512m --executor-memory 512m
> >>
> >> Results in  java.lang.NoClassDefFoundError:
> >> org/codehaus/jackson/map/deser/std/StdDeserializer
> >>
> >> The full stack trace is below. I have validate that the
> >> $SPARK_HOME/lib/spark-assembly-1.2.1-hadoop2.6.0.jar does infact
> contain the
> >> class in question:
> >>
> >> jar -tvf spark-assembly-1.2.1-hadoop2.6.0.jar | grep
> >> 'org/codehaus/jackson/map/deser/std'
> >>
> >> ...
> >>  18002 Thu Mar 05 11:23:04 EST 2015
> >> parquet/org/codehaus/jackson/map/deser/std/StdDeserializer.class
> >>   1584 Thu Mar 05 11:23:04 EST 2015
> >>
> parquet/org/codehaus/jackson/map/deser/std/StdKeyDeserializer$BoolKD.class
> >> ...
> >>
> >> Any guidance on what I missed ? If i start the spark-shell in standalone
> >> it comes up fine, $SPARK_HOME/bin/spark-shell so it looks to be related
> to
> >> starting it under yarn from what I can tell.
> >>
> >> TIA for the assistance.
> >>
> >> -Todd
> >>
> >> Stack Trace
> >>
> >> 15/03/05 12:12:38 INFO spark.SecurityManager: Changing view acls to:
> root
> >> 15/03/05 12:12:38 INFO spark.SecurityManager: Changing modify acls to:
> >> root
> >> 15/03/05 12:12:38 INFO spark.SecurityManager: SecurityManager:
> >> authentication disabled; ui acls disabled; users with view permissions:
> >> Set(root); users with modify permissions: Set(root)
> >> 15/03/05 12:12:38 INFO spark.HttpServer: Starting HTTP Server
> >> 15/03/05 12:12:39 INFO server.Server: jetty-8.y.z-SNAPSHOT
> >> 15/03/05 12:12:39 INFO server.AbstractConnector: Started
> >> SocketConnector@0.0.0.0:36176
> >> 15/03/05 12:12:39 INFO util.Utils: Successfully started service 'HTTP
> >> class server' on port 36176.
> >> Welcome to
> >>       ____              __
> >>      / __/__  ___ _____/ /__
> >>     _\ \/ _ \/ _ `/ __/  '_/
> >>    /___/ .__/\_,_/_/ /_/\_\   version 1.2.1
> >>       /_/
> >>
> >> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
> >> Type in expressions to have them evaluated.
> >> Type :help for more information.
> >> 15/03/05 12:12:43 INFO spark.SecurityManager: Changing view acls to:
> root
> >> 15/03/05 12:12:43 INFO spark.SecurityManager: Changing modify acls to:
> >> root
> >> 15/03/05 12:12:43 INFO spark.SecurityManager: SecurityManager:
> >> authentication disabled; ui acls disabled; users with view permissions:
> >> Set(root); users with modify permissions: Set(root)
> >> 15/03/05 12:12:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
> >> 15/03/05 12:12:44 INFO Remoting: Starting remoting
> >> 15/03/05 12:12:44 INFO Remoting: Remoting started; listening on
> addresses
> >> :[akka.tcp://sparkdri...@hadoopdev01.opsdatastore.com:50544]
> >> 15/03/05 12:12:44 INFO util.Utils: Successfully started service
> >> 'sparkDriver' on port 50544.
> >> 15/03/05 12:12:44 INFO spark.SparkEnv: Registering MapOutputTracker
> >> 15/03/05 12:12:44 INFO spark.SparkEnv: Registering BlockManagerMaster
> >> 15/03/05 12:12:44 INFO storage.DiskBlockManager: Created local directory
> >> at
> >>
> /tmp/spark-16402794-cc1e-42d0-9f9c-99f15eaa1861/spark-118bc6af-4008-45d7-a22f-491bcd1856c0
> >> 15/03/05 12:12:44 INFO storage.MemoryStore: MemoryStore started with
> >> capacity 265.4 MB
> >> 15/03/05 12:12:45 WARN util.NativeCodeLoader: Unable to load
> native-hadoop
> >> library for your platform... using builtin-java classes where applicable
> >> 15/03/05 12:12:45 INFO spark.HttpFileServer: HTTP File server directory
> is
> >>
> /tmp/spark-5d7da34c-58d4-4d60-9b6a-3dce43cab39e/spark-4d65aacb-78bd-40fd-b6c0-53b47e288199
> >> 15/03/05 12:12:45 INFO spark.HttpServer: Starting HTTP Server
> >> 15/03/05 12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT
> >> 15/03/05 12:12:45 INFO server.AbstractConnector: Started
> >> SocketConnector@0.0.0.0:56452
> >> 15/03/05 12:12:45 INFO util.Utils: Successfully started service 'HTTP
> file
> >> server' on port 56452.
> >> 15/03/05 12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT
> >> 15/03/05 12:12:45 INFO server.AbstractConnector: Started
> >> SelectChannelConnector@0.0.0.0:4040
> >> 15/03/05 12:12:45 INFO util.Utils: Successfully started service
> 'SparkUI'
> >> on port 4040.
> >> 15/03/05 12:12:45 INFO ui.SparkUI: Started SparkUI at
> >> http://hadoopdev01.opsdatastore.com:4040
> >> 15/03/05 12:12:46 INFO impl.TimelineClientImpl: Timeline service
> address:
> >> http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/
> >> java.lang.NoClassDefFoundError:
> >> org/codehaus/jackson/map/deser/std/StdDeserializer
> >>     at java.lang.ClassLoader.defineClass1(Native Method)
> >>     at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> >>     at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> >>     at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> >>     at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> >>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>     at java.security.AccessController.doPrivileged(Native Method)
> >>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>     at
> >>
> org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider.configObjectMapper(YarnJacksonJaxbJsonProvider.java:57)
> >>     at
> >>
> org.apache.hadoop.yarn.util.timeline.TimelineUtils.<clinit>(TimelineUtils.java:47)
> >>     at
> >>
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:166)
> >>     at
> >> org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
> >>     at
> >> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:65)
> >>     at
> >>
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
> >>     at
> >>
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:140)
> >>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:348)
> >>     at
> >>
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:986)
> >>     at $iwC$$iwC.<init>(<console>:9)
> >>     at $iwC.<init>(<console>:18)
> >>     at <init>(<console>:20)
> >>     at .<init>(<console>:24)
> >>     at .<clinit>(<console>)
> >>     at .<init>(<console>:7)
> >>     at .<clinit>(<console>)
> >>     at $print(<console>)
> >>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>     at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>     at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>     at java.lang.reflect.Method.invoke(Method.java:606)
> >>     at
> >>
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
> >>     at
> >>
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
> >>     at
> >> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
> >>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
> >>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
> >>     at
> >> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
> >>     at
> >>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
> >>     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
> >>     at
> >>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
> >>      at
> >>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
> >>     at
> >> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
> >>     at
> >>
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
> >>     at
> >> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:60)
> >>     at
> >>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:945)
> >>     at
> >>
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
> >>     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
> >>     at
> >>
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
> >>     at
> >> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:60)
> >>     at
> >>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
> >>      at
> >>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
> >>     at
> >>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
> >>     at
> >>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
> >>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
> >>     at org.apache.spark.repl.Main$.main(Main.scala:31)
> >>     at org.apache.spark.repl.Main.main(Main.scala)
> >>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>     at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>     at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>     at java.lang.reflect.Method.invoke(Method.java:606)
> >>     at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> >>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> >>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >> Caused by: java.lang.ClassNotFoundException:
> >> org.codehaus.jackson.map.deser.std.StdDeserializer
> >>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>     at java.security.AccessController.doPrivileged(Native Method)
> >>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>     ... 66 more
> >>
> >>
> >> scala> exit
> >
> >
>

Reply via email to