Spark uses 1.7.5, and you should probably see 1.7.{4,5} in use through Hadoop. But those are compatible.
That method appears to have been around since 1.3. What version does Pig want? I usually do "mvn -Dverbose dependency:tree" to see both what the final dependencies are, and what got overwritten, to diagnose things like this. My hunch is that something is depending on an old slf4j in your build and it's overwriting Spark et al. On Tue, May 27, 2014 at 10:45 PM, Ryan Compton <compton.r...@gmail.com> wrote: > I use both Pig and Spark. All my code is built with Maven into a giant > *-jar-with-dependencies.jar. I recently upgraded to Spark 1.0 and now > all my pig scripts fail with: > > Caused by: java.lang.RuntimeException: Could not resolve error that > occured when launching map reduce job: java.lang.NoSuchMethodError: > org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598) > at java.lang.Thread.dispatchUncaughtException(Thread.java:1874) > > > Did Spark 1.0 change the version of slf4j? I can't seem to find it via > mvn dependency:tree