It sounds like you're using a version of Pig that wasn't compiled for Hadoop 2.x/.23. Try recompiling with 'ant clean jar -Dhadoopversion=23'.
-Mark On Thu, Sep 19, 2013 at 9:23 AM, j.barrett Strausser <[email protected]> wrote: > Running > > Hadoop-2.1.0-Beta > Pig-0.11.1 > Hive-0.11.1 > > 1. Created Avro backed table in Hive. > 2. Loaded the table in Pig - records = Load '/path' USING > org.apache.pig.piggybank.storage.avro.AvroStorage(); > 3. Can successfully describe the relation. > > I registered the following on pig start : > REGISTER piggybank.jar > REGISTER avro-*.jar > REGISTER jackson-core-asl-1.8.8.jar > REGISTER jackson-mapper-asl-1.8.8.jar > REGISTER json-simple-1.1.jar > REGISTER snappy-java-1.0.3.2.jar > > The avro tools are at 1.7.5 > > > > > > * > * > *Running Dump Produces the following*: > > 2013-09-19 12:08:21,639 [JobControl] ERROR > org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while trying > to run jobs. > java.lang.IncompatibleClassChangeError: Found interface > org.apache.hadoop.mapreduce.JobContext, but class was expected > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225) > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186) > at > org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:441) > at > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:340) > > > > 2013-09-19 12:08:21,651 [main] ERROR > org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to > recreate exception from backend error: Unexpected System Error Occured: > java.lang.IncompatibleClassChangeError: Found interface > org.apache.hadoop.mapreduce.JobContext, but class was expected > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225) > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186) > at > org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:441) > at > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:340) > > *Running illustrate :* > > Pig Stack Trace > --------------- > ERROR 1070: Could not resolve > org.apache.pig.piggybank.storage.avro.AvroStorage using imports: [, > org.apache.pig.builtin., org.apache.pig.impl.builtin.] > > Pig Stack Trace > --------------- > ERROR 2998: Unhandled internal error. > org.apache.hadoop.mapreduce.Mapper$Context.<init>(Lorg/apache/hadoop/mapreduce/Mapper;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;L$ > > java.lang.NoSuchMethodError: > org.apache.hadoop.mapreduce.Mapper$Context.<init>(Lorg/apache/hadoop/mapreduce/Mapper;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;Lorg/apach$ > > > > Any thougths? > > -- > > > https://github.com/bearrito > @deepbearrito
