Hi Milind, >> Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
This is an error when the Hadoop that you compiled Pig against doesn't match the Hadoop that you run Pig on. >> AttemptID:attempt_1357708865500_6931_m_000000_1 Info:Container killed by the ApplicationMaster. Looks like you're using Hadoop-2.0.x. If so, please recompile Pig with Hadoop-2.0.x: ant clean jar-withouthadoop.jar -Dhadoopversion=23 Thanks, Cheolsoo On Thu, Jan 10, 2013 at 11:40 AM, Milind Vaidya <[email protected]> wrote: > Avro Schema with int field > > { > "type" : "record", > "name" : "employee", > "fields":[ > {"name" : "name", "type" : "string", "default" : "NU"}, > {"name" : "age", "type" : "int","default" : 0}, > {"name" : "dept", "type": "string","default" : "DU"}, > {"name" : "office", "type": "string","default" : "OU"}, > {"name" : "salary", "type": "int", "default" : 0} > ] > } > > > Avro Schema with float field > { > "type" : "record", > "name" : "employee", > "fields":[ > {"name" : "name", "type" : "string", "default" : "NU"}, > {"name" : "age", "type" : "int","default" : 0}, > {"name" : "dept", "type": "string","default" : "DU"}, > {"name" : "office", "type": "string","default" : "OU"}, > {"name" : "salary", "type": "float","default" : 0.0} > ] > } > > I built the new piggbank.jar (pig 0.11) and used it for the 2 schemas > containing int and float as above. > > > Script: > > REGISTER /homes/immilind/HadoopLocal/Jars/avro-1.7.1.jar > REGISTER /homes/immilind/HadoopLocal/Jars/piggybank.jar > > employee= load '/user/immilind/AvroData' using > org.apache.pig.piggybank.storage.avro.AvroStorage(); > dump employee; > > But I am getting following error : > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_0 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_0 Info:Container killed by > the ApplicationMaster. > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_1 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_1 Info:Container killed by > the ApplicationMaster. > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_2 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_2 Info:Container killed by > the ApplicationMaster. > > Backend error message > --------------------- > AttemptID:attempt_1357708865500_6931_m_000000_3 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > > Pig Stack Trace > --------------- > ERROR 2997: Unable to recreate exception from backed error: > AttemptID:attempt_1357708865500_6931_m_000000_3 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to > open iterator for alias employee. Backend error : Unable to recreate > exception from backed error: > AttemptID:attempt_1357708865500_6931_m_000000_3 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > at org.apache.pig.PigServer.openIterator(PigServer.java:826) > at > org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696) > at > > org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320) > at > org.apache.pig.tools.grunt.GruntParser.loadScript(GruntParser.java:531) > at > org.apache.pig.tools.grunt.GruntParser.processScript(GruntParser.java:474) > at > > org.apache.pig.tools.pigscript.parser.PigScriptParser.Script(PigScriptParser.java:804) > at > > org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:449) > at > > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194) > at > > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170) > at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69) > at org.apache.pig.Main.run(Main.java:539) > at org.apache.pig.Main.main(Main.java:158) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:601) > at org.apache.hadoop.util.RunJar.main(RunJar.java:208) > Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR > 2997: Unable to recreate exception from backed error: > AttemptID:attempt_1357708865500_6931_m_000000_3 Info:Error: Found interface > org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected > at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getErrorMessages(Launcher.java:217) > at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getStats(Launcher.java:149) > at > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:400) > at org.apache.pig.PigServer.launchPlan(PigServer.java:1264) > at > org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1249) > at org.apache.pig.PigServer.storeEx(PigServer.java:931) > at org.apache.pig.PigServer.store(PigServer.java:898) > at org.apache.pig.PigServer.openIterator(PigServer.java:811) > ... 16 more > > ================================================================================ >
