Pig.jar has two versions, one is with Hadoop 1.x, the other is with Hadoop 2.x. If you are using Pig 0.12 and prior, you will need to recompile with -Dhadoopversion=23 flag to get pig.jar for hadoop 2.x. If you are using Pig 0.13 or later, there are both pig-h1.jar and pig-h2.jar in the release tarball, and you shall pick pig-h2.jar.
Daniel On 3/3/15, 9:24 AM, "Luke Lovett" <[email protected]> wrote: >Hello all, > >I'm trying to use PigUnit with Hadoop 2.4.1. I suspect that there's some >mismatch between the version of Hadoop PigUnit is expecting and the >version I have when I get the following Exception: > >java.lang.InstantiationError: org.apache.hadoop.mapreduce.JobContext > at >org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.createJobC >ontext(HadoopShims.java:66) > at >org.apache.pig.backend.hadoop.executionengine.fetch.FetchPOStoreImpl.creat >eStoreFunc(FetchPOStoreImpl.java:58) > at >org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOper >ators.POStore.setUp(POStore.java:103) > at >org.apache.pig.backend.hadoop.executionengine.fetch.FetchLauncher.init(Fet >chLauncher.java:121) > at >org.apache.pig.backend.hadoop.executionengine.fetch.FetchLauncher.launchPi >g(FetchLauncher.java:78) > at >org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(H >ExecutionEngine.java:275) > at org.apache.pig.PigServer.launchPlan(PigServer.java:1367) > at >org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1352) > at org.apache.pig.PigServer.storeEx(PigServer.java:1011) > at org.apache.pig.PigServer.store(PigServer.java:974) > at org.apache.pig.PigServer.openIterator(PigServer.java:887) > at >org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:752) > at >org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParse >r.java:372) > at >org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:2 >28) > at >org.apache.pig.pigunit.pig.PigServer.registerScript(PigServer.java:55) > at org.apache.pig.pigunit.PigTest.registerScript(PigTest.java:170) > at org.apache.pig.pigunit.PigTest.assertOutput(PigTest.java:249) > at >com.mongodb.hadoop.pig.PigTest.testMongoUpdateStorage(PigTest.java:82) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: >39) > at >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm >pl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at >org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMetho >d.java:47) > at >org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable >.java:12) > at >org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod. >java:44) > at >org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.j >ava:17) > at >org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java: >26) > at >org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27 >) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) > at >org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.j >ava:70) > at >org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.j >ava:50) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at >org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTest >Class(JUnitTestClassExecuter.java:86) > at >org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute >(JUnitTestClassExecuter.java:49) > at >org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.proces >sTestClass(JUnitTestClassProcessor.java:69) > at >org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestC >lass(SuiteTestClassProcessor.java:48) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: >39) > at >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm >pl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at >org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispat >ch.java:35) > at >org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispat >ch.java:24) > at >org.gradle.messaging.dispatch.ContextClassLoaderDispatch.dispatch(ContextC >lassLoaderDispatch.java:32) > at >org.gradle.messaging.dispatch.ProxyDispatchAdapter$DispatchingInvocationHa >ndler.invoke(ProxyDispatchAdapter.java:93) > at com.sun.proxy.$Proxy2.processTestClass(Unknown Source) > at >org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(T >estWorker.java:105) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: >39) > at >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm >pl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at >org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispat >ch.java:35) > at >org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispat >ch.java:24) > at >org.gradle.messaging.remote.internal.hub.MessageHub$Handler.run(MessageHub >.java:360) > at >org.gradle.internal.concurrent.DefaultExecutorFactory$StoppableExecutorImp >l$1.run(DefaultExecutorFactory.java:64) > at >java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor. >java:895) > at >java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java >:918) > at java.lang.Thread.run(Thread.java:695) > >I think JobContext is an Interface in Hadoop 2.x, but a class in Hadoop >1.x. Will PigUnit work with Hadoop 2.x? I'm using Pig version 0.13.0, >which does support Hadoop 2.x. If it is possible, what is going wrong >here? Thanks for your help. > >Luke >
