I ran the pig script with out oozie by passing all the parameters, and it
is running fine.


On Tue, Jul 30, 2013 at 4:36 PM, Serega Sheypak <serega.shey...@gmail.com>wrote:

> Fix is correct. Now your pig script is trying to run.It has an error. Looks
> like you have a problem with alias.
> You need to fix your pig script.
>
>
> 2013/7/30 Kamesh Bhallamudi <kamesh.had...@gmail.com>
>
> > > I suggest you to fix: *<delete path="${nameNode}/${output}/Report"/>*
> > Thanks Serega, that issue got fixed now.
> >
> > Getting a one more issue now.
> >
> > Pig Stack Trace
> > ---------------
> > ERROR 2998: Unhandled internal error. name
> >
> > java.lang.NoSuchFieldError: name
> >         at
> >
> org.apache.pig.parser.QueryParserStringStream.<init>(QueryParserStringStream.java:32)
> >         at
> >
> org.apache.pig.parser.QueryParserDriver.tokenize(QueryParserDriver.java:194)
> >         at
> > org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:162)
> >         at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1633)
> >         at
> > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1584)
> >         at org.apache.pig.PigServer.registerQuery(PigServer.java:584)
> >         at
> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:942)
> >         at
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:386)
> >         at
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:188)
> >         at
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:164)
> >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
> >         at org.apache.pig.Main.run(Main.java:435)
> >         at org.apache.pig.PigRunner.run(PigRunner.java:49)
> >         at
> > org.apache.oozie.action.hadoop.PigMain.runPigJob(PigMain.java:283)
> >         at org.apache.oozie.action.hadoop.PigMain.run(PigMain.java:223)
> >         at
> > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
> >         at org.apache.oozie.action.hadoop.PigMain.main(PigMain.java:76)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:491)
> >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
> >         at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1122)
> >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> >
> >
> > On Tue, Jul 30, 2013 at 4:01 PM, Serega Sheypak <
> serega.shey...@gmail.com
> > >wrote:
> >
> > > 1. you have nameNode variable and jobTracker variable.
> > > 2. you didn't post the whole workflow.
> > > 3. what makes you think that pig script was launched?
> > > I suppose that pig action has failed on prepare step.
> > > I suggest you to fix: *<delete path="${nameNode}/${output}/Report"/>*
> > > If nameNode variable is correct it sohuld help.
> > >
> > >
> > > 2013/7/30 Kamesh Bhallamudi <kamesh.had...@gmail.com>
> > >
> > > > Thanks Serega for quick response.
> > > > There is only variable by which all the parameter values of the pig
> > > script
> > > > will be derived and it is ${output}.
> > > >
> > > > The value of the ${output} = /run/yyy/MM/dd. I am passing the same
> > value
> > > > while evaluating parameter values for parameter UBIDATA, UPIDATA and
> > > OUTPUT
> > > >
> > > >
> > > > On Tue, Jul 30, 2013 at 2:59 PM, Serega Sheypak <
> > > serega.shey...@gmail.com
> > > > >wrote:
> > > >
> > > > > It's not the workflow definiton. It's action tag with nested tags.
> > > > > 1.  You have to provide values for the variables. It's hard to
> guess
> > > > > variable values by their names. Sorry.
> > > > > 2. I can guess that *<delete path="${output}/Report"/>* is wrong
> > > > >     should be: *<delete path="${nameNode}/${output}/Report"/>*
> > > > >
> > > > > For more specific help you need to provide exact varialbes values
> > > during
> > > > > workflow run.
> > > > >
> > > > >
> > > > > 2013/7/30 Kamesh Bhallamudi <kamesh.had...@gmail.com>
> > > > >
> > > > > > This is the workflow definition
> > > > > >
> > > > > > <action name="Report">
> > > > > >         <pig>
> > > > > >         <job-tracker>${jobTracker}</job-tracker>
> > > > > > <name-node>${nameNode}</name-node>
> > > > > >             <prepare>
> > > > > >                <delete path="${output}/Report"/>
> > > > > >             </prepare>
> > > > > >
> > > > <script>/workflows/hive-workflow/scripts/report.pig</script>
> > > > > >             <argument>-param</argument>
> > > > > > <argument>UBIDATA=${output}/ubi</argument>
> > > > > > <argument>-param</argument>
> > > > > > <argument>UPIDATA=${output}/upi</argument>
> > > > > > <argument>-param</argument>
> > > > > > <argument>OUTPUT=${output}</argument>
> > > > > > <argument>-param</argument>
> > > > > >
> > <argument>JAR_PATH=${nameNode}/workflows/mr-workflow/lib/</argument>
> > > > > >         </pig>
> > > > > >         <ok to="end"/>
> > > > > >         <error to="fail"/>
> > > > > >     </action>
> > > > > >
> > > > > >
> > > > > > On Tue, Jul 30, 2013 at 2:05 PM, Serega Sheypak <
> > > > > serega.shey...@gmail.com
> > > > > > >wrote:
> > > > > >
> > > > > > > It would be great if you post your workflow definition.
> > > > > > >
> > > > > > >
> > > > > > > 2013/7/30 Kamesh Bhallamudi <kamesh.had...@gmail.com>
> > > > > > >
> > > > > > > > Hi All,
> > > > > > > >  I am facing the problem while configuring pig action. Please
> > > help
> > > > me
> > > > > > > where
> > > > > > > > I am doing wrong. Please find the exception
> > > > > > > >
> > > > > > > >  Failing Oozie Launcher, Main class
> > > > > > > > [org.apache.oozie.action.hadoop.PigMain], exception invoking
> > > > main(),
> > > > > > > Scheme
> > > > > > > > of the path xxxx is null
> > > > > > > > org.apache.oozie.action.hadoop.LauncherException: Scheme of
> the
> > > > path
> > > > > > xxxx
> > > > > > > > is null
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:443)
> > > > > > > > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
> > > > > > > > at
> > > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
> > > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> > > > > > > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > > > > > > at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1122)
> > > > > > > > at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > > Caused by: org.apache.oozie.action.hadoop.LauncherException:
> > > Scheme
> > > > > of
> > > > > > > the
> > > > > > > > path xxxx is null
> > > > > > > > at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.FileSystemActions.validatePath(FileSystemActions.java:100)
> > > > > > > > at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.FileSystemActions.delete(FileSystemActions.java:57)
> > > > > > > > at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.FileSystemActions.execute(FileSystemActions.java:48)
> > > > > > > > at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.PrepareActionsDriver.doOperations(PrepareActionsDriver.java:64)
> > > > > > > > at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.executePrepare(LauncherMapper.java:669)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:439)
> > > > > > > > ... 8 more
> > > > > > > >  I have been getting the same exception even after
> > > > > > > > trying*hdfs://{nameNode}/xxxx
> > > > > > > > *.
> > > > > > > > I am using oozie version 3.3.1.
> > > > > > > > --
> > > > > > > > Kamesh.
> > > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > --
> > > > > > Bh.V.S.Kamesh,
> > > > > > Software Development Engineer,
> > > > > > HomeShop18.
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Bh.V.S.Kamesh,
> > > > Software Development Engineer,
> > > > HomeShop18.
> > > >
> > >
> >
> >
> >
> > --
> > Bh.V.S.Kamesh,
> > Software Development Engineer,
> > HomeShop18.
> >
>



-- 
Bh.V.S.Kamesh,
Software Development Engineer,
HomeShop18.

Reply via email to