Thanks Serega for quick response. There is only variable by which all the parameter values of the pig script will be derived and it is ${output}.
The value of the ${output} = /run/yyy/MM/dd. I am passing the same value while evaluating parameter values for parameter UBIDATA, UPIDATA and OUTPUT On Tue, Jul 30, 2013 at 2:59 PM, Serega Sheypak <serega.shey...@gmail.com>wrote: > It's not the workflow definiton. It's action tag with nested tags. > 1. You have to provide values for the variables. It's hard to guess > variable values by their names. Sorry. > 2. I can guess that *<delete path="${output}/Report"/>* is wrong > should be: *<delete path="${nameNode}/${output}/Report"/>* > > For more specific help you need to provide exact varialbes values during > workflow run. > > > 2013/7/30 Kamesh Bhallamudi <kamesh.had...@gmail.com> > > > This is the workflow definition > > > > <action name="Report"> > > <pig> > > <job-tracker>${jobTracker}</job-tracker> > > <name-node>${nameNode}</name-node> > > <prepare> > > <delete path="${output}/Report"/> > > </prepare> > > <script>/workflows/hive-workflow/scripts/report.pig</script> > > <argument>-param</argument> > > <argument>UBIDATA=${output}/ubi</argument> > > <argument>-param</argument> > > <argument>UPIDATA=${output}/upi</argument> > > <argument>-param</argument> > > <argument>OUTPUT=${output}</argument> > > <argument>-param</argument> > > <argument>JAR_PATH=${nameNode}/workflows/mr-workflow/lib/</argument> > > </pig> > > <ok to="end"/> > > <error to="fail"/> > > </action> > > > > > > On Tue, Jul 30, 2013 at 2:05 PM, Serega Sheypak < > serega.shey...@gmail.com > > >wrote: > > > > > It would be great if you post your workflow definition. > > > > > > > > > 2013/7/30 Kamesh Bhallamudi <kamesh.had...@gmail.com> > > > > > > > Hi All, > > > > I am facing the problem while configuring pig action. Please help me > > > where > > > > I am doing wrong. Please find the exception > > > > > > > > Failing Oozie Launcher, Main class > > > > [org.apache.oozie.action.hadoop.PigMain], exception invoking main(), > > > Scheme > > > > of the path xxxx is null > > > > org.apache.oozie.action.hadoop.LauncherException: Scheme of the path > > xxxx > > > > is null > > > > at > > > > > > > > > > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:443) > > > > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > > > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436) > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) > > > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > > > > at java.security.AccessController.doPrivileged(Native Method) > > > > at javax.security.auth.Subject.doAs(Subject.java:396) > > > > at > > > > > > > > > > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1122) > > > > at org.apache.hadoop.mapred.Child.main(Child.java:249) > > > > Caused by: org.apache.oozie.action.hadoop.LauncherException: Scheme > of > > > the > > > > path xxxx is null > > > > at > > > > > > > > > > > > > > org.apache.oozie.action.hadoop.FileSystemActions.validatePath(FileSystemActions.java:100) > > > > at > > > > > > > > > > > > > > org.apache.oozie.action.hadoop.FileSystemActions.delete(FileSystemActions.java:57) > > > > at > > > > > > > > > > > > > > org.apache.oozie.action.hadoop.FileSystemActions.execute(FileSystemActions.java:48) > > > > at > > > > > > > > > > > > > > org.apache.oozie.action.hadoop.PrepareActionsDriver.doOperations(PrepareActionsDriver.java:64) > > > > at > > > > > > > > > > > > > > org.apache.oozie.action.hadoop.LauncherMapper.executePrepare(LauncherMapper.java:669) > > > > at > > > > > > > > > > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:439) > > > > ... 8 more > > > > I have been getting the same exception even after > > > > trying*hdfs://{nameNode}/xxxx > > > > *. > > > > I am using oozie version 3.3.1. > > > > -- > > > > Kamesh. > > > > > > > > > > > > > > > -- > > Bh.V.S.Kamesh, > > Software Development Engineer, > > HomeShop18. > > > -- Bh.V.S.Kamesh, Software Development Engineer, HomeShop18.