Do you have a full stack trace here? Also what Hive/Hadoop versions? It looks like Hive somehow thinks that the local copy of the JAR that was downloaded (/tmp/f5dc5d85-903e-422d-af1a-892b994ecfda_resources/hive-udf.jar) is an HDFS path for some reason, and the the distributed cache is trying to copy the file from HDFS when it should be looking in the local filesystem. Had seen someone else with this issue (can't find the thread, though we weren't able to track it down.
Jason On May 16, 2015, at 8:36 AM, Shushant Arora <shushantaror...@gmail.com> wrote: > I have a hive script , where I call a udf . > Script works fine when called from local shell script. > But when called from within oozie workflow, it throws an exception saying jar > not found. > > add jar hdfs://hdfspath of jar; > > create temporary function duncname as 'pkg.className'; > > then on calling function , it throws exception > > converting to local hdfs://nameservice1/hdfspath of jar > Added /tmp/f5dc5d85-903e-422d-af1a-892b994ecfda_resources/hive-udf.jar to > class path > Added resource: > /tmp/f5dc5d85-903e-422d-af1a-892b994ecfda_resources/hive-udf.jar > OK > > java.io.FileNotFoundException: File does not exist: > hdfs://nameservice1/tmp/f5dc5d85-903e-422d-af1a-892b994ecfda_resources/hive-udf.jar > at > org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093) > > > > script works in local mode. But failing on calling from oozie.