Do you have a full stack trace here?
Also what Hive/Hadoop versions?
It looks like Hive somehow thinks that the local copy of the JAR that was
downloaded (/tmp/f5dc5d85-903e-422d-af1a-892b994ecfda_resources/hive-udf.jar)
is an HDFS path for some reason, and the the distributed cache is trying to
I have a hive script , where I call a udf .
Script works fine when called from local shell script.
But when called from within oozie workflow, it throws an exception saying
jar not found.
add jar hdfs://hdfspath of jar;
create temporary function duncname as 'pkg.className';
then on calling func