I have resolved this, so  I ll share what the issue was,

I had set  HIVE_AUX_JARS_PATH in my hive-env.sh

as
  
HIVE_AUX_JARS_PATH=$HIVE_AUX_JARS_PATH,$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar.

The empty HIVE_AUX_JARS_PATH was causing the exception. 

The following fix made it work

if [ -z "$HIVE_AUX_JARS_PATH" ]; then
        
HIVE_AUX_JARS_PATH=$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar
else
       
HIVE_AUX_JARS_PATH=$HIVE_AUX_JARS_PATH,$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar


Thanks,
Sam


On Jan 31, 2012, at 11:50 AM, Sam William wrote:

> 
> I have a new Hive installation . Im able to create  tables and do select * 
> queries from them.  But as soon as I try to execute a query that would 
> involve a Hadoop M/R job,  I get this exception .
> 
> 
> 
> java.lang.IllegalArgumentException: Can not create a Path from an empty string
>        at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
>        at org.apache.hadoop.fs.Path.<init>(Path.java:90)
>        at org.apache.hadoop.fs.Path.<init>(Path.java:50)
>        at 
> org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
>        at 
> org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
>        at 
> org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
>        at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:396)
>        at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157)
>        at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
>        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
> 
> 
> 
> The table is pretty simple .  It is an external table on the  HDFS  and does 
> not have  any partitions.       Any idea why this could be happening ?
> 
> 
> 
> Thanks,
> Sam William
> sa...@stumbleupon.com
> 
> 
> 

Sam William
sa...@stumbleupon.com



Reply via email to