It does return this,

-rw-r--r--   3 hadoop supergroup         99 2011-09-21 13:07 
/user/hive/warehouse/supplier/supplier.txt

On Sep 21, 2011, at 3:01 PM, Ayon Sinha wrote:

> I'm a bit concerned about port 9000 for the HDFS location. Is your namenode 
> at port 9000? Can you run
> hadoop dfs -ls  hdfs://localhost:9000/user/hive/warehouse/supplier     
>  
> -Ayon
> See My Photos on Flickr
> Also check out my Blog for answers to commonly asked questions.
> 
> From: Krish Khambadkone <kkhambadk...@apple.com>
> To: user@hive.apache.org
> Sent: Wednesday, September 21, 2011 2:45 PM
> Subject: Re: Exception when joining HIVE tables
> 
> Here is the table info,   and the query is "select acctbal, availqty, partkey 
> from partsupp JOIN supplier ON (partsupp.suppkey == supplier.suppkey); "
> 
>  desc formatted supplier;
> OK
> # col_name                    data_type               comment             
>                
> key                   string                  None                
> acctbal               string                  None                
> address               string                  None                
> name                  string                  None                
> nationkey             bigint                  None                
> phone                 string                  None                
> suppkey               bigint                  None                
>                
> # Detailed Table Information           
> Database:             default                  
> Owner:                hadoop                   
> CreateTime:           Wed Sep 21 13:05:50 PDT 2011     
> LastAccessTime:       UNKNOWN                  
> Protect Mode:         None                     
> Retention:            0                        
> Location:             hdfs://localhost:9000/user/hive/warehouse/supplier      
>  
> Table Type:           MANAGED_TABLE            
> Table Parameters:              
>       transient_lastDdlTime   1316635649          
>                
> # Storage Information  
>        
> SerDe Library:        org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe      
>  
> InputFormat:          org.apache.hadoop.mapred.TextInputFormat         
> OutputFormat:         
> org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat       
> Compressed:           No                       
> Num Buckets:          -1                       
> Bucket Columns:       []                       
> Sort Columns:         []                       
> Storage Desc Params:           
>       field.delim             ,                   
>       serialization.format    ,                   
> Time taken: 0.213 seconds
> 
> 
> desc formatted partsupp;
> OK
> # col_name                    data_type               comment             
>                
> key                   string                  None                
> availqty              int                     None                
> partkey               bigint                  None                
> suppkey               bigint                  None                
> supplycost            double                  None                
>                
> # Detailed Table Information           
> Database:             default                  
> Owner:                hadoop                   
> CreateTime:           Wed Sep 21 13:05:37 PDT 2011     
> LastAccessTime:       UNKNOWN                  
> Protect Mode:         None                     
> Retention:            0                   
>        
> Location:             hdfs://localhost:9000/user/hive/warehouse/partsupp      
>  
> Table Type:           MANAGED_TABLE            
> Table Parameters:              
>       transient_lastDdlTime   1316635698          
>                
> # Storage Information          
> SerDe Library:        org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe      
>  
> InputFormat:          org.apache.hadoop.mapred.TextInputFormat         
> OutputFormat:       
>       org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat       
> Compressed:           No                       
> Num Buckets:          -1                       
> Bucket Columns:       []                       
> Sort Columns:         []                       
> Storage Desc Params:           
>       field.delim             ,                   
>       serialization.format    ,                   
> Time taken: 2.192 seconds
> 
> 
> On Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:
> 
>> If you can share details of your tables and query we might be able to help. 
>> Do a desc formatted <tablename>
>>  
>> -Ayon
>> See My Photos on Flickr
>> Also check out my Blog for answers to commonly asked questions.
>> 
>> From: Krish Khambadkone <kkhambadk...@apple.com>
>> To: user@hive.apache.org
>> Sent: Wednesday, September 21, 2011 1:11 PM
>> Subject: Exception when joining HIVE tables
>> 
>> Hi,  I get this exception when I try to join two hive tables or even when I 
>> use a specific WHERE clause.  "SELECT *" from any individual table seems to 
>> work fine.  Any idea what is missing here.  I am on hive version 
>> hive-0.7.0-cdh3u0.
>> 
>> java.lang.IllegalArgumentException: Can not create a Path from an empty 
>> string
>>     at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
>>     at org.apache.hadoop.fs.Path.<init>(Path.java:90)
>>     at org.apache.hadoop.fs.Path.<init>(Path.java:50)
>>     at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
>>     at 
>> org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
>>     at 
>> org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
>>     at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
>>     at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
>>     at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:396)
>>     at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>>     at 
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
>>     at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
>>     at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
>>     at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
>>     at 
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
>>     at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
>>     at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
>>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>     at 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>     at java.lang.reflect.Method.invoke(Method.java:597)
>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
>> Job Submission failed with exception 'java.lang.IllegalArgumentException(Can 
>> not create a Path from an empty string)'
>> FAILED: Execution Error, return code 1 from 
>> org.apache.hadoop.hive.ql.exec.MapRedTask
>> 
>> 
> 
> 
> 

Reply via email to