Is the extra 's' needed in your path:


LOAD DATA INPATH '/users/dept.txt' overwrite into table DEPT;



-----Original Message-----
From: Muthukumar Somasundaram (ETS) [mailto:muthukum...@hcl.com]
Sent: Monday, May 12, 2014 12:59 AM
To: user@hive.apache.org
Subject: Hive dataload issue.



Hi



I need a help in resolving a data load issue with Hive.



Background:



I am testing the Informatica connectivity with Big data ( Hadoop and Hive).



I have installed HDFS 1.0.3  and Hive 0.7.1  in RHEL 5.5. I am able to perform 
all HDFS operations.



When I try to load a hive table using hive command line, I am getting the below 
error. If you have faced this earlier or you know the solution please let me 
know.



I tried loading both local file and hdfs file. Both are giving same error. Hope 
I miss some configuration. Please find the attached screen shot.



I tested the script in cloudera , it works fine.



Thanks in advance.



Error Info.



hive> describe dept;

OK

deptid  int

dname   string

Time taken: 3.792 seconds

hive> ! cat /user/dept.txt;

Command failed with exit code = 1

cat: /user/dept.txt: No such file or directory

hive> ! hadoop fs -cat /user/dept.txt;

1,IT

2,Finance

3,Sales

hive> LOAD DATA INPATH '/users/dept.txt' overwrite into table DEPT;

FAILED: Hive Internal Error: 
java.lang.IllegalArgumentException(java.net.URISyntaxException: Relative path 
in absolute URI: 
hdfs://informatica:8020$%7Bbuild.dir%7D/scratchdir/hive_2014-05-12_12-11-29_340_565872632113593986)

java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path 
in absolute URI: 
hdfs://informatica:8020$%7Bbuild.dir%7D/scratchdir/hive_2014-05-12_12-11-29_340_565872632113593986

        at org.apache.hadoop.fs.Path.initialize(Path.java:148)

        at org.apache.hadoop.fs.Path.<init>(Path.java:132)

        at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:142)

        at 
org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:202)

        at 
org.apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.java:294)

        at 
org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(LoadSemanticAnalyzer.java:238)

        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238)

        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340)

        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736)

        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)

Regards

Muthukumar.S







::DISCLAIMER::

----------------------------------------------------------------------------------------------------------------------------------------------------



The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.

E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted, lost, destroyed, arrive late or incomplete, or 
may contain viruses in transmission. The e mail and its contents (with or 
without referred errors) shall therefore not attach any liability on the 
originator or HCL or its affiliates.

Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the views or opinions of HCL or its 
affiliates. Any form of reproduction, dissemination, copying, disclosure, 
modification, distribution and / or publication of this message without the 
prior written consent of authorized representative of HCL is strictly 
prohibited. If you have received this email in error please delete it and 
notify the sender immediately.

Before opening any email and/or attachments, please check them for viruses and 
other defects.



----------------------------------------------------------------------------------------------------------------------------------------------------


Reply via email to