'./usr/local/hadoop_dir/hadoop/big_data/vender_details.txt'  ... why there
is a "." at the beginning ?

On Thu, Jul 12, 2012 at 2:53 PM, shaik ahamed <shaik5...@gmail.com> wrote:

> i tried nitin, still i cant previously before sending to mailing list i
> tried all the always still getting the below error
>
> hive> load data local inpath
> './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into table
> vender;
> FAILED: Error in semantic analysis: Line 1:23 Invalid path
> ''./usr/local/hadoop_dir/hadoop/big_data/vender_details.txt'': No files
> matching path
> file:/usr/local/hive-0.9.0/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt
>
> hive>
>
>
>
> On Thu, Jul 12, 2012 at 1:42 PM, Nitin Pawar <nitinpawar...@gmail.com>wrote:
>
>> can you just give table name while loading data  after into
>>
>> it shd be  load data local inpath
>> './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into
>> table  vender
>>
>>
>> On Thu, Jul 12, 2012 at 1:35 PM, shaik ahamed <shaik5...@gmail.com>wrote:
>>
>>> Hi bejoy
>>>
>>>  hive> Create external table vender(vender string,supplier
>>> string,order_date string,quantity int) row format delimited fields
>>> terminated by ' ' stored as textfile LOCATION
>>> '/usr/local/hadoop_dir/hadoop/big_data';
>>> OK
>>> Time taken: 0.276 seconds
>>> I created the above table as u said in the hdfs location created
>>> successfully...
>>>
>>> then when im loadin the data in to the table im getting an error can u
>>> please correct it
>>>
>>> hive> load data local inpath
>>> './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into table
>>> './usr/local/hive-0.9.0/vender' ;
>>>
>>> FAILED: Parse Error: line 1:94 mismatched input
>>> ''./usr/local/hive-0.9.0/vender'' expecting Identifier near 'table' in
>>> table name
>>> please help me in this ..
>>>
>>> thanks
>>> shaik.
>>>
>>>  On Thu, Jul 12, 2012 at 12:39 PM, Bejoy KS <bejoy...@yahoo.com> wrote:
>>>
>>>> **
>>>> Hi shaik
>>>>
>>>> Step 1
>>>> Create an external table with the desired location in hdfs. Your data
>>>> files for the hive table will be stored in this location/directory in hdfs.
>>>>
>>>> Step 2
>>>> Now use the LOAD DATA command to load data from any other location into
>>>> this table. On successful execution of this command the data files will be
>>>> moved to the table's location .( specified in previous step)
>>>>
>>>> Alternatively you can move or copy files within hdfs using hadoop fs
>>>> copy commands.
>>>> Regards
>>>> Bejoy KS
>>>>
>>>> Sent from handheld, please excuse typos.
>>>>  ------------------------------
>>>> *From: *shaik ahamed <shaik5...@gmail.com>
>>>> *Date: *Thu, 12 Jul 2012 12:30:23 +0530
>>>> *To: *<user@hive.apache.org>; Bejoy Ks<bejoy...@yahoo.com>
>>>> *ReplyTo: *user@hive.apache.org
>>>> *Subject: *Re: unable to create external table plz corrrect the syntax
>>>>
>>>> Thanks for the reply guys
>>>>
>>>>                I have tried dng with the load cmd
>>>>
>>>> i need the HDFS file to be place in the below hive path
>>>>
>>>>
>>>> */usr/local/hive-0.9.0#
>>>> *
>>>> */usr/local/hadoop_dir/hadoop/big_data/vender_details.txt --* this is
>>>> the hdfs path ,this path file
>>>> **
>>>> *i.e vender_details.txt  to be placed in the path
>>>> /usr/local/hive-0.9.0#  --* in the hive path
>>>>
>>>> please reply me with the syntax i tried all the ways with external
>>>> table also
>>>>
>>>>
>>>> Thanks in advance
>>>>
>>>> Shaik
>>>> On Wed, Jul 11, 2012 at 9:03 PM, Bejoy Ks <bejoy...@yahoo.com> wrote:
>>>>
>>>>>  Hi Shaik
>>>>>
>>>>> For the correct syntax for create table statement please refer
>>>>>
>>>>> https://cwiki.apache.org/Hive/languagemanual-ddl.html#LanguageManualDDL-CreateTable
>>>>>
>>>>>
>>>>> Please try our this command to avoid the syntax error
>>>>>
>>>>> Create external table vender(vender string,supplier string,order_date
>>>>> string,quantity int)
>>>>>  row format delimited fields terminated by ' '
>>>>> stored as textfile
>>>>>
>>>>> LOCATION '<hdfs dir>';
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Replace 'hdfs dir' with the required director path in hdfs
>>>>>
>>>>>
>>>>> Then try out the LOAD DATA LOCAL command, since your are loading data 
>>>>> from lfs to hdfs if the data volume is large (100G) it'll take some time.
>>>>>
>>>>>
>>>>> Regards
>>>>>
>>>>> Bejoy KS
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>   ------------------------------
>>>>> *From:* shaik ahamed <shaik5...@gmail.com>
>>>>> *To:* user@hive.apache.org
>>>>> *Sent:* Wednesday, July 11, 2012 8:38 PM
>>>>> *Subject:* unable to create external table plz corrrect the syntax
>>>>>
>>>>>  Thanks for the reply guys,
>>>>>
>>>>> I have tried using the below cmd
>>>>>
>>>>>  usr/local/hive-0.9.0# load data local inpath
>>>>> ‘/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt’ into table
>>>>> vender;
>>>>>
>>>>> in the above hive path we cant load the data using the above cmd ?
>>>>>
>>>>> In the below there  is an syntax error
>>>>> plz correct it
>>>>>
>>>>> hive> create external table vender(vender string,supplier
>>>>> string,order_date string,quantity
>>>>> int)['./usr/local/hadoop_dir/hadoop/big_data/vender_details.txt'] [ row
>>>>> format delimited fields terminated by ' ' stored as textfile] ;
>>>>>
>>>>> FAILED: Parse Error: line 1:90 mismatched input '[' expecting EOF near
>>>>> ')'
>>>>> Thanks in advance
>>>>>
>>>>> Regards
>>>>> shaik.
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Nitin Pawar
>>
>>
>


-- 
Nitin Pawar

Reply via email to