can you just cat the file a.txt as well.

You may have to create table as

"create table ship_type(id int, name string) ROW FORMAT DELIMITED FIELDS
TERMINATED BY '\t';
If it is tab separated or use proper field separator you have.

You get incorrect results when your table definition does not match with
your actual data


On Fri, Feb 21, 2014 at 3:05 PM, Jone Lura <jone.l...@ecc.no> wrote:

> I used this from the example;
>
> stmt.execute("create table " + tableName + " (key int, value string)");
>
> In my application it is very similar;
>
> stmt.execute("create table ship_type (id int, name string)");
>
>
> On 21 Feb 2014, at 10:27, Nitin Pawar <nitinpawar...@gmail.com> wrote:
>
> can you share your create table statement ?
>
>
> On Fri, Feb 21, 2014 at 2:55 PM, Jone Lura <jone.l...@ecc.no> wrote:
>
>> Hi,
>>
>> I am new with Hadoop and Hive, and I am trying to figure out what is =
>> going wrong.
>>
>> In my application I connect successfully to the Hive and I am able to =
>> load data into it.
>>
>> When I try to run a select statement however, things are not as I =
>> expected.
>>
>> The select query returns the correct number of rows, but the values are =
>> 0 for int and null for String.
>>
>> I also tried the HiveJdbcClient.java code found on the wiki page; =
>>
>> https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveS=
>> erver2Clients-JDBCClientSampleCode, and I am experiencing the same =
>> problems.
>>
>> Running: show tables 'testHiveDriverTable'
>> testhivedrivertable
>> Running: describe testHiveDriverTable
>> key                  int                =20
>> value                string             =20
>> Running: load data local inpath '/tmp/a.txt' into table =
>> testHiveDriverTable
>> Running: select * from testHiveDriverTable
>> 0 null
>> 0 null
>> Running: select count(1) from testHiveDriverTable
>>
>> The content of a.txt is as follows;
>>
>> 1\001Test
>> 2\001Test1
>>
>> Does anyone know what could possibly be the reason for this?
>>
>> Hadoop and Hive are locally installed, but not embedded.
>>
>> Best regards,
>>
>> Jone
>>
>
>
>
> --
> Nitin Pawar
>
>
>


-- 
Nitin Pawar

Reply via email to