Hi All,
I have tried multiple ways to create the HIVE table and retrieve data
using JSONSerDe. But here are the errors I encounter:
hive> select * from jobs;
OK
Failed with exception
java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: j
ava.io.EOFException: No content to map to O
Hi,
I am able to parse the input JSON file and load it into hive. I do not see
any errors with create table, so I am assuming that. But when I try to read
the data, I get null
hive> select * from jobs;
OK
null
I have validated the JSON with JSONLint and Notepad++ JSON plugin and it is
a valid JS
ect * from table is as good as hdfs -cat
>
> are you sure there is any data in the table?
>
>
> On Tue, Jun 18, 2013 at 11:54 PM, Sunita Arvind
>
> > wrote:
>
>> Hi,
>>
>> I am able to parse the input JSON file and load it into hive. I do not
>> s
step.
>
> 1. how big is it? hadoop fs -ls
> 2. cat a bit of it and see if anything is there. hadoop fs -text hdfs dir>/ | head -10
>
> do you see any data from step #2?
>
>
>
>
> On Tue, Jun 18, 2013 at 3:58 PM, Sunita Arvind wrote:
>
>> I ran
apReduce CPU Time Spent: 880 msec
OK
null
Time taken: 9.591 seconds
regards
Sunita
On Tue, Jun 18, 2013 at 9:35 PM, Sunita Arvind wrote:
> Ok.
> The data files are quite small. Around 35 KB and 1 KB each.
>
> [sunita@node01 tables]$ hadoop fs -ls /user/sunita/tables/jobs
> F
!!
>
> Richa
>
>
>
> On Wed, Jun 19, 2013 at 7:28 AM, Sunita Arvind wrote:
>
>> Having the a column name same as the table name, is a problem due to
>> which I was not able to reference jobs.values.id from jobs. Changing the
>> table name to jobs1 resolv
.
>
>
> On Wed, Jun 19, 2013 at 4:34 AM, Sunita Arvind wrote:
>
>> Thanks for sharing your experience Richa.
>> I do have timestamps but in the format of year : INT, day : INT, month :
>> INT.
>> As per your suggestion, I changed them all to string, but still ge
sition.title from jobs1;"
> may have some issue. May be it should be as
>
> select jobs.values[0].position.title from jobs1;
>
>
> Regards,
> Ramki.
>
>
> On Wed, Jun 19, 2013 at 8:24 AM, Sunita Arvind wrote:
>
>> Thanks Stephen,
>>
>> That's
ed, Jun 19, 2013 at 12:11 PM, Ramki Palle wrote:
>>
>>> Can you run some other queries from job1 table and see if any query
>>> returns some data?
>>>
>>> I am guessing your query "select jobs.values.position.title from
>>> jobs1;" may h
> select jobs.values[0].company.name, jobs.values[0].position.title,
jobs.values[0].locationdescription from linkedin_jobsearch;
CyberCoders Software Engineer-Hadoop, HDFS, HBase, Pig- Vertica
Analytics Pittsburgh, PA
Time taken: 8.543 seconds
But if I want to get the whole list this does not
Your issue seems familiar. Try logging out of hive session and re-login.
Sunita
On Wed, Jun 19, 2013 at 8:53 PM, Mohammad Tariq wrote:
> Hello list,
>
> I have a hive(0.9.0) setup on my Ubuntu box running hadoop-1.0.4.
> Everything was going smooth till now. But today when I issued *s
piping that
> gnarly stuff into python (or whatever) and have it come out the other end
> all nice and pretty -- and then posting that here. :)
>
>
> On Wed, Jun 19, 2013 at 7:54 PM, Sunita Arvind wrote:
>
>> Finally I could get it work. The issue resolves once I re
Hi,
I am unable to create a partitioned table.
The error I get is:
FAILED: ParseException line 37:16 mismatched input
'"jobs.values.postingDate.year"' expecting Identifier near '(' in column
specification
I tried referring to the columns in various ways,
S.jobs.values.postingDate.year, with quote
as always my 2 cents only.
>
>
> On Wed, Jun 26, 2013 at 3:47 PM, Sunita Arvind wrote:
>
>> Hi,
>>
>> I am unable to create a partitioned table.
>> The error I get is:
>> FAILED: ParseException line 37:16 mismatched input
>> '"jobs.values.posti
Hi Jim,
I am new to hive too so cannot suggest much on that front. However, I'm
pretty sure that this error indicates that a particular class is missing in
your classpath. In the sense, your hive runtime is not able to locate the
class org.apache.hadoop.mapreduce.util.HostUtil. Double check your
H
Hi Praveen / All,
I also have a requirement similar to the one explained (by Praveen) below:
distinct rows on a single column with corresponding data from other columns.
http://mail-archives.apache.org/mod_mbox/hive-user/201211.mbox/%3ccahmb8ta+r0h5a+armutookhkp8fxctown68qoz6lkfmwbrk...@mail.gmai
Hi,
I have written a script which generates JSON files, loads it into a
dictionary, adds a few attributes and uploads the modified files to HDFS.
After the files are generated, if I perform a select * from..; on the table
which points to this location, I get "null, null" as the result. I also
ueries
> which are common in a world of unstructured and un-clean data.
>
> ** **
>
> -Marcin
>
> ** **
>
> *From:* Sunita Arvind [mailto:sunitarv...@gmail.com]
> *Sent:* Tuesday, July 30, 2013 11:00 AM
> *To:* user@hive.apache.org
> *Subject:* Hive Jo
Have you tried a re-start of the cluster. Hive is a JVM process. A hang in
the JVM may cause issues like this. I have experienced a similar issue,
however, I did not get any output at all. After typing the query and
hitting enter, the prompt never returned and neither did I see any counter
informat
Hello,
I am using sqoop to import data from oracle into hive. Below is my SQL:
nohup sqoop import --connect "jdbc:oracle:thin:@(DESCRIPTION = (ADDRESS =
(PROTOCOL = TCP)(HOST = xxx)(PORT = )) (CONNECT_DATA = (SERVER =
DEDICATED) (SERVICE_NAME = CDWQ.tms.toyota.com) (FAILOVER_MODE=
(TYPE=s
rt command
2. Import everything in the table (not feasible in most cases)
However, I still need to know how to get the exact stack trace.
regards
Sunita
On Mon, Nov 11, 2013 at 1:48 PM, Sunita Arvind wrote:
> Hello,
>
> I am using sqoop to import data from oracle into hive. Below is m
to the failed map task log as that log
> usually contain entire exception including all the chained exceptions.
>
> Jarcec
>
> Links:
> 1: http://sqoop.apache.org/mail-lists.html
>
> On Mon, Nov 11, 2013 at 03:01:22PM -0800, Sunita Arvind wrote:
> > Just in case this acts a
Thanks David,
Very valuable input. Will update the group with my findings.
Regards
Sunita
On Monday, November 11, 2013, David Morel wrote:
> On 12 Nov 2013, at 0:01, Sunita Arvind wrote:
>
> Just in case this acts as a workaround for someone:
>> The issue is resolved if I elimi
Hi All,
I am trying to load database listener logs into hive tables. I am using
Regex Serde from
https://repository.cloudera.com/artifactory/public/org/apache/hive/hive-contrib/0.10.0-cdh4.2.0-SNAPSHOT/hive-contrib-0.10.0-cdh4.2.0.jar
Below is my create table:
CREATE external TABLE ListenerLog_
Hello Experts,
I am trying to write a UDF to parse a logline and provide the output in the
form of an array. Basically I want to be able to use LATERAL VIEW explode
subsequently to make it into columns.
This is how a typical log entry looks:
24-JUN-2012 05:00:42 *
(CONNECT_DATA=(SERVICE_NAME=abc
Can someone please suggest if this is doable or not? Is generic udf the
only option? How would using generic vs simple udf make any difference
since I would be returning the same object either ways.
Thank you
Sunita
-- Forwarded message --
From: *Sunita Arvind*
Date: Wednesday
turns an array of structs which is usually used in a
> lateral view.
>
> A good article on how to write a generic UDF is this one:
> http://www.baynote.com/2012/11/a-word-from-the-engineers/
>
>
> On Thu, Jan 30, 2014 at 7:06 AM, Sunita Arvind wrote:
>
>> Can someone plea
27 matches
Mail list logo