Re: Hive select shows null after successful data load

2013-06-18 Thread Richa Sharma
Do you have any timestamp fields in the table that might contain null value ? I faced a similar situation sometime back - changing the data type to string made it work. But I was working on delimited text files. Not sure if it applies to JSON .. but its still worth giving a try !! Richa On We

Re: Export hive table format issue

2013-06-18 Thread Nitin Pawar
Jarek, Any chances that Hamza is hitting this one SQOOP-188: Problem with NULL values in MySQL export In that case I would recommend him to use --input-null-string "N" --input-null-non-string "N" Hamza, can you try above options On Wed,

Re: Hive select shows null after successful data load

2013-06-18 Thread Sunita Arvind
Having the a column name same as the table name, is a problem due to which I was not able to reference jobs.values.id from jobs. Changing the table name to jobs1 resolved the semantic error. However, the query still returns null hive> select jobs.values.position.title from jobs1; Total MapReduce j

Re: Hive select shows null after successful data load

2013-06-18 Thread Sunita Arvind
Ok. The data files are quite small. Around 35 KB and 1 KB each. [sunita@node01 tables]$ hadoop fs -ls /user/sunita/tables/jobs Found 1 items -rw-r--r-- 3 sunita hdfs 35172 2013-06-18 18:31 /user/sunita/tables/jobs/jobs_noSite_parsed.json [sunita@node01 tables]$ hadoop fs -text /user/suni

Re: Hive select shows null after successful data load

2013-06-18 Thread Stephen Sprague
As Nitin alluded to its best to confirm the data is definitely in hdfs using hdfs semantics rather than hive for the first step. 1. how big is it? hadoop fs -ls 2. cat a bit of it and see if anything is there. hadoop fs -text / | head -10 do you see any data from step #2? On Tue, Jun 18,

Re: Export hive table format issue

2013-06-18 Thread Jarek Jarcec Cecho
Would you mind upgrading Sqoop to version 1.4.3? We've significantly improved error logging for case when the input data can't be parsed during export. You should get state dump (exception, input file, position in the file, entire input line) available in the associated map task log. Jarcec O

Re: Errors in one Hive script using LZO compression

2013-06-18 Thread Sanjay Subramanian
Ok guys I solved it in not so elegant way but I need to go forward in production and deploy this because of time constraints :-) I divided the scripts into two stages Stage 1 : The hive script creates TXT files and writes to HDFS Stage 2 : I wrote a Lzo file creator and indexer that will convert

Re: LZO compression implementation in Hive

2013-06-18 Thread Sanjay Subramanian
Thanks I sent it From: Lefty Leverenz mailto:le...@hortonworks.com>> Reply-To: "user@hive.apache.org" mailto:user@hive.apache.org>> Date: Tuesday, June 18, 2013 2:12 AM To: "user@hive.apache.org" mailto:user@hive.apache.org>> Subject: Re:

Re: Hive select shows null after successful data load

2013-06-18 Thread Nitin Pawar
can you run a little more complex query select uniq across columns or do some maths. so we know when it fires up a mapreduce On Wed, Jun 19, 2013 at 1:59 AM, Sunita Arvind wrote: > Thanks for responding Nitin. Yes I am sure that serde is working fine and > json file is being picked based on all

Re: Hive select shows null after successful data load

2013-06-18 Thread Sunita Arvind
Thanks for responding Nitin. Yes I am sure that serde is working fine and json file is being picked based on all the errors that showed up till this stage. What sort of error are you suspecting. File not present or serde not parsing it ? On Tuesday, June 18, 2013, Nitin Pawar wrote: > select * fr

Re: Hive select shows null after successful data load

2013-06-18 Thread Nitin Pawar
select * from table is as good as hdfs -cat are you sure there is any data in the table? On Tue, Jun 18, 2013 at 11:54 PM, Sunita Arvind wrote: > Hi, > > I am able to parse the input JSON file and load it into hive. I do not see > any errors with create table, so I am assuming that. But when I

Hive select shows null after successful data load

2013-06-18 Thread Sunita Arvind
Hi, I am able to parse the input JSON file and load it into hive. I do not see any errors with create table, so I am assuming that. But when I try to read the data, I get null hive> select * from jobs; OK null I have validated the JSON with JSONLint and Notepad++ JSON plugin and it is a valid JS

Re: FileNotFoundException when using hive local mode execution style

2013-06-18 Thread Guillaume Allain
> On 18 June 2013 12:25, Nitin Pawar wrote: > look at the discussion on this thread >https://groups.google.com/a/cloudera.org/forum/?fromgroups=#!topic/cdh-user/gHVq9C5H6RE Thanks for that pointer, although not related to hive 'local-mode', I have set-up the following variables in order to "keep

Re: Export hive table format issue

2013-06-18 Thread Arafat, Moiz
Can you try using default value ex 0 or 999 instead of storing NULL in the numeric column on hive side ? Thanks, Moiz Arafat On Jun 18, 2013, at 9:14 AM, Hamza Asad mailto:hamza.asa...@gmail.com>> wrote: Nitin, Issue is not with the INT or BIGINT (as i have verified both), exception

Re: Export hive table format issue

2013-06-18 Thread Hamza Asad
Nitin, Issue is not with the INT or BIGINT (as i have verified both), exception is same.. Issue is with some thing else.. Please sort out any solution... following exception still raising (# in input string is not visible in terminal and is translated to # when copied to office writer which

Re: Export hive table format issue

2013-06-18 Thread Nitin Pawar
can you change your mysql schema to have bigint instead of just int. for more you can refer this http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad wrote: > Attached are the sch

Re: Export hive table format issue

2013-06-18 Thread Hamza Asad
Attached are the schema files of both HIVE and mySql tables On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar wrote: > for the number format exception, can you share your mysql schema (put as > attachment and not inline in mail). If you have created table with int .. > try to switch the column with

Re: Export hive table format issue

2013-06-18 Thread Nitin Pawar
for the number format exception, can you share your mysql schema (put as attachment and not inline in mail). If you have created table with int .. try to switch the column with bigint On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad wrote: > I have copy paste the ROW in office writer where i saw i

Re: Export hive table format issue

2013-06-18 Thread Hamza Asad
I have copy paste the ROW in office writer where i saw its # separated... yeah \N values representing NULL.. the version of sqoop is *Sqoop 1.4.2 git commit id Compiled by ag on Tue Aug 14 17:37:19 IST 2012* On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar wrote: > is "#" your field separator? > als

Re: Export hive table format issue

2013-06-18 Thread Nitin Pawar
is "#" your field separator? also the separator is normally an octal representation so you can give it a try. why does your columns have \N as values? is it for NULL ? what version of sqoop are you using? On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad wrote: > im executing following command* > s

Re: Export hive table format issue

2013-06-18 Thread Hamza Asad
im executing following command* sqoop export --connect jdbc:mysql://localhost/ --table dump_hive_events_details --export-dir hive/warehouse/.db/events_details --input-null-non-string \N --input-fields-terminated-by '#' --username --password x* * 13/06/18 16:26:44 INFO mapre

Re: FileNotFoundException when using hive local mode execution style

2013-06-18 Thread Nitin Pawar
look at the discussion on this thread https://groups.google.com/a/cloudera.org/forum/?fromgroups=#!topic/cdh-user/gHVq9C5H6RE On Tue, Jun 18, 2013 at 4:44 PM, Guillaume Allain wrote: > Hi all, > > I plan to use hive local in order to speed-up unit testing on (very) > small data sets. (Data is

FileNotFoundException when using hive local mode execution style

2013-06-18 Thread Guillaume Allain
Hi all, I plan to use hive local in order to speed-up unit testing on (very) small data sets. (Data is still on hdfs). I switch the local mode by setting the following variables : SET hive.exec.mode.local.auto=true; SET mapred.local.dir=/user; SET mapred.tmp.dir=file:///tmp; (plus creating neede

Re: Export hive table format issue

2013-06-18 Thread Nitin Pawar
check the option --input-fields-terminated-by in sqoop export On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad wrote: > I want to export my table in mysql and for that i'm using sqoop export > command but in HDFS i've data apparantly without any field seperator But it > does contain some field separ

Export hive table format issue

2013-06-18 Thread Hamza Asad
I want to export my table in mysql and for that i'm using sqoop export command but in HDFS i've data apparantly without any field seperator But it does contain some field separator. data is saved in the format as shown below *8119844144724992013-01-29 00:00:00.0141\N\N\N\N\N\N\N\N\N

Re: LZO compression implementation in Hive

2013-06-18 Thread Lefty Leverenz
You can email hive-...@hadoop.apache.org asking for edit privileges on the Hive wiki. Here's an example from the archives . Once you have permission, you can use

Fwd: FileNotFoundException when using hive local mode execution style

2013-06-18 Thread Guillaume Allain
Hi all, I plan to use hive local in order to speed-up unit testing on (very) small data sets. (Data is still on hdfs). I switch the local mode by setting the following variables : SET hive.exec.mode.local.auto=true; SET mapred.local.dir=/user; SET mapred.tmp.dir=file:///tmp; (plus creating neede

Re: Errors in one Hive script using LZO compression

2013-06-18 Thread Sanjay Subramanian
Yes I am going to start debugging from the inner query working my way outwards….starting tomorrow AM… :-) From: Sanjay Subramanian mailto:sanjay.subraman...@wizecommerce.com>> Date: Monday, June 17, 2013 11:59 PM To: "user@hive.apache.org" mailto:user@hive.apache.or

Errors in one Hive script using LZO compression

2013-06-18 Thread Sanjay Subramanian
Hi I am using LZO compression in our scripts but one script is still creating errors Diagnostic Messages for this Task: Error: java.io.IOException: java.io.EOFException: Premature EOF from inputStream at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationExc