Attached are the schema files of both HIVE and mySql tables

On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <nitinpawar...@gmail.com>wrote:

>  for the number format exception, can you share your mysql schema (put as
> attachment and not inline in mail). If you have created table with int ..
> try to switch the column with bigint
>
>
>
> On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <hamza.asa...@gmail.com>wrote:
>
>> I have copy paste the ROW in office writer where i saw its # separated...
>> yeah \N values representing NULL..
>> the version of sqoop is
>> *Sqoop 1.4.2
>> git commit id
>> Compiled by ag on Tue Aug 14 17:37:19 IST 2012*
>>
>>
>> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <nitinpawar...@gmail.com>wrote:
>>
>>> is "#" your field separator?
>>> also the separator is normally an octal representation so you can give
>>> it a try.
>>>
>>> why does your columns have \N as values? is it for NULL ?
>>>
>>> what version of sqoop are you using?
>>>
>>>
>>> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <hamza.asa...@gmail.com>wrote:
>>>
>>>> im executing following command*
>>>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>>>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>>>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>>>> xxxxxxxx --password xxxxxxxxx*
>>>> *
>>>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>>>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>>>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>>>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>>>> 1 \N \N 3 2 \N 1"
>>>>     at
>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>>>     at java.lang.Integer.parseInt(Integer.java:492)
>>>>     at java.lang.Integer.valueOf(Integer.java:582)
>>>>     at
>>>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>>>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>>>     at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>>>     at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>     at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> *
>>>>
>>>>
>>>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar 
>>>> <nitinpawar...@gmail.com>wrote:
>>>>
>>>>> check the option --input-fields-terminated-by in sqoop export
>>>>>
>>>>>
>>>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <hamza.asa...@gmail.com>wrote:
>>>>>
>>>>>> I want to export my table in mysql and for that i'm using sqoop
>>>>>> export command but in HDFS i've data apparantly without any field 
>>>>>> seperator
>>>>>> But it does contain some field separator. data is saved in the format as
>>>>>> shown below
>>>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N
>>>>>> \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>>>> how can i export this type of data to mysql and what field separator
>>>>>> i mention it there.. Please help
>>>>>>
>>>>>> --
>>>>>> *Muhammad Hamza Asad*
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Nitin Pawar
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> *Muhammad Hamza Asad*
>>>>
>>>
>>>
>>>
>>> --
>>> Nitin Pawar
>>>
>>
>>
>>
>> --
>> *Muhammad Hamza Asad*
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*
CREATE TABLE events_details(
  id int,
  event_id int,
  user_id BIGINT,
  event_date string,
  intval_1 int ,
  intval_2 int,
  intval_3 int,
  intval_4 int,
  intval_5 int,
  intval_6 int,
  intval_7 int,
  intval_8 int,
  intval_9 int,
  intval_10 int,
  intval_11 int,
  intval_12 int,
  intval_13 int,
  intval_14 int,
  intval_15 int,
  intval_16 int,
  intval_17 int,
  intval_18 int,
  intval_19 int,
  intval_20 int,
  intval_21 int,
  intval_22 int,
  intval_23 int,
  intval_24 int,
  intval_25 int,
  intval_26 int)
ROW FORMAT DELIMITED 
 FIELDS TERMINATED BY ',' 
STORED AS TEXTFILE; 
CREATE TABLE IF NOT EXISTS `dump_hive_events_details` (
  `id` int(11) NOT NULL auto_increment,
  `event_id` int(11) NOT NULL,
  `user_id` bigint(11) NOT NULL,
  `event_date` timestamp NULL default NULL,
  `intval_1` int(11) default NULL,
  `intval_2` int(11) default NULL,
  `intval_3` int(11) default NULL,
  `intval_4` int(11) default NULL,
  `intval_5` int(11) default NULL,
  `intval_6` int(11) default NULL,
  `intval_7` int(11) default NULL,
  `intval_8` int(11) default NULL,
  `intval_9` int(11) default NULL,
  `intval_10` int(11) default NULL,
  `intval_11` int(11) default NULL,
  `intval_12` int(11) default NULL,
  `intval_13` int(11) default NULL,
  `intval_14` int(11) default NULL,
  `intval_15` int(11) default NULL,
  `intval_16` int(11) default NULL,
  `intval_17` int(11) default NULL,
  `intval_18` int(11) default NULL,
  `intval_19` int(11) default NULL,
  `intval_20` int(11) default NULL,
  `intval_21` int(11) default NULL,
  `intval_22` int(11) default NULL,
  `intval_23` int(11) default NULL,
  `intval_24` int(11) default NULL,
  `intval_25` int(11) default NULL,
  `intval_26` int(11) default NULL,
  PRIMARY KEY  (`id`),
  UNIQUE KEY `event_id` (`event_id`,`user_id`,`event_date`),
  KEY `date` (`event_date`),
  KEY `intval_14` (`intval_14`),
  KEY `intval_22` (`intval_22`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ;


Reply via email to