Hi Wenxing, I have created a table based on the column information you sent but I won't be able to do this testing in the next couple of days. Btw have you tried the import with smaller data sets? I mean have you tried to test what is the biggest data set you can import successfully?
Szabolcs On Wed, Jan 4, 2017 at 10:55 AM, wenxing zheng <wenxing.zh...@gmail.com> wrote: > Hi Szabolcs, > > I am testing this scenario with our client's slave database. And I am > sorry that I can not share the table definition and the sample data here. > But attached is a sample of table definition with the column types. > > It's quite complex. > > Thanks, Wenxing > > On Wed, Jan 4, 2017 at 4:24 PM, Szabolcs Vasas <va...@cloudera.com> wrote: > >> Hi Wenxing, >> >> I haven't tried this scenario yet but I would be happy to test it on my >> side. Can you please send me the DDL statement for creating the MySQL >> table >> and some sample data? >> Also it would be very helpful to send the details of the job you would >> like >> to run. >> >> Regards, >> Szabolcs >> >> On Wed, Jan 4, 2017 at 2:54 AM, wenxing zheng <wenxing.zh...@gmail.com> >> wrote: >> >> > can anyone help to advice? >> > >> > And I met with a problem when I set the checkColumn with updated_time, >> but >> > currently all the updated_time are in NULL. Under this case, the Sqoop >> will >> > fail to start the job. I think we need to support such kind of case. >> > >> > On Thu, Dec 29, 2016 at 9:18 AM, wenxing zheng <wenxing.zh...@gmail.com >> > >> > wrote: >> > >> > > Dear all, >> > > >> > > Did anyone already try to import more than 10 million data from MySQL >> to >> > > HDFS by using the Sqoop2? >> > > >> > > I always failed at the very beginning with various throttling >> settings, >> > > but never made it. >> > > >> > > Appreciated for any advice. >> > > Thanks, Wenxing >> > > >> > >> >> >> >> -- >> Szabolcs Vasas >> Software Engineer >> <http://www.cloudera.com> >> > > -- Szabolcs Vasas Software Engineer <http://www.cloudera.com>