Hi Guys,

I am trying to import the data from my db to Hive using Sqoop.

My usage is to get the data to Hive and later if necessary update the records 
from Hive. For ACID transactions I learnt that I definitely need bucketing 
tables. 

Now I created bucketed table and I am trying to import the data using Sqoop 
import where it is throwing a error as follows:

ERROR tool.ImportTool: Encountered IOException running import job: 
org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not 
supported : Store into a partition with bucket definition from Pig/Mapreduce is 
not supported 

If it is not supported is there a work around?

I created a partitioned table and import data into that table, can I insert 
these data from partitioned table to created bucketing table? If yes can 
someone provide a best example?

I have tried with `insert into` statement for each partition which takes so 
long.
Because my each year contains “241972735” records.

Also, this might be not a good practice for production environment.

Appreciate your help.

Thanks
Sowjanya

Reply via email to