Hi All, I have a SQL file of size 30mb which is a single create table statement with about 800,000 columns, hence the size.
I am trying to execute it using hive -f <file>. Initially, hive ran the command with 256mb heap size and gave me an OOM error. I increased the heap size using export HADOOP_HEAPSIZE to 1 gb and eventually 2gb which made the OOM error go away. However, the hive command ran for 5 hours without actually creating the table. The JVM was running. However, 1. running a strace on the process showed that it was stuck on a futex call. 2. I am using mysql for metastore and there were no rows added to either TBLS or COLUMNS table. Question. 1. can hive do this create table of 800k columns from a sql file of 30mb? 2. if theoretically possible, what could be happening that's taking it over 5 hours and still not succeeding? any insight is much appreciated. Sincerely, Ameet