Hive to HDFS directly using INSERT OVERWRITE DIRECTORY Imcompatible issue

2013-10-14 Thread Sonya Ling
Hi: Currently, our hive_to_hdfs function has two parts. The first part retrieves transactions records in Hive, put into a temporary file in local file system.The second part puts temporary file in local file system into HDFS. The second part work on NameNode and is outside of Hadoop process and

Alter or Query a table with field name 'date' always get error

2013-08-22 Thread Sonya Ling
Hi: I have a table transaction3 with a field name 'date'.  That table is the target of importing table from MYSQL using Sqoop.  The table in MYSQL has a field with the field name 'date' and SQOOP does not allow column name mapping.  Therefore, the field name 'date' is kept in the transaction3