Hi:
Currently, our hive_to_hdfs function has two parts. The first part
retrieves transactions records in Hive, put into a temporary file in local
file system.The second part puts temporary file in local file system into
HDFS. The second part work on NameNode and is outside of Hadoop process
and
Hi:
I have a table transaction3 with a field name 'date'. That table is the
target of importing table from MYSQL using Sqoop. The table in MYSQL
has a field with the field name 'date' and SQOOP does not allow column
name mapping. Therefore, the field name 'date' is kept in the
transaction3