The both tables have NO partitions.
发件人:
"Sahibdeep Singh"
收件人:
user@hive.apache.org
日期:
2019/03/11 13:09
主题:
Re: How to update the metadata of a Hive table?
Does the new location have old partitions as well? OR some partitions lie
in old location and some in new location?
On Sun, Mar 1
Not, it is NOT an external table.
Both tables are managed tables and have no partitions.
The analyze command only update the row number but not other meta data
such as table size.
发件人:
"Ashutosh Bapat"
收件人:
user@hive.apache.org
日期:
2019/03/11 13:01
主题:
Re: How to update the metadata of a
Hi, All,
We use the 'alter table foo set location new-hdfs-folder' to force table
foo to use the latest data
The command is executed successfully and the new data can be accessed
through table foo.
But the metadata of the table 'foo' is not correct. e.g. rowNumber and
table size,
I tried 'an
Bulk import and no partition
发件人:
Jörn Franke
收件人:
user@hive.apache.org
日期:
2018/01/02 14:56
主题:
Re: Does Hive SQL support reading data without locking?
How do you import data ? Bulk import?
What about using partitions ( or is the data too small for daily
partitions?)
On 2. Jan 2018, at
Hi, All,
We are using Hive to persist our data and we run cron jobs to import new
data into Hive daily.
At the same time, our users may query data from Hive at the same time by
using a third party software like Tableau
We found an issue that importing job will fail while the table is queried
Hi, All,
We are using Hive 1.1.0 on CDH 5.12 and I'm new to Hive.
Our client app imports data from Oracle DB to Hive.
The logic is like this:
get a connection to hive then
execute some SQL statement //the error occurs here
After importing several tables successfully, we got following err