You raise an important point; "metadata" commands like create table and
alter table only affect metadata, not the actual data itself. So, you have
to write the files into the partition directories yourself and in the
correct schema. One way to do the latter is to stage the raw data in a
"temporary"
Hello, and thank you both for your answers...
I think I found the problem... keep in mind I'm quite new to all this
Hive/Hadoop stuff :)
I think my problem was due to the fact that the create table statement had
the partition defined but the information was not partitioned on the file
system (it w
A couple of clarifying questions and suggestions. First, keep in mind that
Hive doesn't care if you have a typo of some kind in your external location
;) Use DESCRIBE FORMATTED to verify the path is right. For an external
partitioned table, DESCRIBE FORMATTED table
PARTITION(col1=val1,col2=val2,...
Fernando,
It is more likely related to your SerDe and the underlying data not
matching up to it and than being related to the table being external
on S3.
Mark
On Tue, Dec 11, 2012 at 6:05 AM, Fernando Andrés Doglio Turissini
wrote:
> Long subject, I know.. let me explain a bit more about the pro
Long subject, I know.. let me explain a bit more about the problem:
I'm trying to load a file into a hive table (this is on an EMR instance)
for that I create an external table, and I set the location to the folder
on an s3 bucket, where the file resides.
The problem is that even though the table