I've told it to CREATE EXTERNAL TABLE, I'm still getting the same errors.
fs.default.name=hdfs://ip-10-64-74-82.ec2.internal:9000
metastore.SDS.locaion: s3://mapreduce.dev.evite.com/table_out/events
Ideas? Should fs.default somehow point to S3?
Cheers,
B
On Mon, Sep 26, 2011 at 5:11 PM, Miguel
Hi Bradford,
For tables stored on s3, you have to specify :
create EXTERNAL table events …
Regards,
Miguel
On 27 Sep 2011, at 00:28, Jonathan Seidman wrote:
> Hey Bradford - from my experience that error occurs when there's a conflict
> between the "default.fs.name" setting and the value in t
Hey Bradford - from my experience that error occurs when there's a conflict
between the "default.fs.name" setting and the value in the
metastore.SDS.location column in the Hive metadata. For us this has occurred
when either migrating to a new cluster or changing the NN hostname. Not sure
how all th
Hey amigos,
I'm doing a EMR load for HDFS to S3 data. My example looks correct,
but I'm getting an odd error. Since all the EMR data is in one
directory, I'm copying the file to HDFS, then doing 'LOAD DATA INPATH'
to put it back into S3.
CREATE TABLE events(
..blahblah...
)
ROW FORMAT DELIMITED
F