Hi all,
I have a large table partitioned by date in S3 and I would like to copy it
to a local partitioned table stored in HDSF. Any hints how to do it
efficiently?
Thanks,
Rosanna
Hi Marcos,
Here's a JIRA query that will show you the list of unresolved Hive issues
related to the HBase storage handler:
https://issues.apache.org/jira/secure/IssueNavigator.jspa?reset=true&mode=hide&jqlQuery=component+%3D+%22HBase+Handler%22+AND+project+%3D+HIVE+AND+resolution+%3D+Unresolved+O
El 4/12/2011 2:13 PM, Jean-Daniel Cryans escribió:
Is there anything in particular you'd like to know?
I recently answered a more specific (but still general) question about
Hive/HBase here: http://search-hadoop.com/m/YZe7h1zxxoc1
I will also be giving a presentation at OSCON Data in July about
Hello,
I have to use Hive 7 for a project so wanted to know some details about it
and also if some one can forward em the link to a tutorial so that I can go
ahead and implement it.
Thanks a lot.
Is there anything in particular you'd like to know?
I recently answered a more specific (but still general) question about
Hive/HBase here: http://search-hadoop.com/m/YZe7h1zxxoc1
I will also be giving a presentation at OSCON Data in July about our
experience using both together.
J-D
On Mon, Ap
Hi Prash,
Try this:
create external table mslog
(
time_stamp string,
seq string
) row format delimited fields terminated by '\t' stored as textfile location
's3://your/bucket/path/'
;
Important: your s3 bucket can only contain files that have the same schema
format. Hive doesn't like
hello
good news for you
thousands of new original products here
take a look , it is the best place for Chrisama gift .
i had bought some from them , and i like much
so i tell you
good luck