RE: How to change the separator of INSERT OVERWRITE LOCAL DIRECTORY

2013-06-19 Thread Tony Burton
Works for me like this, using the most recent Hive on AWS, using a ~ delimiter: > create external table mydata () row format delimited fields terminated by '~' stored as textfile location 's3://mybucket/path/to/data'; > insert overwrite table mydata select * from otherdata; From: Felix.徐 [mai

RE: S3/EMR Hive: Load contents of a single file

2013-03-27 Thread Tony Burton
one. A directory table uses all files in the directory while a file table uses one specific file and properly avoids sibling files. My bad. Thanks for the careful analysis and clarification. TIL! Cheers! On Mar 27, 2013, at 02:58 , Tony Burton wrote: > A bit more info - do an exten

RE: S3/EMR Hive: Load contents of a single file

2013-03-27 Thread Tony Burton
singly) "location:s3://mybucket/path/to/data/" From: Tony Burton [mailto:tbur...@sportingindex.com] Sent: 27 March 2013 08:46 To: 'user@hive.apache.org' Subject: RE: S3/EMR Hive: Load contents of a single file Thanks for the reply Keith. > you could have dispensed with the addi

RE: S3/EMR Hive: Load contents of a single file

2013-03-27 Thread Tony Burton
you know for certain that it isn't using other files also in that directory as part of the same table...or if it is currently empty, that if you add a new file to the directory after creating the table in your described fashion, it doesn't immediately become visible as part of the table?

RE: S3/EMR Hive: Load contents of a single file

2013-03-26 Thread Tony Burton
on, you have to point to a directory. You cannot point to a file (IMHO). Hope that helps sanjay From: Tony Burton mailto:tbur...@sportingindex.com>> Reply-To: "user@hive.apache.org<mailto:user@hive.apache.org>" mailto:user@hive.apache.org>> Date: Tuesday,

S3/EMR Hive: Load contents of a single file

2013-03-26 Thread Tony Burton
ehouse Clearly different, but which is correct? Is there an easier way to load a single file into a hive table? Or should I just put each file in a directory and proceed as before? Thanks! Tony Tony Burton Senior Software Engineer e:

RE: HWI use on AWS/EMR

2013-01-21 Thread Tony Burton
uld be able to change the security group for the node. This feature was not there previously, came in few months back On Mon, Jan 21, 2013 at 5:09 PM, Tony Burton mailto:tbur...@sportingindex.com>> wrote: I've tracked down this document on AWS: http://docs.aws.amazon.com/AWSEC2/latest/Us

RE: HWI use on AWS/EMR

2013-01-21 Thread Tony Burton
i.com<http://www.openbi.com/> | ariel.mar...@openbi.com<mailto:ariel.mar...@openbi.com> 150 N Michigan Avenue, Suite 2800, Chicago, IL 60601 Cell: 314-827-4356 On Fri, Jan 18, 2013 at 12:17 PM, Tony Burton mailto:tbur...@sportingindex.com>> wrote: If you could provide the steps to do t

RE: HWI use on AWS/EMR

2013-01-18 Thread Tony Burton
correct. Best, Aril - Ariel Marcus, Consultant www.openbi.com<http://www.openbi.com/> | ariel.mar...@openbi.com<mailto:ariel.mar...@openbi.com> 150 N Michigan Avenue, Suite 2800, Chicago, IL 60601 Cell: 314-827-4356 On Fri, Jan 18, 2013 at 12:00 PM, Tony Burton mailto:tbur...

RE: HWI use on AWS/EMR

2013-01-18 Thread Tony Burton
<mailto:ariel.mar...@openbi.com> 150 N Michigan Avenue, Suite 2800, Chicago, IL 60601 Cell: 314-827-4356 On Fri, Jan 18, 2013 at 11:26 AM, Tony Burton mailto:tbur...@sportingindex.com>> wrote: Hi Ariel, Thanks for the speedy reply. We'll be accessing the HWI from Window

RE: HWI use on AWS/EMR

2013-01-18 Thread Tony Burton
ot blocked by default. That's certainly true for ports 9XXX used by the JobTracker, etc. dean On Fri, Jan 18, 2013 at 9:54 AM, Tony Burton mailto:tbur...@sportingindex.com>> wrote: Hi, I'm trying to get HWI running and accessible from an Amazon Web Services EMR instance. I'v

RE: HWI use on AWS/EMR

2013-01-18 Thread Tony Burton
work, verify that port 80 is not blocked by default. That's certainly true for ports 9XXX used by the JobTracker, etc. dean On Fri, Jan 18, 2013 at 9:54 AM, Tony Burton mailto:tbur...@sportingindex.com>> wrote: Hi, I'm trying to get HWI running and accessible from an Amazon Web

HWI use on AWS/EMR

2013-01-18 Thread Tony Burton
Hi, I'm trying to get HWI running and accessible from an Amazon Web Services EMR instance. I've hit a blocker early on though, and the documentation is less than illuminating. Can you share any experiences you have had? Specifically, here's what I'm curious about. - Running on AWS. I've create

RE: question on output hive table to file

2012-09-05 Thread Tony Burton
I use the following example to set my own delimiter, I hope it's easy to adjust for your own needs: hive> create external table input (a int, b string, c float) row format delimited fields terminated by "\t" stored as sequencefile location 's3://path/to/data/input/'; hive> create external table

question about Hive 'recover partitions' on AWS S3

2012-04-24 Thread Tony Burton
Hi, Is it possible ever to not specify the partition variable name when discovering partitions? I'm sure I've seen this demonstrated but of course when it's needed, I can't find it. Can anyone clarify? I have a number of date-named directories in Amazon AWS S3, containing data stored in sequen

RE: UnixODBC and Hive setup

2011-10-28 Thread Tony Burton
ssor::eventHandler_ boost::shared_ptr apache::thrift::TProcessor::eventHandler_ Any ideas? Thanks in advance. Tony From: Tony Burton [mailto:tbur...@sportingindex.com] Sent: 27 October 2011 13:49 To: user@hive.apache.org Subject: RE: UnixODBC and Hive setup ** to fix the missing uint32_t i had to

RE: UnixODBC and Hive setup

2011-10-27 Thread Tony Burton
7; must be available [-fpermissive] (and one more similar) This required an extra #include "/usr/include/netinet.h" in /usr/local/include/thrift/protocol/TBinaryProtocol.tcc From: Tony Burton [mailto:tbur...@sportingindex.com] Sent: 27 October 2011 13:35 To: user@hive.apache

RE: UnixODBC and Hive setup

2011-10-27 Thread Tony Burton
mp;& make && make install. Bit more instructions are there in README in same dir. Hope it helps, Ashutosh On Wed, Oct 26, 2011 at 09:05, Tony Burton mailto:tbur...@sportingindex.com>> wrote: Hello, I'm having trouble installing and running UnixODBC for Hive. Can anyone h

RE: UnixODBC and Hive setup

2011-10-26 Thread Tony Burton
et install gcc $ sudo apt-get install g++ which will upgrade your gcc to latest version. That may help. Ashutosh On Wed, Oct 26, 2011 at 10:02, Tony Burton mailto:tbur...@sportingindex.com>> wrote: Hi Ashutosh Thanks for the speedy reply! I found the contrib/fb303 easily enough - the

RE: UnixODBC and Hive setup

2011-10-26 Thread Tony Burton
You need to build fb303 for this. Go to the dir where you untarred thrift tar ball. Then, cd contrib/fb303 . Then ./bootstrap.sh && configure && make && make install. Bit more instructions are there in README in same dir. Hope it helps, Ashutosh On Wed, Oct 26, 2011 at 0