On 13/02/13 17:34, Mark Grover wrote:
> Hi Marcin,
> Sorry to hear that you ran into this.
>
> My guess is you are using Yarn and this is, in fact, a known issue.
>
> The culprit line here is
> https://github.com/apache/hive/blob/branch-0.10/ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.ja
I think we need to think a little bigger than this.
Recently I've been thinking that what would be most useful to the Hive user
community would be a CHAN – Comprehensive Hive Archive Network, (analogous to
CPAN, CRAN, CTAN etc.). A central place where user-contributed UD[A,T]Fs could
be upload
I stumbled across the little documented reflect function today. I've always
known about it, but java scares me if it's not in a cup so I didn't dig.
Well today I dug, and found an awesome use case for reflect (for me) and
wanted to share. I also thought it would be nice to validate some thoughts
Hi All,
In HIVE-3959, I'm working actively on guaranteeing accuracy of physical stats.
For context, the status quo in Hive is that both Table stats and Partition
stats exist but are quite unreliable (even with hive.stats.reliable set to
true). Either stats should be reliable or they should not
Hi Robin,
Thanks for the response.
The point mentioned by you is one of the many other serious issues in the
code.
I have fixed all of them and it is pretty much in a good shape presently
after some testing.
You can find it at:
https://github.com/Abhishek2301/Hive/blob/master/src/UDAFTopNPercent.j
Hi,
You might well have found the error yourself by now, but if not, the problem is
that you missed the "static" keyword off the declaration of
TopNPercentEvaluator. Line 18 of your code should read
public static class TopNPercentEvaluator implements UDAFEvaluator {
then this error go
I developed the inline udtf. Seems to work:
http://svn.apache.org/repos/asf/hive/trunk/ql/src/test/queries/clientpositive/udf_inline.q
as (id, text) FROM SRC limit 2;
SELECT inline(
ARRAY(
STRUCT (1,'dude!'),
STRUCT (2,'Wheres'),
STRUCT (3,'my car?')
)
) as (id, text) FROM SR
Hi all,
I'm running hive-0.9.0-cdh4.1.3. I've created an external table partitioned
across year, month, and day. It works fine until I write a query that
causes multiple MR jobs to be run. Then I get a FileNotFoundException when
it tries to run the second job.
Job Submission failed with exception
Figured it out
fromhttps://cwiki.apache.org/Hive/languagemanual-udf.html#LanguageManualUDF-ComplexTypeConstructors
It should beINSERT INTO TABLE oc SELECT named_struct('a', x, 'b', y) FROM tc;
--- On Wed, 2/13/13, Dean Wampler wrote:
From: Dean Wampler
Subject: Re: INSERT INTO table with STRU
Hmm. I tried the following hacks, but all wouldn't parse. Ideas?
I changed:
... select struct(x,y) ...
to
... select struct(x,y) as struct ...
... select cast(struct(x,y) as struct) ...
... select struct(x as a,y as b) ...
Okay, but there is a hack that does work; By pass INSERT INTO a
Hi Marcin,
Sorry to hear that you ran into this.
My guess is you are using Yarn and this is, in fact, a known issue.
The culprit line here is
https://github.com/apache/hive/blob/branch-0.10/ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java#L209
This issue is being tracked by
https://issu
Hi
I'm experiencing the following problem when running a simple query under
hive-0.10. I'm using the stock release package with CDH-4.1
The query need to execute map-reduce job for this to happen. For me
something along this one is sufficient:
select count(*) from sample_table;
The error I get
I'll mention some workarounds, but they all add overhead:
1. Use STRING for the column, then parse it with the date functions
Alexander mentioned.
2. Use STRING, then replace the offending '-' with a space, e.g.,
select printf("%s %s", substr('2013-02-13-08:11:22', 0, 10),
substr('2013-02-13-08:1
May
https://cwiki.apache.org/Hive/languagemanual-udf.html#LanguageManualUDF-DateFunctions
help you?
- Alex
On Feb 13, 2013, at 10:43 AM, Chunky Gupta wrote:
> Hi,
>
> I have a log file which has timestamp in format "-MM-DD-HH:MM:SS". But
> since the timestamp datatype format in hive is
Hi,
I have a log file which has timestamp in format "-MM-DD-HH:MM:SS". But
since the timestamp datatype format in hive is "-MM-DD HH:MM:SS".
I created a table with datatype of that column as TIMESTAMP. But when I
load the data it is throwing error. I think it is because of difference in
fo
Try to use following
http://hadoop.apache.org/docs/r0.18.2/hdfs_shell.html#cp
On Wed, Feb 13, 2013 at 2:01 PM, Hamza Asad wrote:
> thnx.. done it successfully.. Now i want to transfer my data from old
> location to new location, how can i do that
>
>
> On Wed, Feb 13, 2013 at 1:24 PM, Kugathasan
thnx.. done it successfully.. Now i want to transfer my data from old
location to new location, how can i do that
On Wed, Feb 13, 2013 at 1:24 PM, Kugathasan Abimaran <
abimar...@hsenidmobile.com> wrote:
> If you don't have the following property in hive-site.xml, add it and
> change the locatio
If you don't have the following property in hive-site.xml, add it and
change the location where you want.
hive.metastore.warehouse.dir
/user/hive/warehouse
location of default database for the warehouse
On Wed, Feb 13, 2013 at 1:44 PM, Hamza Asad wrote:
> Dear all, how can i change def
18 matches
Mail list logo