: user@hive.apache.org
Subject: Re: GenericUDF
Thanks Jason for your inputs.
I believe you are talking about the number of instances created, which explains
why the constructor was called thrice. But I'm still unclear about the two
calls made to the initialize method, when I use the temp
Thanks Jason for your inputs.
I believe you are talking about the number of instances created, which
explains why the constructor was called thrice. But I'm still unclear about
the two calls made to the initialize method, when I use the temporary
function in the query. Can you put some more light
- Created once when registering the function to the FunctionRegistry.
- The UDF is copied from the version in the registry during query compilation
- The query plan is serialized, then deserialized by the tasks during query
execution, which constructs another instance of the UDF.
Tried your example with Hive trunk. Didn't quite work out of the box, you'll
need to replace List with List.
Otherwise, this seemed to work:
hive> select ComplexUDFExample(array('a', 'b', 'c'), 'a') from src limit 3;
….
OK
true
true
true
Time taken: 6.271 seconds, Fetched: 3 row(s)
On Feb 4, 20
I want to do a simple test like this - but not working -
select ComplexUDFExample(List("a", "b", "c"), "b") from table1 limit 10;
FAILED: SemanticException [Error 10011]: Line 1:25 Invalid function 'List'
On Tuesday, February 4, 2014 2:34 PM, Raj Hadoop wrote:
How to test a Hive Generi
How to test a Hive GenericUDF which accepts two parameters List, T
List -> Can it be the output of a collect set. Please advise.
I have a generic udf which takes List, T. I want to test it how it works
through Hive.
On Monday, January 20, 2014 5:19 PM, Raj Hadoop wrote:
The following
lto:edlinuxg...@gmail.com]
Sent: Tuesday, May 29, 2012 4:58 PM
To: user@hive.apache.org
Subject: Re: GenericUdf and Jdbc issues
So.
this.getResourceAsStream(filename) is a very tricky method to get right
especially in hive which you have the hive-classpath, the hadoop-classpath, the
hive-jdbc c
So.
this.getResourceAsStream(filename) is a very tricky method to get
right especially in hive which you have the hive-classpath, the
hadoop-classpath, the hive-jdbc classpath. Especially when you
consider that launched map/reduce tasks get there own environment and
classpath.
I had the same i