Thank you Kumar, It works!
How about the second Question: I add an archive file (*.tar.gz) by using ADD
ARCHIVE *.tar.gz, it seems that this file is not unarchived automatically. I am
also confused.
-- --
??: "Manoj Kumar"
Hi,
Maybe you can try adding native libraries (*.so) files along with its all
dependent libs (*.so) in
*hadoop-{version}/lib/native*
$ld libcustom.so --> should show all shared libs resolved from
local hadoop-{version}/lib/native folder
Above path is shared across the hadoop ecosystem and make
Hi, all:
I'm a beginner of hive. Recently I want to implement a UDF in hive and this
function is code with java but call some method written in C++. So my UDF need
to load some native libraries (*.so). I have already added the so to hive by
using Add file *.so. But it seems that hive doesn't add
Hi,
I want to update Hive UDFs without requiring a restart of hive. According
to:
https://www.cloudera.com/documentation/enterprise/5-14-x/topics/cm_mc_hive_udf.html#concept_zb2_rxr_lw
setting
hive.reloadable.aux.jars.path
is required. I have set it to /user/hive/libs/udf (which resides on HDFS).
CVE-2018-1284: Hive UDF series UDFXPath allow users to pass
carefully crafted XML to access arbitrary files
Severity: Important
Vendor: The Apache Software Foundation
Versions Affected: This vulnerability affects all versions from 0.6.0
Description: Malicious user might use any xpath UDFs
Is it possible to run a Hive UDF in Spark DataFrame?
g
in a MR task.
For the other queries try referencing the file as "MyData.txt".
From: Dayong
Sent: Tuesday, April 05, 2016 11:49 AM
To: user@hive.apache.org
Subject: Re: Hive UDF to fetch value from distributed cache not working with
outer que
What if you extends genericUDF
Thanks,
Dayong
> On Apr 5, 2016, at 2:11 PM, Abhishek Dubey wrote:
>
> Hi,
>
>
> We have written a Hive UDF in Java to fetch value from file added in
> distributed cache which works perfectly from a select query like :
>
> Query
Hi,
We have written a Hive UDF in Java to fetch value from file added in
distributed cache which works perfectly from a select query like :
Query 1.
select country_key, MyFunction(country_key,"/data/MyData.txt") as capital from
tablename;
But not working when trying to create
Hello,
I have a simple Hive UDF that works fine when executed against an external
table that uses test files, but no result when used with an external table
that uses Avro Snappy files. Does there any docuementation about that ? or
what i have to change ?
Cordially.
---
Can you guys help me to check whether sample https request is accessed from
Hive UDF to ensure whether any configuration issue from my end or a Bug in
Hive.
Thanks,
Prabhu Joseph
On Mon, Jan 11, 2016 at 11:53 PM, Sergey Shelukhin
wrote:
> Hmm, I’ve no idea off the top of my head
06
To: "user@hive.apache.org<mailto:user@hive.apache.org>"
mailto:user@hive.apache.org>>
Cc: "d...@hive.apache.org<mailto:d...@hive.apache.org>"
mailto:d...@hive.apache.org>>
Subject: Re: Hive UDF accessing https request
Thanks Sergey for looking into this.
B
> javax.net.ssl.SSLHandshakeException:
>sun.security.validator.ValidatorException: PKIX path building failed:
>sun.security.provider.certpath.SunCertPathBuilderException: unable to
>find valid certification path to requested
There's a linux package named ca-certificates(-java) which might be
miss
Thanks Sergey for looking into this.
Below is the Exception we are getting when we use from Hive UDF, but from
separate java program it works fine
javax.net.ssl.SSLHandshakeException:
sun.security.validator.ValidatorException: PKIX path building failed
lt;mailto:user@hive.apache.org>"
mailto:user@hive.apache.org>>
Date: Friday, January 8, 2016 at 00:51
To: "user@hive.apache.org<mailto:user@hive.apache.org>"
mailto:user@hive.apache.org>>,
"d...@hive.apache.org<mailto:d...@hive.apache.org>"
Hi Experts,
I am trying to write a Hive UDF which access https request and based on
the response return the result. From Plain Java, the https response is
coming but the https accessed from UDF is null.
Can anyone review the below and share the correct steps to do this.
create temporary
Do you have a full stack trace here?
Also what Hive/Hadoop versions?
It looks like Hive somehow thinks that the local copy of the JAR that was
downloaded (/tmp/f5dc5d85-903e-422d-af1a-892b994ecfda_resources/hive-udf.jar)
is an HDFS path for some reason, and the the distributed cache is trying to
I have a hive script , where I call a udf .
Script works fine when called from local shell script.
But when called from within oozie workflow, it throws an exception saying
jar not found.
add jar hdfs://hdfspath of jar;
create temporary function duncname as 'pkg.className';
then on calling func
Resolved by delete all files under /tmp hdfs://namenode:/tmp/hive
delete from mysql-->hive-->funcs
recreate all the function & it work
On Thu, Apr 30, 2015 at 3:36 PM, Gerald-G wrote:
> HI
> My hive version is 0.14.0 installed from HDP2.2.4
>
> On Thu, Apr 30, 2015 at 3:34 PM, Gerald-G wrot
HI
My hive version is 0.14.0 installed from HDP2.2.4
On Thu, Apr 30, 2015 at 3:34 PM, Gerald-G wrote:
> Hi ALL:
>
> I have develop three UDF and compile them in one jar. Hive Explainn one
> udf to antother class
>
> Dump INFO as Follow: Hive explain userlost-->shiftAct(), but the return
> t
Hi ALL:
I have develop three UDF and compile them in one jar. Hive Explainn one
udf to antother class
Dump INFO as Follow: Hive explain userlost-->shiftAct(), but the return
type is boolean, the right return type I want
0: jdbc:hive2://10-4-32-53:1> explain select userlost(idayacti,1) f
Hi everyone,
Consider the sql :
SELECT thumbnail( product_ image )
FROM advert i sements
WHERE product_name = ‘ Brownie’ ;
The product_ image field is a reference to a multi-megabyte image object.
The thumbnail method reads in this object,
As per this url, How can I fetch these variables in hive udf.
On 19 December 2014 at 14:30, Daniel Haviv
wrote:
>
> First result in google:
>
> http://stackoverflow.com/questions/12464636/how-to-set-variables-in-hive-scripts
>
> Daniel
>
> On 19 בדצמ׳ 2014, at 10:54
fetching from a url. I have to set this URL
> dynamically at the time of hive script run.
>
> I don't like to pass this url as separate argument tot the udf evaluate
> method. Is there a way to set this url in hive script and get from hive udf,
> or set this in user environment
o set this url in hive script and get from hive
udf, or set this in user environment and then fetch. Please tell me the
full procedure to do this.
Thanks & Regards
Dilip Agarwal
+91 8287857554
>
> Hi,
>
> Please help!
>
> I am using *hiveserver2 *on HIVE 0.13 on Hadoop 2.4.1, also
> nexr-hive-udf-0.2-SNAPSHOT.jar
>
> I can run query from CLI, e.g.
> hive> SELECT add_months(sysdate(), +12) FROM DUAL;
> Execution completed successfully
> MapredLocal task su
Hi,
Please help!
I am using hiveserver2 on HIVE 0.13 on Hadoop 2.4.1, also
nexr-hive-udf-0.2-SNAPSHOT.jar
I can run query from CLI, e.g.
hive> SELECT add_months(sysdate(), +12) FROM DUAL;
Execution completed successfully
MapredLocal task succeeded
OK
2015-12-17
Time taken: 7.393 seco
Hi,
Currently our production is using Hive 0.9.0. There is already a complex Hive
query running on hadoop daily to generate millions records output. What I want
to do is to transfer this result to Cassandra.
I tried to do it in UDF, as then I can send the data at reducer level, to
maximum the t
Hello all,
I have a simple UDF, which returns boolean return type. Without doing any
checks i have returned true in my evaluate method, (tesing purpose)
In hive query used join and invoked this udf,
It is keep on running, No outputs for long time. after more than 23 hours,
getting broken pipe exce
code for some examples)
On Jul 30, 2014, at 7:43 AM, Dan Fan wrote:
> Hi there
>
> I am writing a hive UDF function. The input could be string, int, double etc.
> The return is based on the data type. I was trying to use the generic method,
> however, hive seems not recognize it.
Hi there
I am writing a hive UDF function. The input could be string, int, double etc.
The return is based on the data type. I was trying to use the generic method,
however, hive seems not recognize it.
Here is the piece of code I have as example.
public T evaluate(final T s, final String
Yeah. After setting hive.cache.expr.evaluation=false, all queries output
expected results.
And I found that it's related to the getDisplayString function in the UDF.
At first the function returns a string regardless of its parameters. And I
had to set hive.cache.expr.evaluation = false.
But after
Looks like it's caused by HIVE-7314. Could you try that with
"hive.cache.expr.evaluation=false"?
Thanks,
Navis
2014-07-24 14:34 GMT+09:00 丁桂涛(桂花) :
> Yes. The output is correct: ["tp","p","sp"].
>
> I developed the UDF using JAVA in eclipse and exported the jar file into
> the auxlib directory
Yes. The output is correct: ["tp","p","sp"].
I developed the UDF using JAVA in eclipse and exported the jar file into
the auxlib directory of hive. Then add the following line into the
~/.hiverc file.
create temporary function getad as 'xxx';
The hive version is 0.12.0. Perhaps the problem r
Have you tried this query without UDF, say:
select
array(tp, p, sp) as ps
from
(
select
'tp' as tp,
'p' as p,
'sp' as sp
from
table_name
where
id =
) t;
And how you implement the UDF?
谢谢
金杰 (Jie Jin)
On Wed, Jul 23, 2014 at 1:34 PM, 丁桂涛(桂花) wrote:
> R
Recently I developed a Hive Generic UDF *getad*. It accepts a map type and
a string type parameter and outputs a string value. But I found the UDF
output really confusing in different conditions.
Condition A:
select
getad(map_col, 'tp') as tp,
getad(map_col, 'p') as p,
getad(map_col, 'sp')
g in single reducer.
>>>>>
>>>>> Thanks,
>>>>> Navis
>>>>>
>>>>>
>>>>> 2014-07-10 1:50 GMT+09:00 Malligarjunan S :
>>>>>
>>>>>> Hello All,
>>>>>> Is that the
s that the expected behavior from hive to take so much of time?
>>>>>
>>>>>
>>>>> Thanks and Regards,
>>>>> Sankar S
>>>>>
>>>>>
>>>>> On Tue, Jul 8, 2014 at 11:23 PM, Malligarjunan S <
>>>>> malligarju...@gmail.com> wrote:
>>>>>
>>>>>> Hello All,
>>>>>>
>>>>>> Can any one help me to answer to my question posted on Stackoverflow?
>>>>>>
>>>>>> http://stackoverflow.com/questions/24416373/hive-udf-performance-too-slow
>>>>>> It is pretty urgent. Please help me.
>>>>>>
>>>>>> Thanks and Regards,
>>>>>> Sankar S.
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>> --
>>> Sorry this was sent from mobile. Will do less grammar and spell check
>>> than usual.
>>>
>>
>>
>
t; Hello All,
>>>> Is that the expected behavior from hive to take so much of time?
>>>>
>>>>
>>>> Thanks and Regards,
>>>> Sankar S
>>>>
>>>>
>>>> On Tue, Jul 8, 2014 at 11:23 PM, Malligarjunan
r from hive to take so much of time?
>>>
>>>
>>> Thanks and Regards,
>>> Sankar S
>>>
>>>
>>> On Tue, Jul 8, 2014 at 11:23 PM, Malligarjunan S <
>>> malligarju...@gmail.com> wrote:
>>>
>>>> Hello All,
>>
Is that the expected behavior from hive to take so much of time?
>>
>>
>> Thanks and Regards,
>> Sankar S
>>
>>
>> On Tue, Jul 8, 2014 at 11:23 PM, Malligarjunan S > > wrote:
>>
>>> Hello All,
>>>
>>> Can any one help m
M, Malligarjunan S
> wrote:
>
>> Hello All,
>>
>> Can any one help me to answer to my question posted on Stackoverflow?
>> http://stackoverflow.com/questions/24416373/hive-udf-performance-too-slow
>> It is pretty urgent. Please help me.
>>
>> Thanks and Regards,
>> Sankar S.
>>
>
>
om/questions/24416373/hive-udf-performance-too-slow
> It is pretty urgent. Please help me.
>
> Thanks and Regards,
> Sankar S.
>
It's cross producting. Not strange taking so much time even with small
tables.
Thanks,
Navis
2014-07-09 2:53 GMT+09:00 Malligarjunan S :
> Hello All,
>
> Can any one help me to answer to my question posted on Stackoverflow?
> http://stackoverflow.com/questions/24416373/hive-ud
Hello All,
Can any one help me to answer to my question posted on Stackoverflow?
http://stackoverflow.com/questions/24416373/hive-udf-performance-too-slow
It is pretty urgent. Please help me.
Thanks and Regards,
Sankar S.
Just export that path in your class path and restart megastore service.
On Jun 27, 2014 7:26 PM, "Rishabh Bhardwaj" wrote:
> Hi all,
> I have a udf namely gwudf.jar
> I have it on my local dir at /tmp/gwudf.jar ,and
> also at hdfs at /user/hive/lib/gwudf.jar
>
> so when I am adding this jar, The
Hi all,
I have a udf namely gwudf.jar
I have it on my local dir at /tmp/gwudf.jar ,and
also at hdfs at /user/hive/lib/gwudf.jar
so when I am adding this jar, The following error comes
Adding like this,
hive> add jar /tmp/gwudf.jar;
Added /tmp/gwudf.jar to class
What version of Hive are you running?
It looks like the error you're seeing might be from Hive trying to retrieve the
error message from the logs and might not be related to the actual error.
Might want to check the logs for the Hadoop task that was run as part of this
query, to see if that ha
Hi,
I'm trying to create a function that generates a UUID, want to use it in a
query to insert data into another table.
Here is the function:
package com.udf.example;
import java.util.UUID;
import org.apache.hadoop.hive.ql.exec.Description;
import org.apache.hadoop.hive.ql.exec.UDF;
import o
try
public class Uuid extends UDF{
On Thu, May 15, 2014 at 2:07 PM, Leena Gupta wrote:
> Hi,
>
> I'm trying to create a function that generates a UUID, want to use it in a
> query to insert data into another table.
>
> Here is the function:
>
> package com.udf.example;
>
> import java.util.UUI
Hi all,
We have a few Hive UDFs where I work. These are deployed by a bootstrap
script so that the JAR files are in Hive's CLASSPATH before the server
starts.
This works to load the UDF whenever a cluster is started and then the UDF
can be loaded with the ADD JAR and CREATE TEMPORARY FUNCTION co
Thanks Nitin Its done now.
The problem was this that I have to store the class file in the sam directory
structure as I have described in package declaration line in the java code.
Rishabh
On Wednesday, 9 April 2014 12:43 PM, Nitin Pawar
wrote:
Follow the steps as it is from the link I shar
Follow the steps as it is from the link I shared .. it works
Somehow your package is getting messed and it is not able to find the class
On Wed, Apr 9, 2014 at 12:27 PM, Rishabh Bhardwaj wrote:
> I added,
> package rishabh.udf.hive;
> in the above code.
> and repeated the steps.
> But Now getti
I added,
package rishabh.udf.hive;
in the above code.
and repeated the steps.
But Now getting the following error,
hive> create temporary function helloworld as
'rishabh.udf.hive.SimpleUDFExample';
FAILED: Class rishabh.udf.hive.SimpleUDFExample not found
FAILED: Execution Error, return code 1 f
in your code and that code package is missing
what you need to do is
define package something like
package org.apache.hadoop.hive.ql.udf;
then your add function definition becomes
CREATE TEMPORARY FUNCTION AS
'org.apache.hadoop.hive.ql.udf.';
feel free to use any package name you wish but mak
Hi Nitin,
Thanks for the concern.
Here is the code of the UDF,
import org.apache.hadoop.hive.ql.exec.Description;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.Text;
@Description(
name="SimpleUDFExample",
value="returns 'hello x', where x is whatever you give it (STRI
Can you put first few lines of your code here or upload code on github and
share the link?
On Wed, Apr 9, 2014 at 11:59 AM, Rishabh Bhardwaj wrote:
> Hi all,
> I have done the following steps to create a UDF in hive but getting
> error.Please help me.
> 1. Created the udf as described
> here<
Hi all,
I have done the following steps to create a UDF in hive but getting
error.Please help me.
1. Created the udf as described here.
2. Compiled it successfully.
3. Copy the class file to a directory hiveudfs.
4. Added it to a jar with this command: jar -cf hiveudfs.jar
hiveudfs/SimpleUDFExamp
Hi Hive Gurus,
Is there a Hive UDF (built-in or 3rd party) that takes a struct and converts
it to a JSON string?
Queries display structs fine, but when inserting struct columns into a table
where the corresponding columns are typed as strings, the format is gone and
all field name and
Hi Jon,
Please refer to the following document:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Select#LanguageManualSelect-REGEXColumnSpecification
Hope this helps.
Thanks
-Abdelrahman
On Tue, Aug 13, 2013 at 9:13 AM, Jon Bender wrote:
> Hi there,
>
> I'm trying to pass some
Hi there,
I'm trying to pass some external properties to a UDF. In the MapReduce
world I'm used to extending Configured in my classes, but in my UDF class
when initializing a new Configuration object or HiveConf object it doesn't
inherit any of those properties. I see it in the Job Configuration
The problem might be Java's limitation on having a single top-level class
in each file (as opposed to classes nested within a top-level class). You
would have to nest your UDFs in such a top-level class. That would work
fine, but when you define a TEMPORARY FUNCTION in Hive, I don't know if the
syn
in same UDF or in same jar?
On Tue, Jul 9, 2013 at 6:19 PM, Manickam P wrote:
> Hi,
>
> Can we write more than one function like to_upper and to_lower in same UDF
> ? Or do we need write separate UDF for each?
> Please let me know.
>
>
>
> Thanks,
> Manickam P
>
--
Nitin Pawar
Hi,
Can we write more than one function like to_upper and to_lower in same UDF ? Or
do we need write separate UDF for each? Please let me know.
Thanks,
Manickam P
implemented it successfully?
Thanks
Rupinder
From: Rupinder Singh [mailto:rsi...@care.com]
Sent: Tuesday, July 02, 2013 10:40 AM
To: user@hive.apache.org
Subject: NoClassDefFoundError when creating a custom hive UDF
Hi,
I have created a custom Hive UDF that has external JAR dependencies. I have
added
Hi,
I have created a custom Hive UDF that has external JAR dependencies. I have
added those jars to the Hive session using 'add jar' but when I try to create
my function, I get a NoClassDefFoundError on the dependency class.
I am on Hive 0.81 running in Amazon EMR.
This is what happ
Ravi,
It looks like you are missing the
ADD JAR ...
command
Ruslan
On Tue, Sep 4, 2012 at 6:45 PM, Edward Capriolo wrote:
> You could start with this:
>
> https://github.com/edwardcapriolo/hive-geoip
>
> On Tue, Sep 4, 2012 at 10:42 AM, Ravi Shetye wrote:
>> Hi
>> I am trying to register a jav
You could start with this:
https://github.com/edwardcapriolo/hive-geoip
On Tue, Sep 4, 2012 at 10:42 AM, Ravi Shetye wrote:
> Hi
> I am trying to register a java udf which looks like
>
> public final class IP_2_GEO extends UDF {
> String geo_file;
> String geo_type;
> public IP_2_GEO(String geo_
best way to write a UDF is write few test cases around it with few expected
datasets to capture the errors while developing itself rather than in a
hive session
On Tue, Jun 26, 2012 at 8:06 PM, Jan Dolinár wrote:
> Hi,
>
> Check the hadoop logs of the failed task. My best guess is that there is
Hi,
Check the hadoop logs of the failed task. My best guess is that there is an
uncaught exception thrown somewhere in your code. The logs will tell where
and what caused the problem.
Best regards,
Jan
On Tue, Jun 26, 2012 at 4:20 PM, Yue Guan wrote:
> Hi, hive users
>
> I have the following u
Hi, hive users
I have the following udf:
package com.name.hadoop.hive.udf;
import java.util.Set;
import org.apache.commons.lang.StringUtils;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.Text;
public class MyUDF extends UDF {
private Map> aMapping;
private f
By instance I mean a set of mapreduce jobs (3 in this case)..when
executing in Cli only one instance runs and output is displayed on the
screen, but this is not the case when using with PowerPivot (multiple
instance one after the other and contains the same no. of HDFS read
write...etc)...a
There are multiple instances of 3 mpareduce jobs (executing one after the
other) on running the single query using powerpivot.
I can find out next instance when this throws up in the screen after like
2 instance of the 3mapreduce jobs.
Hive history
file=/tmp/hadoop/hive_job_log_hadoop_201206121120_
Yes understood. I do not have a problem in defining the parameters in the
code. But the problem is, I am using PowerPivot as the visualization engine.
Now, when I give the query as a set like:
add jar /usr/local/hadoop/src/retweetlink1.jar;
create temporary function link as
I got that -i option is not applicable to hive server
let me drill through to find out if there is any other option
on the other side to be curious , if you are writing the code to access
hiveserver, whats the harm defining these parameters in your code?
On Tue, Jun 12, 2012 at 12:12 PM, Sreenat
Like is there anyway to make the .hiverc file be executed even in
hiveserver instance.
simple way like this
hive --service hiveserver -i .hiverc
doesnot work Nithin
Any other way Nitin, I just want to add a single jar file and do not know
much about custom hive build. And this requirement may vary at some other
point of time. Its not a good way of building hive each time I need a new
jar to be added.
aniket, his problem is that he does not want to create that function each
time.
he wants it available on each session with hive server
so we are suggesting custom hive build where he will bundle his udf with
hive and have available with hive server
On Tue, Jun 12, 2012 at 11:58 AM, Aniket Mokashi
I mean every time you connect to hive server-
execute-
create temporary function...;
your hive query...;
~Aniket
On Mon, Jun 11, 2012 at 11:27 PM, Aniket Mokashi wrote:
> put jar in hive-classpath (libs directory etc) and do a create temporary
> function every time you connect from server.
>
>
put jar in hive-classpath (libs directory etc) and do a create temporary
function every time you connect from server.
What version of hive are you on?
~Aniket
On Mon, Jun 11, 2012 at 11:12 PM, Sreenath Menon
wrote:
> I have a jar file : 'twittergen.jar', now how can I add it to hive lib.
> Kind
I have a jar file : 'twittergen.jar', now how can I add it to hive lib.
Kindly help. I need the function to be used across sections when running a
server instance. Now stuck up with this.
you can checkout hive code and build your udf and ship it with hive :)
custom hive for yourself ... if the function is generic feel free to share
on git :)
On Mon, Jun 11, 2012 at 9:28 PM, Sreenath Menon wrote:
> Ya UDF do not live across section. But what if I just want the temporary
> function
Ya UDF do not live across section. But what if I just want the temporary
function to be created each time of a new session. This is what is done
with help of .hiverc. But again this is working only with cli mode not in
server mode.
BTW I am interested to know how to build the function into hive, k
UDF's do not live across session. This is why the syntax is "CREATE
TEMPORARY FUNCTION". You can build the function into hive and then you
will not need to add the UDF.
On Mon, Jun 11, 2012 at 11:31 AM, Sreenath Menon
wrote:
> I have tried that before. It does not work. But anyways thanks for th
Nithin,
Any idea on invoking .hiverc when running : /usr/hive/bin/hive --service
hiveserver
This works when I am using Hive cli.
i.e. When I give: select link(tweet) from tweetsdata; in the cli and
defined the function 'link' in .hiverc
But when i run /usr/hive --service hiveserver
And use the function in PowerPivot, then it says that 'link' is not defined.
if you have created a file other than name ".hiverc" , you will need to
start hive with this file
something like hive -i hiverc
but when you create a file .hiverc in your home directory hive cli picks
it up automatically
On Mon, Jun 11, 2012 at 6:13 PM, Sreenath Menon wrote:
> K..so i have cre
K..so i have created a file 'sample.hiverc' in the home directory..how do I
run this particular file
in your home directory (if you are using linux with vm) then you will need
to create that file and add the entries exactly the same way you add in
hive cli
On Mon, Jun 11, 2012 at 6:06 PM, Sreenath Menon wrote:
> Hi Nitin
>
> Can u kindly help me (briefly) on how to add to hiverc...no such loca
Hi Nitin
Can u kindly help me (briefly) on how to add to hiverc...no such location
exsist in my machine
include it in your ~/.hiverc to have it across sessions
On Mon, Jun 11, 2012 at 5:42 PM, Sreenath Menon wrote:
> Hi
>
> I am using Hive with Microsoft PowerPivot as the visualization tool.
>
> When I am running a query involving UDF like this from PowerPivot:
> add jar /usr/local/hadoop/src/retwe
Hi
I am using Hive with Microsoft PowerPivot as the visualization tool.
When I am running a query involving UDF like this from PowerPivot:
add jar /usr/local/hadoop/src/retweetlink1.jar;
create temporary function link as 'retweetlink';
Followed by a select statement, the query executes fine for t
Put hive-exec*.jar in your eclipse classpath. (project properties-> java
build path -> libraries)
On Tue, Jun 5, 2012 at 8:52 AM, kulkarni.swar...@gmail.com <
kulkarni.swar...@gmail.com> wrote:
> Did you try this[1]? It had got me most of my way through the process.
>
> [1] https://cwiki.apache.o
Did you try this[1]? It had got me most of my way through the process.
[1] https://cwiki.apache.org/Hive/gettingstarted-eclipsesetup.html
On Tue, Jun 5, 2012 at 8:49 AM, Arun Prakash wrote:
> Hi Friends,
> I tried to develop udf for hive but i am getting package import error
> in eclipse.
>
> im
.exec.TableScanOperator.processOp(TableScanOperator.java:83)
>>> > at
>>> > org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>>> > at
>>> > org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:
pache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:531)
>> > ... 9 more
>> > Caused by: java.lang.reflect.InvocationTargetException
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> > Method)
>> > at
>
ssorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at
> >
> org.apache.hadoop.h
hod.java:601)
> at
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:824)
> ... 18 more
> Caused by: java.lang.NumberFormatException: For input string: "1"
> at
> java.lang.NumberFormatException.for
.lang.Long.parseLong(Long.java:441)
at java.lang.Long.(Long.java:702)
at com.musigma.hive.udf.ip2int.evaluate(ip2int.java:11)
... 23 more
If I am running the HIVE UDF like --- select ip2int("102.134.123.1") from
sample_data; Its not giving any error.
St
2 4:10:19 AM
Subject: The Confused question of hive udf
Hi,
we have an udf called minf which can change current time to one point .
for example 20120510:00:00:00 --> minf 2012051,
20120510:00:06:00 -> minf 20120510001
20120510:10:51:38 -> minf 20120510130
we test the minf function
1 - 100 of 114 matches
Mail list logo