try
public class Uuid extends UDF{

On Thu, May 15, 2014 at 2:07 PM, Leena Gupta <gupta.le...@gmail.com> wrote:

> Hi,
>
> I'm trying to create a function that generates a UUID, want to use it in a
> query to insert data into another table.
>
> Here is the function:
>
> package com.udf.example;
>
> import  java.util.UUID;
> import org.apache.hadoop.hive.ql.exec.Description;
> import  org.apache.hadoop.hive.ql.exec.UDF;
> import  org.apache.hadoop.io.Text;
>
>
> @Description(
> name = "Uuid",
> value = "_FUNC_() - Generate a unique uuid",
> extended="Select Uuid from foo limit 1;"
> )
>
> class Uuid extends UDF{
>   public Text evaluate(){
>     return new Text(UUID.randomUUID().toString());
>   }
> }
>
> I registered it successfully in Hive but when I try to use it in a query I
> get a Nullpointer exception(see below). The same function when I run
> outside of Hive by including main() is able to return a UUID.
> Could someone please help shed some light on why I'm getting this error.
>
> select entity_volume,Uuid() from test_volume limit 5;
> Total MapReduce jobs = 1
> Launching Job 1 out of 1
> Number of reduce tasks is set to 0 since there's no reduce operator
> Starting Job = job_201311092117_0312, Tracking URL =
> http://bi1-xxx.com:50030/jobdetails.jsp?jobid=job_201311092117_0312
> Kill Command = /usr/lib/hadoop/bin/hadoop job  -Dmapred.job.tracker=
> bi1-xxx.com:8021 -kill job_201311092117_0312
> Hadoop job information for Stage-1: number of mappers: 1; number of
> reducers: 0
> 2014-05-15 10:12:10,825 Stage-1 map = 0%,  reduce = 0%
> 2014-05-15 10:12:29,916 Stage-1 map = 100%,  reduce = 100%
> Ended Job = job_201311092117_0312 with errors
> Error during job, obtaining debugging information...
> Examining task ID: task_201311092117_0312_m_000002 (and more) from job
> job_201311092117_0312
> Exception in thread "Thread-23" java.lang.NullPointerException
> at
> org.apache.hadoop.hive.shims.Hadoop23Shims.getTaskAttemptLogUrl(Hadoop23Shims.java:44)
>  at
> org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.getTaskInfos(JobDebugger.java:186)
> at
> org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.run(JobDebugger.java:142)
>  at java.lang.Thread.run(Thread.java:745)
> FAILED: Execution Error, return code 2 from
> org.apache.hadoop.hive.ql.exec.MapRedTask
> MapReduce Jobs Launched:
> Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
>
> Thanks!
>

Reply via email to