ok.... I found the reason, as I modified the jar file, though I re-ran "ADD
.....MyUdf.jar;  create temporary function ....; ", it doesn't take effect.
I have to get out of hive session, then rerun these again.


On Mon, Sep 30, 2013 at 1:47 PM, Yang <teddyyyy...@gmail.com> wrote:

> I wrote a super simple UDF, but got some errors:
>
> UDF:
>
> package yy;
> import org.apache.hadoop.hive.ql.exec.UDF;
> import java.util.Random;
> import java.util.UUID;
> import java.lang.management.*;
>
> public class MyUdf extends UDF {
>         static Random rand = new Random(System.currentTimeMillis() +
> Thread.currentThread().getId()* 1000000);
>         String name = ManagementFactory.getRuntimeMXBean().getName();
>         long startValue = Long.valueOf(name.replaceAll("[^\\d]+", "")) *
> 10000 + Thread.currentThread().getId() * 1000;
>         public long evaluate(long x ) {
>                 //return (long)UUID.randomUUID().hashCode();
>                 //return rand.nextLong();
>                 return startValue++;
>          }
> }
>
>
>
>
>
> sql script:
>
> CREATE TEMPORARY FUNCTION gen_uniq2 AS 'yy.MyUdf';
> select gen_uniq2(field1), field2
> from yy_mapping limit 10;
>
> field1 is bigint, field2 is int
>
>
>
>
>
> error:
>
> hive> source aa.sql;
> Added ./MyUdf.jar to class path
> Added resource: ./MyUdf.jar
> OK
> Time taken: 0.0070 seconds
> FAILED: SemanticException [Error 10014]: Line 2:7 Wrong arguments
> 'field1': No matching method for class yy.MyUdf with (bigint). Possible
> choices: _FUNC_()
>
>
>
>
>
> so I'm declaring a UDF with arg of long, so that should work for a bigint
> (more importantly it's complaining not long vs bigint, but bigint vs void
> ). I tried changing both to int, same failure
>
>
> thanks!
> yang
>
>
>

Reply via email to