It's been ages since I wrote one, but the differences to mine: a) I use LongWritable: public LongWritable evaluate(LongWritable startAt) { b) I have annotations on the class (but I think they are just for docs) @Description(name = "row_sequence", value = "_FUNC_() - Returns a generated row sequence number starting from 1") @UDFType(deterministic = false) public class UDFRowSequence extends UDF {
Hope this helps! Tim On Mon, Sep 30, 2013 at 10:47 PM, Yang <teddyyyy...@gmail.com> wrote: > I wrote a super simple UDF, but got some errors: > > UDF: > > package yy; > import org.apache.hadoop.hive.ql.exec.UDF; > import java.util.Random; > import java.util.UUID; > import java.lang.management.*; > > public class MyUdf extends UDF { > static Random rand = new Random(System.currentTimeMillis() + > Thread.currentThread().getId()* 1000000); > String name = ManagementFactory.getRuntimeMXBean().getName(); > long startValue = Long.valueOf(name.replaceAll("[^\\d]+", "")) * > 10000 + Thread.currentThread().getId() * 1000; > public long evaluate(long x ) { > //return (long)UUID.randomUUID().hashCode(); > //return rand.nextLong(); > return startValue++; > } > } > > > > > > sql script: > > CREATE TEMPORARY FUNCTION gen_uniq2 AS 'yy.MyUdf'; > select gen_uniq2(field1), field2 > from yy_mapping limit 10; > > field1 is bigint, field2 is int > > > > > > error: > > hive> source aa.sql; > Added ./MyUdf.jar to class path > Added resource: ./MyUdf.jar > OK > Time taken: 0.0070 seconds > FAILED: SemanticException [Error 10014]: Line 2:7 Wrong arguments > 'field1': No matching method for class yy.MyUdf with (bigint). Possible > choices: _FUNC_() > > > > > > so I'm declaring a UDF with arg of long, so that should work for a bigint > (more importantly it's complaining not long vs bigint, but bigint vs void > ). I tried changing both to int, same failure > > > thanks! > yang > > >