Hi Jeff,
the ticket is certainly relevant, thanks for digging it out, but as I said
I can repro in 1.6.0-rc2. Will try again just to make sure.

On Tue, Dec 15, 2015 at 5:17 PM Jeff Zhang <zjf...@gmail.com> wrote:

> It should be resolved by this ticket
> https://issues.apache.org/jira/browse/SPARK-11191
>
>
>
> On Wed, Dec 16, 2015 at 3:14 AM, Antonio Piccolboni <
> anto...@piccolboni.info> wrote:
>
>> Hi,
>> I am trying to create a UDF using the thiftserver. I followed this
>> example <https://gist.github.com/airawat/7461612>, which is originally
>> for hive. My understanding is that the thriftserver creates a hivecontext
>> and Hive UDFs should be supported. I then sent this query to the
>> thriftserver (I use the RJDBC module for R but I doubt any other JDBC
>> client would be any different):
>>
>>
>> CREATE TEMPORARY FUNCTION NVL2 AS 'khanolkar.HiveUDFs.NVL2GenericUDF'
>>
>> I only changed some name wrt  the posted examples, but I think the class
>> was found just right because 1)There's no errors in the log or console 2)I
>> can generate a class not found error mistyping the class name, and I see it
>> in the logs 3) I can use the reflect builtin to invoke a different function
>> that I wrote and supplied to spark in the same way (--jars option to
>> start-thriftserver)
>>
>> After this, I can't use the NVL2 function in a query and I can't even do
>> a  DESCRIBE query on it,  nor does it list with SHOW FUNCTIONS. I tried
>> both 1.5.1 and 1.6.0-rc2 built with thriftserver support for Hadoop 2.6
>>
>> I know the HiveContext is slightly behind the latest Hive as far as
>> features, I believe one or two revs, so that may be one potential problem,
>> but all these feature I believe are present in Hive 0.11 and should have
>> made it into Spark. At the very least, I would like to see some message in
>> the logs and console so that I can find the error of my ways, repent and
>> fix my code. Any suggestions? Anything I should post to support
>> troubleshooting? Is this JIRA-worthy? Thanks
>>
>> Antonio
>>
>>
>>
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to