It is fixed in 1.5.3
https://issues.apache.org/jira/browse/SPARK-11191
On Wed, Dec 9, 2015 at 12:58 AM, Deenar Toraskar
wrote:
> Hi Trystan
>
> I am facing the same issue. It only appears with the thrift server, the
> same call works fine via the spark-sql shell. Do you have any workarounds
>
Hi Trystan
I am facing the same issue. It only appears with the thrift server, the
same call works fine via the spark-sql shell. Do you have any workarounds
and have you filed a JIRA/bug for the same?
Regards
Deenar
On 12 October 2015 at 18:01, Trystan Leftwich wrote:
> Hi everyone,
>
> Since
Hi everyone,
Since upgrading to spark 1.5 I've been unable to create and use UDF's when
we run in thrift server mode.
Our setup:
We start the thrift-server running against yarn in client mode, (we've also
built our own spark from github branch-1.5 with the following args, -Pyarn
-Phive -Phive-thr
Sure and sparksql supports Hive UDFs.
ISTM that the UDF 'DATE_FORMAT' is just not registered in your metastore?
Did you say 'CREATE FUNCTION' in advance?
Thanks,
On Tue, Jul 14, 2015 at 6:30 PM, Ravisankar Mani wrote:
> Hi Everyone,
>
> As mentioned in Spark sQL programming guide, Spark SQL sup
Hi Everyone,
As mentioned in Spark sQL programming guide, Spark SQL support Hive UDFs.
I have built the UDF's in hive meta store. It working perfectly in hive
connection. But it is not working in spark ("java.lang.RuntimeException:
Couldn't find function DATE_FORMAT").
Could you please help how t