No, .hiverc only works for CLI.

UDFs are tricky. The only way I can think of is to add them to the
function registry
(https://github.com/apache/hive/blob/trunk/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java)
and recompile Hive.

On Mon, Dec 10, 2012 at 8:01 AM, John Omernik <j...@omernik.com> wrote:
> Will that work for my thrift server connections?
>
>
> On Sun, Dec 9, 2012 at 7:56 PM, विनोद सिंह <vi...@vinodsingh.com> wrote:
>>
>> Put a .hiverc file in your home directory containing commands, Hive CLI
>> will execute all of them at startup.
>>
>> Thanks,
>> Vinod
>>
>> On Sun, Dec 9, 2012 at 10:25 PM, John Omernik <j...@omernik.com> wrote:
>>>
>>> I am looking for ways to streamline some of my analytics. One thing I
>>> notice is that when I use hive cli, or connect to my hive thrift server,
>>> there are a some commands I always end up running for my session.  If I have
>>> multiple CLIs or connections to Thrift, then I have to run it each time. If
>>> I lose a connection to hive thrift, I have to run them.  Etc etc.
>>>
>>> My thought was, is there a way that upon opening a hive cli or connection
>>> to a hive thrift server, could I have certain commands be executed?
>>>
>>> These commands include a use command to get me to a specific database
>>> (perhaps there is a default database config variable?) or loading up all the
>>> temporary functions I use (UDFs) .
>>>
>>> For example, I have a UDF to do URL decoding:
>>>
>>> CREATE TEMPORARY FUNCTION uridecode AS 'org.domain.analytics.URIDECODE;
>>>
>>> Can I get this to run auto magically at hive cli start or thrift server
>>> connection?
>>>
>>> If not, could we build it in that we can add UDFs to hive without doing a
>>> recompile that stay in permanently?
>>>
>>> I would welcome discussion on this!
>>>
>>>
>>
>

Reply via email to