I dont care about 'security issues' over the ability to work quickly. One
popular compute system let you define mappers in scala in a shell for
example.

On Monday, June 5, 2017, Makoto Yui <m...@apache.org> wrote:

> Alan,
>
> Putting Hive backported UDFs to Hive branch-1 will cause dependencies
> to the specific Hive branch-1, the next stable release of v1.x.
> Artifact should be a distinct jar that only includes backported UDFs
> to use it in exiting Hive clusters.
>
> Better to support possibly all Hive versions since v0.13.0 or later.
> So, better to be a distinct Maven submodule.
>
> Edward,
>
> Gems-like dynamic plugin loading from Maven repository (or github
> repos by using jitpack.io) is possible by using Eclipse Aether but
> dynamic plugin/class loading involves security issues.
> https://stackoverflow.com/questions/35598239/load-maven-
> artifact-via-classloader
> https://github.com/treasure-data/digdag/tree/master/
> digdag-core/src/main/java/io/digdag/core/plugin
>
> Thanks,
> Makoto
>
> 2017-06-03 3:26 GMT+09:00 Edward Capriolo <edlinuxg...@gmail.com
> <javascript:;>>:
> > Don't we currently support features that load functions from external
> places
> > like maven http server etc? I wonder if it would be easier to back port
> that
> > back port a handful of functions ?
> >
> > On Fri, Jun 2, 2017 at 2:22 PM, Alan Gates <alanfga...@gmail.com
> <javascript:;>> wrote:
> >>
> >> Rather than put that code in hive/contrib I was thinking that you could
> >> just backport the Hive 2.2 UDFs into the same locations in Hive 1
> branch.
> >> That seems better than putting them into different locations on
> different
> >> branches.
> >>
> >> If you are willing to do the porting and post the patches (including
> >> relevant unit tests so we know they work) I and other Hive committers
> can
> >> review the patches and commit them to branch-1.
> >>
> >> Alan.
> >>
> >> On Thu, Jun 1, 2017 at 6:36 PM, Makoto Yui <m...@apache.org
> <javascript:;>> wrote:
> >>>
> >>> That's would be a help for existing Hive users.
> >>> Welcome to put it into hive/contrib or something else.
> >>>
> >>> Minimum dependancies are hive 0.13.0 and hadoop 2.4.0.
> >>> It'll work for any Hive environment, version 0.13.0 or later.
> >>> https://github.com/myui/hive-udf-backports/blob/master/pom.xml#L49
> >>>
> >>> Thanks,
> >>> Makoto
> >>>
> >>> --
> >>> Makoto YUI <myui AT apache.org>
> >>> Research Engineer, Treasure Data, Inc.
> >>> http://myui.github.io/
> >>>
> >>> 2017-06-02 2:24 GMT+09:00 Alan Gates <alanfga...@gmail.com
> <javascript:;>>:
> >>> > I'm curious why these can't be backported inside Hive.  If someone is
> >>> > willing to do the work to do the backport we can check them into the
> >>> > Hive 1
> >>> > branch.
> >>> >
> >>> > On Thu, Jun 1, 2017 at 1:44 AM, Makoto Yui <m...@apache.org
> <javascript:;>> wrote:
> >>> >>
> >>> >> Hi,
> >>> >>
> >>> >> I created a repository for backporting recent Hive UDFs (as of
> v2.2.0)
> >>> >> to legacy Hive environment (v0.13.0 or later).
> >>> >>
> >>> >>    https://github.com/myui/hive-udf-backports
> >>> >>
> >>> >> Hope this helps for those who are using old Hive env :-(
> >>> >>
> >>> >> FYI
> >>> >>
> >>> >> Makoto
> >>> >>
> >>> >> --
> >>> >> Makoto YUI <myui AT apache.org>
> >>> >> Research Engineer, Treasure Data, Inc.
> >>> >> http://myui.github.io/
> >>> >
> >>> >
> >>
> >>
> >
>
>
>
> --
> Makoto YUI <myui AT apache.org>
> Research Engineer, Treasure Data, Inc.
> http://myui.github.io/
>


-- 
Sorry this was sent from mobile. Will do less grammar and spell check than
usual.

Reply via email to