Re: Using JAR files located on HDFS for SerDe

2017-04-12 Thread Jörn Franke
I do not think it is supported. The jar for Hive must be on a local filesystem of the Hive server (not necessarily on all nodes). > On 12. Apr 2017, at 16:57, Mahdi Mohammadinasab wrote: > > Hello, > > I am trying to add a JAR file which is located on HDFS to be later used as a > SerDe. This

Using JAR files located on HDFS for SerDe

2017-04-12 Thread Mahdi Mohammadinasab
Hello, I am trying to add a JAR file which is located on HDFS to be later used as a SerDe. This is completely possible using "ADD JAR" command but I prefer to use *hive.aux.jars.path* setting in "*hive-site.xml*" or " *HIVE_AUX_JARS_PATH*" environment variable (Because then I don't need to update

Re: How to create auto increment key for a table in hive?

2017-04-12 Thread Gopal Vijayaraghavan
> I'd like to remember that Hive supports ACID (in a very early stages yet) but > most often that is a feature that most people don't use for real production > systems. Yes, you need ACID to maintain multiple writers correctly. ACID does have a global primary key (which is not a single integer

Re: How to create auto increment key for a table in hive?

2017-04-12 Thread Luis
Hi Jone, I'd like to remember that Hive supports ACID (in a very early stages yet) but most often that is a feature that most people don't use for real production systems. I think there is nothing for "auto-increment keys" like RDBMS due the nature of parallelism of Hive and Hadoop ecosystem. Still