Re: hive create table error

2015-10-12 Thread Sanjeev Verma
getting this exception Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:185) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache

Re: hive create table error

2015-10-12 Thread Syed Abulullah
Hi Sanjeev - Did you try to change your query to explicitly specify par.* create table sample_table AS select par.* from parquet_table par inner join parquet_table_counter ptc ON ptc.user_id=par.user_id; Thanks. From: Sanjeev Verma mailto:sanjeev.verm...@gmail.com>> Reply-To: "user@hive.apach

Re: Hive create table line terminated by '\n'

2015-01-13 Thread 王鹏飞
Can you give me a simple example like table only contains one string column,and in that sting column it contains '\n' as the content ? On Wed, Jan 14, 2015 at 1:42 AM, Xuefu Zhang wrote: > Consider using dataformat other than TEXT such as sequence file. > > On Mon, Jan 12, 2015 at 10:54 PM, 王鹏飞

Re: Hive create table line terminated by '\n'

2015-01-13 Thread Xuefu Zhang
Consider using dataformat other than TEXT such as sequence file. On Mon, Jan 12, 2015 at 10:54 PM, 王鹏飞 wrote: > Thank you,maybe i didn't express my question explicitly.I know the hive > create table clause,and there exists FIELDS TERMINATED BY etc. > For example,if i use FIELDS TERMINATED BY ' ,

Re: Hive create table line terminated by '\n'

2015-01-12 Thread 王鹏飞
Thank you,maybe i didn't express my question explicitly.I know the hive create table clause,and there exists FIELDS TERMINATED BY etc. For example,if i use FIELDS TERMINATED BY ' ,',what if the ' ,' is contained in One field,hive will use the rule to separate One field. You might suggested me to c

RE: Hive create table line terminated by '\n'

2015-01-12 Thread Xiaoyong Zhu
I guess you could use fields terminated by clause.. CREATE TABLE IF NOT EXISTS default.table_name ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' COLLECTION ITEMS TERMINATED BY '\002' MAP KEYS TERMINATED BY '\003' STORED AS TEXTFILE Xiaoyong From: 王鹏飞 [mailto:wpf5...@gma

Re: Hive Create Table command throws datanucleus error

2014-01-17 Thread Stephen Sprague
Good 'ole permissions, eh? A shame the error thrown by the datanuculeus code didn't expose this important piece of information. Instead we get a red herring about "autocreate flag" nonsense and waste hours of time. Egads. +1 for sharing back! On Fri, Jan 17, 2014 at 6:26 PM, Mohammad Islam wro

Re: Hive Create Table command throws datanucleus error

2014-01-17 Thread Mohammad Islam
I never used postgres with Hive. These links might be helpful: 1. http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-1.3.0/bk_dataintegration/content/ch_using-hive.html Look for Using Postgres for the Hive Metastore 2.  http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/4.2.0/C

Re: Hive Create Table command throws datanucleus error

2014-01-17 Thread Leena Gupta
After several tries, I finally figured it out. Thought I'll post what I had to do in case others hit this problem. >From what I observed, it looks like the Datanucleus error is related to the absence of proper access permissions to hiveuser in postgres. I uninstalled both hive and postgres, install

Re: Hive Create Table command throws datanucleus error

2014-01-16 Thread Stephen Sprague
okay. so SEQUENCE_TABLE does indeed exist. that's the first thing to get out of the way then. hmm. I suspect these other tables are just artifacts from the differences in the hive versions. yeah, i'm not sure where to go from here and I couldn't find much via googling either. would it be possib

Re: Hive Create Table command throws datanucleus error

2014-01-16 Thread Leena Gupta
Thanks for responding Stephen. I checked the tables in Postgres and the SEQUENCE_TABLE exists. However in comparison to the list you gave for Hive 0.12, the following tables are missing, not sure if these could be the cause of the datanucleus error : DELEGATION_TOKENS MASTER_KEYS VERSION Thank

Re: Hive Create Table command throws datanucleus error

2014-01-16 Thread Stephen Sprague
On Thu, Jan 16, 2014 at 4:17 PM, Leena Gupta wrote: > Could not create "increment"/"table" value-generation container > "SEQUENCE_TABLE" since autoCreate flags do not allow it. Interestingly enough this exact same question is posted here: http://stackoverflow.com/questions/19205318/cannot-cre

Re: Re: Hive create table

2011-05-29 Thread jinhang du
Thanks for all your help. I fix my problem by editing the hive-site.xml. hive.aux.jars.path file:///usr/lib/hive/lib/hive-contrib-0.7.0-CDH3B4.jar These JAR file are available to all users for all jobs Now, I want to understand the "input.regex" . Is this Java regular expression? Thanks

Re: Hive create table

2011-05-27 Thread jinhang du
Are there any documents to help me understand the meaning and usage of "input.regex"? I create table as your introduction. And "select * from table1" returns the right answer. However, "select table1.str1 from table1" gains the following exception. Total MapReduce jobs = 1 Launching Job 1 out of

Re: Hive create table

2011-05-25 Thread jinhang du
Thanks for your information. It works. 在 2011年5月25日 下午9:57,valentina kroshilina 写道: > you can use something like this: > > CREATE EXTERNAL TABLE IF NOT EXISTS table1 ( >str1 int, >str2 int, >str3 int > ) > ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' > With SERD

Re: Hive create table

2011-05-25 Thread valentina kroshilina
you can use something like this: CREATE EXTERNAL TABLE IF NOT EXISTS table1 ( str1 int, str2 int, str3 int ) ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' With SERDEPROPERTIES ( "input.regex"="(\\d+)(\\d+)(\\d+)" ) LOCATION '/path1; 2011/5/25 jinhang du >

Re: Hive create table

2011-05-25 Thread bejoy_ks
Hi Jinhang I don't think hive supports multi character delimiters. The hassle free option here would be to preprocess the data using mapreduce to replace the multi character delimiter with another permissible one that suits your data. Regards Bejoy K S -Original Message- From: jin