parquet
>> format on S3.
>>
>> 2. We copied one parquet file object(data) to a separate S3
>> bucket(target) so now our target bucket contains one parquet file data in
>> following hierarchy on S3 :-
>> s3:///Test/00_0 (Size of object : 1218 Bytes)
>>
; hierarchy on S3 :-
> s3:///Test/00_0 (Size of object : 1218 Bytes)
>
> 3. After that, we have executed following 3 command in Apache Hive 2.1.1
> managed by us on EC2 cluster :-
>
> (i) Create External table on top of above S3 location :-
>
> CREATE EXTERNAL TABLE
Hello,
I was running some create partitioned external table queries looking like:
# 30 partitions inside
CREATE EXTERNAL TABLE table1 (value string) PARTITIONED BY (shard string)
LOCATION 's3a://path/date=2021-02-01/';
INFO : Completed compiling command(queryId=); Time taken: 7.753 se
9)
>
> at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
>
> ... 42 more
>
> )
>
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:512)
>
> at
> org.apache.hadoop.hive.metas
Yes, it works. Thank you very much,
Garry
From: Suresh Kumar Sethuramaswamy
Reply-To: "user@hive.apache.org"
Date: Wednesday, November 7, 2018 at 3:10 PM
To: "user@hive.apache.org"
Subject: Re: Create external table with s3 location error
Thanks for the logs. Couple of th
.(SessionHiveMetaStoreClient.java:74)
From: Suresh Kumar Sethuramaswamy
Reply-To: "user@hive.apache.org"
Date: Wednesday, November 7, 2018 at 2:50 PM
To: "user@hive.apache.org"
Subject: Re: Create external table with s3 location error
Are you using EMR or Apache hadoop open source?
Can you share
> reboot the server. Any suggestion?
>
>
>
> hive> create external table kv (key int, values string) location
> 's3://cu-iclick/test';
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask.
> MetaException(message:java.lang.NullPointerException)
>
>
>
> Garry
>
hi All,
I am try to create a external table using s3 as location but
failed. I add my access key and security key in hive-site.xml and reboot the
server. Any suggestion?
hive> create external table kv (key int, values string) location
's3://cu-iclick/test&
gt; > > > Elliot.
> > > >
> > > > > On Sat, 9 Dec 2017 at 00:08, Scott Halgrim
> > > > > wrote:
> > > > > > Hi,
> > > > > >
> > > > > > I’ve been struggling with this for a few hours, hopefu
grim
> wrote:
>
>> Hi,
>>
>> I’ve been struggling with this for a few hours, hopefully somebody here
>> can help me out.
>>
>> We have a lot of data in parquet format on S3 and we want to use Hive to
>> query it. I’m running on ubuntu and we ha
ave a lot of data in parquet format on S3 and we want to use Hive to
> > > query it. I’m running on ubuntu and we have a MySQL metadata store on AWS
> > > RDS.
> > >
> > > The command in the hive client I’m trying to run is:
> > > > CREATE EXT
his for a few hours, hopefully somebody here
> can help me out.
>
> We have a lot of data in parquet format on S3 and we want to use Hive to
> query it. I’m running on ubuntu and we have a MySQL metadata store on AWS
> RDS.
>
> The command in the hive client I’m trying to run
is:
> CREATE EXTERNAL TABLE
> my_schema.my_table
> (account_id INT,
> action VARCHAR(282),
> another_id INT
> yaid INT,
> `date` TIMESTAMP,
> deleted_at TIMESTAMP,
> id INT,
> lastchanged TIMESTAMP,
> thing_index DOUBLE,
> old_id INT,
> parent_id INT,
> ru
Hi,
I am currently using hive 2.1 on emr5.0 and every time I tried to create an
external table against a folder that does not exist on s3, hive returns s3
403 exception even though I can read/write to that bucket using awscli on
the same host. I remembered this working on an earlier version of hiv
are enabled
on HDFS and Hive plugins.
In case I create external table and Location in HDFS is empty or there
are small amount of files then create table is ok via beeline. But in
case there are loads of files in example 10 000 or more then I can not
create external table via beeline. But I can
Hi
I have kerberos and ranger enabled cluster. Ranger plugins are enabled
on HDFS and Hive plugins.
In case I create external table and Location in HDFS is empty or there
are small amount of files then create table is ok via beeline. But in
case there are loads of files in example 10 000 or
I have kerberos enabled in my cluster.
In case I create external table using beeline I see from hdfs namenode
log that it does Kerberos auth for every single file I guess.
It may be the reason why creating external hive table fails in case I
have loads of directories and files under them
/files_10k/f1966.txt
...
Connected to: Apache Hive (version 1.2.1.2.3.4.0-3485)
Driver: Hive JDBC (version 1.2.1.2.3.4.0-3485)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://bigdata29.webmedia.int:1/> create external table
files_10k (i int) row format delimited fie
://hadoopnn1.estpak.ee:1/def> create external table
files_10k (i int) row format delimited fields terminated by '\t'
location '/user/margusja/files_10k';
No rows affected (0.197 seconds)
2: jdbc:hive2://hadoopnn1.estpak.ee:1/def> drop table files_10k;
No rows
More information:
2016-05-11 13:31:17,086 INFO [HiveServer2-Handler-Pool: Thread-5867]:
parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command:
create external table files_10k (i int) row format delimited fields
terminated by '\t' location '/user/margusja/files_10k
e!
No rows affected (1.225 seconds)
0: jdbc:hive2://hadoopnn1.example.com:2181
<http://hadoopnn1.example.com:2181>,hado> drop table if exists trips;
Getting log thread is interrupted, since query is done!
No rows affected (0.159 seconds)
0: jdbc:hive2://hadoopnn
Sadly in our environment:
Generated files like you did.
Connected to: Apache Hive (version 1.2.1.2.3.4.0-3485)
Driver: Hive JDBC (version 1.2.1.2.3.4.0-3485)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoopnn1.estpak.ee:2181,hado> create external table
files_10k
Could not reproduced that issue on Cloudera quickstart VM.
I’ve created an HDFS directory with 10,000 files.
I’ve create external table from within beeline.
The creation was immediate.
Dudu
---
bash
ample.com:2181,hado> use default;
> Getting log thread is interrupted, since query is done!
> No rows affected (1.225 seconds)
> 0: jdbc:hive2://hadoopnn1.example.com:2181,hado> drop table if exists
> trips;
> Getting log thread is interrupted, since query is done!
> No rows affecte
e2://hadoopnn1.example.com:2181,hado> CREATE EXTERNAL TABLE
`TRIPS`(
0: jdbc:hive2://hadoopnn1.example.com:2181,hado> `bike_nr` string,
0: jdbc:hive2://hadoopnn1.example.com:2181,hado> `duration` int,
0: jdbc:hive2://hadoopnn1.example.com:2181,hado> `start_date` string,
Try this simple external table creation in beeline (check first that that
it connects OK)
*use default;drop table if exists trips;CREATE EXTERNAL TABLE `TRIPS`(
`bike_nr` string, `duration` int, `start_date` string, `start_station`
string, `end_station` string
Hi again
I opened hive (an old client)
And exactly the same create external table location [paht in hdfs
to place where are loads of files] works and the same DDL does not work
via beeline.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 10/05/16 23:03
Hi
Can someone explain or provide documentation how Hive creates external
tables?
I have problem with creating external table in case I am pointing
location in hdfs in to directory where are loads of files. Beeline just
hangs or there will be other errors.
In case I point location in to th
> And again: the same row is correct if I export a small set of data, and
>incorrect if I export a large set - so I think that file/data size has
>something to do with this.
My Phoenix vs LLAP benchmark hit size related issues in ETL.
In my case, the tipping point was >1 hdfs block per CSV file.
around for this?
-Original Message-
From: Nicholas Hakobian [mailto:nicholas.hakob...@rallyhealth.com]
Sent: Thursday, January 28, 2016 3:15 PM
To: user@hive.apache.org
Subject: Re: "Create external table" nulling data from source table
Do you have any fields with embedded newline charac
Do you have any fields with embedded newline characters? If so,
certain hive output formats will parse the newline character as the
end of row, and when importing, chances are the missing fields (now
part of the next row) will be padded with nulls. This happens in Hive
as well if you are using a Te
First time posting to this list. Please forgive me if I break etiquette. I'm
looking for some help with getting data from hive to hbase.
I'm using HDP 2.2.8.
I have a compressed (zlib), orc-based hive table with 12 columns and billions
of rows.
In order to get the data into hbase, I have to cr
Hi,
I am trying to create an external table on a S3 bucket, however I'm
receiving the following error in the process:
hive> CREATE EXTERNAL TABLE ping_prod
> PARTITIONED BY(day string)
> ROW FORMAT SERDE
> 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
Hi,
I use hive0.10 from CDH4.2.1.
I thought create a external table does not require write permission in
the source directory.
But when I tried to create a external from some files that I have read
permission, a exception thrown:
hive (default)>
> create external table xux
Sent from remote device, Please excuse typos
-Original Message-
From: Joseph D Antoni
Date: Fri, 15 Feb 2013 08:55:50
To: user@hive.apache.org
Reply-To: user@hive.apache.org
Subject: Re: CREATE EXTERNAL TABLE Fails on Some Directories
Not sure--I just truncated the file list from the ls-
: Friday, February 15, 2013 11:50 AM
Subject: Re: CREATE EXTERNAL TABLE Fails on Some Directories
Something's odd about this output; why is there no / in front of 715? I always
get the full path when I run a -ls command. I would expect either:
/715/file.csv
or
/user//715/file.csv
Or is that
the directory--wasn't clear on that..
>
> Joey
>
>
>
> --
> *From:* Dean Wampler
> *To:* user@hive.apache.org; Joseph D Antoni
> *Sent:* Friday, February 15, 2013 11:37 AM
> *Subject:* Re: CREATE EXTERNAL TABLE Fails on Some Directories
>
> You confirmed that 715 is an
To: user@hive.apache.org; Joseph D Antoni
Sent: Friday, February 15, 2013 11:37 AM
Subject: Re: CREATE EXTERNAL TABLE Fails on Some Directories
You confirmed that 715 is an actual directory? It didn't become a file by
accident?
By the way, you don't need to include the file name in t
sed (changing 711 to 712,713, etc) to a file for each
> day. All of my loads work, EXCEPT for 715 and 716.
>
> Script is as follows:
>
> create external table 715_table_name
> (col1 string,
> col2 string)
> row format
> delimited fields terminated by ','
>
work, EXCEPT for 715 and 716.
Script is as follows:
create external table 715_table_name
(col1 string,
col2 string)
row format
delimited fields terminated by ','
lines terminated by '\n'
stored as textfile
location '/715/file.csv';
This is failing with:
Error in Metada
Hi,
I enabled HUE on our cloudera distribution, but I was not able to see any
command interface where I can create an externally managed table in hbase.
All I was presented a wizard to create the table. Is there a way I can just
create a externally managed table without wizard just like through hi
> Regards,
> Bejoy KS
>
> --
> *From:* Vidhya Venkataraman
> *To:* user@hive.apache.org
> *Sent:* Friday, July 27, 2012 10:21 PM
> *Subject:* Create external table like.
>
> Hi
> I am using Hive 0.7.x on my dev machine (yeah we will be upgr
27, 2012 10:21 PM
Subject: Create external table like.
Hi
I am using Hive 0.7.x on my dev machine (yeah we will be upgrading soon :) )
I used the statement indicated in the subject to create an external table:
create external table ext_sample_v1 like sample_v1 location
'/hive/ware
Hi
I am using Hive 0.7.x on my dev machine (yeah we will be upgrading soon
:) )
I used the statement indicated in the subject to create an external table:
*create external table ext_sample_v1 like sample_v1 location
'/hive/warehouse/sample_v1/';*
*
*
Since sample_v1 had partition
data local inpath
>>> './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into
>>> table vender
>>>
>>>
>>> On Thu, Jul 12, 2012 at 1:35 PM, shaik ahamed wrote:
>>>
>>>> Hi bejoy
>>>>
>>>>
t;> it shd be load data local inpath
>> './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into
>> table vender
>>
>>
>> On Thu, Jul 12, 2012 at 1:35 PM, shaik ahamed wrote:
>>
>>> Hi bejoy
>>>
>>> hive> Creat
loading data after into
>
> it shd be load data local inpath
> './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into
> table vender
>
>
> On Thu, Jul 12, 2012 at 1:35 PM, shaik ahamed wrote:
>
>> Hi bejoy
>>
>> hive> Create ext
can you just give table name while loading data after into
it shd be load data local inpath
'./usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into
table vender
On Thu, Jul 12, 2012 at 1:35 PM, shaik ahamed wrote:
> Hi bejoy
>
> hive> Create external tabl
Hi bejoy
hive> Create external table vender(vender string,supplier
string,order_date string,quantity int) row format delimited fields
terminated by ' ' stored as textfile LOCATION
'/usr/local/hadoop_dir/hadoop/big_data';
OK
Time taken: 0.276 seconds
I created the above
2 Jul 2012 12:30:23
To: ; Bejoy Ks
Reply-To: user@hive.apache.org
Subject: Re: unable to create external table plz corrrect the syntax
Thanks for the reply guys
I have tried dng with the load cmd
i need the HDFS file to be place in the below hive path
*/usr/local/hive-0.9.0#
*
ase refer
>
> https://cwiki.apache.org/Hive/languagemanual-ddl.html#LanguageManualDDL-CreateTable
>
>
> Please try our this command to avoid the syntax error
>
> Create external table vender(vender string,supplier string,order_date
> string,quantity int)
> row f
Hi Shaik
For the correct syntax for create table statement please refer
https://cwiki.apache.org/Hive/languagemanual-ddl.html#LanguageManualDDL-CreateTable
Please try our this command to avoid the syntax error
Create external table vender(vender string,supplier string,order_date
string
correct it
hive> create external table vender(vender string,supplier string,order_date
string,quantity
int)['./usr/local/hadoop_dir/hadoop/big_data/vender_details.txt'] [ row
format delimited fields terminated by ' ' stored as textfile] ;
FAILED: Parse Error: lin
Naga"
To: user@hive.apache.org
Sent: Tuesday, June 19, 2012 6:16:31 PM
Subject: Re: create external table on existing hive partitioned table ?
Thanks Mark,
The reason to create the 2nd table is One of the column is defined as string in
the first table, I wanted to read the string int
y, June 19, 2012 6:16:31 PM
Subject: Re: create external table on existing hive partitioned table ?
Thanks Mark,
The reason to create the 2nd table is One of the column is defined as string in
the first table, I wanted to read the string into Map data type.
i.e
Existing table.
{"
artitions on the on the other table
> before you access your data through it. If the schema for both the tables
> is the same (except the fact that one is managed while other is external),
> any particular reason you'd like to create a new table?
>
> Mark
>
> - Origina
ame (except the fact
that one is managed while other is external), any particular reason you'd like
to create a new table?
Mark
- Original Message -
From: "Sai Naga"
To: user@hive.apache.org
Sent: Tuesday, June 19, 2012 4:19:25 PM
Subject: create external table on existi
Is it possible, to create external table on a existing hive table which is
partitioned.
I have a existing hive table which is partitioned by dt and group like below
desc page_access;
page string
country string
dt string ( Partitioned column )
group string ( Partitioned column
Is it possible to do something like this:
CREATE EXTERNAL TABLE T
LOCATION 's3://...'
AS
SELECT ...;
I would rather not fix the schema for T so that I can easily change the
SELECT query and its underlying tables.
Because of that, I don't want to define the table first. Instead,
Hi,
I found answer for this issue. It was related to metastore_db. Server and
client were on different metastore_db.
Vivek
From: Vivek Mishra
Sent: Friday, December 03, 2010 1:48 PM
To: user@hive.apache.org
Subject: Running a HiveClient with create external table HBase
Hi,
Currently I am
REATE EXTERNAL TABLE". But somehow that table is not
getting created in Hive. Interesting point is running the sql command from
command line with Hive is running fine.
This behavior is random. Sometimes it shows me all created tables in Hive(when
I use "SHOW TABLES").
Does it h
61 matches
Mail list logo