jdbc:hive2://hadoopnn1.estpak.ee:1/;principal=hive/_h...@testhadoop.com;auth=kerberos;kerberosAuthType=fromSubject.
- this works. Added dot at the end of the string.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 16/09/16 14:38, Margus Roo wrote:
Hive
Hive version 1.2.1.2.3
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 16/09/16 14:31, Margus Roo wrote:
Hi
I am trying configure Toad for Hadoop SQL part. Kerberos in enabled in
my cluster.
I can see that Toad generates JDBC string ending with ...;
auth
Hi
I am trying configure Toad for Hadoop SQL part. Kerberos in enabled in
my cluster.
I can see that Toad generates JDBC string ending with ...;
auth=kerberos;kerberosAuthType=fromSubject
When I testing this string in my beeline:
[margroo@hadoopnn1 ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_
Hi
I do not is this issue to Hive or Ranger list - let's see.
I have successfully set up Hive and Kerberos and Ranger Hive plugin
enabled and Ranger audit configured to write to HDFS.
After Hive restart I see that Hive takes Kerberos tickets and Ranger
parameters are as I set them:
STARTUP
Hi
The first idea pops up is:
1. HDFS commands to copy your existing structure and data to support a
new partitions structure.
2. Create a new on temporary hive external table
3. (optional) if you created temporary table then drop old one and
insert ... select from temporary table.
Margu
Looks like it is more question to Ranger user-list. I disabled Ranger
authorization and now it working like I expect.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 16/05/16 12:50, Margus Roo wrote:
Hi
I have kerberos and ranger enabled cluster. Ranger plugins
Hi
I have kerberos and ranger enabled cluster. Ranger plugins are enabled
on HDFS and Hive plugins.
In case I create external table and Location in HDFS is empty or there
are small amount of files then create table is ok via beeline. But in
case there are loads of files in example 10 000 or
.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 12/05/16 10:41, Margus Roo wrote:
No I got closer and discovered that my problem is related with
permissions.
In example
drwxr-xr-x - margusja hdfs 0 2016-05-12 03:33 /tmp/files_10k
...
-rw-r--r-- 3
IAccessAllowed(755)) - Error getting
permissions for hdfs://mycluster/tmp/files_10k
java.io.IOException: Couldn't create proxy provider class
org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
I am confused. What extra rights Hive except?
Margus (margusja) Roo
http://
hive2://hadoopnn1.estpak.ee:1/def>
So in my point of view beeline in some reason looks data and old hive
client does not.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 11/05/16 13:35, Margus Roo wrote:
More information:
2016-05-11 13:31
process.
Also I have hive high availability configured.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 11/05/16 12:26, Margus Roo wrote:
Sadly in our environment:
Generated files like you did.
Connected to: Apache Hive (version 1.2.1.2.3.4.0-
registered as an existing table.
*
*
*
*
Dr Mich Talebzadeh
LinkedIn
/https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>
On 11 May 2016 at 06:16, Margus Roo <mailto:mar...@roo.e
Sadly in our environment:
Generated files like you did.
Connected to: Apache Hive (version 1.2.1.2.3.4.0-3485)
Driver: Hive JDBC (version 1.2.1.2.3.4.0-3485)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoopnn1.estpak.ee:2181,hado> create external table
files_10k (i int
Hi
Thanks for your answer.
---
At first I create an empty hdfs directory (if directory is empty I did
not have problems before too).
[margusja@hadoopnn1 ~]$ hdfs dfs -mkdir /user/margusja/trips
[margusja@hadoopnn1 ~]$ beeline -f create_externat_table_trips.hql -u
"jdbc:hive2://hadoopnn1.ex
, Margus Roo wrote:
Hi
Can someone explain or provide documentation how Hive creates external
tables?
I have problem with creating external table in case I am pointing
location in hdfs in to directory where are loads of files. Beeline
just hangs or there will be other errors.
In case I point
Hi
Can someone explain or provide documentation how Hive creates external
tables?
I have problem with creating external table in case I am pointing
location in hdfs in to directory where are loads of files. Beeline just
hangs or there will be other errors.
In case I point location in to th
Hi
Found that in my config there was
hive.exec.dynamic.partition = true;
I turned it false and most of the times I can create table now but not
every time.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 29/02/16 09:27, Margus Roo wrote:
Hi
Can someone
://margus.roo.ee
skype: margusja
+372 51 48 780
On 26/02/16 17:27, Margus Roo wrote:
Basically the question is:
Does Hive checks files in location before creating table?
Because if I move files away before creating tables it works and after
table is created I can move files back and all works :)
Margus
/02/16 16:40, Margus Roo wrote:
Hi
I try to create external table and in the location there are 8960
small files
And I am getting every time something like that:
GC pool 'PS MarkSweep' had collection(s): count=1 time=1672ms
GC pool 'PS Scavenge' had collection(s): count=1 tim
Hi
I try to create external table and in the location there are 8960 small
files
And I am getting every time something like that:
GC pool 'PS MarkSweep' had collection(s): count=1 time=1672ms
GC pool 'PS Scavenge' had collection(s): count=1 time=45ms
2016-02-26 15:18:29,721 INFO
[org.apache.h
ven or endorsed by
Peridale Technology Ltd, its subsidiaries or their employees, unless
expressly so stated. It is the responsibility of the recipient to
ensure that this email is virus free, therefore neither Peridale
Technology Ltd, its subsidiaries nor their employees accept any
responsibility.
"embedded-only" so does not have its own datastore table.
16/02/10 03:34:23 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
Margus (margusja) Roo
http://margus.r
Hi
I have two servers where are same hive configuration bigdata29 and bigdata2.
From bigdata29 I can connect successfully with metadata:
[hive@bigdata29 ~]$ hive --service metatool -listFSRoot
WARNING: Use "yarn jar" to launch YARN applications.
Initializing HiveMetaTool..
16/02/10 03:34:21 INFO
.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780
On 09/01/16 17:49, Margus Roo wrote:
Hi
I am trying to use beeline with hive + kerberos (Hortonworks sandbox 2.3)
The problem is that I can use hdfs but not beeline and I do not know
what is wrong.
Console output
Hi
I am trying to use beeline with hive + kerberos (Hortonworks sandbox 2.3)
The problem is that I can use hdfs but not beeline and I do not know
what is wrong.
Console output:
[margusja@sandbox ~]$ kdestroy
[margusja@sandbox ~]$ hdfs dfs -ls /user/
16/01/09 15:45:32 WARN ipc.Client: Exceptio
25 matches
Mail list logo