Hi,
set hive.security.authorization.enabled property false in hive-default.xml
Thanks,
Ankit
On Fri, Feb 3, 2012 at 11:38 AM, Ronak Bhatt wrote:
> hive> create table t1(t1 string);
>
> Authorization failed:No privilege 'Create' found for outputs {
> database:test1}. Use show grant to get more
Load data inpath will append data into Hive table .. this feature is also
supported in Hive-0.7.*.
Ex: load data local inpath './examples/files/kv1.txt' into table test1;
This commad will append data into test1 table.
On Wed, Jan 18, 2012 at 10:55 AM, Aniket Mokashi wrote:
> how about Load dat
No buddy, we can't do that.
On Mon, Jan 9, 2012 at 3:57 PM, Bhavesh Shah wrote:
> Hello all,
>
> Can we write the Hive Jdbc code in UDF?
>
> --
> Regards,
> Bhavesh Shah
>
>
Hi,
It will use hadoop lib by default. Requirement is HADOOP_HOME must be set.
Thanks,
Ankit
On Mon, Jan 9, 2012 at 3:30 PM, wrote:
> I did see that, and to try and ensure the right Hadoop libs are being
> used, I’ve created the following entry in hive-site.xml:
>
> ** **
>
>
It will use hadoop lib by default.. requirement is hadoop must be in class
path
On Mon, Jan 9, 2012 at 3:30 PM, wrote:
> I did see that, and to try and ensure the right Hadoop libs are being
> used, I’ve created the following entry in hive-site.xml:
>
> ** **
>
>
>
>
eption(message:org.apache.hadoop.hbase.MasterNotRunningException:
> localhost:45966
>
> at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:394)
> at org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmin.java:83)
&g
apache.hadoop.hbase.zookeeper.ZKUtil.createAndFailSilent(ZKUtil.java:886)
> at
> org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.(ZooKeeperWatcher.java:133)
> ... 26 more
>
> )
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
&
I think your hbase master is not running.
Open the hive shell and run command :
hbase> status
On Fri, Dec 2, 2011 at 2:00 PM, Alok Kumar wrote:
> Hi,
>
> Does Hive-Hbase integration require Hbase running in pseudo-distributed
> mode?
>
> I've build my Hadoop following this article
> http://www
>
>
> http://ria101.wordpress.com/2010/01/28/setup-hbase-in-pseudo-distributed-mode-and-connect-java-client/
>
> 127.0.0.1 localhost
> 23.201.99.100 hbase.mycompany.com hbase
>
> so i made the host file at my machine as such. is the host file
> configuration is
Hi,
Please use 127.0.0.1 instead of ubuntu.ubuntu-domain .
or Open the Hbase shell and run command 'status'.
On Tue, Nov 29, 2011 at 1:34 PM, shashwat shriparv <
dwivedishash...@gmail.com> wrote:
> I have followed
> https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration my
> hbase is
Please read above mail as "I think you are right".
On Tue, Oct 25, 2011 at 11:07 AM, Ankit Jain wrote:
> Hi,
>
> I think you are right??
>
> I have one question.
>
> How we can create and switch user in hive??. Your grantor name is hadoop
> and user name i
Hi,
I think you are right??
I have one question.
How we can create and switch user in hive??. Your grantor name is hadoop and
user name is skrishnan. How u login with user skrishnan.
Thanks,
Ankit Jain
On Tue, Oct 25, 2011 at 10:54 AM, Sriram Krishnan wrote:
> The user "skrish
Hi,
Please try to run the following command and view the grant option.*
hive> show grant user abc on database default;*
output :
databasedefault
principalNameabc
principalTypeUSER
*privilegeAll *
grantTime1319518326
grantor xyz
is *skrishnan* is ubuntu user or u have cre
Hi all,
How we can Add user on group in hive??
I also go through the following sites:
https://cwiki.apache.org/confluence/display/Hive/AuthDev
but didn't get anything about "how we can add user on group??"
Thanks,
Ankit Jain
> FROM user [, user] ...
>
>
>
> https://cwiki.apache.org/confluence/display/Hive/AuthDev#AuthDev-4.3grant%2Frevokestatement
>
> - Alex
>
>
> On Thu, Oct 20, 2011 at 3:45 PM, Ankit Jain wrote:
>
>> Hi Alexander,
>>
>> Thanks for reply..
>>
&g
he.
So, i want to switch the user from abc to apache.
Thanks,
Ankit Jain
On Thu, Oct 20, 2011 at 7:07 PM, Alexander C.H. Lorenz <
wget.n...@googlemail.com> wrote:
> Hi,
>
> did you use the database:
> hive> use test_db;
> hive> create table test (one string
Hi all,
I have created database test_db and grant all permission to user 'apache' on
'test_db'
hive>create database test_db
hive>grant all on database test_db to user apache.
hive>show grant user apache on database test_db;
OK
databasetest
principalName*apache*
principalTypeUSER
privi
ver class name for a JDBC metastore
javax.jdo.option.ConnectionUserName
root
username to use against metastore database
javax.jdo.option.ConnectionPassword
password
Thanks,
Ankit Jain
On Wed, Oct 19, 2011 at 4:43 PM, Chinna Rao Lalam 72745 <
chinna...@huawei.com> wrote:
ndia) Pvt. Ltd.
> ww.impetus.com
>
>
> On Mon, Oct 17, 2011 at 1:06 PM, Ankit Jain wrote:
>
>> Hi all,
>>
>> I have created the database test_db in hive using command 'create database
>> test_db'. Then i have created the table test_db_tb inside
/
192.168.41.52:1/test_db' but it was using the hive default database
instead of test_db. How i can specify the hive database name??
Thanks,
Ankit Jain
Hi Ravindra,
Use hadoop-namenode,hadoop-datanode1,hadoop-datanode2 in the master and
slave file Instead of using Hostname. And map them in /etc/host file.
Example
master file:
hadoop-namenode
slave file :
hadoop-datanode1
hadoop-datanode2
/etc/hosts file :
IP_of_master hadoop-namenode
I
Is it hive database created in MySQL ??
On Tue, Oct 11, 2011 at 5:55 PM, Vikas Srivastava <
vikas.srivast...@one97.net> wrote:
> that also i have dine.. i put msql connector in that lib
>
> On Tue, Oct 11, 2011 at 5:39 PM, Ankit Jain wrote:
>
>> Hello Vikas,
>>
Hello Vikas,
I think you have to put the mysql Driver in lib dir of hive.
Thanks,
Ankit
On Tue, Oct 11, 2011 at 5:18 PM, Vikas Srivastava <
vikas.srivast...@one97.net> wrote:
> hey thanks for replay but already have at that place..
>
> 2011/10/11 Harold(陳春宏)
>
>>
>>
>> You have to add jar file
Hi all,
I have created new database 'test' in hive. I have to authenticate this
database.
Thanks,
Ankit
Hi Praveen,
create the table 'social' before running the script.
On Mon, Sep 12, 2011 at 9:24 AM, Vikas Srivastava <
vikas.srivast...@one97.net> wrote:
> remove the ; and check that table is present or not.
>
> On Mon, Sep 12, 2011 at 5:42 PM, Chalcy Raja <
> chalcy.r...@careerbuilder.com> wrot
Hi siddharth ,
I think,Port 1 is already in use.
On Mon, Sep 12, 2011 at 2:28 AM, Siddharth Tiwari wrote:
> Please help I am trying to start hiveserver using following command, but
> it throws error ps help. I intend to integrate it with pentaho, if anyone hv
> relevant experience in doing
can you provide me sample log data
Thanks,
Ankit
On Fri, Sep 9, 2011 at 2:33 AM, MIS wrote:
> I want to access individual columns from a table created with row delimited
> by RegexSerde.
>
> For example,
> I have created a table as below:
>
> create table test ( col1 STRING, col2 STRING )
> R
Hi all,
I tried to index the lzo file but got the following error while indexing the
lzo file :
java.lang.ClassCastException:
com.hadoop.compression.lzo.LzopCodec$LzopDecompressor cannot be cast to
com.hadoop.compression.lzo.LzopDecompressor
at
com.hadoop.mapreduce.LzoSplitRecordReader.initial
I have used hadoop-0.20.2 version of hadoop.
On Wed, Jul 27, 2011 at 1:29 PM, Koert Kuipers wrote:
> i just tried your step 10 and 11 on my setup and it works. See my earlier
> message about my setup.
>
>
> On Wed, Jul 27, 2011 at 9:37 AM, Ankit Jain wrote:
>
>> Hi all,
Hi all,
I tried to index the lzo file but got the following error while indexing the
lzo file :
java.lang.ClassCastException:
com.hadoop.compression.lzo.LzopCodec$LzopDecompressor cannot be cast to
com.hadoop.compression.lzo.LzopDecompressor
I have performed following steps:-
1. $sudo apt-get in
Hi,
Load data into hbase using hbase-sink. Ones data store into hbase, create
hive external table corresponding to that hbase table.
On Sun, Jun 5, 2011 at 1:41 AM, bharath vissapragada <
bharathvissapragada1...@gmail.com> wrote:
> Hey,
>
> Hive tables are nothing but some meta-data overlay on t
Hi,
If path aleady exist in hdfs, then you need to perform following step.
$HADOOP_HOME/bin/hadoop fs -chmod g+w Exist_Path_hdfs
Example:
$HADOOP_HOME/bin/hadoop fs -chmod g+w /com/impetus/data
On Wed, May 18, 2011 at 1:25 AM, Ankit Jain wrote:
> Hi,
>
> At the time installation w
/bin/hadoop fs -chmod g+w /com/impetus/data
then your data start storing in /com/impetus/data dir.
Regards,
Ankit Jain
On Wed, May 18, 2011 at 12:53 AM, jinhang du wrote:
> hi,
> The default value is "/user/hive/warehouse" in hive.site.xml. After I
> changed the directory t
Hi stuart,
In above case,we create hive external table.so,actual data is reside in
hbase and hive only contains metadata information.In that case,whenever we
run any query over hive , it read data from hbase.
On Tue, Apr 19, 2011 at 6:17 AM, Ankit Jain wrote:
> Hi stuart,
> If the hbas
Hi stuart,
If the hbase data is changed directory,hive will able to read thoes data.In
that case,hive only contains metadata information.
On Tue, Apr 19, 2011 at 3:29 AM, Stuart Scott wrote:
> Hi,
>
>
>
> Wonder if anyone can help please?
>
>
>
> I’ve read that you can use Hive to create HBase ta
Hi,
I am able to launch the map-reduce job (select userid from user) from hive
shell.I am also passing the auxpath parameter to the shell (specifying the
Hive/HBase integration related jars).
However,when i trying to launch the map-reduce job(select userid from User)
from jdbc client.The map reduce
36 matches
Mail list logo