Not yet !
Sent from my iPhone
> On Feb 26, 2015, at 8:23 PM, Srinivas Thunga
> wrote:
>
> Hi,
>
> Thanks for the prompt response.
>
> Cann't i insert specific columns which i want?
>
> Like we do in oracle.
>
> Thanks & Regards,
>
> Srinivas T
>
>> On Thu, Feb 26, 2015 at 8:55 PM, Ala
Compile native libraries and put them in path.
Sent from my iPhone
> On Feb 17, 2015, at 11:24 PM, Baveja, Ankush wrote:
>
> Hi,
>
> Newbie and trying to do some custom script. Not exactly hive related, so
> kindly bear
>
> When I run hadoop fs –ls, I get below output
>
> [root@node3 co
Have you looked at HAWQ from Pivotal ?
Sent from my iPhone
> On Jan 30, 2015, at 4:27 AM, Samuel Marks wrote:
>
> Since Hadoop came out, there have been various commercial and/or open-source
> attempts to expose some compatibility with SQL. Obviously by posting here I
> am not expecting an un
Whats the value set in yarn-site ? Can you paste here ?
Sent from my iPhone
> On Aug 16, 2014, at 8:19 PM, "karthik Srivasthava"
> wrote:
>
> So i tried using these
>
> set yarn.nodemanager.resource.memory-mb ;
> set yarn.scheduler.maximum-allocation-mb;
>
> Even though, one yarn_tez
Limit the maximum usable memory by Yarn
Sent from my iPhone
> On Aug 16, 2014, at 12:16 PM, "karthik Srivasthava"
> wrote:
>
> I am using TEZ as Hive Execution Engine. When i run a query , Yarn Containers
> are using all the space available in node. This doesnt allow another query to
> run.
Sorry Amzad, I am afraid u can't
Sent from my iPhone
> On May 30, 2014, at 3:12 AM, "Amjad ALSHABANI" wrote:
>
> Hello Everybody,
>
> I know that this question may concern Hadoop list but i ve made this mistake
> when using Hive.
> I created new database giving the location on HDFS but i fo
You can write a SerDe to handle the control character.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
"Maybe other people will try to limit me but I don't limit
> }
> This is just a very simple eg fr reference but we have a complex Json format
> with huge amount of data.
>
> So, in this case how can we load it into hive tables and hdfs?
>> On Mar 26, 2014 10:59 PM, wrote:
>> Are you swagatika mohanty?
>>
>>
Hi Swagatika
You can create external tables to Mongo and can process it using hive. New
mongo connectors have added support for hive. Did you try that?
Sent from my iPhone
> On Mar 26, 2014, at 9:59 AM, "Swagatika Tripathy"
> wrote:
>
> Hi,
> We have some files stored in MongoDB , mostly in k
on: Hive Runtime Error
while processing row (tag=1)
{"key":{"joinkey0":""},"value":{"_col2":"92","_col11":"-60-01-21,00","_col12":"-03-07-04,00"},"alias":1}
at org.apache.hadoop.hiv
.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row (tag=1)
{"key":{"joinkey0":""},"value":{"_col2":"
node cluster running cdh 4.3. Please try to locate what can be teh
issue.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
"Maybe other people will try to limit me but I don't limit myself"
-36_812
I am using hive-10.x , hadoop-2.0.0,. Appreciate any help in understanding the
issue.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
"Maybe other people w
You are welcome !!!
Sent from my iPhone
> On Jan 23, 2014, at 8:59 PM, "prashant gokhale"
> wrote:
>
> Perfect! I was able to drop partition using IGNORE PROTECTION. Thanks
> Siddharth.
>
> -Prashant
>
>
>> On Thu, Jan 23, 2014 at 5:37 PM,
Try IGNORE PROTECTION clause. I gues you had used NO DROP CASCADE clause when
you created the same.
Sent from my iPhone
> On Jan 23, 2014, at 4:28 PM, "prashant gokhale"
> wrote:
>
> Hello all,
>
> I am working with some existing hive tables and I am having trouble deleting
> partitions. I
Hi Team
I am getting following error when I am trying to load csv file in my hive
table:-
FAILED: Parse Error: line 1:71 character '' not supported here
Can you please explain whats this error and its resolution ?
**
Cheers !!!
Siddharth Tiwari
Have a refr
Hi TeamI am getting following error when I am trying to load csv file in my
hive table:-FAILED: Parse Error: line 1:71 character '' not supported here
Can you please explain whats this error and its resolution ?
**
Cheers !!!
Siddharth Tiwari
Have a refr
Whats the easiest way to remove queues from hadoop without restarting services
? Why cant we just refreshqueues ?
Sent from my iPhone
Why dont you try implementing that and contributing it to community ?
Sent from my iPhone
On Jul 17, 2013, at 10:18 PM, "Omkar Joshi" wrote:
> I read of the term ‘JDBC Storage Handler’ at
> https://issues.apache.org/jira/browse/HIVE-1555
>
> The issues seems open but I just want to confirm
Please stop spamming the list.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
"Maybe other people will try to limit me but I don't limit myself"
> D
Sent from my iPhone
Begin forwarded message:
> From: Siddharth Tiwari
> Date: 21 March 2013 7:43:41 AM MST
> To: hive user list
> Subject: Hive accessing metastore
>
> Hi team,
>
> What we have observed is, hive creates so many connections to metastore
> (mysql
possible to limit the number of maximum connections hive could make to
mysql metastore ? Or is it possible to limit number of maximum jobs than can be
submitted to hive at anytime ?
Please help
Thanks in advance.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day
.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
"Maybe other people will try to limit me but I don't limit myself"
Hey Team,
We have huge tables in Mainframe DB2. Can some one tell if its possible to pull
data from DB2 in Mainframe to hive and then use MapReduce to sort the data in
hive and push it back to Mainframe table.
Please help
**
Cheers !!!
Siddharth Tiwari
Have a
Can you paste your namenode log here ?
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
"Maybe other people will try to limit me but I don't limit myself&
There is something you gain and something you loose.
Compression would reduce IO through increased cpu work . Also you would receive
different experience for different tasks ie HDFS read , HDFS write , shuffle
and sort . So to go for compression or not depends on your usages .
Sent from my N8
DO this:-
nano ~/.bashrc
add following lines:-
export HADOOP_HOME= (path to your hadoop bin directory)
#done
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
"Every duty is holy, and devotion to duty is the highest form of worship of
God.”
Date: S
please help me out in following error, I encounter it on cygwin
hive> show tables
> ;
FAILED: Hive Internal Error:
java.lang.IllegalArgumentException(java.net.URISyntaxException: Relative path
in absolute URI: file
03-11-05_208_1818592223695168110)
java.lang.IllegalArgumentException: java.ne
and port and then
refreshnode. It should work.
**
Cheers !!!
Siddharth Tiwari
TCS world wide Data warehouse and Analytic Team - Americas
Have a refreshing day !!!
Date: Mon, 19 Sep 2011 18:51:28 +0530
Subject: Re: Decommission of datanode(Urgent)
From: vikas.srivast
rminated at any time by editing the
configuration or the exclude files and repeating the -refreshNodes command.
hope it helps.
*----*
Cheers !!!
Siddharth Tiwari
TCS world wide Data warehouse and Analytic Team - Americas
Have a refreshing day !!!
Date: Mon, 19 Sep
Dear Mike
Did u try the same using hive-0.7.1 and hadoop-0.20.2 ? try it out once
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
Date: Thu, 15 Sep 2011 15:59:00 +0200
Subject: Configuring Hive to automatically create _SUCCESS files
From: michael.g.n
Hi Jasper I am on Pentaho BI suite 4.0 and Hive 0.7.1 ... please help
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
Date: Mon, 12 Sep 2011 11:47:06 +0200
Subject: Re: Pentaho issue with hive
From: jasper.knu...@vlc.nl
To: user@hive.apache.org
Hi Siddharth
You can write a map/reduce job to do it for you accordingly, leveraging its
power of parallel processing.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
Date: Fri, 19 Aug 2011 18:46:22 +0530
Subject: Re: problem in reading data fro table (fields terminated
You will have to parse this data accordingly
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
Date: Fri, 19 Aug 2011 16:05:21 +0530
Subject: problem in reading data fro table (fields terminated by '||')
From: vikas.srivast...@one97.ne
okay Ed and Carl, I get the point, the only thing which bothered me was, would
it be able to run on cygwin ? what actually was wrong.
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
From: c...@cloudera.com
Date: Thu, 18 Aug 2011 13:13:37 -0700
Subject: Re
hey carl,
Isint there any way to enable it, if not, what is this error about ? what is
the problem ?
**
Cheers !!!
Siddharth Tiwari
Have a refreshing day !!!
From: c...@cloudera.com
Date: Thu, 18 Aug 2011 11:34:03 -0700
Subject: Re: Hive DDL issue
To: user
36 matches
Mail list logo