Hi,
What are the steps one should follow to move hive from one server to another
along with hadoop? I've moved my hadoop master node from one server to another
and then moved my hive also. I started all my hadoop nodes successfully but
getting error while executing hive queries. It shows the
Following should help:-
http://hive.apache.org/docs/r0.10.0/api/org/apache/hadoop/hive/metastore/api/Table.html
http://hive.apache.org/docs/r0.10.0/api/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.html
~Aniket
On Thu, Jun 27, 2013 at 7:21 AM, Gelesh G Omathil
wrote:
> Hi,
>
> I would
Yes. It'll. Have you changed all the env. vars, like HADOOP_HOME, HIEV_HOME
etc according to your new environment?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Jun 27, 2013 at 11:43 PM, Manickam P wrote:
> Hi Tariq,
>
> It is a very simple query like select * from table.
> Actually i've
Hi Tariq,
It is a very simple query like select * from table. Actually i've moved my
masternode node from one server to another. I copied all the hadoop temp files
and dfs data directory. Then i installed hive in the same machine and
configured. i'm getting exception while executing the query i
What's your query?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Jun 27, 2013 at 11:17 PM, Manickam P wrote:
> Hi,
>
> I checked all the nodes. All are up and running. normal hive queries like
> show tables are working fine. Queries with map reduce is throwing exception.
>
>
> Thanks,
>
Hi,
I checked all the nodes. All are up and running. normal hive queries like show
tables are working fine. Queries with map reduce is throwing exception.
Thanks,
Manickam P
From: donta...@gmail.com
Date: Thu, 27 Jun 2013 21:59:22 +0530
Subject: Re: Unable to execute a query in hive
To: user@hi
Hello Manickam,
Please make sure your hadoop daemons are running.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Jun 27, 2013 at 9:55 PM, Manickam P wrote:
> Hi,
>
> I tried to execute a query in hive but i got the below exception. I dont
> know the reason.
> Please help me to
Hi,
I tried to execute a query in hive but i got the below exception. I dont know
the reason. Please help me to resolve the issue.
java.net.ConnectException: Call to 192.168.99.33/192.168.99.33:5 failed on
connection exception: java.net.ConnectException: Connection refusedat
org.apac
awesome. thanks for building that!
On Thu, Jun 27, 2013 at 8:06 AM, David Morel wrote:
> On 26 Jun 2013, at 15:21, Christian Schneider wrote:
>
> Hi,
> is JDBC the only way to connect to HiveServer 2?
>
> I can't find any documentation how to do it with java?
>
> There is a Thrift port open, b
Well. You got a few suggestions there Peter. That in itself is reason to
celebrate!
And that was a good description and i fault you not for going into some
detail. The part about keeping it simple is always a challenge I know. :)
I get your point but i don't have anything more really to offer.
On 26 Jun 2013, at 15:21, Christian Schneider wrote:
Hi,
is JDBC the only way to connect to HiveServer 2?
I can't find any documentation how to do it with java?
There is a Thrift port open, but how to use that?
Best Regards,
Christian.
Generally you can generate some code using thrift (and
Hi, i try to access HiveServer2 through thrift with java. But where is the
ITL to generate a the HiveServer2 client?
The only thrift ITL I find is located here:
* https://github.com/apache/hive/tree/HIVE-4115/service/if /
hive_service.thrift (i don't know what TCLIService.thrift is)
I tried to fi
Classification: For internal use only
Is there a way to access hive metadata in a Hive UDF? . e.g. if I wanted to
implement a custom version of the Hive show columns command like
show_columns(tablename)- or use the metadata to perform certain functions on a
table, How does one go about doing it?
Hi,
If you’re suggesting that I use something like
SELECT * FROM data WHERE MyUdf(data. BLOCK__OFFSET__INSIDE__FILE);
rather than
SELECT * FROM data JOIN small ON data.BLOCK__OFFSET__INSIDE__FILE =
small.offset;
then, yes, I have thought of that. However the fact is that reading the
billions o
Hi,
OK. Imagine I've created a Hive table like this:
CREATE TABLE small (...)
STORED AS
INPUTFORMAT 'MyInputFormat'
...;
My class MyInputFormat looks like this:
public void configure(JobConf jc) {
String tableName = "BigTable";
String location = magicFunction(tab
Slightly less "hackish" way to do this without joins is to write custom UDF
that will take data.BLOCK__OFFSET__INSIDE__FILE as input parameter and
return the corresponding data from the small file. If you mark it
"deterministic" using @UDFType(deterministic = true), the performance
should be quite
Hi,
I have thought about a map-only join, but as I understand it this is still going
to do a full table scan on my large data file. If this is billions of records
then it's
still going to be slow, even if it only returns a handful of records.
Also I don't know of any way to get Hive to do a join
from Alex's blog
http://mapredit.blogspot.in/2013/05/get-all-extended-hive-tables-with.html
if you want to do it programatically then you will need to look at
HiveMetaStoreClient
If both of these are not what you are looking for then sorry I will need a
little more details on your question
On
few thoughts:
If you have a smaller file (in size of MB's) have you tried considering map
only join?
also if you are interested in particular records from a table and do not
want to go through entire table to find them, then partitioning + indexing
will be handy.
ORCFile Format (still very new) ca
Hi,
Hopefully a simple question.
Given that I have a table name (as a String) is there an API call that I can
use to obtain the location of the table? Assume that my code is executing
in a custom InputFormat to define the context. (I'm running "inside" a Hive
query so I assume that there's a way
Well, I'm not very good at keeping things brief, unfortunately.
But I'll have a go, trying to keep things simple.
Suppose that I have a data table in Hive and it has many rows - say billions.
I have another file stored in HDFS (it can be a Hive table too if it helps)
and this file is small and con
21 matches
Mail list logo