Re: Hive Join-Query

2012-07-25 Thread Lefty Leverenz
Your query says "JOIN supplier s ON (s.supplierid=v.supplier)" but s.supplierid should be s.supplier_id. Also, the vender schema shows a "quantiry" column which might be just a message typo, but if you cut-&-pasted the schema data into the message then you should change the name to "quantity". –

Re: HBASE and HIVE Integration

2012-07-25 Thread vijay shinde
Hi Bejoy, I made some changes as per your suggetion. Here is the error from the http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201207251858_0004 Job: Error: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.s

Re: HBaseSerDe

2012-07-25 Thread Ted Yu
The ctor is used in TestHBaseSerDe.java So maybe change it to package private ? On Wed, Jul 25, 2012 at 12:43 PM, kulkarni.swar...@gmail.com < kulkarni.swar...@gmail.com> wrote: > While going through some code for HBase/Hive Integration, I came across > this constructor: > > public HBaseSerDe()

HBaseSerDe

2012-07-25 Thread kulkarni.swar...@gmail.com
While going through some code for HBase/Hive Integration, I came across this constructor: public HBaseSerDe() throws SerDeException { } Basically, the constructor is doing nothing but throwing an exception. Problem is fixing this now will be a non-passive change. I couldn't really find an obvio

RE: Problem replacing existing Hive file with modified copy

2012-07-25 Thread Connell, Chuck
Gee thanks! That is great service. Chuck From: Bejoy Ks [mailto:bejoy...@yahoo.com] Sent: Wednesday, July 25, 2012 12:04 PM To: user@hive.apache.org Subject: Re: Problem replacing existing Hive file with modified copy The corresponding jira filed to track this bug is 'HIVE-3300' . https://issue

Re: Challenging priority : HIVE-2910 Improve the HWI interface

2012-07-25 Thread Bertrand Dechoux
And now I have to apologize. I was one version late for hive (0.8.1). And the version 0.9 does include HWI with bootstrap. The jira must be misleading or I don't understand what the issue is about... Bertrand https://issues.apache.org/jira/browse/HIVE-2910 On Wed, Jul 25, 2012 at 6:15 PM, Bertra

Re: HIVE ERROR

2012-07-25 Thread Bejoy Ks
Hi It is because of space issues. Issue 'df -h' command on the TT node that reported this error, the partition used for dfs.data.dir should be full. Regards Bejoy KS From: abhiTowson cal To: user@hive.apache.org Sent: Wednesday, July 25, 2012 9:48 PM Subjec

Re: Challenging priority : HIVE-2910 Improve the HWI interface

2012-07-25 Thread Bertrand Dechoux
Great answer. Thanks a lot. 1) I understand the concern with branches but I quickly reviewed the change for 0.9.1 and not everything seemed to be a bug patch. So I thought : why not ask about HIVE-2910. 2) I wasn't sure about that, it seems logical though. That's a great news. I will definitely t

Re: Problem replacing existing Hive file with modified copy

2012-07-25 Thread Bejoy Ks
The corresponding jira filed to track this bug is 'HIVE-3300' . https://issues.apache.org/jira/browse/HIVE-3300 Regards Bejoy KS From: Bejoy Ks To: "user@hive.apache.org" Sent: Wednesday, July 25, 2012 9:28 PM Subject: Re: Problem replacing existing Hive fi

Re: Problem replacing existing Hive file with modified copy

2012-07-25 Thread Bejoy Ks
Hi Connell It looks like a bug in hive, I checked with hive 0.9 . If you are loading data from local fs to hive tables using 'LOAD DATA LOCAL INPATH' and if a file with the same name exists in the table's location then the new file will be suffixed by *_copy_1. But if we do the 'LOAD DATA IN

Re: Challenging priority : HIVE-2910 Improve the HWI interface

2012-07-25 Thread Edward Capriolo
Generally we only apply patches to trunk. Thus maintaining branches becomes to much trouble for us. You have to remember that most hive major versions have no actual major changes. Most everything is hidden behind a query language. The only changes that have to be done carefully are changes to the

Challenging priority : HIVE-2910 Improve the HWI interface

2012-07-25 Thread Bertrand Dechoux
Hi, Here is my stand. Hive provides a dsl to easily explore data contained in hadoop with limited experience with java and MapReduce. And Hive Web Interface provides an easy exposition : users need only a browser and the hadoop cluster can be well 'fire-walled' because the communication is only th

Problem replacing existing Hive file with modified copy

2012-07-25 Thread Connell, Chuck
I created a Hive table that consists of two files, names1.txt and names2.txt. The table works correctly and answers all queries etc. I want to REPLACE names2.txt with a modified version. I copied the new version of names2.txt to the /tmp/input folder within HDFS. Then I tried the command: hive

Hive Join-Query

2012-07-25 Thread prabhu k
Hi Users, I have 3 table's vender,supplier and date, by using these table Im trying to generate a report like below *Vendor Name, Supplier Name, Year, Quarter, Sum ( quantity )* I have executed the below query, after execute the query,I'm not getting any result on my console hive>select v.ve

thrift server and CDH issue?

2012-07-25 Thread hadoopman
I recall recently reading somewhere on cloudera's web site that it was not recommended to run more than one thrift server connecting to hive however it's been a couple months since reading this. I'm still digging to find the article and was curious perhaps someone here can provide some insight

RE: Loading data into data_dim table

2012-07-25 Thread Bennie Schut
Hi Prabhu, Be careful when going into the direction of calendar dimensions. While strictly speaking this is a cleaner dwh design you will for sure run into issues you might not expect. Consider this is probably what you would want to do (roughly) to query a day: select count(*) from fact f j

Re: Hive cdh4 and Lzo compression

2012-07-25 Thread Bejoy Ks
Hi Anson If you have your external table point to a directory that has files compressed using lzo, everything would work as desired if you have lzo codec listed in io.compression.codecs in core-site.xml . Regards Bejoy KS From: Anson Abraham To: user@hive.ap

Hive cdh4 and Lzo compression

2012-07-25 Thread Anson Abraham
With the release of cdh4, does is Lzo compression still supported, where if I have my hive table point to path of files in lzo? -anson

Re: HBASE and HIVE Integration

2012-07-25 Thread Bejoy Ks
Hi Vijay You have provided the hbase master directly. (It is fine for single node hbase installation). But still can you try providing the zookeeper quorum instead.  If that doesn't work as well , please post the error log from the mapreduce tasks? Just go the jobtracker page and drill down o

Re: HBASE and HIVE Integration

2012-07-25 Thread kulkarni.swar...@gmail.com
Can you also post logs from "/tmp//hive.log". That might contain some info on your job failure. On Wed, Jul 25, 2012 at 8:28 AM, vijay shinde wrote: > Hi Bejoy, > > Thanks for quick reply. Here are some additional details > > Cloudera Version - CDH3U4 > > *hive-site.xml* > ** > * > hive.aux.jars.

Re: HBASE and HIVE Integration

2012-07-25 Thread vijay shinde
Hi Bejoy, Thanks for quick reply. Here are some additional details Cloudera Version - CDH3U4 *hive-site.xml* ** * hive.aux.jars.path file:///usr/lib/hive/lib/hive-hbase-handler-0.7.1-cdh3u2.jar,file:///usr/lib/hive/lib/hbase-0.90.4-cdh3u2.jar,file:///usr/lib/hive/lib/zookeeper-3.3.1.jar,file:///

Re: Loading data into data_dim table

2012-07-25 Thread prabhu k
Thanks for your help :) it's data has been loaded fine now, select * from dim_date; 76622020-12-22 00:00:00.000 20204 12 3 52 13 4 357 83 22 3 DecemberDec Tuesday Tue 76632020-12-23 00:00:00.000 20204 12 3 5

Re: Loading data into data_dim table

2012-07-25 Thread Bejoy KS
Hi Prabhu Your data is tab delimited use /t as the delimiter while creating table. fields terminated by '/t' Not sure this is the right / or not. If this doesn't work try the other one. Regards Bejoy KS Sent from handheld, please excuse typos. -Original Message- From: prabhu k Date:

Re: Loading data into data_dim table

2012-07-25 Thread prabhu k
Thanks for the reply. I have tried the with delimited fields terminated by '|' and delimited fields terminated by ',' while selecting the table both Im getting null . when i see the HDFS file looks like below. bin/hadoop fs -cat /user/hive/warehoure/time.txt 7666 2020-12-26 00:00:00.000202

RE: Continuous log analysis requires 'dynamic' partitions, is that possible?

2012-07-25 Thread Ruslan Al-fakikh
Bertrand, Sorry, I don't have a link to msck documentation. I haven't tried it myself, I just heard of it. Thanks From: Bertrand Dechoux [mailto:decho...@gmail.com] Sent: Wednesday, July 25, 2012 1:23 PM To: user@hive.apache.org Subject: Re: Continuous log analysis requires 'dynamic' p

Re: Loading data into data_dim table

2012-07-25 Thread Bertrand Dechoux
What Bejoy is saying implicitly, is that the format is not verified by the load command. If it does not match, you will get NULL. And it would be curious that your comma separated value (csv) file is using pipe (|) but why not. Bertrand On Wed, Jul 25, 2012 at 12:45 PM, Bejoy KS wrote: > ** > H

Re: Loading data into data_dim table

2012-07-25 Thread Bejoy KS
Hi Prabhu Can you cat the file in hdfs and ensure that the fields are delimited by '|' character. hadoop fs -text user/hive/warehouse/dim_date/time.csv Regards Bejoy KS Sent from handheld, please excuse typos. -Original Message- From: prabhu k Date: Wed, 25 Jul 2012 16:05:42 To: R

Loading data into data_dim table

2012-07-25 Thread prabhu k
Hi Users, I have created dim_date table like below. table created successfully and i then load the data into the dim_date table while i am selecting the table, getting null values.my input file is time.csv file hive> create table dim_date(DateId int,ddate string,Year int,Quarter int,Month_Numbe

Re: HBASE and HIVE Integration

2012-07-25 Thread Bejoy KS
Hi Vijay Can you share more details like The CDH Version/Hive version you are using Steps you followed for hive hbase integration with the values you set The DDL used for hive hbase integration The actual error from failed map reduce task Regards Bejoy KS Sent from handheld, please excus

Re: Continuous log analysis requires 'dynamic' partitions, is that possible?

2012-07-25 Thread Bertrand Dechoux
usage of msck : msck table msck repair table BUT that won't help me. I am using an external table with 'external' partitions (which do not follow hive conventions). So I first create an external table without local and then I specify every partition with an absolute location. I don't think ther

Re: Continuous log analysis requires 'dynamic' partitions, is that possible?

2012-07-25 Thread Bertrand Dechoux
@Puneet Khatod : I found that out. And that's why I am asking here. I guess non AWS users might have the same problems and a way to solve it. @Ruslan Al-fakikh : It seems great. Is there any documentation for msck? I will find out with the diff file but is there a wiki page or a blog post about it

HBASE and HIVE Integration

2012-07-25 Thread vijay shinde
I am facing issue while executing HIVE queries on HBASE-HIVE integration. I followed the wiki hbase-hive integration https://cwiki.apache.org/Hive/hbaseintegration.html I have already passed all the required jars for auxpath in hive-site.xml file. I am using Cloudera CDH demo VM.. Any help would b