RE: Too many open files

2011-01-06 Thread Bennie Schut
In the past I ran into a similar problem which was actually caused by a bug in hadoop. Someone was nice enough to come up with a workaround for this. Perhaps you are running into a similar problem. I also had this problem when calling lots of "load file" commands. After adding this to the hive-s

Re: Hive/Hbase Integration Error

2011-01-06 Thread John Sichi
On Jan 6, 2011, at 9:53 PM, Adarsh Sharma wrote: > I want to know why it occurs in hive.log > > 2011-01-05 15:19:36,783 ERROR DataNucleus.Plugin > (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires > "org.eclipse.core.resources" but it cannot be resolved. > That is a bogus

Re: Hive/Hbase Integration Error

2011-01-06 Thread Adarsh Sharma
John Sichi wrote: Here is what you need to do: 1) Use svn to check out the source for Hive 0.6 I download Hive-0.6.0 source code with the command svn co http://svn.apache.org/repos/asf/hive/branches/branch-0.6/ hive-0.6.0 2) In your checkout, replace the HBase 0.20.3 jars with the ones f

Re: Too many open files

2011-01-06 Thread Terje Marthinussen
No, the problem is connections to datanodes on port 50010. Terje On Fri, Jan 7, 2011 at 11:46 AM, Shrijeet Paliwal wrote: > You mentioned that you got the code from trunk so fair to assume you > are not hitting https://issues.apache.org/jira/browse/HIVE-1508 > Worth checking still. Are all the o

Re: Too many open files

2011-01-06 Thread Shrijeet Paliwal
You mentioned that you got the code from trunk so fair to assume you are not hitting https://issues.apache.org/jira/browse/HIVE-1508 Worth checking still. Are all the open files - hive history files (they look like hive_job_log*.txt) ? Like Viral suggested you can check that by monitoring open fil

how to load data into a array of array column

2011-01-06 Thread wd
hi, I have a file like: 1000^A1,2,3,4,5^B4,5,6,7,8^B4,5,6,9,7 Expect to create a row like col1 col2 1[[1,2,3,4,5],[4,5,6,7,8],[4,5,6,9,7]] So we can select it like "select col2[2][1] from t1", and the result should "4". The table can be created by sql: create table t1 ( col1 int,

Re: Too many open files

2011-01-06 Thread Viral Bajaria
Hi Terje, I have asked about this issue in an earlier thread but never got any response. I get this exception when I am using Hive over Thrift and submitting 1000s of LOAD FILE commands. If you actively monitor the open file count of the user under which I run the hive instance, it keeps on creep

Too many open files

2011-01-06 Thread Terje Marthinussen
Hi, While loading some 10k+ .gz files through HiveServer with LOAD FILE etc. etc. 11/01/06 22:12:42 INFO exec.CopyTask: Copying data from file:XXX.gz to hdfs://YYY 11/01/06 22:12:42 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.net.SocketException: Too many open files 11/01/06 22

Re: MySQL Metastore migration

2011-01-06 Thread David Burley
Chris, I've recently converted a metastore from derby to MySQL. It wasn't pain free, but we made it through without major issues. Here is what I recommend you try; though I make no promises it'll work for anyone else: 1. Backup your metastore files 2. Install razorsql (http://www.razorsql.com/) 3

Re: HIVE ODBC test fails at testing with isql

2011-01-06 Thread Ning Zhang
I guess libodbc.so is the one you renamed from libodbchive.so? If so can you ldd libodbc.so and see what other .so files are linked? You can follow the links and do the ldd check for all necessary .so files (libodbchive.so, libhiveclient.so and libthrift.so). Also please make sure all these .so

Re: Hive/Hbase Integration Error

2011-01-06 Thread John Sichi
Here is what you need to do: 1) Use svn to check out the source for Hive 0.6 2) In your checkout, replace the HBase 0.20.3 jars with the ones from 0.20.6 3) Build Hive 0.6 from source 4) Use your new Hive build JVS On Jan 6, 2011, at 2:34 AM, Adarsh Sharma wrote: > Dear all, > > I am sorry

Hive/Hbase Integration Error

2011-01-06 Thread Adarsh Sharma
Dear all, I am sorry I am posting this message again but I can't able to locate the root cause after googled a lot. I am trying Hive/Hbase Integration from the past 2 days. I am facing the below issue while creating external table in Hive. I am using hadoop-0.20.2, hbase-0.20.6, hive-0.6.0

Re: Can't drop table

2011-01-06 Thread wd
11/01/06 18:20:14 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=t1 11/01/06 18:20:14 INFO metastore.HiveMetaStore: 0: drop_table : db=default tbl=t1 11/01/06 18:20:14 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=t1 11/01/06 18:20:14 DEBUG metastore.ObjectStore: Executin

Re: Can't drop table

2011-01-06 Thread Carl Steinbach
The best first step is enable logging to the console and then try the operation again through the CLI: hive -hiveconf hive.root.logger=INFO,console or if you want even more logging info try hive -hiveconf hive.root.logger=DEBUG,console Thanks. Carl On Thu, Jan 6, 2011 at 1:29 AM, wd wrote:

Re: HIVE ODBC test fails at testing with isql

2011-01-06 Thread vaibhav negi
Hi, Thanks for the tips carl, but still it doesn't run .. now i will try it on another server some other day. Thanks and Regards Vaibhav Negi On Thu, Jan 6, 2011 at 1:10 PM, Carl Steinbach wrote: > Hi Vaibhav, > > ror output when i run isql hive > > > > [-]SQL_SUCCESS > > [-]hEnv = $

Can't drop table

2011-01-06 Thread wd
hi, I've setup a single node hadoop and hive. And can create table in hive, but can't drop table, hive cli will hang there, nothing more infos. hive-0.6.0-bin hadoop-0.20.2 jre1.6.0_23 postgresql-9.0-801.jdbc4.jar (have tried postgresql-8.4-701.jdbc4.jar) pgsql 9.0.2 How to find what's wrong hap