So we got it, I hope!
We did take care of the ulimit max open file thing (e.g. 1.3.1.6.1. ulimit on
Ubuntu: http://hbase.apache.org/book/notsoquick.html). But after the switch
from "native" hadoop to cloudera ditribution cdh3u0 we didn't mention to to
this for the users "hdfs", "hbase" AND "mapr
It seems that the more dynamic partitions are imported the fewer I am able to
import respectively the smaller the files have to be.
Any clues?
Original-Nachricht
> Datum: Wed, 13 Jul 2011 09:45:27 +0200
> Von: "labtrax"
> An: user@hive.apache.org
>
Hi,
I allways get
java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:
Hive Runtime Error while processing row (tag=0)
{"key":{},"value":{"_col0":"1129","_col1":"Campaign","_col2":"34811433","_col3":"group","_col4":"1271859453","_col5":"Soundso","_col6":"93709590","_col
Hello,
can't import files with dynamic partioning. Query looks like this
FROM cost c INSERT OVERWRITE TABLE costp PARTITION (accountId,day) SELECT
c.clientId,c.campaign,c.accountId,c.day DISTRIBUTE BY c.accountId,c.day
Strange thing is: Sometimes it works sometimes mapred fails with something
I got the problem myself, have a look here:
https://groups.google.com/a/cloudera.org/group/cdh-user/browse_frm/thread/40e01b40584da107/5d2fb9f2d36d66c6?tvc=1&q=labtrax#5d2fb9f2d36d66c6
--
Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir
belohnen Sie mit bis zu 50,- Euro! h
The metastore is not actually the problem here. I already configured hive
metastore with mysql.
And I use hive with jdbc. The question is, what does "The HiveServer is
currently single
threaded, which could present serious use limitations" means? Might the results
wrong because multiple queries
verHandler.(HiveServer.java:80)
at
org.apache.hadoop.hive.jdbc.HiveConnection.(HiveConnection.java:84)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:110)
at java.sql.DriverManager.getConnection(DriverManager.java:582)
Any clues?
Thanks
labtrax
--
Empfeh
/hive-hbase-handler-0.7.0-cdh3u0.jar,/usr/lib/hive/lib/hbase-0.90.1-cdh3u0.jar,/usr/lib/hive/lib/zookeeper-3.3.1.jar
I already tried different configs like file:///... or file:///${HIVE_HOME}.
Any clues?
Thanks in advance
labtrax
--
Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir
heers
labtrax
Hello,
I have a hadoop cluster running with the hadoop_append-jar
(hadoop-core-0.20-append-r1056497-core.jar)
for hbase reason.
I tried hive 0.6.0 and 0.7.0 and for both each when I start it I get
Exception in thread "main" java.lang.RuntimeException: Could not load shims
Thanks John, I will give it a try.. Hopefully it will work. I'll report it
later here.
> Original-Nachricht
> Datum: Mon, 11 Apr 2011 18:14:28 +
> Von: John Sichi
> An: ""
> Betreff: Re: hive hbase and hadoop versions
>
--
NEU: FreePhone - kostenlos mobil telefonieren u
10 matches
Mail list logo