Hi Ankit,
I got this in my java mapper code
String oldSeperator = "�"; //the thorn as java sees it
String newSeperator = "~";
In Eclipse it shows as �, which is the standard java way of saying "I don't
know this multibyte character".
When you copy paste this � to the linux shell it depicts as
Hi Jasper,
could you please share your MR program.
I am not able to grab this character
Ankit
Hi Ankit,
It all depends on your environment and locale en encoding. This proved to
work in my case, but I believe to have seen your characters as well, but
after all it is not your browser that has to do the work and interpret the
multibyte character. That is the main problem with the thorn; ever
Hi Jasper,
How did you find - 'þ'
My browser shows this - �
Ankit
Try one of these suggestions:
(1) run HBase and Hive in separate clusters (downside is that map/reduce tasks
will have to issue remote request to region servers whereas normally they could
run on the same nodes)
(2) debug the shim exception and see if you can contribute a patch that makes
Hive
Hello,
it seems that hive 0.6 an 0.7 is incompatible with the hadoop-append-jar from
hbase 0.90.2. But without the append-jar you cannot use hbase in production...
Any advices for the hadoop/hbase/hive-version-jungle? I already ask this last
month but I didn't get a reasonable answer.
Cheers
l
Anyone have an idea on this?
Anyone using compression with sequence files successfully?
The wiki and Hadoop: the definitive guide suggest that the below is
correct so I am at a loss to explain what we are seeing.
Tom
On Fri, May 6, 2011 at 5:39 PM, Tom Hall wrote:
> I have read http://wiki.a
I have been seeing this a lot on my cluster as well.
This typically happens for me if there are many maps (more than 5000) in a job.
Here is my cluster summary
342316 files and directories, 94294 blocks = 436610 total. Heap Size is 258.12
MB / 528 MB (48%)
Configured Capacity : 26.57 TB
DFS
Hello,
I have a hadoop cluster running with the hadoop_append-jar
(hadoop-core-0.20-append-r1056497-core.jar) for hbase reason.
I tried hive 0.6.0 and 0.7.0 and for both each when I start it I get
Exception in thread "main" java.lang.RuntimeException: Could not load shims in
class null
Does hive support SELECT ... LIMIT N,M statement and Having in hive0.7?
Thanks,
LiuLei
10 matches
Mail list logo