Re: Custom Serde with thorn

2011-05-09 Thread Jasper Knulst
Hi Ankit, I got this in my java mapper code String oldSeperator = "�"; //the thorn as java sees it String newSeperator = "~"; In Eclipse it shows as �, which is the standard java way of saying "I don't know this multibyte character". When you copy paste this � to the linux shell it depicts as

Re: Custom Serde with thorn

2011-05-09 Thread ankit bhatnagar
Hi Jasper, could you please share your MR program. I am not able to grab this character Ankit

Re: Custom Serde with thorn

2011-05-09 Thread Jasper Knulst
Hi Ankit, It all depends on your environment and locale en encoding. This proved to work in my case, but I believe to have seen your characters as well, but after all it is not your browser that has to do the work and interpret the multibyte character. That is the main problem with the thorn; ever

Re: Custom Serde with thorn

2011-05-09 Thread ankit bhatnagar
Hi Jasper, How did you find - 'þ' My browser shows this - � Ankit

Re: hadoop, hive and hbase problem

2011-05-09 Thread John Sichi
Try one of these suggestions: (1) run HBase and Hive in separate clusters (downside is that map/reduce tasks will have to issue remote request to region servers whereas normally they could run on the same nodes) (2) debug the shim exception and see if you can contribute a patch that makes Hive

Re: hadoop, hive and hbase problem

2011-05-09 Thread labtrax
Hello, it seems that hive 0.6 an 0.7 is incompatible with the hadoop-append-jar from hbase 0.90.2. But without the append-jar you cannot use hbase in production... Any advices for the hadoop/hbase/hive-version-jungle? I already ask this last month but I didn't get a reasonable answer. Cheers l

Re: Sequence File Compression

2011-05-09 Thread Tom Hall
Anyone have an idea on this? Anyone using compression with sequence files successfully? The wiki and Hadoop: the definitive guide suggest that the below is correct so I am at a loss to explain what we are seeing. Tom On Fri, May 6, 2011 at 5:39 PM, Tom Hall wrote: > I have read http://wiki.a

Re: So many unexpected "Lost task tracker" errors making the job to be killed Options

2011-05-09 Thread Shantian Purkad
I have been seeing this a lot on my cluster as well. This typically happens for me if there are many maps (more than 5000) in a job. Here is my cluster summary 342316 files and directories, 94294 blocks = 436610 total. Heap Size is 258.12 MB / 528 MB (48%) Configured Capacity : 26.57 TB DFS

hadoop, hive and hbase problem

2011-05-09 Thread hive1
Hello, I have a hadoop cluster running with the hadoop_append-jar (hadoop-core-0.20-append-r1056497-core.jar) for hbase reason. I tried hive 0.6.0 and 0.7.0 and for both each when I start it I get Exception in thread "main" java.lang.RuntimeException: Could not load shims in class null

SELECT ... LIMIT N,M and Having statement

2011-05-09 Thread lei liu
Does hive support SELECT ... LIMIT N,M statement and Having in hive0.7? Thanks, LiuLei