Your HBase team are pleased to announce the release of HBase 0.92.1.
Download it from your favorite Apache mirror [1]. It should also be available
up in maven.
HBase 0.92.1 is a bug fix release. For a complete list of changes, see the
release notes [2].
If upgrading from 0.90.x to 0.92.1, see t
On Sat, Mar 17, 2012 at 1:06 AM, yonghu wrote:
> Hello,
>
> I have used the command ./hbase
> org.apache.hadoop.hbase.mapreduce.Export 'test'
> http://localhost:8020/test to export the data content from the test
> table. And I can see the exported content from my hdfs folder,
> hdfs://localhost/te
On Sat, Mar 17, 2012 at 3:19 AM, Jiajun Chen wrote:
> 2012-03-17 18:17:22,793 ERROR
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Failed open
> of region=UrlMapDocRubbish,,1315295663867.920f6b6878106d1c673f898a4aeb9df8.
> java.io.IOException: Compression algorithm 'lzo' previou
Hello,
A couple of days ago, I asked about strange behavior in my
"Scan.addFamiliy reduces results" thread.
I want to confirm that I did find a bug, and if so, how to submit a bug
report.
The basic strangeness is that changing the amount of caching, changes
the number of results. In the o
Hi, Harsh.
Thank you for your reply and I understand.
I don't think it's so important right now, but if I think so, I'll file a
JIRA.
Anyway, thank you very much!
2012/3/17 Harsh J
> No, "Über" jars are not supported. The jar is merely injected onto the
> classpath directly.
>
> Please file a
No, "Über" jars are not supported. The jar is merely injected onto the
classpath directly.
Please file a JIRA if you consider it important that HBase coprocessor
hosts support Über jar loading. For the meanwhile, just create
exploded library jars, with all class files residing in the root, and
not
2012-03-17 18:17:22,793 ERROR
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Failed open
of region=UrlMapDocRubbish,,1315295663867.920f6b6878106d1c673f898a4aeb9df8.
java.io.IOException: Compression algorithm 'lzo' previously failed test.
at
org.apache.hadoop.hbase.util.Comp
Hi, Harsh.
Thank you for your reply.
> - Your single jar should contain all required dependencies.
You mean I can use 'lib' directory in coprocessor jar file?
If not, do you have any plan to make it possilble?
Thanks.
2012/3/17 Harsh J
> You need to follow the second approach ("Load from ta
You need to follow the second approach ("Load from table attribute")
to specify a jar path along with your coprocessor classes:
http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/coprocessor/package-summary.html#load
- Your single jar should contain all required dependencies.
Otherwise, just
Hello,
I have used the command ./hbase
org.apache.hadoop.hbase.mapreduce.Export 'test'
http://localhost:8020/test to export the data content from the test
table. And I can see the exported content from my hdfs folder,
hdfs://localhost/test/part-m-0. I have tried 2 commands to read
the content
Hi all,
Is there any way for Coprocessors to use external libraries?
Hadoop MapReduce can specify them by using -libjars command line
arguments or including 'lib' directory in job.jar file.
I also want to use external libraries in Coprocessors.
Thanks.
--
Takuya UESHIN
Tokyo, Japan
http://tw
11 matches
Mail list logo