Re: Problem reading HDFS block size > 1GB

2009-10-01 Thread Raghu Angadi
Vinay, This issue came up before ( http://www.mail-archive.com/core-...@hadoop.apache.org/msg35620.html ) . I think we should fix this soon. Dhruba filed a jira ( https://issues.apache.org/jira/browse/HDFS-96 ) . Not all errors reported here fixed by the patch attached there. Could we discuss thi

[jira] Created: (HADOOP-6294) log4j settings are service/node specific

2009-10-01 Thread Allen Wittenauer (JIRA)
log4j settings are service/node specific Key: HADOOP-6294 URL: https://issues.apache.org/jira/browse/HADOOP-6294 Project: Hadoop Common Issue Type: Bug Reporter: Allen Wittenauer Hadoop

Problem reading HDFS block size > 1GB

2009-10-01 Thread Vinay Setty
Hi all, We are running Yahoo distribution of Hadoop based on Hadoop 0.20.0-2787265 . On a 10 nodes cluster with OpenSUSE Linux Operating System. We have HDFS configured with Block Size 5GB (This is for our experiments). But we are facing following problems when we try reading the data beyond 1GB fr

[jira] Resolved: (HADOOP-5141) Resolving json.jar through ivy

2009-10-01 Thread Giridharan Kesavan (JIRA)
[ https://issues.apache.org/jira/browse/HADOOP-5141?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Giridharan Kesavan resolved HADOOP-5141. Resolution: Won't Fix closing this as wontfix as chukwa is no more a hadoop contri