Hi all!
I'm trying to use IBM BigInsights (essentially hadoop 0.20.2) as an HDFS
file storage. While I'm perfectly able to access all my files using the web
console and JAQL inside the box, the same does not apply to the Java
application I've created which runs *externally*, on another machine, not in
the hadoop environment. I get the well-known and dreaded INFO: Could not
obtain block blk_.................._.... from any node: 
java.io.IOException: No live nodes contain current block I've raised
dfs.datanode.max.xcievers to 4096 (and even to 8192) and max open file
limits to 32K, to no avail. Kinda frustrated already, and google doesn't
seem to help anymore. There are no exceptions logged in both namenode and
datanode logs. Hadoop fsck says that the filesystem is clean.
My configuration is 1 datanode running on the same machine as namenode.

Does anyone know what to do with this? 
Thanks.
-- 
View this message in context: 
http://old.nabble.com/Cannot-access-HDFS-from-Java-tp33758171p33758171.html
Sent from the Hadoop core-dev mailing list archive at Nabble.com.

Reply via email to