Is it my imagination or does org.apache.hadoop.fs.Hdfs is in the hadoop-hdfs JAR, and not hadoop-hdfs-client JAR?
Because someone has been complaining on the spark-dev list, that if they try to instantiate FiileContext.getFileContext("hdfs://namenode:8020"), they get to see a stack trace starting with java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.Hdfs not found" 1. Is this really the case —the HDFS and webhdfs AFS impls are not in hadoop-client? 2. If so, why not? 3. Can it be corrected before 2.8.0 goes into beta release? 4. Otherwise, the dependencies of hadoop-client are going to have to reinstate hadoop-hdfs as a dependency -Steve