Thanks Sacha, I have now my hdfs-site.xml like that : (as the hadoop-site.xml seems to be deprecated)
<configuration> <property> <name>dfs.support.append</name> <value>true</value> </property> </configuration> But I continue receiving the exception. Checking the hadoop source code, I saw public FSDataOutputStream append(Path f, int bufferSize, Progressable progress) throws IOException { throw new IOException("Not supported"); } in the class org.apache.hadoop.fs.ChecksumFileSystem and this is where the exception is thrown. my fileSystem is a LocalFileSystem instance that inherits the ChecksumFileSystem and the append method has not beem overriden. Whereas in the DistributedFileSystem, the append method is defined like this: /** This optional operation is not yet supported. */ public FSDataOutputStream append(Path f, int bufferSize, Progressable progress) throws IOException { DFSOutputStream op = (DFSOutputStream)dfs.append(getPathName(f), bufferSize, progress); return new FSDataOutputStream(op, statistics, op.getInitialLen()); } even if the comment still say it is not supported, it seems to do something.... So this makes me think that append is not supported on hadoop LocalFileSystem. Is it correct? Thanks, Olivier On Thu, May 28, 2009 at 11:06 AM, Sasha Dolgy <sdo...@gmail.com> wrote: > http://www.mail-archive.com/core-user@hadoop.apache.org/msg10002.html > > > On Thu, May 28, 2009 at 3:03 PM, Olivier Smadja <osma...@gmail.com> wrote: > > Hi Sacha! > > > > Thanks for the quick answer. Is there a simple way to search the mailing > > list? by text or by author. > > > > At http://mail-archives.apache.org/mod_mbox/hadoop-core-user/ I only see > a > > browse per month... > > > > Thanks, > > Olivier > > > > > > On Thu, May 28, 2009 at 10:57 AM, Sasha Dolgy <sdo...@gmail.com> wrote: > > > >> append isn't supported without modifying the configuration file for > >> hadoop. check out the mailling list threads ... i've sent a post in > >> the past explaining how to enable it. > >> > >> On Thu, May 28, 2009 at 2:46 PM, Olivier Smadja <osma...@gmail.com> > wrote: > >> > Hello, > >> > > >> > I'm trying hadoop for the first time and I'm just trying to create a > file > >> > and append some text in it with the following code: > >> > > >> > > >> > import java.io.IOException; > >> > > >> > import org.apache.hadoop.conf. Configuration; > >> > import org.apache.hadoop.fs.FSDataOutputStream; > >> > import org.apache.hadoop.fs.FileSystem; > >> > import org.apache.hadoop.fs.Path; > >> > > >> > /** > >> > * @author olivier > >> > * > >> > */ > >> > public class HadoopIO { > >> > > >> > public static void main(String[] args) throws IOException { > >> > > >> > > >> > String directory = "/Users/olivier/tmp/hadoop-data"; > >> > Configuration conf = new Configuration(true); > >> > Path path = new Path(directory); > >> > // Create the File system > >> > FileSystem fs = path.getFileSystem(conf); > >> > // Sets the working directory > >> > fs.setWorkingDirectory(path); > >> > > >> > System.out.println(fs.getWorkingDirectory()); > >> > > >> > // Creates a files > >> > FSDataOutputStream out = fs.create(new Path("test.txt")); > >> > out.writeBytes("Testing hadoop - first line"); > >> > out.close(); > >> > // then try to append something > >> > out = fs.append(new Path("test.txt")); > >> > out.writeBytes("Testing hadoop - second line"); > >> > out.close(); > >> > > >> > fs.close(); > >> > > >> > > >> > } > >> > > >> > } > >> > > >> > > >> > but I receive the following exception: > >> > > >> > Exception in thread "main" java.io.IOException: Not supported > >> > at > >> > > >> > org.apache.hadoop.fs.ChecksumFileSystem.append(ChecksumFileSystem.java:290) > >> > at org.apache.hadoop.fs.FileSystem.append(FileSystem.java:525) > >> > at com.neodatis.odb.hadoop.HadoopIO.main(HadoopIO.java:38) > >> > > >> > > >> > 1) Can someone tell me what am i doing wrong? > >> > > >> > 2) How can I update the file (for example, just update the first 10 > bytes > >> of > >> > the file)? > >> > > >> > > >> > Thanks, > >> > Olivier > >> > > >> > >> > >> > >> -- > >> Sasha Dolgy > >> sasha.do...@gmail.com > >> > > > > > > -- > Sasha Dolgy > sasha.do...@gmail.com >