Thanks for the help Philip. Yeap I am using the proxy python script which interacts with the * HadoopThriftServer*. This *HadoopThriftServer* uses the * org.apache.hadoop.fs.FileSystem* object to interact with the some file-system. As you suggested, since *HadoopThriftServer* doesn't explicitly state that it needs to interact using hdfs://, it keeps using fs.default.name.
Does anyone know what would be the right way to fix *HadoopThriftServer *to start interacting with the HDFS? regards, -- Ahmad Humayun Graduate Student Computer Science Dpt., UCL http://www.cs.ucl.ac.uk/students/A.Humayun/ +44 (0)79 5536 6637 On Sun, Jul 11, 2010 at 4:36 PM, Philip Zeyliger <phi...@cloudera.com>wrote: > You're using the server that serves as a proxy to HDFS via Thrift? If > I had to guess, the server is configured with fs.default.name set to > the default (file:///, which is the local fs), instead of > (hdfs://<namenode>:8020/). > > Cheers, > > -- Philip > > On Sun, Jul 11, 2010 at 8:15 AM, Ahmad Humayun <ahmad.hu...@gmail.com> > wrote: > > Hi there, > > > > I am trying to use ThriftFS to interact with the HDFS. I am getting quite > a > > strange problem. Whenever I try to run > src/contrib/thriftfs/scripts/hdfs.py, > > it seems as if the thrift server is interacting with the local file > system > > rather than the HDFS (I have checked that the HDFS is running). For > example > > when invoke "hdfs>> ls /", it returns the root listing of the local linux > > file system and not the listing of the HDFS root. > > > > I have a feeling that I am doing something wrong which is pretty basic. > Does > > anyone have an idea? > > > > Oh by the way, I run hdfs.py without any arguments. > > > > > > regards, > > -- > > Ahmad Humayun > > Graduate Student > > Computer Science Dpt., UCL > > http://www.cs.ucl.ac.uk/students/A.Humayun/ > > +44 (0)79 5536 6637 > > >