Hi Jarek,

Below are my configurations:

1) Env Parameters:
export HADOOP_HOME=/opt/hadoop-2.0.3-alpha
export PATH=$HADOOP_HOME/bin:$PATH
export PATH=$HADOOP_HOME/sbin:$PATH
export HADOOP_MAPARED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export HDFS_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export YARN_CONF_DIR=${HADOOP_HOME}/etc/hadoop

2) hdfs-site.xml:
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>

  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>

  <property>
    <name>dfs.name.dir</name>
    <value>/home/temp/hadoop/dfs_name_dir</value>
  </property>

  <property>
    <name>dfs.data.dir</name>
    <value>/home/temp/hadoop/dfs_data_dir</value>
  </property>

  <property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
  </property>
</configuration>




2013/7/17 Jarek Jarcec Cecho <jar...@apache.org>

> Hi sir,
> the exception is suggesting that FileSystem implementation for your
> default FS can't be found. I would check HDFS configuration to ensure that
> it's configured properly and that Sqoop is properly picking the
> configuration and all the HDFS libraries.
>
> Jarcec
>
> On Tue, Jul 16, 2013 at 11:29:07AM +0800, sam liu wrote:
> > I also tried another sqoop build sqoop-1.4.2.bin__hadoop-2.0.0-alpha on
> > hadoop 2.0.3-alpha, but failed as well. The exception is same as above:
> > 'java.lang.UnsupportedOperationException: Not implemented by the
> > DistributedFileSystem FileSystem implementation'.
> >
> > This issue blocks me quite a while...
> >
> >
> > 2013/6/21 Abraham Elmahrek <a...@cloudera.com>
> >
> > > Hey Sam,
> > >
> > > My understanding is that Sqoop 1.4.3 should work with Hadoop 2.0.x
> (which
> > > would include Hadoop 2.0.4 alpha). Any ways, there seems to be some
> version
> > > conflicting going on here. Do you have any other builds of sqoop
> installed?
> > >
> > > -Abe
> > >
> > >
> > > On Thu, Jun 20, 2013 at 6:39 PM, sam liu <liuqiyun2...@gmail.com>
> wrote:
> > >
> > >> Anyone could provide a answer? We are making decision whether to
> leverage
> > >> Sqoop 1.4.3 on yarn or not.
> > >>
> > >> Thanks!
> > >>
> > >>
> > >> 2013/6/20 sam liu <liuqiyun2...@gmail.com>
> > >>
> > >>> Hi,
> > >>>
> > >>> Sqoop website says Sqoop 1.4.3 support Hadoop 2.0, but I failed to
> run
> > >>> import tool against hadoop-2.0.4-alpha, using
> > >>> sqoop-1.4.3.bin__hadoop-2.0.0-alpha. Can anyone help provide
> > >>> triage/suggestion? Thanks in advance!
> > >>>
> > >>> - Command:
> > >>> sqoop import --connect jdbc:db2://host:50000/SAMPLE --table
> > >>> DB2ADMIN.DB2TEST_TBL001 --username user --password pwd -m 1
> --target-dir
> > >>> /tmp/DB2TEST_TBL001
> > >>>
> > >>> - Exception:
> > >>> 13/06/19 23:28:28 INFO manager.SqlManager: Executing SQL statement:
> > >>> SELECT t.* FROM DB2ADMIN.DB2TEST_TBL001 AS t WHERE 1=0
> > >>> 13/06/19 23:28:28 ERROR sqoop.Sqoop: Got exception running Sqoop:
> > >>> java.lang.UnsupportedOperationException: Not implemented by the
> > >>> DistributedFileSystem FileSystem implementation
> > >>> java.lang.UnsupportedOperationException: Not implemented by the
> > >>> DistributedFileSystem FileSystem implementation
> > >>>         at
> org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:207)
> > >>>         at
> > >>> org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2245)
> > >>>         at
> > >>>
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2255)
> > >>>         at
> > >>>
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2272)
> > >>>         at
> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:86)
> > >>>         at
> > >>>
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2311)
> > >>>         at
> > >>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2293)
> > >>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:317)
> > >>>         at
> org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:288)
> > >>>         at
> org.apache.sqoop.mapreduce.JobBase.cacheJars(JobBase.java:134)
> > >>>         at
> > >>>
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:197)
> > >>>         at
> > >>> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
> > >>>         at
> > >>> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
> > >>>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
> > >>>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > >>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > >>>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > >>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> > >>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> > >>>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > >>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> > >>>
> > >>
> > >>
> > >>
> > >> --
> > >>
> > >> Sam Liu
> > >>
> > >
> > >
> >
> >
> > --
> >
> > Sam Liu
>



-- 

Sam Liu

Reply via email to