put your .so file in every traker's Hadoop-install/lib/native/Linux-xxx-xx/

Or
 
In your code,try to do
  
  String oldPath=System.getProperty("java.library.path");
  System.setProperty("java.library.path", oldPath==null?
local_path_of_lib_file:oldPath+pathSeparator +local_path_of_lib_file))
  System.loadLibrary("XXX");

However, you also need to fetch the library to local through
DistributedCache( like jason said) or putting and getting it from hdfs by
yourself.

On 09-4-30 下午5:14, "Ian jonhson" <[email protected]> wrote:

> You mean that the current hadoop does not support JNI calls, right?
> Are there any solution to achieve the calls from C interfaces?
> 
> 2009/4/30 He Yongqiang <[email protected]>:
>> Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
>> not, I think we should create a jira issue for supporting that.
>> 
>> 
>> On 09-4-30 下午4:02, "Ian jonhson" <[email protected]> wrote:
>> 
>>> Thanks for answering.
>>> 
>>> I run my Hadoop in single node, not cluster mode.
>>> 
>>> 
>>> 
>>> On Thu, Apr 30, 2009 at 11:21 AM, jason hadoop <[email protected]>
>>> wrote:
>>>> You need to make sure that the shared library is available on the
>>>> tasktracker nodes, either by installing it, or by pushing it around via the
>>>> distributed cache
>>>> 
>>>> 
>>>> 
>>>> On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <[email protected]> wrote:
>>>> 
>>>>> Dear all,
>>>>> 
>>>>> I wrote a plugin codes for Hadoop, which calls the interfaces
>>>>> in Cpp-built .so library. The plugin codes are written in java,
>>>>> so I prepared a JNI class to encapsulate the C interfaces.
>>>>> 
>>>>> The java codes can be executed successfully when I compiled
>>>>> it and run it standalone. However, it does not work when I embedded
>>>>> in Hadoop. The exception shown out is (found in Hadoop logs):
>>>>> 
>>>>> 
>>>>> ------------  screen dump  ---------------------
>>>>> 
>>>>> # grep myClass logs/* -r
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>>>> thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
>>>>> java.lang.UnsatisfiedLinkError:
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>>>> thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
>>>>> java.lang.UnsatisfiedLinkError:
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>>>> ...
>>>>> 
>>>>> --------------------------------------------------------
>>>>> 
>>>>> It seems the library can not be loaded in Hadoop. My codes
>>>>> (myClass.java) is like:
>>>>> 
>>>>> 
>>>>> ---------------  myClass.java  ------------------
>>>>> 
>>>>> public class myClass
>>>>> {
>>>>> 
>>>>>        public static final Log LOG =
>>>>>                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
>>>>> 
>>>>> 
>>>>>        public myClass()   {
>>>>> 
>>>>>                try {
>>>>>                        //System.setProperty("java.library.path",
>>>>> "/usr/local/lib");
>>>>> 
>>>>>                        /* The above line does not work, so I have to
>>>>> do something
>>>>>                         * like following line.
>>>>>                         */
>>>>>                        addDir(new String("/usr/local/lib"));
>>>>>                        System.loadLibrary("myclass");
>>>>>                }
>>>>>                catch(UnsatisfiedLinkError e) {
>>>>>                        LOG.info( "Cannot load library:\n " +
>>>>>                                e.toString() );
>>>>>                }
>>>>>                catch(IOException ioe) {
>>>>>                        LOG.info( "IO error:\n " +
>>>>>                                ioe.toString() );
>>>>>                }
>>>>> 
>>>>>        }
>>>>> 
>>>>>        /* Since the System.setProperty() does not work, I have to add the
>>>>> following
>>>>>         * function to force the path is added in java.library.path
>>>>>         */
>>>>>        public static void addDir(String s) throws IOException {
>>>>> 
>>>>>            try {
>>>>>                        Field field =
>>>>> ClassLoader.class.getDeclaredField("usr_paths");
>>>>>                         field.setAccessible(true);
>>>>>                        String[] paths = (String[])field.get(null);
>>>>>                        for (int i = 0; i < paths.length; i++) {
>>>>>                            if (s.equals(paths[i])) {
>>>>>                                return;
>>>>>                            }
>>>>>                        }
>>>>>                        String[] tmp = new String[paths.length+1];
>>>>>                        System.arraycopy(paths,0,tmp,0,paths.length);
>>>>>                        tmp[paths.length] = s;
>>>>> 
>>>>>                        field.set(null,tmp);
>>>>>                    } catch (IllegalAccessException e) {
>>>>>                        throw new IOException("Failed to get
>>>>> permissions to set library path");
>>>>>                    } catch (NoSuchFieldException e) {
>>>>>                        throw new IOException("Failed to get field
>>>>> handle to set library path");
>>>>>            }
>>>>>        }
>>>>> 
>>>>>        public native int myClassfsMount(String subsys);
>>>>>        public native int myClassfsUmount(String subsys);
>>>>> 
>>>>> 
>>>>> }
>>>>> 
>>>>> --------------------------------------------------------
>>>>> 
>>>>> 
>>>>> I don't know what missed in my codes and am wondering whether there are
>>>>> any
>>>>> rules in Hadoop I should obey if I want to  achieve my target.
>>>>> 
>>>>> FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
>>>>> services from a daemon. I would better if this design did not cause the
>>>>> fail in
>>>>> my codes.
>>>>> 
>>>>> 
>>>>> Any comments?
>>>>> 
>>>>> 
>>>>> Thanks in advance,
>>>>> 
>>>>> Ian
>>>>> 
>>>> 
>>>> 
>>>> 
>>>> --
>>>> Alpha Chapters of my book on Hadoop are available
>>>> http://www.apress.com/book/view/9781430219422
>>>> 
>>> 
>>> 
>> 
>> 
>> 
> 
> 


Reply via email to