On Wed, Aug 21, 2013 at 1:25 PM, Colin McCabe <cmcc...@alumni.cmu.edu>wrote:
> St.Ack wrote: > > > + Once I figured where the logs were, found that JAVA_HOME was not being > > exported (don't need this in hadoop-2.0.5 for instance). Adding an > > exported JAVA_HOME to my running shell which don't seem right but it took > > care of it (I gave up pretty quick on messing w/ > > yarn.nodemanager.env-whitelist and yarn.nodemanager.admin-env -- I wasn't > > getting anywhere) > > I thought that we were always supposed to have JAVA_HOME set when > running any of these commands. At least, I do. How else can the > system disambiguate between different Java installs? I need 2 > installs to test with JDK7. > > That is fair enough but I did not need to define this explicitly previously (for hadoop-2.0.5-alpha for instance) or the JAVA_HOME that was figured in start scripts was propagated and now is not (I have not dug in). > > + This did not seem to work for me: > > <name>hadoop.security.group.mapping</name> > > > <value>org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback</va > > lue>. > > We've seen this before. I think your problem is that you have > java.library.path set correctly (what System.loadLibrary checks), but > your system library path does not include a necessary dependency of > libhadoop.so-- most likely, libjvm.so. Probably, we should fix > NativeCodeLoader to actually make a function call in libhadoop.so > before it declares everything OK. > My expectation was that if native group lookup fails, as it does here, then the 'Fallback' would kick in and we'd do the Shell query. This mechanism does not seem to be working. St.Ack