Noah Watkins created HADOOP-11244:
-
Summary: The HCFS contract test testRenameFileBeingAppended
doesn't do a rename
Key: HADOOP-11244
URL: https://issues.apache.org/jira/browse/HADOOP-11244
Pr
watkins nwatkins 4096 2012-02-22 09:49 pkgconfig
On Feb 22, 2012, at 10:25 AM, Ralph Castain wrote:
> Hi Noah
>
> Your LD_LIBRARY_PATH has to include the path to the OMPI libraries so we can
> find libmpi. Did you include your $prefix/lib[64] in it?
>
> On Feb 22, 2012
Just gave the nightly release a try and I'm getting an error:
nwatkins@kyoto:~/projects/openmpi_java/openmpi-1.7a1r25994/examples$ mpirun -np
1 java Hello
JAVA BINDINGS FAILED TO LOAD REQUIRED LIBRARIES
My setup is on the latest Ubuntu with:
1. Built with contrib/platform/hadoop/linux
2. Compil
Could you try mvn -X compile? That should show the exact commands that are
being executed which seem to be coming from hadoop-project-dist/pom.xml.
On Lion the following worked me:
git clone git://git.apache.org/hadoop-common.git
cd hadoop-common
mvn compile
I had a very similar error:
[ERR
On Jan 30, 2012, at 3:58 PM, Ralph Castain wrote:
> I have tried setting -Djava.library.path and LD_LIBRARY_PATH to the correct
> locations. In both cases, I get errors from the JNI code indicating that it
> was unable to open the specified dynamic library.
Do the paths to OpenMPI libraries, a
[
https://issues.apache.org/jira/browse/HADOOP-6779?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Noah Watkins resolved HADOOP-6779.
--
Resolution: Won't Fix
Resolved as "Won't Fix". If there are any users
Hi,
Is it possible to use IP addresses for the host list of BlockLocation? The Ceph
file system uses IPs to describe block locations, and Hadoop is not achieving
any data locality. I used a quick reverse DNS hack to verify that the IPs were
causing the problem, but this isn't robust, and the Ce
- Original Message -
> From: "Eli Collins"
>
> RawLocalFileSystem uses Java's File#list which has "no guarantee that
> the name strings in the resulting array will appear in any specific
> order; they are not, in particular, guaranteed to appear in
> alphabetical order.", however the FSCon
I have a question about the FileSystem contract in 0.20.
In FileSystemContractBaseBaseTest:testFileStatus() there
are several files created, and afterwards the test confirms
that they are present. Here is the relevant code:
FileStatus[] paths = fs.listStatus(path("/test"));
paths = fs.li
pattern.
>
> Cheers,
>
> Joep
> ________
> From: Noah Watkins [jayh...@soe.ucsc.edu]
> Sent: Sunday, July 31, 2011 5:02 PM
> To: common-dev@hadoop.apache.org
> Subject: Trouble resolving external jar dependency
>
> I'm experimen
I'm experimenting with a new file system that depends on an external jar that
is not available right now via maven. I added the jar to the lib/ directory and
hadoop-common builds fine. However, when running 'ant mvn-install' I get the
following error. It seems as though a reference to the extern
On Mar 23, 2011, at 1:51 PM, Keren Ouaknine wrote:
> Hello,
>
> I have a specific RecordReder, named XRR. I would like iterate on split and
> create records. Thus, need to position its start attribute. I dont have
> enough info from InputSplit. Interface for inputsplit had only getLength and
> ge
Components: fs
Affects Versions: 0.23.0
Reporter: Noah Watkins
Fix For: 0.23.0
This patch does 2 things that makes sub-classing RawLocalFileSystem easier.
First, it adds a constructor that allows a sub-class to avoid calling
getInitialWorkingDirectory(). This is
: 0.22.0
Environment: Ubuntu 10.10
Reporter: Noah Watkins
Priority: Blocker
Fix For: 0.22.0
My Hadoop installation is having trouble loading the native code library. It
appears from the log below that java.library.path is missing the basedir in its
path
What is the relation between the current trunk and branch-0.22? Is trunk the
current dev for 0.23 or 0.22?
Thanks,
Noah
Hi,
My Hadoop installation is having trouble loading the native code library. It
appears from the log below that java.library.path is missing the basedir in its
path. The libraries are built, and present in the directory shown below
(relative to hadoop-common directory). Instead of seeing:
> Most of the Hadoop code base (MapReduce, FsShell etc) has not yet
> switched to the new API yet.
I have a FS implementation that is nearly complete, and designed against the
0.21 code base, using the old FileSystem interface. Do you have any
recommendations on how to move to the new API in trunk
Affects Versions: 0.21.0
Reporter: Noah Watkins
The file:
src/native/src/org_apache_hadoop.h
declares the static function "do_dlsym" in the header as non-inline. Files
including the header (e.g. for the THROW macro) receive a "defined but unused"
warning duri
Hi,
This question pertains to the 0.21 release.
When adding code to the Hadoop native code library what is the preferred method
for regenerating Makefile.in? I've tried to use autoreconf, but it seems as
though my newer version of autotools is causing compilation problems for the
other native
19 matches
Mail list logo