Moving to common-dev@.
I'm able to run all hadoop-hdfs-httpfs tests without failure if I do a
"mvn clean test" under that directory.
Can you try to clean and then re-run? Might have been a transient
failure? Do you have the test logs?
If you can reliably reproduce this test failure, please repor
Ah, oops. I forgot there was a dev list. Sorry.
I ran "mvn clean install -Pdist -Dtar -Ptest-patch" as directed in the
document. I'm not sure what the options do, though.
I'll give "mvn clean test" a try.
On Thu, Jan 26, 2012 at 7:07 AM, Harsh J wrote:
> Moving to common-dev@.
>
> I'm able
What are you looking to do exactly?
Are you looking to compile and get the project setups ready for eclipse?
On Thu, Jan 26, 2012 at 7:26 PM, Bai Shen wrote:
> Ah, oops. I forgot there was a dev list. Sorry.
>
> I ran "mvn clean install -Pdist -Dtar -Ptest-patch" as directed in the
> document.
I put cygwin's bin on windows PATH ,but i got the same error .
Anyway i decided to move on Ubuntu .
So I have got the hadoop source on Ubuntu and tried to build that with
"mvn clean package" ,but I got another error which complained about pom
validation :
Failed to validate pom for project hadoo
Yes. I need to do some debugging, so I'm trying to get hadoop compiled so
I can try some changes.
Also, I'm still getting the error even when running "mvn clean test".
I'm using the OpenJDK 1.6.0_22 on F16. Is there any other information
needed?
On Thu, Jan 26, 2012 at 9:16 AM, Harsh J wrote:
Bai,
In that case, to get started quick, try:
mvn install -DskipTests && mvn eclipse:eclipse
Then import all your required projects in.
On Thu, Jan 26, 2012 at 10:12 PM, Bai Shen wrote:
> Yes. I need to do some debugging, so I'm trying to get hadoop compiled so
> I can try some changes.
>
> A
Hi Samaneh,
Have you checked your repository location vs. what is in the POM?
http://maven.apache.org/maven-1.x/reference/properties.html
Your local repo is probably in ~/.m2
Hope that helps.
Thanks,
GenericOptionsParser ought to have better options parsing, and not pick only
the options in the front
-
Key: HADOOP-7995
URL: https://issues.apache.org/jira/browse
change location of the native libraries to lib instead of lib/native
Key: HADOOP-7996
URL: https://issues.apache.org/jira/browse/HADOOP-7996
Project: Hadoop Common
Issue Ty
I'm confused about the disparity of block sizes between BlockCompressorStream
and SnappyCompressor.
BlockCompressorStream has default MAX_INPUT_SIZE on the order of 512 bytes,
whereas SnappyCompressor has IO_COMPRESSION_CODEC_SNAPPY_BUFFERSIZE_DEFAULT of
256kB.
In BlockCompressorStream.write()
[
https://issues.apache.org/jira/browse/HADOOP-6844?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Eli Collins resolved HADOOP-6844.
-
Resolution: Duplicate
HADOOP-7247 duped this.
> Update docs to reflect new jar
11 matches
Mail list logo