See http://hudson.zones.apache.org/hudson/job/Hadoop-Hdfs-trunk/45/
--
[...truncated 317708 lines...]
[junit] 2009-08-10 15:21:36,540 INFO common.Storage
(DataStorage.java:recoverTransitionRead(123)) - Formatting ...
[junit] 2009-08-10 15:21:36,943
Required avro classes are missing
-
Key: HDFS-534
URL: https://issues.apache.org/jira/browse/HDFS-534
Project: Hadoop HDFS
Issue Type: Bug
Reporter: Tsz Wo (Nicholas), SZE
Some tests like TestDFS
TestFileCreation occasionally fails because of an exception in DataStreamer.
Key: HDFS-535
URL: https://issues.apache.org/jira/browse/HDFS-535
Project: Hadoop HDFS
Support hflush at DFSClient
---
Key: HDFS-536
URL: https://issues.apache.org/jira/browse/HDFS-536
Project: Hadoop HDFS
Issue Type: Sub-task
Reporter: Hairong Kuang
Assignee: Hairong Kuang
-
I have to admit that I don't know the official answer. The hack below seems
working:
- compile all 3 sub-projects;
- copy everything in hdfs/build and mapreduce/build to common/build;
- then run hadoop by the scripts in common/bin as before.
Any better idea?
Nicholas Sze
I'd hazard a guess and say we should hitch our wagon to https://issues.apache.org/jira/browse/HADOOP-5107
.
Arun
On Aug 10, 2009, at 5:25 PM, Tsz Wo (Nicholas), Sze wrote:
I have to admit that I don't know the official answer. The hack
below seems working:
- compile all 3 sub-projects;
- c
DataNode uses ReplicaBeingWritten to support dfs writes/hflush
--
Key: HDFS-537
URL: https://issues.apache.org/jira/browse/HDFS-537
Project: Hadoop HDFS
Issue Type: Sub-task
Hi Todd,
Two problems:
- The patch in HADOOP-6152 cannot be applied.
- I have tried an approach similar to the one described by the slides but it
did not work since jetty cannot find the webapps directory. See below:
2009-08-10 17:54:41,671 WARN org.mortbay.log: Web application not found
file:
[
https://issues.apache.org/jira/browse/HDFS-525?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Tsz Wo (Nicholas), SZE reopened HDFS-525:
-
Reopen for committing this to 0.20.
> ListPathsServlet.java uses static SimpleDateForm
[
https://issues.apache.org/jira/browse/HDFS-525?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Tsz Wo (Nicholas), SZE resolved HDFS-525.
-
Resolution: Fixed
Fix Version/s: 0.20.1
I have also committed this to 0.20.1
Yeah, I'm hitting the same issues, the patch problems weren't really an
issue (same line for same line conflict on my checkout), but not having the
webapp's sort of a pain.
Looks like ant bin-package puts the webapps dir in
HDFS_HOME/build/hadoop-hdfs-0.21.0-dev/webapps, while the daemon's expecti
FWIW, I've been using the following simple shell script:
[0]doorstop:hadoop(149128)$cat damnit.sh
#!/bin/bash
set -o errexit
set -x
cd hadoop-common
ant binary
cd ..
cd hadoop-hdfs
ant binary
cd ..
cd hadoop-mapreduce
ant binary
cd ..
mkdir -p all/bin all/lib all/contrib
cp hadoop-common/bin/*
Hi Philip,
Tried the script. It seems that the script could start a cluster but the web
page did not work. Got the following error from the web interface:
HTTP ERROR: 404
/dfshealth.jsp
RequestURI=/dfshealth.jsp
Powered by Jetty://
Thanks,
Nicholas
- Original Message
> From: Phili
Hey Nicholas,
Aaron gave a presentation with his best guess at the HUG last month. His
slides are here: http://www.cloudera.com/blog/2009/07/17/the-project-split/
(starting at slide 16)
(I'd let him reply himself, but he's out of the office this afternoon ;-) )
Hopefully we'll get towards somethi
14 matches
Mail list logo