Exclude Private elements from generated Javadoc
---
Key: HADOOP-6658
URL: https://issues.apache.org/jira/browse/HADOOP-6658
Project: Hadoop Common
Issue Type: Sub-task
Components: documen
Oleg, if you specifies the directory mapreduce will take all files.
Regards,
Zacarias
On Tue, Mar 23, 2010 at 3:08 AM, Oleg Ruchovets wrote:
> Hi ,
> All examples that I found executes mapreduce job on a single file but in my
> situation I have more than one.
>
> Suppose I have such folder on
Common portion of MAPREDUCE-1545
Key: HADOOP-6657
URL: https://issues.apache.org/jira/browse/HADOOP-6657
Project: Hadoop Common
Issue Type: Improvement
Reporter: Luke Lu
Assignee: Luke
[
https://issues.apache.org/jira/browse/HADOOP-6645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rodrigo Schmidt reopened HADOOP-6645:
-
I didn't cover the case in which the parent directory is "/"
> Bugs on listStatus for HarFi
We recommend that people use Amazon S3 as the durable store when using Elastic
MapReduce. We consider the HDFS on Elastic MapReduce clusters to be transient.
With that said, you need some way to get your data into S3 from HDFS. We
recommend storing the files directly in S3 (with S3N) and not usi
[
https://issues.apache.org/jira/browse/HADOOP-6655?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Konstantin Boudnik resolved HADOOP-6655.
Resolution: Invalid
I'm proposing to close this JIRA because I've misread the test
On Mar 23, 2010, at 7:34 AM, Paolo Castagna wrote:
Should they continue to be called '-core*' even if they belong to the
Hadoop Common project?
One of the blockers on 0.21 was to rename the artifacts to common. The
artifacts for 0.20, which are unified should stay core.
https://issues.apa
On Tue, Mar 23, 2010 at 07:34AM, Paolo Castagna wrote:
> Konstantin Boudnik wrote:
> > I believe they are commonly called '-core*' :)
>
> Should they continue to be called '-core*' even if they belong to the
> Hadoop Common project?
Last I checked they are called *-core in the current trunk. Rena
Konstantin Boudnik wrote:
I believe they are commonly called '-core*' :)
Should they continue to be called '-core*' even if they belong to the
Hadoop Common project?
Or, should... with some (distributed) pain be named '-common*'?
Paolo
On Tue, Mar 23, 2010 at 07:07AM, Paolo Castagna wrote:
I believe they are commonly called '-core*' :)
On Tue, Mar 23, 2010 at 07:07AM, Paolo Castagna wrote:
> Hi,
> how should the Hadoop Common artifacts be called:
>
> - hadoop-core-{version}.jar
> - hadoop-core-{version}-sources.jar
> - hadoop-core-{version}-javadoc.jar
>
> or:
>
> - hadoo
Hi,
how should the Hadoop Common artifacts be called:
- hadoop-core-{version}.jar
- hadoop-core-{version}-sources.jar
- hadoop-core-{version}-javadoc.jar
or:
- hadoop-common-{version}.jar
- hadoop-common-{version}-sources.jar
- hadoop-common-{version}-javadoc.jar
?
Paolo
Hi ,
All examples that I found executes mapreduce job on a single file but in my
situation I have more than one.
Suppose I have such folder on HDFS which contains some files:
/my_hadoop_hdfs/my_folder:
/my_hadoop_hdfs/my_folder/file1.txt
/my_hadoop_hdfs/my_fold
12 matches
Mail list logo