+1
Thanks Wangda for the proposal.
I would like to participate in this project, Please add me also to the
project.
Regards
Devaraj K
On Mon, Sep 2, 2019 at 8:50 PM zac yuan wrote:
> +1
>
> Submarine will be a complete solution for AI service development. It can
> take advantage
Devaraj K created HADOOP-16046:
--
Summary: [JDK 11]
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/webapp/hamlet
classes make compilation fail
Key: HADOOP-16046
URL: https
Devaraj K created HADOOP-15938:
--
Summary: [JDK 11] hadoop-annotations build fails with 'Failed to
check signatures'
Key: HADOOP-15938
URL: https://issues.apache.org/jira/browse/HADOOP-15938
Devaraj K created HADOOP-15937:
--
Summary: [JDK 11] Update maven-shade-plugin.version to 3.2.1
Key: HADOOP-15937
URL: https://issues.apache.org/jira/browse/HADOOP-15937
Project: Hadoop Common
Devaraj K created HADOOP-15936:
--
Summary: [JDK 11] MiniDFSClusterManager & MiniHadoopClusterManager
compilation fails due to the usage of '_' as identifier
Key: HADOOP-15936
URL: https://issues.a
Devaraj K created HADOOP-15935:
--
Summary: [JDK 11] Update maven.plugin-tools.version to 3.6.0
Key: HADOOP-15935
URL: https://issues.apache.org/jira/browse/HADOOP-15935
Project: Hadoop Common
>> > >>
> >> > >> On Thursday, April 2, 2015 12:57 PM, Allen Wittenauer <
> >> > >> a...@altiscale.com > wrote:
> >> > >>
> >> > >>
> >> > >>
> >> > >>
> >> > >>
> >> > >> On Apr 2, 2015, at 12:40 PM, Vinod Kumar Vavilapalli <
> >> > >> vino...@hortonworks.com > wrote:
> >> > >>
> >> > >> >
> >> > >> > We'd then doing two commits for every patch. Let's simply not
> remove
> >> > >> CHANGES.txt from trunk, keep the existing dev workflow, but doc the
> >> > >>release
> >> > >> process to remove CHANGES.txt in trunk at the time of a release
> going
> >> > >>out
> >> > >> of trunk.
> >> > >>
> >> > >>
> >> > >>
> >> > >> Might as well copy branch-2¹s changes.txt into trunk then. (or
> 2.7¹s.
> >> > >> Last I looked, people updated branch-2 and not 2.7¹s or vice versa
> for
> >> > >>some
> >> > >> patches that went into both branches.) So that folks who are
> >> > >>committing to
> >> > >> both branches and want to cherry pick all changes can.
> >> > >>
> >> > >> I mean, trunk¹s is very very very wrong. Right now. Today.
> Borderline
> >> > >> useless. See HADOOP-11718 (which I will now close out as won¹t
> fix)Š
> >> and
> >> > >> that jira is only what is miscategorized, not what is missing.
> >> > >>
> >> > >>
> >> > >>
> >> > >>
> >> >
> >> >
> >>
> >> --
> >> Mobile
> >>
>
>
--
Thanks
Devaraj K
t;>> *http://people.apache.org/~vinodkv/hadoop-2.7.1-RC0/
> >>>>>> <http://people.apache.org/~vinodkv/hadoop-2.7.1-RC0/>*
> >>>>>>
> >>>>>> The RC tag in git is: release-2.7.1-RC0
> >>>>>>
> >>>>>> The maven artifacts are available via repository.apache.org at
> >>>>>> *
> >>>>>
> >>>>>
> https://repository.apache.org/content/repositories/orgapachehadoop-101
> >>>>>9/
> >>>>>> <
> >>>>>
> >>>>>
> https://repository.apache.org/content/repositories/orgapachehadoop-101
> >>>>>9/
> >>>>> *
> >>>>>>
> >>>>>> Please try the release and vote; the vote will run for the usual 5
> >>>> days.
> >>>>>>
> >>>>>> Thanks,
> >>>>>> Vinod
> >>>>>>
> >>>>>> PS: It took 2 months instead of the planned [1] 2 weeks in getting
> >>>>>>this
> >>>>>> release out: post-mortem in a separate thread.
> >>>>>>
> >>>>>> [1]: A 2.7.1 release to follow up 2.7.0
> >>>>>> http://markmail.org/thread/zwzze6cqqgwq4rmw
> >>>>>
> >>>>>
> >>>>
> >>
> >>
> >>
> >> --
> >> Lei (Eddy) Xu
> >> Software Engineer, Cloudera
> >>
> >
> >
>
>
--
Thanks
Devaraj K
ould be a "Adoption of New Codebase" kind of
> vote and will be Lazy 2/3 majority of PMC members.
>
--
Thanks
Devaraj K
tp://svn.apache.org/viewvc/hadoop/common/tags/release-0.23.11-rc0/
>
> The maven artifacts are available via repository.apache.org.
>
> Please try the release and vote; the vote will run for the usual 7 days
> til June 26th.
>
> I am +1 (binding).
>
> thanks,
> Tom Graves
>
>
>
>
>
--
Thanks
Devaraj K
+1
Thanks
Devaraj K
On Tue, Jun 24, 2014 at 2:23 PM, Arun C Murthy wrote:
> Folks,
>
> As discussed, I'd like to call a vote on changing our by-laws to change
> release votes from 7 days to 5.
>
> I've attached the change to by-laws I'm proposing.
>
>
+1 (non-binding)
I verified by running some mapreduce examples, it works fine.
Thanks
Devaraj k
-Original Message-
From: Arun C Murthy [mailto:a...@hortonworks.com]
Sent: 07 October 2013 12:31
To: common-dev@hadoop.apache.org; hdfs-...@hadoop.apache.org;
yarn-...@hadoop.apache.org
You could also refer this for the build steps.
http://svn.apache.org/repos/asf/hadoop/common/trunk/BUILDING.txt
Thanks
Devaraj k
From: Matt Fellows [mailto:matt.fell...@bespokesoftware.com]
Sent: 03 September 2013 12:40
To: common-dev@hadoop.apache.org
Subject: Re: New dev. environment issue
+1
I downloaded and ran some examples, It works fine.
Thanks
Devaraj k
-Original Message-
From: Konstantin Boudnik [mailto:c...@apache.org]
Sent: 16 August 2013 11:00
To: common-dev@hadoop.apache.org; hdfs-...@hadoop.apache.org;
mapreduce-...@hadoop.apache.org; yarn
[
https://issues.apache.org/jira/browse/HADOOP-9794?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Devaraj K resolved HADOOP-9794.
---
Resolution: Invalid
If you find any bug in Hadoop, please raise JIRA.
> dsd
+1, downloaded the release, verified signs, ran examples and succeeded.
Thanks
Devaraj k
-Original Message-
From: Thomas Graves [mailto:tgra...@yahoo-inc.com]
Sent: 01 July 2013 22:50
To: common-dev@hadoop.apache.org
Cc: hdfs-...@hadoop.apache.org; mapreduce-...@hadoop.apache.org
Hi Arun,
Is there any possibility of including YARN-41 in this release.
Thanks
Devaraj K
-Original Message-
From: Arun C Murthy [mailto:a...@hortonworks.com]
Sent: 19 June 2013 12:29
To: mapreduce-...@hadoop.apache.org; common-dev@hadoop.apache.org;
hdfs-...@hadoop.apache.org
And also you need to run the command >mvn eclipse:eclipse
You can go through this page http://wiki.apache.org/hadoop/EclipseEnvironment
Thanks
Devaraj K
On 6/17/13, Chandrashekhar Kotekar wrote:
> Hi,
>
> Just now I checked out latest hadoop code from SVN and imported all the
&
Devaraj K created HADOOP-8578:
-
Summary: Provide a mechanism for cleaning config items from
LocalDirAllocator which will not be used anymore
Key: HADOOP-8578
URL: https://issues.apache.org/jira/browse/HADOOP-8578
Good to hear Hasan.
Welcome to the Hadoop community. You can find more details in the below link
for how to contribute.
http://wiki.apache.org/hadoop/HowToContribute
Thanks
Devaraj
From: Hasan Gürcan [hasan.guer...@googlemail.com]
Sent: Tuesday, May 2
Devaraj K created HADOOP-8439:
-
Summary: Update hadoop-setup-conf.sh to support yarn configurations
Key: HADOOP-8439
URL: https://issues.apache.org/jira/browse/HADOOP-8439
Project: Hadoop Common
Devaraj K created HADOOP-8438:
-
Summary: hadoop-validate-setup.sh refers to examples jar file
which doesn't exist
Key: HADOOP-8438
URL: https://issues.apache.org/jira/browse/HADOOP-8438
Project: H
This can be due to any of these reasons,
1. No DataNode instances being up and running.
2. The DataNode instances cannot talk to the server, through networking or
Hadoop configuration problems.
3. Your DataNode instances have no hard disk space in their configured data
directories.
4. Your Da
Hi Lopez,
You need to have default constructor for TaggedWritable because while
deserializing it creates instance using default constructor and calls the
readFields() on it.
Thanks
Devaraj
From: LopezGG [lopezgilsi...@gmail.com]
Sent: Wednesday, Apr
: Hadoop Common
Issue Type: Bug
Components: scripts
Affects Versions: 2.0.0, 3.0.0
Reporter: Devaraj K
Assignee: Devaraj K
Priority: Minor
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please
I tried applying your patches in my environment, It works fine.
The issue with Hadoop QA may be because of having -M (^M) characters in
the patch file. Can you try by removing the -M (^M) chars in the patch.
Thanks
Devaraj
From: Mostafa Elhemali [mosta
Type: Bug
Components: metrics
Affects Versions: 0.24.0
Reporter: Devaraj K
Assignee: Devaraj K
{code:title=MBeans.java|borderStyle=solid}
static public ObjectName register(String serviceName, String nameName,
Object the
Components: build
Affects Versions: 0.24.0, 0.23.1
Reporter: Devaraj K
{code:xml}
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/hadoop/install/hadoop-0.23.1-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl
[
https://issues.apache.org/jira/browse/HADOOP-4160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Devaraj K resolved HADOOP-4160.
---
Resolution: Duplicate
Duplicate of MAPREDUCE-2903.
> Job tracker browser log sh
Can you check the MultipleOutputs class.
http://hadoop.apache.org/common/docs/r0.20.205.0/api/org/apache/hadoop/mapre
d/lib/MultipleOutputs.html
Devaraj K
-Original Message-
From: Bhavesh Shah [mailto:bhavesh25s...@gmail.com]
Sent: Tuesday, December 27, 2011 3:29 PM
To: hadoop
Project: Hadoop Common
Issue Type: Bug
Components: scripts
Affects Versions: 0.24.0
Reporter: Devaraj K
Assignee: Devaraj K
hadoop-common-project\hadoop-common\src\main\packages\hadoop-setup-conf.sh has
following issues
1. check_permission
: scripts
Affects Versions: 0.24.0
Reporter: Devaraj K
Assignee: Devaraj K
Fix For: 0.24.0
When we execute start-dfs.sh, it is gving the below error.
{code:xml}
linux124:/home/devaraj/NextGenMR/Hadoop-0.24-09082011/hadoop-hdfs-0.24.0-SNAPSHOT/sbin
: 0.23.0
Reporter: Devaraj K
{code:title=IOUtils.java|borderStyle=solid}
try {
copyBytes(in, out, buffSize);
} finally {
if(close) {
out.close();
in.close();
}
}
{code}
In the above code if any exception throws from the out.close() statement
://issues.apache.org/jira/browse/HADOOP-7130
Project: Hadoop Common
Issue Type: Bug
Components: fs
Affects Versions: 0.20.2
Reporter: Devaraj K
1. Pull out one hard disk from Task tracker node (out of 10 disks pull one).
Now it is noted that some jobs
Type: Improvement
Components: conf
Affects Versions: 0.21.0
Environment: NA
Reporter: Devaraj K
Priority: Minor
Fix For: 0.22.0
Retrying socket connection failure times are hard coded as 45 and it is giving
the retryring message for 45
35 matches
Mail list logo