Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi,

That's even more interesting. How's so since the profile got added a
week ago or later and RC2 was cut two/three days ago? Anyone know?

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 5:09 AM, Marcelo Vanzin  wrote:
> There is no "mesos" profile in 2.0.1.
>
> On Sat, Sep 24, 2016 at 2:19 PM, Jacek Laskowski  wrote:
>> Hi,
>>
>> I keep asking myself why are you guys not including -Pmesos in your
>> builds? Is this on purpose or have you overlooked it?
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sat, Sep 24, 2016 at 9:25 PM, Dongjoon Hyun  wrote:
>>> +1 (non binding)
>>>
>>> I compiled and tested on the following two systems.
>>>
>>> - CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
>>> -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
>>> - CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
>>> -Phive -Phive-thriftserver
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>>
>>> On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski  wrote:

 Hi,

 Not that it could fix the issue but no -Pmesos?

 Jacek


 On 24 Sep 2016 12:08 a.m., "Sean Owen"  wrote:
>
> +1 Signatures and hashes check out. I checked that the Kinesis
> assembly artifacts are not present.
>
> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
> problem. This test never completed. If nobody else sees it, +1,
> assuming it's a bad test or env issue.
>
> - should clone and clean line object in ClosureCleaner *** FAILED ***
>   isContain was true Interpreter output contained 'Exception':
>   Welcome to
>   __
>/ __/__  ___ _/ /__
>   _\ \/ _ \/ _ `/ __/  '_/
>  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
> /_/
>
>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>   Type in expressions to have them evaluated.
>   Type :help for more information.
>
>   scala> // Entering paste mode (ctrl-D to finish)
>
>
>   // Exiting paste mode, now interpreting.
>
>   org.apache.spark.SparkException: Job 0 cancelled because
> SparkContext was shut down
> at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
> ...
>
>
> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
> > Please vote on releasing the following candidate as Apache Spark
> > version
> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
> > passes
> > if a majority of at least 3+1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 2.0.1
> > [ ] -1 Do not release this package because ...
> >
> >
> > The tag to be voted on is v2.0.1-rc2
> > (04141ad49806a48afccc236b699827997142bd57)
> >
> > This release candidate resolves 284 issues:
> > https://s.apache.org/spark-2.0.1-jira
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1199
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
> >
> >
> > Q: How can I help test this release?
> > A: If you are a Spark user, you can help us test this release by taking
> > an
> > existing Spark workload and running on this release candidate, then
> > reporting any regressions from 2.0.0.
> >
> > Q: What justifies a -1 vote for this release?
> > A: This is a maintenance release in the 2.0.x series.  Bugs already
> > present
> > in 2.0.0, missing features, or bugs related to new features will not
> > necessarily block this release.
> >
> > Q: What happened to 2.0.1 RC1?
> > A: There was an issue with RC1 R documentation during release candidate
> > preparation. As a result, rc1 was canceled before a vote was called.
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>>>
>>
>> -
>> To unsubscribe e-mail: dev-un

Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Sean Owen
It was added to the master branch, and this is a release from the 2.0.x branch.

On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
> Hi,
>
> That's even more interesting. How's so since the profile got added a
> week ago or later and RC2 was cut two/three days ago? Anyone know?
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi Sean,

Sure, but then the question is why it's not a part of 2.0.1? I thought
it was considered ready for prime time and so should be shipped in
2.0.1.

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
> It was added to the master branch, and this is a release from the 2.0.x 
> branch.
>
> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
>> Hi,
>>
>> That's even more interesting. How's so since the profile got added a
>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Sean Owen
It's a change to the structure of the project, and probably not
appropriate for a maintenance release. 2.0.1 core would then no longer
contain Mesos code while 2.0.0 did.

On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
> Hi Sean,
>
> Sure, but then the question is why it's not a part of 2.0.1? I thought
> it was considered ready for prime time and so should be shipped in
> 2.0.1.
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
>> It was added to the master branch, and this is a release from the 2.0.x 
>> branch.
>>
>> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
>>> Hi,
>>>
>>> That's even more interesting. How's so since the profile got added a
>>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi Sean,

So, another question would be when is the change going to be released
then? What's the version for the master? The next release's 2.0.2 so
it's not for mesos profile either :(

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
> It's a change to the structure of the project, and probably not
> appropriate for a maintenance release. 2.0.1 core would then no longer
> contain Mesos code while 2.0.0 did.
>
> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
>> Hi Sean,
>>
>> Sure, but then the question is why it's not a part of 2.0.1? I thought
>> it was considered ready for prime time and so should be shipped in
>> 2.0.1.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
>>> It was added to the master branch, and this is a release from the 2.0.x 
>>> branch.
>>>
>>> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
 Hi,

 That's even more interesting. How's so since the profile got added a
 week ago or later and RC2 was cut two/three days ago? Anyone know?


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Sean Owen
Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
becomes the de facto 2.2.x branch. It's not true that the next release
is 2.0.2. You can see the master version:
https://github.com/apache/spark/blob/master/pom.xml#L29

On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski  wrote:
> Hi Sean,
>
> So, another question would be when is the change going to be released
> then? What's the version for the master? The next release's 2.0.2 so
> it's not for mesos profile either :(
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
>> It's a change to the structure of the project, and probably not
>> appropriate for a maintenance release. 2.0.1 core would then no longer
>> contain Mesos code while 2.0.0 did.
>>
>> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
>>> Hi Sean,
>>>
>>> Sure, but then the question is why it's not a part of 2.0.1? I thought
>>> it was considered ready for prime time and so should be shipped in
>>> 2.0.1.
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> 
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
 It was added to the master branch, and this is a release from the 2.0.x 
 branch.

 On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
> Hi,
>
> That's even more interesting. How's so since the profile got added a
> week ago or later and RC2 was cut two/three days ago? Anyone know?
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Sean Owen
+1 binding. Same as for RC2 -- it all checks out, from license to sigs
to compile and test. We have no issues of any kind targeted for 2.0.1.

I do have one test failure that I have seen with some regularity in
Ubuntu but not on any Jenkins machines. One of the YARN tests will
just kill all my ssh sessions (!)

YarnClusterSuite:
...
- run Spark in yarn-cluster mode with additional jar
- run Spark in yarn-cluster mode unsuccessfully
Connection to ... closed by remote host.

Every time. I wonder if anyone's seen that while testing? But it's no blocker.

On Sat, Sep 24, 2016 at 11:08 PM, Reynold Xin  wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if a
> majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc3
> (9d28cc10357a8afcfb2fa2e6eecb5c2cc2730d17)
>
> This release candidate resolves 290 issues:
> https://s.apache.org/spark-2.0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1201/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already present
> in 2.0.0, missing features, or bugs related to new features will not
> necessarily block this release.
>
> Q: What fix version should I use for patches merging into branch-2.0 from
> now on?
> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
> (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.
>
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi Sean,

I remember a similar discussion about the releases in Spark and I must
admit it again -- I simply don't get it. I seem to not have paid
enough attention to details to appreciate it. I apologize for asking
the very same questions again and again. Sorry.

Re the next release, I was referring to JIRA where 2.0.2 came up quite
recently for issues not included in 2.0.1. This disjoint between
releases and JIRA versions causes even more frustration whenever I'm
asked what and when the next release is going to be. It's not as
simple as I think it should be (for me).

(I really hope it's only me with this mental issue)

Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
unless someone adds it to branch-2.0. Correct?

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
> Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
> becomes the de facto 2.2.x branch. It's not true that the next release
> is 2.0.2. You can see the master version:
> https://github.com/apache/spark/blob/master/pom.xml#L29
>
> On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski  wrote:
>> Hi Sean,
>>
>> So, another question would be when is the change going to be released
>> then? What's the version for the master? The next release's 2.0.2 so
>> it's not for mesos profile either :(
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
>>> It's a change to the structure of the project, and probably not
>>> appropriate for a maintenance release. 2.0.1 core would then no longer
>>> contain Mesos code while 2.0.0 did.
>>>
>>> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
 Hi Sean,

 Sure, but then the question is why it's not a part of 2.0.1? I thought
 it was considered ready for prime time and so should be shipped in
 2.0.1.

 Pozdrawiam,
 Jacek Laskowski
 
 https://medium.com/@jaceklaskowski/
 Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
 Follow me at https://twitter.com/jaceklaskowski


 On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
> It was added to the master branch, and this is a release from the 2.0.x 
> branch.
>
> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
>> Hi,
>>
>> That's even more interesting. How's so since the profile got added a
>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Jacek Laskowski
+1

Ship it!

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 12:08 AM, Reynold Xin  wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if a
> majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc3
> (9d28cc10357a8afcfb2fa2e6eecb5c2cc2730d17)
>
> This release candidate resolves 290 issues:
> https://s.apache.org/spark-2.0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1201/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already present
> in 2.0.0, missing features, or bugs related to new features will not
> necessarily block this release.
>
> Q: What fix version should I use for patches merging into branch-2.0 from
> now on?
> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
> (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.
>
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Master, branches and versioning

2016-09-25 Thread Sean Owen
(Renaming thread to keep it separate from RC vote)

If you're asking why there's a version 2.0.2 in JIRA, it's because we
have to have that entity in order to target anything to version 2.0.2.
2.2.0 exists as well as version labels in JIRA. None are marked as
'released' because there is no such release, but it makes perfect
sense to have a noun to talk about in JIRA.

There may never be a 2.0.2 software release. But anything committed to
2.0.x after this point will be released in 2.0.2 _if it does get
released_.
If 2.0.2 happens it may happen before or after 2.1.0; that's normal. I
suspect it would happen after, if ever, and I expect the next actual
software release chronologically will be 2.1.0.

I think this is all standard software procedure; what's the confusion?

There is no formal plan for when which releases happen, so I don't
think anyone can answer that definitively. I happens when it happens
by loose consensus.

The -Pmesos change is not in branch-2.0 and therefore would not be in
any 2.0.x release.


On Sun, Sep 25, 2016 at 4:31 PM, Jacek Laskowski  wrote:
> Hi Sean,
>
> I remember a similar discussion about the releases in Spark and I must
> admit it again -- I simply don't get it. I seem to not have paid
> enough attention to details to appreciate it. I apologize for asking
> the very same questions again and again. Sorry.
>
> Re the next release, I was referring to JIRA where 2.0.2 came up quite
> recently for issues not included in 2.0.1. This disjoint between
> releases and JIRA versions causes even more frustration whenever I'm
> asked what and when the next release is going to be. It's not as
> simple as I think it should be (for me).
>
> (I really hope it's only me with this mental issue)
>
> Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
> unless someone adds it to branch-2.0. Correct?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
>> Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
>> becomes the de facto 2.2.x branch. It's not true that the next release
>> is 2.0.2. You can see the master version:
>> https://github.com/apache/spark/blob/master/pom.xml#L29

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Dongjoon Hyun
+1 (non binding)

RC3 is compiled and tested on the following two systems, too. All tests
passed.

* CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
   with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
-Dsparkr
* CentOS 7.2 / Open JDK 1.8.0_102
   with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver

Cheers,
Dongjoon



On Saturday, September 24, 2016, Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if
> a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5c
> 2cc2730d17)
>
> This release candidate resolves 290 issues: https://s.apache.org/spark-2.
> 0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1201/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already
> present in 2.0.0, missing features, or bugs related to new features will
> not necessarily block this release.
>
> Q: What fix version should I use for patches merging into branch-2.0 from
> now on?
> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
> (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.
>
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Yin Huai
+1

On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun  wrote:

> +1 (non binding)
>
> RC3 is compiled and tested on the following two systems, too. All tests
> passed.
>
> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
> -Dsparkr
> * CentOS 7.2 / Open JDK 1.8.0_102
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>
> Cheers,
> Dongjoon
>
>
>
> On Saturday, September 24, 2016, Reynold Xin  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if
>> a majority of at least 3+1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.0.1
>> [ ] -1 Do not release this package because ...
>>
>>
>> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
>> c2cc2730d17)
>>
>> This release candidate resolves 290 issues:
>> https://s.apache.org/spark-2.0.1-jira
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1201/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/
>>
>>
>> Q: How can I help test this release?
>> A: If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions from 2.0.0.
>>
>> Q: What justifies a -1 vote for this release?
>> A: This is a maintenance release in the 2.0.x series.  Bugs already
>> present in 2.0.0, missing features, or bugs related to new features will
>> not necessarily block this release.
>>
>> Q: What fix version should I use for patches merging into branch-2.0 from
>> now on?
>> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
>> (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.
>>
>>
>>


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Josh Rosen
+1

On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:

> +1
>
> On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
> wrote:
>
>> +1 (non binding)
>>
>> RC3 is compiled and tested on the following two systems, too. All tests
>> passed.
>>
>> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>> -Dsparkr
>> * CentOS 7.2 / Open JDK 1.8.0_102
>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>>
>> Cheers,
>> Dongjoon
>>
>>
>>
>> On Saturday, September 24, 2016, Reynold Xin  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if
>>> a majority of at least 3+1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.0.1
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> The tag to be voted on is v2.0.1-rc3
>>> (9d28cc10357a8afcfb2fa2e6eecb5c2cc2730d17)
>>>
>>> This release candidate resolves 290 issues:
>>> https://s.apache.org/spark-2.0.1-jira
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://people.apache.org/keys/committer/pwendell.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1201/
>>>
>>> The documentation corresponding to this release can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/
>>>
>>>
>>> Q: How can I help test this release?
>>> A: If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions from 2.0.0.
>>>
>>> Q: What justifies a -1 vote for this release?
>>> A: This is a maintenance release in the 2.0.x series.  Bugs already
>>> present in 2.0.0, missing features, or bugs related to new features will
>>> not necessarily block this release.
>>>
>>> Q: What fix version should I use for patches merging into branch-2.0
>>> from now on?
>>> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
>>> (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.
>>>
>>>
>>>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Matei Zaharia
+1

Matei

> On Sep 25, 2016, at 1:25 PM, Josh Rosen  wrote:
> 
> +1
> 
> On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  > wrote:
> +1
> 
> On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun  > wrote:
> +1 (non binding)
> 
> RC3 is compiled and tested on the following two systems, too. All tests 
> passed.
> 
> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
> * CentOS 7.2 / Open JDK 1.8.0_102
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
> 
> Cheers,
> Dongjoon
> 
> 
> 
> On Saturday, September 24, 2016, Reynold Xin  > wrote:
> Please vote on releasing the following candidate as Apache Spark version 
> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if a 
> majority of at least 3+1 PMC votes are cast.
> 
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
> 
> 
> The tag to be voted on is v2.0.1-rc3 
> (9d28cc10357a8afcfb2fa2e6eecb5c2cc2730d17)
> 
> This release candidate resolves 290 issues: 
> https://s.apache.org/spark-2.0.1-jira 
> 
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/ 
> 
> 
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc 
> 
> 
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1201/ 
> 
> 
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/ 
> 
> 
> 
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an 
> existing Spark workload and running on this release candidate, then reporting 
> any regressions from 2.0.0.
> 
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already present 
> in 2.0.0, missing features, or bugs related to new features will not 
> necessarily block this release.
> 
> Q: What fix version should I use for patches merging into branch-2.0 from now 
> on?
> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC (i.e. 
> RC4) is cut, I will change the fix version of those patches to 2.0.1.
> 
> 
> 



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Ricardo Almeida
+1 (non-binding)

Built and tested on
- Ubuntu 16.04 / OpenJDK 1.8.0_91
- CentOS / Oracle Java 1.7.0_55
(-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Pyarn)


On 25 September 2016 at 22:35, Matei Zaharia 
wrote:

> +1
>
> Matei
>
> On Sep 25, 2016, at 1:25 PM, Josh Rosen  wrote:
>
> +1
>
> On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:
>
>> +1
>>
>> On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
>> wrote:
>>
>>> +1 (non binding)
>>>
>>> RC3 is compiled and tested on the following two systems, too. All tests
>>> passed.
>>>
>>> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>>> -Dsparkr
>>> * CentOS 7.2 / Open JDK 1.8.0_102
>>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>>>
>>> Cheers,
>>> Dongjoon
>>>
>>>
>>>
>>> On Saturday, September 24, 2016, Reynold Xin 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and
 passes if a majority of at least 3+1 PMC votes are cast.

 [ ] +1 Release this package as Apache Spark 2.0.1
 [ ] -1 Do not release this package because ...


 The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
 c2cc2730d17)

 This release candidate resolves 290 issues:
 https://s.apache.org/spark-2.0.1-jira

 The release files, including signatures, digests, etc. can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/

 Release artifacts are signed with the following key:
 https://people.apache.org/keys/committer/pwendell.asc

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1201/

 The documentation corresponding to this release can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/


 Q: How can I help test this release?
 A: If you are a Spark user, you can help us test this release by taking
 an existing Spark workload and running on this release candidate, then
 reporting any regressions from 2.0.0.

 Q: What justifies a -1 vote for this release?
 A: This is a maintenance release in the 2.0.x series.  Bugs already
 present in 2.0.0, missing features, or bugs related to new features will
 not necessarily block this release.

 Q: What fix version should I use for patches merging into branch-2.0
 from now on?
 A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
 (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.



>>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Mark Hamstra
Spark's branch-2.0 is a maintenance branch, effectively meaning that only
bug-fixes will be added to it.  There are other maintenance branches (such
as branch-1.6) that are also receiving bug-fixes in theory, but not so much
in fact as maintenance branches get older.  The major and minor version
numbers of maintenance branches stay fixed, with only the patch-level
version number changing as new releases are made from a maintenance
branch.  Thus, the next release from branch-2.0 will be 2.0.1, the set of
bug-fixes contributing to the next branch-2.0 release will result in 2.0.2,
etc.

New work, both bug-fixes and non-bug-fixes, is contributed to the master
branch.  New releases from the master branch increment the minor version
number (unless they include API-breaking changes, in which case the major
version number changes -- e.g. Spark 1.x.y to Spark 2.0.0).  Thus the first
release from the current master branch will be 2.1.0, the next will be
2.2.0, etc.

There should be active "next JIRA numbers" for whatever will be the next
release from the master as well as each of the maintenance branches.

This is all just basic SemVer (http://semver.org/), so it surprises me some
that you are finding the concepts to be new, difficult or frustrating.

On Sun, Sep 25, 2016 at 8:31 AM, Jacek Laskowski  wrote:

> Hi Sean,
>
> I remember a similar discussion about the releases in Spark and I must
> admit it again -- I simply don't get it. I seem to not have paid
> enough attention to details to appreciate it. I apologize for asking
> the very same questions again and again. Sorry.
>
> Re the next release, I was referring to JIRA where 2.0.2 came up quite
> recently for issues not included in 2.0.1. This disjoint between
> releases and JIRA versions causes even more frustration whenever I'm
> asked what and when the next release is going to be. It's not as
> simple as I think it should be (for me).
>
> (I really hope it's only me with this mental issue)
>
> Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
> unless someone adds it to branch-2.0. Correct?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
> > Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
> > becomes the de facto 2.2.x branch. It's not true that the next release
> > is 2.0.2. You can see the master version:
> > https://github.com/apache/spark/blob/master/pom.xml#L29
> >
> > On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski 
> wrote:
> >> Hi Sean,
> >>
> >> So, another question would be when is the change going to be released
> >> then? What's the version for the master? The next release's 2.0.2 so
> >> it's not for mesos profile either :(
> >>
> >> Pozdrawiam,
> >> Jacek Laskowski
> >> 
> >> https://medium.com/@jaceklaskowski/
> >> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> >> Follow me at https://twitter.com/jaceklaskowski
> >>
> >>
> >> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
> >>> It's a change to the structure of the project, and probably not
> >>> appropriate for a maintenance release. 2.0.1 core would then no longer
> >>> contain Mesos code while 2.0.0 did.
> >>>
> >>> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski 
> wrote:
>  Hi Sean,
> 
>  Sure, but then the question is why it's not a part of 2.0.1? I thought
>  it was considered ready for prime time and so should be shipped in
>  2.0.1.
> 
>  Pozdrawiam,
>  Jacek Laskowski
>  
>  https://medium.com/@jaceklaskowski/
>  Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>  Follow me at https://twitter.com/jaceklaskowski
> 
> 
>  On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen 
> wrote:
> > It was added to the master branch, and this is a release from the
> 2.0.x branch.
> >
> > On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski 
> wrote:
> >> Hi,
> >>
> >> That's even more interesting. How's so since the profile got added a
> >> week ago or later and RC2 was cut two/three days ago? Anyone know?
> >>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


ArrayType support in Spark SQL

2016-09-25 Thread Jason White
It seems that `functions.lit` doesn't support ArrayTypes. To reproduce:

org.apache.spark.sql.functions.lit(2 :: 1 :: Nil)

java.lang.RuntimeException: Unsupported literal type class
scala.collection.immutable.$colon$colon List(2, 1)
  at
org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:59)
  at org.apache.spark.sql.functions$.lit(functions.scala:101)
  ... 48 elided

This is about the first thing I tried to do with ArrayTypes in Spark SQL. Is
this usage supported, or on the roadmap?



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/ArrayType-support-in-Spark-SQL-tp19063.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Herman van Hövell tot Westerflier
+1 (non-binding)

On Sun, Sep 25, 2016 at 2:05 PM, Ricardo Almeida <
ricardo.alme...@actnowib.com> wrote:

> +1 (non-binding)
>
> Built and tested on
> - Ubuntu 16.04 / OpenJDK 1.8.0_91
> - CentOS / Oracle Java 1.7.0_55
> (-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Pyarn)
>
>
> On 25 September 2016 at 22:35, Matei Zaharia 
> wrote:
>
>> +1
>>
>> Matei
>>
>> On Sep 25, 2016, at 1:25 PM, Josh Rosen  wrote:
>>
>> +1
>>
>> On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:
>>
>>> +1
>>>
>>> On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
>>> wrote:
>>>
 +1 (non binding)

 RC3 is compiled and tested on the following two systems, too. All tests
 passed.

 * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
 -Dsparkr
 * CentOS 7.2 / Open JDK 1.8.0_102
with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver

 Cheers,
 Dongjoon



 On Saturday, September 24, 2016, Reynold Xin 
 wrote:

> Please vote on releasing the following candidate as Apache Spark
> version 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and
> passes if a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
> c2cc2730d17)
>
> This release candidate resolves 290 issues:
> https://s.apache.org/spark-2.0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapache
> spark-1201/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
> 1-rc3-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by
> taking an existing Spark workload and running on this release candidate,
> then reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already
> present in 2.0.0, missing features, or bugs related to new features will
> not necessarily block this release.
>
> Q: What fix version should I use for patches merging into branch-2.0
> from now on?
> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new
> RC (i.e. RC4) is cut, I will change the fix version of those patches to
> 2.0.1.
>
>
>
>>>
>>
>


Re: ArrayType support in Spark SQL

2016-09-25 Thread Jason White
Continuing to dig, I encountered:
https://github.com/apache/spark/blob/master/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/LiteralExpressionSuite.scala#L125

  // TODO(davies): add tests for ArrayType, MapType and StructType

I guess others have thought of this already, just not implemented yet. :)

For others reading this thread, someone suggested using a SQL UDF to return
the constant - this works as a hack for now.




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/ArrayType-support-in-Spark-SQL-tp19063p19065.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Kousuke Saruta

+1 (non-binding)


On 2016年09月26日 07:26, Herman van Hövell tot Westerflier wrote:

+1 (non-binding)

On Sun, Sep 25, 2016 at 2:05 PM, Ricardo Almeida 
mailto:ricardo.alme...@actnowib.com>> 
wrote:


+1 (non-binding)

Built and tested on
- Ubuntu 16.04 / OpenJDK 1.8.0_91
- CentOS / Oracle Java 1.7.0_55
(-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver
-Pyarn)


On 25 September 2016 at 22:35, Matei Zaharia
mailto:matei.zaha...@gmail.com>> wrote:

+1

Matei


On Sep 25, 2016, at 1:25 PM, Josh Rosen
mailto:joshro...@databricks.com>>
wrote:

+1

On Sun, Sep 25, 2016 at 1:16 PM Yin Huai
mailto:yh...@databricks.com>> wrote:

+1

On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun
mailto:dongj...@apache.org>> wrote:

+1 (non binding)

RC3 is compiled and tested on the following two
systems, too. All tests passed.

* CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
   with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive
-Phive-thriftserver -Dsparkr
* CentOS 7.2 / Open JDK 1.8.0_102
   with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive
-Phive-thriftserver

Cheers,
Dongjoon



On Saturday, September 24, 2016, Reynold Xin
mailto:r...@databricks.com>> wrote:

Please vote on releasing the following candidate
as Apache Spark version 2.0.1. The vote is open
until Tue, Sep 27, 2016 at 15:30 PDT and passes
if a majority of at least 3+1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.0.1
[ ] -1 Do not release this package because ...


The tag to be voted on is v2.0.1-rc3
(9d28cc10357a8afcfb2fa2e6eecb5c2cc2730d17)

This release candidate resolves 290 issues:
https://s.apache.org/spark-2.0.1-jira


The release files, including signatures, digests,
etc. can be found at:

http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/



Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc


The staging repository for this release can be
found at:

https://repository.apache.org/content/repositories/orgapachespark-1201/



The documentation corresponding to this release
can be found at:

http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/




Q: How can I help test this release?
A: If you are a Spark user, you can help us test
this release by taking an existing Spark workload
and running on this release candidate, then
reporting any regressions from 2.0.0.

Q: What justifies a -1 vote for this release?
A: This is a maintenance release in the 2.0.x
series. Bugs already present in 2.0.0, missing
features, or bugs related to new features will
not necessarily block this release.

Q: What fix version should I use for patches
merging into branch-2.0 from now on?
A: Please mark the fix version as 2.0.2, rather
than 2.0.1. If a new RC (i.e. RC4) is cut, I will
change the fix version of those patches to 2.0.1.











Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread vaquar khan
+1 (non-binding)

Regards,
Vaquar khan

On 25 Sep 2016 20:41, "Kousuke Saruta"  wrote:

> +1 (non-binding)
>
> On 2016年09月26日 07:26, Herman van Hövell tot Westerflier wrote:
>
> +1 (non-binding)
>
> On Sun, Sep 25, 2016 at 2:05 PM, Ricardo Almeida <
> ricardo.alme...@actnowib.com> wrote:
>
>> +1 (non-binding)
>>
>> Built and tested on
>> - Ubuntu 16.04 / OpenJDK 1.8.0_91
>> - CentOS / Oracle Java 1.7.0_55
>> (-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Pyarn)
>>
>>
>> On 25 September 2016 at 22:35, Matei Zaharia 
>> wrote:
>>
>>> +1
>>>
>>> Matei
>>>
>>> On Sep 25, 2016, at 1:25 PM, Josh Rosen 
>>> wrote:
>>>
>>> +1
>>>
>>> On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:
>>>
 +1

 On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
 wrote:

> +1 (non binding)
>
> RC3 is compiled and tested on the following two systems, too. All
> tests passed.
>
> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
> -Dsparkr
> * CentOS 7.2 / Open JDK 1.8.0_102
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>
> Cheers,
> Dongjoon
>
>
>
> On Saturday, September 24, 2016, Reynold Xin 
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark
>> version 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and
>> passes if a majority of at least 3+1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.0.1
>> [ ] -1 Do not release this package because ...
>>
>>
>> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
>> c2cc2730d17)
>>
>> This release candidate resolves 290 issues:
>> https://s.apache.org/spark-2.0.1-jira
>>
>> The release files, including signatures, digests, etc. can be found
>> at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
>> 1-rc3-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapache
>> spark-1201/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
>> 1-rc3-docs/
>>
>>
>> Q: How can I help test this release?
>> A: If you are a Spark user, you can help us test this release by
>> taking an existing Spark workload and running on this release candidate,
>> then reporting any regressions from 2.0.0.
>>
>> Q: What justifies a -1 vote for this release?
>> A: This is a maintenance release in the 2.0.x series.  Bugs already
>> present in 2.0.0, missing features, or bugs related to new features will
>> not necessarily block this release.
>>
>> Q: What fix version should I use for patches merging into branch-2.0
>> from now on?
>> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new
>> RC (i.e. RC4) is cut, I will change the fix version of those patches to
>> 2.0.1.
>>
>>
>>

>>>
>>
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Luciano Resende
+1 (non-binding)

On Sat, Sep 24, 2016 at 3:08 PM, Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and passes if
> a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5c
> 2cc2730d17)
>
> This release candidate resolves 290 issues: https://s.apache.org/spark-2.
> 0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1201/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc3-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already
> present in 2.0.0, missing features, or bugs related to new features will
> not necessarily block this release.
>
> Q: What fix version should I use for patches merging into branch-2.0 from
> now on?
> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new RC
> (i.e. RC4) is cut, I will change the fix version of those patches to 2.0.1.
>
>
>


-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Pete Lee
+1


On Sun, Sep 25, 2016 at 3:26 PM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:

> +1 (non-binding)
>
> On Sun, Sep 25, 2016 at 2:05 PM, Ricardo Almeida <
> ricardo.alme...@actnowib.com> wrote:
>
>> +1 (non-binding)
>>
>> Built and tested on
>> - Ubuntu 16.04 / OpenJDK 1.8.0_91
>> - CentOS / Oracle Java 1.7.0_55
>> (-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Pyarn)
>>
>>
>> On 25 September 2016 at 22:35, Matei Zaharia 
>> wrote:
>>
>>> +1
>>>
>>> Matei
>>>
>>> On Sep 25, 2016, at 1:25 PM, Josh Rosen 
>>> wrote:
>>>
>>> +1
>>>
>>> On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:
>>>
 +1

 On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
 wrote:

> +1 (non binding)
>
> RC3 is compiled and tested on the following two systems, too. All
> tests passed.
>
> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
> -Dsparkr
> * CentOS 7.2 / Open JDK 1.8.0_102
>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>
> Cheers,
> Dongjoon
>
>
>
> On Saturday, September 24, 2016, Reynold Xin 
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark
>> version 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and
>> passes if a majority of at least 3+1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.0.1
>> [ ] -1 Do not release this package because ...
>>
>>
>> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
>> c2cc2730d17)
>>
>> This release candidate resolves 290 issues:
>> https://s.apache.org/spark-2.0.1-jira
>>
>> The release files, including signatures, digests, etc. can be found
>> at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
>> 1-rc3-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapache
>> spark-1201/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
>> 1-rc3-docs/
>>
>>
>> Q: How can I help test this release?
>> A: If you are a Spark user, you can help us test this release by
>> taking an existing Spark workload and running on this release candidate,
>> then reporting any regressions from 2.0.0.
>>
>> Q: What justifies a -1 vote for this release?
>> A: This is a maintenance release in the 2.0.x series.  Bugs already
>> present in 2.0.0, missing features, or bugs related to new features will
>> not necessarily block this release.
>>
>> Q: What fix version should I use for patches merging into branch-2.0
>> from now on?
>> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new
>> RC (i.e. RC4) is cut, I will change the fix version of those patches to
>> 2.0.1.
>>
>>
>>

>>>
>>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Shixiong(Ryan) Zhu
+1

On Sun, Sep 25, 2016 at 10:43 PM, Pete Lee  wrote:

> +1
>
>
> On Sun, Sep 25, 2016 at 3:26 PM, Herman van Hövell tot Westerflier <
> hvanhov...@databricks.com> wrote:
>
>> +1 (non-binding)
>>
>> On Sun, Sep 25, 2016 at 2:05 PM, Ricardo Almeida <
>> ricardo.alme...@actnowib.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> Built and tested on
>>> - Ubuntu 16.04 / OpenJDK 1.8.0_91
>>> - CentOS / Oracle Java 1.7.0_55
>>> (-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Pyarn)
>>>
>>>
>>> On 25 September 2016 at 22:35, Matei Zaharia 
>>> wrote:
>>>
 +1

 Matei

 On Sep 25, 2016, at 1:25 PM, Josh Rosen 
 wrote:

 +1

 On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:

> +1
>
> On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
> wrote:
>
>> +1 (non binding)
>>
>> RC3 is compiled and tested on the following two systems, too. All
>> tests passed.
>>
>> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>> -Dsparkr
>> * CentOS 7.2 / Open JDK 1.8.0_102
>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>>
>> Cheers,
>> Dongjoon
>>
>>
>>
>> On Saturday, September 24, 2016, Reynold Xin 
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT and
>>> passes if a majority of at least 3+1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.0.1
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
>>> c2cc2730d17)
>>>
>>> This release candidate resolves 290 issues:
>>> https://s.apache.org/spark-2.0.1-jira
>>>
>>> The release files, including signatures, digests, etc. can be found
>>> at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
>>> 1-rc3-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://people.apache.org/keys/committer/pwendell.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapache
>>> spark-1201/
>>>
>>> The documentation corresponding to this release can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.
>>> 1-rc3-docs/
>>>
>>>
>>> Q: How can I help test this release?
>>> A: If you are a Spark user, you can help us test this release by
>>> taking an existing Spark workload and running on this release candidate,
>>> then reporting any regressions from 2.0.0.
>>>
>>> Q: What justifies a -1 vote for this release?
>>> A: This is a maintenance release in the 2.0.x series.  Bugs already
>>> present in 2.0.0, missing features, or bugs related to new features will
>>> not necessarily block this release.
>>>
>>> Q: What fix version should I use for patches merging into branch-2.0
>>> from now on?
>>> A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a new
>>> RC (i.e. RC4) is cut, I will change the fix version of those patches to
>>> 2.0.1.
>>>
>>>
>>>
>

>>>
>>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-25 Thread Jeff Zhang
+1

On Mon, Sep 26, 2016 at 2:03 PM, Shixiong(Ryan) Zhu  wrote:

> +1
>
> On Sun, Sep 25, 2016 at 10:43 PM, Pete Lee  wrote:
>
>> +1
>>
>>
>> On Sun, Sep 25, 2016 at 3:26 PM, Herman van Hövell tot Westerflier <
>> hvanhov...@databricks.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Sun, Sep 25, 2016 at 2:05 PM, Ricardo Almeida <
>>> ricardo.alme...@actnowib.com> wrote:
>>>
 +1 (non-binding)

 Built and tested on
 - Ubuntu 16.04 / OpenJDK 1.8.0_91
 - CentOS / Oracle Java 1.7.0_55
 (-Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Pyarn)


 On 25 September 2016 at 22:35, Matei Zaharia 
 wrote:

> +1
>
> Matei
>
> On Sep 25, 2016, at 1:25 PM, Josh Rosen 
> wrote:
>
> +1
>
> On Sun, Sep 25, 2016 at 1:16 PM Yin Huai  wrote:
>
>> +1
>>
>> On Sun, Sep 25, 2016 at 11:40 AM, Dongjoon Hyun 
>> wrote:
>>
>>> +1 (non binding)
>>>
>>> RC3 is compiled and tested on the following two systems, too. All
>>> tests passed.
>>>
>>> * CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1
>>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>>> -Dsparkr
>>> * CentOS 7.2 / Open JDK 1.8.0_102
>>>with -Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver
>>>
>>> Cheers,
>>> Dongjoon
>>>
>>>
>>>
>>> On Saturday, September 24, 2016, Reynold Xin 
>>> wrote:
>>>
 Please vote on releasing the following candidate as Apache Spark
 version 2.0.1. The vote is open until Tue, Sep 27, 2016 at 15:30 PDT 
 and
 passes if a majority of at least 3+1 PMC votes are cast.

 [ ] +1 Release this package as Apache Spark 2.0.1
 [ ] -1 Do not release this package because ...


 The tag to be voted on is v2.0.1-rc3 (9d28cc10357a8afcfb2fa2e6eecb5
 c2cc2730d17)

 This release candidate resolves 290 issues:
 https://s.apache.org/spark-2.0.1-jira

 The release files, including signatures, digests, etc. can be found
 at:
 http://people.apache.org/~pwendell/spark-releases/spark-2.0.
 1-rc3-bin/

 Release artifacts are signed with the following key:
 https://people.apache.org/keys/committer/pwendell.asc

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapache
 spark-1201/

 The documentation corresponding to this release can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-2.0.
 1-rc3-docs/


 Q: How can I help test this release?
 A: If you are a Spark user, you can help us test this release by
 taking an existing Spark workload and running on this release 
 candidate,
 then reporting any regressions from 2.0.0.

 Q: What justifies a -1 vote for this release?
 A: This is a maintenance release in the 2.0.x series.  Bugs already
 present in 2.0.0, missing features, or bugs related to new features 
 will
 not necessarily block this release.

 Q: What fix version should I use for patches merging into
 branch-2.0 from now on?
 A: Please mark the fix version as 2.0.2, rather than 2.0.1. If a
 new RC (i.e. RC4) is cut, I will change the fix version of those 
 patches to
 2.0.1.



>>
>

>>>
>>
>


-- 
Best Regards

Jeff Zhang