Hi all,
Apache Spark 2.0.0 is the first release of Spark 2.x line. It includes
2500+ patches from 300+ contributors.
To download Spark 2.0, head over to the download page:
http://spark.apache.org/downloads.html
To view the release notes:
http://spark.apache.org/releases/spark-release-2-0-0.html
Hi Burak,
Yes, you're right.
Thanks.
> El 27 jul 2016, a las 0:19, Burak Yavuz escribió:
>
> Hi,
>
> It's bad practice to change jars for the same version and is prohibited in
> Spark Packages. Please bump your version number and make a new release.
>
> Best regards,
> Burak
>
>> On Tue, J
Hi,
It's bad practice to change jars for the same version and is prohibited in
Spark Packages. Please bump your version number and make a new release.
Best regards,
Burak
On Tue, Jul 26, 2016 at 3:51 AM, Julio Antonio Soto de Vicente <
ju...@esbet.es> wrote:
> Hi all,
>
> Maybe I am missing som
Yeah, I thought the vote was closed... but I couldn't think of a better
thread to remark upon!
That's a useful comment on Derby's role - thanks. Certainly, we'd just
attempted a build-and-test execution with revising the Derby level to the
current 10.12.1.1, and hadn't observed any issues... a PR
The release vote has already closed and passed. Derby is only used in
tests AFAIK, so I don't think this is even critical let alone a
blocker. Updating is fine though, open a PR.
On Tue, Jul 26, 2016 at 3:37 PM, Stephen Hellberg wrote:
> -1 Sorry, I've just noted that the RC5 proposal includes
-1 Sorry, I've just noted that the RC5 proposal includes shipping Derby @
10.11.1.1 which is vulnerable to CVE: 2015-1832.
It would be ideal if we could instead ship 10.12.1.1 real soon.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-A
unsuscribe
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
The reason of no response is that this feature is not available yet.
You can vote and following this JIRA
https://issues.apache.org/jira/browse/SPARK-13721, if you really need this
feature.
Yong
From: Don Drake
Sent: Monday, July 25, 2016 9:12 PM
To: dev@sp
Hi,
Since spark.driver.appUIAddress is only used in Spark on YARN to
"announce" the web UI's address, I think the setting should rather be
called spark.yarn.driver.appUIAddress (for consistency with the other
YARN-specific settings).
What do you think? I'd like to hear your thoughts before fillin
Hi all,
Maybe I am missing something, but... Is there a way to update a package
uploaded to spark-packages.org under the same version?
Given a release called my_package 1.1.2, I would like to re-upload it due to
build failure; but I want to do it also as version 1.1.2...
Thank you.
---
10 matches
Mail list logo