For now, I'll just put this as critical. We can discuss the
documentation stuff offline or in another thread.
On Fri, Mar 6, 2015 at 1:36 PM, Sean Owen wrote:
> Although the problem is small, especially if indeed the essential docs
> changes are following just a couple days behind the final relea
Is there a plan to implement SSL support for the Block Transfer Service
(specifically, the NettyBlockTransferService implementation)? I can
volunteer if needed...
Jeff
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Block-Transfer-Service-encryption-
Although the problem is small, especially if indeed the essential docs
changes are following just a couple days behind the final release, I
mean, why the rush if they're essential? wait a couple days, finish
them, make the release.
Answer is, I think these changes aren't actually essential given t
To add to what Patrick said, the only reason that those JIRAs are marked as
Blockers (at least I can say for myself) is so that they are at the top of
the JIRA list signifying that these are more *immediate* issues than all
the Critical issues. To make it less confusing for the community voting, we
Sean,
The docs are distributed and consumed in a fundamentally different way
than Spark code itself. So we've always considered the "deadline" for
doc changes to be when the release is finally posted.
If there are small inconsistencies with the docs present in the source
code for that release tag
Hi,
I've implemented class MyClass in MLlib that does some operation on
LabeledPoint. MyClass extends serializable, so I can map this operation on data
of RDD[LabeledPoints], such as data.map(lp => MyClass.operate(lp)). I write
this class in file with ObjectOutputStream.writeObject. Then I stop
+1 (non-binding, of course)
1. Compiled OSX 10.10 (Yosemite) OK Total time: 13:55 min
mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
-Dhadoop.version=2.6.0 -Phive -DskipTests -Dscala-2.11
2. Tested pyspark, mlib - running as well as compare results with 1.1.x &
1.2.x
pyspark wo
+1 (non-binding, doc issues aside)
Ran batch of tests against yarn and standalone, including tests for
rc2 blockers, all looks fine.
On Thu, Mar 5, 2015 at 6:52 PM, Patrick Wendell wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 1.3.0!
>
> The tag to be voted
Given the title and tagging, it sounds like there could be some
must-have doc changes to go with what is being released as 1.3. It can
be finished later, and published later, but then the docs source
shipped with the release doesn't match the site, and until then, 1.3
is released without some "must
Hey Sean,
> SPARK-5310 Update SQL programming guide for 1.3
> SPARK-5183 Document data source API
> SPARK-6128 Update Spark Streaming Guide for Spark 1.3
For these, the issue is that they are documentation JIRA's, which
don't need to be timed exactly with the release vote, since we can
update the
Congrats!
On Thu, Mar 5, 2015 at 1:34 PM, shane knapp wrote:
> WOOT!
>
> On Thu, Mar 5, 2015 at 1:26 PM, Reynold Xin wrote:
>
> > We reached a new milestone today.
> >
> > https://github.com/apache/spark
> >
> >
> > 10,001 commits now. Congratulations to Xiangrui for making the 1th
> > comm
There are still three JIRAs marked as blockers for 1.3.0:
SPARK-5310 Update SQL programming guide for 1.3
SPARK-5183 Document data source API
SPARK-6128 Update Spark Streaming Guide for Spark 1.3
As a matter of hygiene, let's either mark them resolved if they're
resolved, or push them / depriorit
Hi all,
I never heard from anyone on this and have received emails in private
that people would like to add terasort to their spark-perf installs so
it becomes part of their cluster validation checks.
Yours,
Ewan
Forwarded Message
Subject:SparkSpark-perf terasort WIP
This has some disadvantage for Java, I think. You can't switch on an
object defined like this, but you can with an enum. And although the
scala compiler understands that the set of values is fixed because of
'sealed' and so can warn about missing cases, the JVM won't know this,
and can't do the sam
I'll kick it off with a +1.
On Thu, Mar 5, 2015 at 6:52 PM, Patrick Wendell wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 1.3.0!
>
> The tag to be voted on is v1.3.0-rc2 (commit 4aaf48d4):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=4aaf
15 matches
Mail list logo