+1 (non-binding)
Tests the without-hadoop binaries (so didn't run Hive-related tests)
with a test batch including standalone / client, yarn / client and
cluster, including core, mllib and streaming (flume and kafka).
On Wed, Dec 16, 2015 at 1:32 PM, Michael Armbrust
wrote:
> Please vote on relea
Hi all,
I have two questions about selecting columns of Dataset.
First, could you tell me know if there is any way to select TypedColumn
columns in addition to the combination of `expr` and `as`?
Second, how can we alias such a `expr("name").as[String]` Column?
I tried to select a column of Data
+1 (non-binding)
Tested a number of tests surrounding DataFrames, Datasets, and ML.
On Wed, Dec 16, 2015 at 1:32 PM Michael Armbrust
wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 1.6.0!
>
> The vote is open until Saturday, December 19, 2015 at 18:00 UTC an
Thanks Sean for sending me the logs offline.
Turns out the tests are failing again, for reasons unrelated to Spark. I
have filed https://issues.apache.org/jira/browse/SPARK-12426 for that with
some details. In the meanwhile, I agree with Sean, these tests should be
disabled. And, again, I don't th
Yes that's what I mean. If they're not quite working, let's disable
them, but first, we have to rule out that I'm not just missing some
requirement.
Functionally, it's not worth blocking the release. It seems like bad
form to release with tests that always fail for a non-trivial number
of users, b
Sean,
Are you referring to docker integration tests? If so, they were disabled
for majority of the release and I recently worked on it (SPARK-11796) and
once it got committed, the tests were re-enabled in Spark builds. I am not
sure what OSs the test builds use, but it should be passing there too.
For me, mostly the same as before: tests are mostly passing, but I can
never get the docker tests to pass. If anyone knows a special profile
or package that needs to be enabled, I can try that and/or
fix/document it. Just wondering if it's me.
I'm on Java 7 + Ubuntu 15.10, with -Pyarn -Phive -Phiv
+1. Ran some regression tests on Spark on Yarn (hadoop 2.6 and 2.7).
Tom
On Wednesday, December 16, 2015 3:32 PM, Michael Armbrust
wrote:
Please vote on releasing the following candidate as Apache Spark version 1.6.0!
The vote is open until Saturday, December 19, 2015 at 18:00 UTC and
+1 (non-binding)
It passes our tests after we registered 6 new classes with Kryo:
kryo.register(classOf[org.apache.spark.sql.catalyst.expressions.UnsafeRow])
kryo.register(classOf[Array[org.apache.spark.mllib.tree.model.Split]])
kryo.register(Class.forName("org.apache.spark.mllib.tree.m
You can pretty much measure it from the Event timeline listed in the driver
ui, You can click on jobs/tasks and get the time that it took for each of
it from there.
Thanks
Best Regards
On Thu, Dec 17, 2015 at 7:27 AM, sara mustafa
wrote:
> Hi,
>
> The class org.apache.spark.sql.execution.basicO
If the port 7077 is open for public on your cluster, that's all you need to
take over the cluster. You can read a bit about it here
https://www.sigmoid.com/securing-apache-spark-cluster/
You can also look at this small exploit I wrote
https://www.exploit-db.com/exploits/36562/
Thanks
Best Regards
11 matches
Mail list logo