Test Failures on macOS 10.15.4

2020-09-23 Thread EveLiao
Hi, I am new in the development of Spark. When I tried to run unit tests locally on macOS 10.15.4, everything went smoothly except a single testcase - SPARK-6330 regression test. After a few hours struggling with it, I moved to Linux and it passed magically. My OS is Ubuntu 18.0.4. Digging into

[build system] downtime due to SSL cert errors

2020-09-23 Thread shane knapp ☠
jenkins is up and building, but not reachable via https at the moment. i'm working on getting this sorted ASAP. shane -- Shane Knapp Computer Guy / Voice of Reason UC Berkeley EECS Research / RISELab Staff Technical Lead https://rise.cs.berkeley.edu

[FYI] Removing `spark-3.1.0-bin-hadoop2.7-hive1.2.tgz` from Apache Spark 3.1 distribution

2020-09-23 Thread Dongjoon Hyun
Hi, All. Since Apache Spark 3.0.0, Apache Hive 2.3.7 is the default Hive execution library. The forked Hive 1.2.1 library is not recommended because it's not maintained properly. In Apache Spark 3.1 on December 2020, we are going to remove it from our official distribution. https://github.co

Re: A common naming policy for third-party packages/modules under org.apache.spark?

2020-09-23 Thread Sean Owen
Sure it is a good idea, but not sure Spark can enforce it? even a documented suggestion probably isn't going to be noticed. FooBar can put code under org.apache.spark.foobar, ideally, I guess. On Wed, Sep 23, 2020 at 8:01 AM Steve Loughran wrote: > > the issue is that sometimes people explicitly

Re: A common naming policy for third-party packages/modules under org.apache.spark?

2020-09-23 Thread Steve Loughran
the issue is that sometimes people explicitly want to put stuff into the spark package tree just to get at things which spark scoped as org.apache.spark. Unless/Until the relevant APIs/classes are rescoped to be public, putting your classes under the package hierarchy lets your own code at it. It j