Hi,
I am new in the development of Spark. When I tried to run unit tests locally
on macOS 10.15.4, everything went smoothly except a single testcase -
SPARK-6330 regression test. After a few hours struggling with it, I moved to
Linux and it passed magically. My OS is Ubuntu 18.0.4.
Digging into
jenkins is up and building, but not reachable via https at the moment. i'm
working on getting this sorted ASAP.
shane
--
Shane Knapp
Computer Guy / Voice of Reason
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu
Hi, All.
Since Apache Spark 3.0.0, Apache Hive 2.3.7 is the default
Hive execution library. The forked Hive 1.2.1 library is not
recommended because it's not maintained properly.
In Apache Spark 3.1 on December 2020, we are going to
remove it from our official distribution.
https://github.co
Sure it is a good idea, but not sure Spark can enforce it? even a
documented suggestion probably isn't going to be noticed.
FooBar can put code under org.apache.spark.foobar, ideally, I guess.
On Wed, Sep 23, 2020 at 8:01 AM Steve Loughran
wrote:
>
> the issue is that sometimes people explicitly
the issue is that sometimes people explicitly want to put stuff into the
spark package tree just to get at things which spark scoped as
org.apache.spark. Unless/Until the relevant APIs/classes are rescoped to be
public, putting your classes under the package hierarchy lets your own code
at it. It j