I am currently using a third party library(Lucene) with Spark that is not
serializable. Due to that reason, it generates the following exception :
Job aborted due to stage failure: Task 144.0 in stage 25.0 (TID 2122) had a not
serializable result: org.apache.lucene.facet.FacetsConfig Serializa
Did you try this? https://stackoverflow.com/a/2114387/375670
On Tue, Feb 25, 2020 at 10:23 AM yeikel valdes wrote:
> I am currently using a third party library(Lucene) with Spark that is not
> serializable. Due to that reason, it generates the following exception :
>
> Job aborted due to stag
Hi all,
I just noticed we apparently don't build the documentation in the Jenkins
anymore.
I remember we have the job:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-docs
Does anybody know what happened to this job?
Thanks.
it's been gone for quite a long time. these docs were being built but not
published.
relevant discussion:
http://apache-spark-developers-list.1001551.n3.nabble.com/Re-moving-the-spark-jenkins-job-builder-repo-from-dbricks-spark-tp25325p26222.html
shane
On Tue, Feb 25, 2020 at 6:18 PM Hyukjin Kw
+1
Xiao
Michael Armbrust 于2020年2月24日周一 下午3:03写道:
> Hello Everyone,
>
> As more users have started upgrading to Spark 3.0 preview (including
> myself), there have been many discussions around APIs that have been broken
> compared with Spark 2.x. In many of these discussions, one of the
> rationa
Hm, we should still run this I believe. PR builders do not run doc build
(more specifically `cd docs && jekyll build`)
Fortunately, Javadoc, Scaladoc, SparkR documentation and PySpark API
documentation are being tested in PR builder.
However, for MD file itself under `docs` and SQL Built-in
Functi