It's still dying. Back to this error (it used to be spark-2.2.0 before):
java.io.IOException: Cannot run program "./bin/spark-submit" (in directory
"/tmp/test-spark/spark-2.1.2"): error=2, No such file or directory
So, a mirror is missing that Spark version... I don't understand why
nobody el
Quick update for everybody: I was trying to deal with the release
scripts to get them to work with 2.1; there were some fixes needed,
and on top of that Maven Central changed something over the weekend
which made Java 7 unhappy.
I actually was able to create an RC1 after many tries and tweaking,
b
Sorry for the delay here, waiting on a couple blockers and working through
setting up for the release.
Tom
On Sunday, June 10, 2018, 10:41:00 PM CDT, Xiao Li
wrote:
+1
Tom, thanks for helping this!
Xiao
2018-06-07 9:40 GMT-07:00 Marcelo Vanzin :
Took a look at our branch and most of
Spark will schedule all jobs you have and add them to common task queue.
Difference between FIFO and FAIR is how this queue is handled. FIFO will
prefer to run jobs in FIFO order and FAIR would try to divide resources
equally to all jobs
Problem you have is different. Driver (actually spark API) b
Those still appear to be env problems. I don't know why it is so
persistent. Does it all pass locally? Retrigger tests again and see what
happens.
On Tue, Jun 19, 2018, 2:53 AM Petar Zecevic wrote:
>
> Thanks, but unfortunately, it died again. Now at pyspark tests:
>
>
>
Hi Spark devs,
I have couple of pull requests for structured streaming which are getting
older and fading out from earlier pages in PR pages.
https://github.com/apache/spark/pull/21469
https://github.com/apache/spark/pull/21357
https://github.com/apache/spark/pull/21222
Two of them are in a kind