Re: [Spark][Core] Resource Allocation

2022-07-15 Thread Sungwoo Park
For 1), this is a recurring question in this mailing list, and the answer is: no, Spark does not support the coordination between multiple Spark applications. Spark relies on an external resource manager, such as Yarn and Kubernetes, to allocate resources to multiple Spark applications. For example

[Building] Building with JDK11

2022-07-15 Thread Szymon Kuryło
Hello, I'm trying to build a Java 11 Spark distro using the dev/make-distribution.sh script. I have set JAVA_HOME to point to JDK11 location, I've also set the java.version property in pom.xml to 11, effectively also setting `maven.compile.source` and `maven.compile.target`. When inspecting classe

Re: [Building] Building with JDK11

2022-07-15 Thread Sean Owen
Java 8 binaries are probably on your PATH On Fri, Jul 15, 2022, 5:01 PM Szymon Kuryło wrote: > Hello, > > I'm trying to build a Java 11 Spark distro using the > dev/make-distribution.sh script. > I have set JAVA_HOME to point to JDK11 location, I've also set the > java.version property in pom.xm

Re: [Building] Building with JDK11

2022-07-15 Thread Tufan Rakshit
maybe try intellij or some other IDE with SBT . Maven has been always magical for me Best Tufan On Sat, 16 Jul 2022 at 00:11, Sean Owen wrote: > Java 8 binaries are probably on your PATH > > On Fri, Jul 15, 2022, 5:01 PM Szymon Kuryło > wrote: > >> Hello, >> >> I'm trying to build a Java 11 Sp