I successfully submitted and ran org.apache.spark.examples.SparkPi on Yarn
using 4.0.0-preview1. However I got it to work only after fixing an issue
with the Yarn nodemanagers (Hadoop v3.3.6 and v3.4.0). Namely the issue
was:
1. If the nodemanagers used java 11, Yarn threw an error about not find
I'm a big +1 on this proposal. We should be able to continue improving the
programming guides to enhance their quality and make this process easier.
> Move the programming guide to the spark-website repo, to allow faster
iterations and releases
This is a great idea. It should work for structured
Thanks for sharing! Yea Spark 4.0 is built using Java 17.
On Tue, Jun 18, 2024 at 5:07 AM George Magiros wrote:
> I successfully submitted and ran org.apache.spark.examples.SparkPi on Yarn
> using 4.0.0-preview1. However I got it to work only after fixing an issue
> with the Yarn nodemanagers (
You don’t need to upgrade Java for HDFS and YARN. Just keep using Java 8 for
Hadoop and set JAVA_HOME to Java 17 for Spark applications[1].
0. Install Java 17 on all nodes, for example, under /opt/openjdk-17
1. Modify $SPARK_CONF_DIR/spark-env.sh
export JAVA_HOME=/opt/openjdk-17
2. Modify $SPAR
unsubscribe