nd CSP takes care of the
rest. FYI, I am running Java 11 and spark 3.4.1 on the host submitting
spark-submit. The docker file is also built on java 11, Spark 3.4,1 and
Pyspark
The tag explains it
spark-py:3.4.1-scala_2.12-11-jre-slim-buster-java11PlusPackages
The problem I notice is that cluster
for Spark on
K8s on Spark32 running on version < Hadoop 3.2 (since the default value in
Docker file for Spark32 is Java 11)
Please let me know if it make sense to you.
Regards
Pralabh Kumar
On Tue, Jun 14, 2022 at 4:21 PM Steve Loughran wrote:
> hadoop 3.2.x is the oldest of the
x for the help . Have a quick question , How can we fix the above error
> in Hadoop 3.1 .
>
>- Spark docker file have (Java 11)
>
> https://github.com/apache/spark/blob/branch-3.2/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile
>
>- No
- Spark docker file have (Java 11)
>
> https://github.com/apache/spark/blob/branch-3.2/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile
>
>- Now if we build Spark32 , Spark image will be having Java 11 . If
>we run on a Hadoop version less than
Hi Steve / Dev team
Thx for the help . Have a quick question , How can we fix the above error
in Hadoop 3.1 .
- Spark docker file have (Java 11)
https://github.com/apache/spark/blob/branch-3.2/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile
- Now if we build
Steve . Thx for your help ,please ignore last comment.
Regards
Pralabh Kumar
On Mon, 13 Jun 2022, 15:43 Pralabh Kumar, wrote:
> Hi steve
>
> Thx for help . We are on Hadoop3.2 ,however we are building Hadoop3.2 with
> Java 8 .
>
> Do you suggest to build Hadoop with J
Hi steve
Thx for help . We are on Hadoop3.2 ,however we are building Hadoop3.2 with
Java 8 .
Do you suggest to build Hadoop with Java 11
Regards
Pralabh kumar
On Mon, 13 Jun 2022, 15:25 Steve Loughran, wrote:
>
>
> On Mon, 13 Jun 2022 at 08:52, Pralabh Kumar
> wrote:
>
On Mon, 13 Jun 2022 at 08:52, Pralabh Kumar wrote:
> Hi Dev team
>
> I have a spark32 image with Java 11 (Running Spark on K8s) . While
> reading a huge parquet file via spark.read.parquet("") . I am getting
> the following error . The same error is mentio
Hi Dev team
I have a spark32 image with Java 11 (Running Spark on K8s) . While reading
a huge parquet file via spark.read.parquet("") . I am getting the
following error . The same error is mentioned in Spark docs
https://spark.apache.org/docs/latest/#downloading but w.r.t to ap
22 at 13:50, Mich Talebzadeh
wrote:
> I have loaded docker files into my docker repository on docker hub and it
> is public.
>
>
> These are built on Spark 3.1.2 OR 3.1.1, with Scala 2.12 and with Java 11
> OR Java 8 on OS jre-slim-buster. The ones built on 3.1.1 with Java 8
&g
I have loaded docker files into my docker repository on docker hub and it
is public.
These are built on Spark 3.1.2 OR 3.1.1, with Scala 2.12 and with Java 11
OR Java 8 on OS jre-slim-buster. The ones built on 3.1.1 with Java 8
should work with GCP
No additional packages are added to PySpark
Support will be released as part of Spark 3.0
Preview:
https://spark.apache.org/docs/3.0.0-preview2/#downloading
Refer:
https://issues.apache.org/jira/browse/SPARK-24417
Hi all,
Is there any plan for upgrading to java 11?
Kind regards
Ehsan.
> On Tue, Nov 6, 2018 at 9:16 AM Felix Cheung wrote:
>>
>> +1 for Spark 3, definitely
>> Thanks for the updates
>>
>>
>>
>> From: Sean Owen
>> Sent: Tuesday, November 6, 2018 9:11 AM
>> To: Felix Cheung
&g
> *From:* Sean Owen
> *Sent:* Tuesday, November 6, 2018 9:11 AM
> *To:* Felix Cheung
> *Cc:* dev
> *Subject:* Re: Java 11 support
>
> I think that Java 9 support basically gets Java 10, 11 support. But
> the jump from 8 to 9 is unfortunately more breaking than usual because
&
Here's the ticket I know of:
> https://issues.apache.org/jira/browse/SPARK-24417
> <https://issues.apache.org/jira/browse/SPARK-24417> . DB is already
> working on some of it, I see.
> On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <mailto:felixcheun...@hotmail
at 10:59 AM Felix Cheung
> wrote:
> >
> > Speaking of, get we work to support Java 11?
> > That will fix all the problems below.
> >
> >
> >
> >
> > From: Felix Cheung
> > Sent: Tuesday, November 6, 2018 8:57 AM
+1 for Spark 3, definitely
Thanks for the updates
From: Sean Owen
Sent: Tuesday, November 6, 2018 9:11 AM
To: Felix Cheung
Cc: dev
Subject: Re: Java 11 support
I think that Java 9 support basically gets Java 10, 11 support. But
the jump from 8 to 9 is
important for Spark 3. Here's the ticket I know of:
https://issues.apache.org/jira/browse/SPARK-24417 . DB is already
working on some of it, I see.
On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung wrote:
>
> Speaking of, get we work to support Java 11?
> That will fix all the
Speaking of, get we work to support Java 11?
That will fix all the problems below.
From: Felix Cheung
Sent: Tuesday, November 6, 2018 8:57 AM
To: Wenchen Fan
Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived
20 matches
Mail list logo