ite build tool, declare a dependency on your required
>> packages.
>> 2. Write your Dockerfile, with or without the Spark binaries inside it.
>> 3. Using your build tool to copy the dependencies to a location that the
>> Docker daemon can access.
>> 4. Copy the dependenc
e it.
> 3. Using your build tool to copy the dependencies to a location that the
> Docker daemon can access.
> 4. Copy the dependencies into the correct directory.
> 5. Ensure those files have the correct permissions.
>
> In my opinion, it is pretty easy to do this with Gradle
Hi all,
I am creating a base Spark image that we are using internally.
We need to add some packages to the base image:
spark:3.5.1-scala2.12-java17-python3-r-ubuntu
Of course I do not want to Start Spark with --packages "..." - as it is not
efficient at all - I would like to add the needed jars t
"
>
> name: spark-gcs-creds
>
> namespace: so350
>
> resourceVersion: "180991552"
>
> uid: ac30c575-9abf-4a77-ba90-15576607c97f
>
> type: Opaque
>
> any feedback on this
> tia!
>
>
> On Sun, Oct 6, 2024 at 1:16 AM Nimrod Ofek wrote:
&g
rg.slf4j_slf4j-api-1.7.30.jar,file:///opt/spark/other-jars/org.mongodb_mongodb-driver-sync-4.0.5.jar,file:///opt/spark/other-jars/org.mongodb_bson-4.0.5.jar,file:///opt/spark/other-jars/org.mongodb_mongodb-driver-core-4.0.5.jar"
> "spark.submit.pyFiles":
> "file:///opt
Where is the checkpoint location? Not in GCS?
Probably the location of the checkpoint is there- and you don't have
permissions for that...
בתאריך יום ה׳, 3 באוק׳ 2024, 02:43, מאת karan alang :
> This seems to be the cause of this ->
> github.com/kubeflow/spark-operator/issues/1619 .. the secret