Please try these two corrections:
1. The --packages isn't the right command line argument for
spark-submit. Please use --conf spark.jars.packages=your-package to
specify Maven packages or define your configuration parameters in
the spark-defaults.conf file
2. Please check the version nu
Hi Steve,
You’re correct about the '--packages' option, seems my memory does not serve me
well :)
On 2022/02/15 07:04:27 Stephen Coy wrote:
> Hi Morven,
>
> We use —packages for all of our spark jobs. Spark downloads the specified jar
> and all of its dependencies from a Maven repository.
>
Hi Morven,
We use —packages for all of our spark jobs. Spark downloads the specified jar
and all of its dependencies from a Maven repository.
This means we never have to build fat or uber jars.
It does mean that the Apache Ivy configuration has to be set up correctly
though.
Cheers,
Steve C
I wrote a toy spark job and ran it within my IDE, same error if I don’t add
spark-avro to my pom.xml. After putting spark-avro dependency to my pom.xml,
everything works fine.
Another thing is, if my memory serves me right, the spark-submit options for
extra jars is ‘--jars’ , not ‘--packages’.
Hi Anna,
Avro libraries should be inbuilt in SPARK in case I am not wrong. Any
particular reason why you are using a deprecated or soon to be deprecated
version of SPARK?
SPARK 3.2.1 is fantastic.
Please do let us know about your set up if possible.
Regards,
Gourav Sengupta
On Thu, Feb 10, 20
Have you added the dependency in the build.sbt?
Can you 'sbt package' the source successfully?
regards
frakass
On 2022/2/10 11:25, Karanika, Anna wrote:
For context, I am invoking spark-submit and adding arguments --packages
org.apache.spark:spark-avro_2.12:3.2.0.
Hello,
I have been trying to use spark SQL’s operations that are related to the Avro
file format,
e.g., stored as, save, load, in a Java class but they keep failing with the
following stack trace:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Failed to
find data source: a