@Alfie Davidson : Awesome, it worked with
"“org.elasticsearch.spark.sql”"
But as soon as I switched to *elasticsearch-spark-20_2.12, *"es" also
worked.
On Fri, Sep 8, 2023 at 12:45 PM Dipayan Dev wrote:
>
> Let me try that and get back. Just wondering, if there a change in the
> way we pass t
Let me try that and get back. Just wondering, if there a change in the way
we pass the format in connector from Spark 2 to 3?
On Fri, 8 Sep 2023 at 12:35 PM, Alfie Davidson
wrote:
> I am pretty certain you need to change the write.format from “es” to
> “org.elasticsearch.spark.sql”
>
> Sent fr
I mean, have you checked if this is in your jar? Are you building an
assembly? Where do you expect elastic classes to be and are they there?
Need some basic debugging here
On Thu, Sep 7, 2023, 8:49 PM Dipayan Dev wrote:
> Hi Sean,
>
> Removed the provided thing, but still the same issue.
>
>
>
Hi Sean,
Removed the provided thing, but still the same issue.
org.elasticsearch
elasticsearch-spark-30_${scala.compat.version}
7.12.1
On Fri, Sep 8, 2023 at 4:41 AM Sean Owen wrote:
> By marking it provided, you are not including this dependency with your
> app. If it is also n
By marking it provided, you are not including this dependency with your
app. If it is also not somehow already provided by your spark cluster (this
is what it means), then yeah this is not anywhere on the class path at
runtime. Remove the provided scope.
On Thu, Sep 7, 2023, 4:09 PM Dipayan Dev w
Hi,
Can you please elaborate your last response? I don’t have any external
dependencies added, and just updated the Spark version as mentioned below.
Can someone help me with this?
On Fri, 1 Sep 2023 at 5:58 PM, Koert Kuipers wrote:
> could the provided scope be the issue?
>
> On Sun, Aug 27,
++ Dev
On Thu, 7 Sep 2023 at 10:22 PM, Dipayan Dev wrote:
> Hi,
>
> Can you please elaborate your last response? I don’t have any external
> dependencies added, and just updated the Spark version as mentioned below.
>
> Can someone help me with this?
>
> On Fri, 1 Sep 2023 at 5:58 PM, Koert Kuip
could the provided scope be the issue?
On Sun, Aug 27, 2023 at 2:58 PM Dipayan Dev wrote:
> Using the following dependency for Spark 3 in POM file (My Scala version
> is 2.12.14)
>
>
>
>
>
>
> *org.elasticsearch
> elasticsearch-spark-30_2.12
> 7.12.0provided*
>
>
> The code throws error
Using the following dependency for Spark 3 in POM file (My Scala version is
2.12.14)
*org.elasticsearch
elasticsearch-spark-30_2.12
7.12.0provided*
The code throws error at this line :
df.write.format("es").mode("overwrite").options(elasticOptions).save("index_name")
The same code i
What’s the version of the ES connector you are using?
On Sat, Aug 26, 2023 at 10:17 AM Dipayan Dev
wrote:
> Hi All,
>
> We're using Spark 2.4.x to write dataframe into the Elasticsearch index.
> As we're upgrading to Spark 3.3.0, it throwing out error
> Caused by: java.lang.ClassNotFoundExceptio
Hi All,
We're using Spark 2.4.x to write dataframe into the Elasticsearch index.
As we're upgrading to Spark 3.3.0, it throwing out error
Caused by: java.lang.ClassNotFoundException: es.DefaultSource
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
at java.base/java.lang.Cla
11 matches
Mail list logo