although no, additional constructor won't work...

On Mon, Feb 8, 2021 at 7:01 PM Alex Ott <alex...@gmail.com> wrote:

> Hi all
>
> I've noticed following SO question about Spark 3.1.1 not working with
> Delta 0.7.0:
> https://stackoverflow.com/questions/66106096/delta-lake-insert-into-sql-in-pyspark-is-failing-with-java-lang-nosuchmethoder/66106800#66106800
> - I checked with Delta 0.8.0 and it has the same problem. It brokes here:
> https://github.com/delta-io/delta/blob/v0.8.0/src/main/scala/org/apache/spark/sql/delta/DeltaAnalysis.scala#L204
>
> I'm thinking that it's caused by following change:
> https://github.com/apache/spark/commit/a082f4600b1cb814442beed1b578bc3430a257a7#diff-cf96171d13fd77e670764766ae22afafbc4a396316bd758a89b60a6fe70d5b0dL150,
> but not 100% sure.  If it's, then maybe we could add the backward
> compatible constructor for this case class?
>
> P.S. It's not the first time when 3.1 breaks compatibility with existing
> connectors, for example, Spark Cassandra Connector on 3.1 doesn't work
> without changes:
> https://github.com/datastax/spark-cassandra-connector/pull/1280
>
>
> --
> With best wishes,                    Alex Ott
> http://alexott.net/
> Twitter: alexott_en (English), alexott (Russian)
>


-- 
With best wishes,                    Alex Ott
http://alexott.net/
Twitter: alexott_en (English), alexott (Russian)

Reply via email to