alekjarmov commented on code in PR #50591: URL: https://github.com/apache/spark/pull/50591#discussion_r2045216463
########## connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/V2JDBCTest.scala: ########## @@ -152,7 +178,11 @@ private[v2] trait V2JDBCTest extends SharedSparkSession with DockerIntegrationFu val t = spark.table(s"$catalogName.alt_table") val expectedSchema = new StructType() .add("C2", StringType, true, defaultMetadata()) - assert(t.schema === expectedSchema) + val expectedSchemaWithoutRemoteTypeName = + removeMetadataFromAllFields(expectedSchema, "remoteTypeName") + val schemaWithoutRemoteTypeName = + removeMetadataFromAllFields(t.schema, "remoteTypeName") + assert(schemaWithoutRemoteTypeName === expectedSchemaWithoutRemoteTypeName) Review Comment: Didn't update since I couldn't manage to run tests locally (and don't want to fail another pipeline in case it won't work), but we can write down an issue for this test cleanup. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org