KnightChess created HUDI-3781: --------------------------------- Summary: spark delete sql can't delete record Key: HUDI-3781 URL: https://issues.apache.org/jira/browse/HUDI-3781 Project: Apache Hudi Issue Type: Bug Components: spark, spark-sql Reporter: KnightChess
create a table and set *hoodie.datasource.write.operation* upsert when I use _*sql*_ to delete, the delete *operation key* will be overwrite by *hoodie.datasource.write.operation* from table or env {code:java} withSparkConf(sparkSession, hoodieCatalogTable.catalogProperties) { Map( "path" -> path, RECORDKEY_FIELD.key -> hoodieCatalogTable.primaryKeys.mkString(","), TBL_NAME.key -> tableConfig.getTableName, HIVE_STYLE_PARTITIONING.key -> tableConfig.getHiveStylePartitioningEnable, URL_ENCODE_PARTITIONING.key -> tableConfig.getUrlEncodePartitioning, KEYGENERATOR_CLASS_NAME.key -> classOf[SqlKeyGenerator].getCanonicalName, SqlKeyGenerator.ORIGIN_KEYGEN_CLASS_NAME -> tableConfig.getKeyGeneratorClassName, OPERATION.key -> DataSourceWriteOptions.DELETE_OPERATION_OPT_VAL, PARTITIONPATH_FIELD.key -> tableConfig.getPartitionFieldProp, HiveSyncConfig.HIVE_SYNC_MODE.key -> HiveSyncMode.HMS.name(), HiveSyncConfig.HIVE_SUPPORT_TIMESTAMP_TYPE.key -> "true", HoodieWriteConfig.DELETE_PARALLELISM_VALUE.key -> "200", SqlKeyGenerator.PARTITION_SCHEMA -> partitionSchema.toDDL ) } {code} -- This message was sent by Atlassian Jira (v8.20.1#820001)