Copilot commented on code in PR #4330:
URL: https://github.com/apache/flink-cdc/pull/4330#discussion_r2981281667
##########
flink-cdc-connect/flink-cdc-source-connectors/flink-connector-mysql-cdc/src/main/java/org/apache/flink/cdc/connectors/mysql/source/utils/OnlineSchemaChangeUtils.java:
##########
@@ -106,10 +108,10 @@ public static boolean
isOnLineSchemaChangeEvent(SourceRecord record) {
return false;
}
Struct value = (Struct) record.value();
- ObjectMapper mapper = new ObjectMapper();
try {
String ddl =
- mapper.readTree(value.getString(HISTORY_RECORD_FIELD))
+ OBJECT_MAPPER
+ .readTree(value.getString(HISTORY_RECORD_FIELD))
.get(HistoryRecord.Fields.DDL_STATEMENTS)
.asText()
.toLowerCase();
Review Comment:
`toLowerCase()` uses the default JVM locale; for parsing SQL keywords
deterministically, prefer `toLowerCase(Locale.ROOT)` (and add the corresponding
import) to avoid locale-specific case mapping surprises.
##########
flink-cdc-connect/flink-cdc-source-connectors/flink-connector-mysql-cdc/src/main/java/org/apache/flink/cdc/connectors/mysql/source/offset/BinlogOffsetSerializer.java:
##########
@@ -30,15 +30,15 @@ public class BinlogOffsetSerializer {
public static final BinlogOffsetSerializer INSTANCE = new
BinlogOffsetSerializer();
+ private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
+
public byte[] serialize(BinlogOffset binlogOffset) throws IOException {
// use JSON serialization
- ObjectMapper objectMapper = new ObjectMapper();
- return objectMapper.writeValueAsBytes(binlogOffset.getOffset());
+ return OBJECT_MAPPER.writeValueAsBytes(binlogOffset.getOffset());
}
public BinlogOffset deserialize(byte[] bytes) throws IOException {
- ObjectMapper objectMapper = new ObjectMapper();
- Map<String, String> offset = objectMapper.readValue(bytes, Map.class);
+ Map<String, String> offset = OBJECT_MAPPER.readValue(bytes, Map.class);
return new BinlogOffset(offset);
Review Comment:
`readValue(bytes, Map.class)` returns a raw `Map` and relies on an unchecked
assignment to `Map<String, String>`. Consider using a
`TypeReference<Map<String, String>>` (or deserializing to `Map<String, Object>`
and normalizing) to make the expected value types explicit and avoid hidden
type issues.
##########
flink-cdc-connect/flink-cdc-source-connectors/flink-connector-mysql-cdc/src/main/java/org/apache/flink/cdc/connectors/mysql/source/utils/OnlineSchemaChangeUtils.java:
##########
@@ -136,11 +138,11 @@ public static Optional<String>
parseOnLineSchemaRenameEvent(SourceRecord record)
return Optional.empty();
}
Struct value = (Struct) record.value();
- ObjectMapper mapper = new ObjectMapper();
try {
String ddl =
- mapper.readTree(value.getString(HISTORY_RECORD_FIELD))
+ OBJECT_MAPPER
+ .readTree(value.getString(HISTORY_RECORD_FIELD))
.get(HistoryRecord.Fields.DDL_STATEMENTS)
.asText()
.toLowerCase();
Review Comment:
`toLowerCase()` here is locale-dependent; use `toLowerCase(Locale.ROOT)` to
keep OSC DDL parsing stable regardless of the process locale.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]