wuchong commented on a change in pull request #11766: URL: https://github.com/apache/flink/pull/11766#discussion_r412677524
########## File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/source/row/converter/PostgresRowConverter.java ########## @@ -28,4 +34,39 @@ public PostgresRowConverter(RowType rowType) { super(rowType); } + + @Override + public JDBCFieldConverter createConverter(LogicalType type) { + LogicalTypeRoot root = type.getTypeRoot(); + + if (root == LogicalTypeRoot.ARRAY) { + ArrayType arrayType = (ArrayType) type; + LogicalTypeRoot elemType = arrayType.getElementType().getTypeRoot(); + + if (elemType == LogicalTypeRoot.VARBINARY) { + + return v -> { + PgArray pgArray = (PgArray) v; + Object[] in = (Object[]) pgArray.getArray(); + + Object[] out = new Object[in.length]; + for (int i = 0; i < in.length; i++) { + out[i] = ((PGobject) in[i]).getValue().getBytes(); + } + + return out; + }; + } else { + return v -> ((PgArray) v).getArray(); Review comment: I mean the currently default implementation of `AbstractJDBCRowConverter` will use `v -> v` for the array conversion, which puts `java.sql.Array` in a Row. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org