[ https://issues.apache.org/jira/browse/KUDU-3198?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Alexey Serbin updated KUDU-3198: -------------------------------- Fix Version/s: 1.14.0 > Unable to delete a full row from a table with 64 columns when using java > client > ------------------------------------------------------------------------------- > > Key: KUDU-3198 > URL: https://issues.apache.org/jira/browse/KUDU-3198 > Project: Kudu > Issue Type: Bug > Components: java > Affects Versions: 1.10.0, 1.10.1, 1.11.0, 1.12.0, 1.11.1, 1.13.0 > Reporter: YifanZhang > Priority: Major > Fix For: 1.14.0 > > > We recently got an error when deleted full rows from a table with 64 columns > using sparkSQL, however if we delete a column from the table, this error will > not appear. The error is: > {code:java} > Failed to write at least 1000 rows to Kudu; Sample errors: Not implemented: > Unknown row operation type (error 0){code} > I tested this by deleting a full row from a table with 64 column using java > client 1.12.0/1.13.0, if the row is set NULL for some columns, I got an error: > {code:java} > Row error for primary key=[-128, 0, 0, 1], tablet=null, > server=d584b3407ea444519e91b32f2744b162, status=Invalid argument: DELETE > should not have a value for column: c63 STRING NULLABLE (error 0) > {code} > if the row is set values for all columns , I got an error like: > {code:java} > Row error for primary key=[-128, 0, 0, 1], tablet=null, server=null, > status=Corruption: Not enough data for column: c63 STRING NULLABLE (error 0) > {code} > I also tested this with tables with different number of columns. The weird > thing is I could delete full rows from a table with 8/16/32/63/65 columns, > but couldn't do this if the table has 64/128 columns. -- This message was sent by Atlassian Jira (v8.3.4#803005)