Jonathan Vexler created HUDI-5452:
-------------------------------------

             Summary: Spark-sql long datatype conversion to bigint in hive 
causes issues with alter table
                 Key: HUDI-5452
                 URL: https://issues.apache.org/jira/browse/HUDI-5452
             Project: Apache Hudi
          Issue Type: Bug
          Components: spark-sql
            Reporter: Jonathan Vexler
         Attachments: AlterTableIssue.txt

Commands run to get this error: [^AlterTableIssue.txt] . When trying to alter 
the table with long in the schema we get this error.

When calling describe table we get 
{code:java}
spark-sql> describe test_table;
_hoodie_commit_time     string                                      
_hoodie_commit_seqno    string                                      
_hoodie_record_key      string                                      
_hoodie_partition_path  string                                      
_hoodie_file_name       string                                      
id                      int                                         
name                    string                                      
price                   double                                      
ts                      bigint                                      
dt                      string  {code}
We think that it is having issues with long turning into bigint. When we 
created a different table with the same schema but ts had type int instead of 
long, we were able to alter the schema.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to