During the migration from hive to spark, there was a problem with the SQL used to create views in hive. The problem is that the SQL that legally creates a view in hive will make an error when executed in spark SQL.
The SQL is as follows: CREATE VIEW myView AS SELECT CASE WHEN age > 12 THEN CAST(gender * 0.3 - 0.1 AS double) END AS TT, gender, age FROM myTable; The error message is as follows: Cannot up cast TT from decimal(13, 1) to double. The type path of the target object is: You can either add an explicit cast to the input data or choose a higher precision type of the field in the target object How should we solve this problem?