sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1919505376


##########
sql/core/src/main/scala/org/apache/spark/sql/jdbc/MySQLDialect.scala:
##########
@@ -112,6 +112,21 @@ private case class MySQLDialect() extends JdbcDialect with 
SQLConfHelper with No
       } else {
         super.visitAggregateFunction(funcName, isDistinct, inputs)
       }
+
+    override def visitCast(expr: String, exprDataType: DataType, dataType: 
DataType): String = {
+      val databaseTypeDefinition = dataType match {
+        // MySQL uses CHAR in the cast function for the type LONGTEXT
+        case StringType => "CHAR"
+        // MySQL uses SIGNED INTEGER in the cast function for SMALLINT, 
INTEGER and BIGINT.
+        // To avoid breaking code relying on ResultSet metadata, we support 
BIGINT only at
+        // this time.
+        case LongType => "SIGNED INTEGER"
+        // MySQL uses BINARY in the cast function for the type BLOB
+        case BinaryType => "BINARY"
+        case _ => 
getJDBCType(dataType).map(_.databaseTypeDefinition).getOrElse(dataType.typeName)

Review Comment:
   Yes, they are valid statements. The problem comes from different set of 
target type identifiers. The original type is less important, for example we 
can do this for string type as well: 
   ```sql
   SELECT cast('123' AS SIGNED INTEGER)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to