sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1916265920


##########
sql/core/src/main/scala/org/apache/spark/sql/jdbc/MySQLDialect.scala:
##########
@@ -112,6 +112,21 @@ private case class MySQLDialect() extends JdbcDialect with 
SQLConfHelper with No
       } else {
         super.visitAggregateFunction(funcName, isDistinct, inputs)
       }
+
+    override def visitCast(expr: String, exprDataType: DataType, dataType: 
DataType): String = {
+      val databaseTypeDefinition = dataType match {
+        // MySQL uses CHAR in the cast function for the type LONGTEXT
+        case StringType => "CHAR"
+        // MySQL uses SIGNED INTEGER in the cast function for SMALLINT, 
INTEGER and BIGINT.
+        // To avoid breaking code relying on ResultSet metadata, we support 
BIGINT only at
+        // this time.
+        case LongType => "SIGNED INTEGER"
+        // MySQL uses BINARY in the cast function for the type BLOB
+        case BinaryType => "BINARY"
+        case _ => 
getJDBCType(dataType).map(_.databaseTypeDefinition).getOrElse(dataType.typeName)

Review Comment:
   Sure, I saw other dialects are using both predefined functions in 
QueryExecutionErrors and classifyException with structured metadata. I'm trying 
to use QueryExecutionErrors.notSupportTypeError here, not sure if it's 
appropriate. If not, please let me know and I will try changing it to 
classifyException instead. PTAL
    



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to