cloud-fan commented on code in PR #50101:
URL: https://github.com/apache/spark/pull/50101#discussion_r1974701969


##########
sql/core/src/main/scala/org/apache/spark/sql/jdbc/PostgresDialect.scala:
##########
@@ -303,12 +303,27 @@ private case class PostgresDialect()
 
   class PostgresSQLBuilder extends JDBCSQLBuilder {
     override def visitExtract(field: String, source: String): String = {
-      field match {
-        case "DAY_OF_YEAR" => s"EXTRACT(DOY FROM $source)"
-        case "YEAR_OF_WEEK" => s"EXTRACT(YEAR FROM $source)"
-        case "DAY_OF_WEEK" => s"EXTRACT(DOW FROM $source)"
-        case _ => super.visitExtract(field, source)
+      // SECOND, MINUTE, HOUR, QUARTER, YEAR, DAY are identical on postgres 
and spark
+      // MONTH        is different, postgres returns 0-11, spark returns 1-12.

Review Comment:
   you are right, sorry I misread the doc.
   
   Now looking into it again, Spark DS v2 pushdown does not support extracting 
fields from interval type yet (expressions like `ExtractANSIIntervalMonths` are 
not included in `V2ExpressionBuilder`). But even if we support it later, its 
semantic is the same as pgsql.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to