ueshin commented on code in PR #49386:
URL: https://github.com/apache/spark/pull/49386#discussion_r1905993989


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/unresolved.scala:
##########
@@ -1071,3 +1072,29 @@ trait UnresolvedPlanId extends LeafExpression with 
Unevaluable {
   // Subclasses can override this function to provide more TreePatterns.
   def nodePatternsInternal(): Seq[TreePattern] = Seq()
 }
+
+case class UnresolvedWithColumns(
+     colNames: Seq[String],
+     exprs: Seq[Expression],
+     metadata: Option[Seq[Metadata]],
+     child: LogicalPlan)
+  extends UnresolvedUnaryNode {
+
+  final override val nodePatterns: Seq[TreePattern] = 
Seq(UNRESOLVED_WITH_COLUMNS)
+
+  override protected def withNewChildInternal(
+      newChild: LogicalPlan): UnresolvedWithColumns = copy(child = newChild)
+}
+
+case class UnresolvedWithColumnsRenamed(

Review Comment:
   There seem to be some semantic differences even with `withColumnsRenamed`:
   
   - If the specified column names are missing in the `df`, 
`df.withColumnsRenamed` ignores whereas `UnresolvedStarExceptOrReplace` throws 
an exception.
   - `df.withColumnsRenamed` respects the argument order, e.g., 
       ```
       test("SPARK-46260: withColumnsRenamed should respect the Map ordering") {
         val df = spark.range(10).toDF()
         assert(df.withColumnsRenamed(ListMap("id" -> "a", "a" -> "b")).columns 
=== Array("b"))
         assert(df.withColumnsRenamed(ListMap("a" -> "b", "id" -> "a")).columns 
=== Array("a"))
       }
       ```
       whereas `UnresolvedStarExceptOrReplace` throws an exception.
   
   I guess we should keep the new plans to be safer?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to