cloud-fan commented on code in PR #49351: URL: https://github.com/apache/spark/pull/49351#discussion_r1914427606
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala: ########## @@ -41,19 +43,102 @@ object ResolveWithCTE extends Rule[LogicalPlan] { plan: LogicalPlan, cteDefMap: mutable.HashMap[Long, CTERelationDef]): LogicalPlan = { plan.resolveOperatorsDownWithPruning(_.containsAllPatterns(CTE)) { - case w @ WithCTE(_, cteDefs) => - cteDefs.foreach { cteDef => - if (cteDef.resolved) { - cteDefMap.put(cteDef.id, cteDef) - } + case withCTE @ WithCTE(_, cteDefs) => + val newCTEDefs = cteDefs.map { + case cteDef if !cteDef.recursive => + val newCTEDef = cteDef + if (newCTEDef.resolved) { + cteDefMap.put(newCTEDef.id, newCTEDef) + } + newCTEDef + case cteDef => Review Comment: Let's add some comments to explain the general workflow for resolving recursive CTEs. My expectation is: 1. wait for other analyzer rules to resolve the anchor plan in `Union` 2. replace the CTE relation definition plan with UnionLoop 3. replace CTE relation reference with `UnionLoopRef` and set the output as the anchor plan's output 4. at the end, the recursive CTE is resolved. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org