aokolnychyi commented on code in PR #50561:
URL: https://github.com/apache/spark/pull/50561#discussion_r2040392099


##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -297,6 +298,24 @@ public int hashCode() {
     }
   }
 
+  /**
+   * Create a TableChange for adding a new Table Constraint
+   */
+  static TableChange addConstraint(Constraint constraint, Boolean validate) {
+    return new AddConstraint(constraint, validate);
+  }
+
+  /**
+   * Create a TableChange for dropping a Table Constraint
+   */
+  static TableChange dropConstraint(String name, Boolean ifExists, Boolean 
cascade) {

Review Comment:
   Same question about `Boolean`.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -297,6 +298,24 @@ public int hashCode() {
     }
   }
 
+  /**
+   * Create a TableChange for adding a new Table Constraint
+   */
+  static TableChange addConstraint(Constraint constraint, Boolean validate) {

Review Comment:
   Why `Boolean` and not `boolean`? I think `AddConstraint` below actually 
accepts a primitive, potentially leading a NPE during unboxing.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -787,4 +806,81 @@ public int hashCode() {
       return Arrays.hashCode(clusteringColumns);
     }
   }
+
+  /** A TableChange to alter table and add a constraint. */
+  final class AddConstraint implements TableChange {
+    private final Constraint constraint;
+    private final boolean validate;

Review Comment:
   During the discussion, the SPIP design evolved to be like this:
   
   ```
   class AddConstraint implements TableChange {
     private final Constraint constraint;
     private final String validatedTableVersion; // if known and was validated 
     …
   }
   ```
   
   If Spark validated the constraint, it should set the validation status in 
the constraint to `VALID`. The table version will be there, if known.
   
   ```
   interface Table {
     …
   
     // reports the current table version, if supported
     default String currentVersion() { return null; }
   
     …
   }
   ```



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -297,6 +298,24 @@ public int hashCode() {
     }
   }
 
+  /**
+   * Create a TableChange for adding a new Table Constraint
+   */
+  static TableChange addConstraint(Constraint constraint, Boolean validate) {
+    return new AddConstraint(constraint, validate);
+  }
+
+  /**
+   * Create a TableChange for dropping a Table Constraint
+   */
+  static TableChange dropConstraint(String name, Boolean ifExists, Boolean 
cascade) {
+    DropConstraint.Mode mode = DropConstraint.Mode.RESTRICT;
+    if (cascade) {

Review Comment:
   Would this fit on a single line?
   
   ```
   DropConstraint.Mode mode = cascade ? DropConstraint.Mode.CASCADE : 
DropConstraint.Mode.RESTRICT;
   ```



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -297,6 +298,24 @@ public int hashCode() {
     }
   }
 
+  /**
+   * Create a TableChange for adding a new Table Constraint
+   */
+  static TableChange addConstraint(Constraint constraint, Boolean validate) {
+    return new AddConstraint(constraint, validate);
+  }
+
+  /**
+   * Create a TableChange for dropping a Table Constraint

Review Comment:
   Minor: `a Table Constraint` -> `a table constraint`?



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/CatalogV2Util.scala:
##########
@@ -296,6 +297,49 @@ private[sql] object CatalogV2Util {
     }
   }
 
+  /**
+   * Apply Table Constraints changes to an existing table and return the 
result.
+   */
+  def applyConstraintChanges(
+      table: Table,
+      changes: Seq[TableChange]): Array[Constraint] = {
+    val constraints = table.constraints()
+
+    def findExistingConstraint(name: String): Option[Constraint] = {
+      constraints.find(_.name.toLowerCase(Locale.ROOT) == 
name.toLowerCase(Locale.ROOT))
+    }
+
+    changes.foldLeft(constraints) { (constraints, change) =>
+      change match {
+        case add: AddConstraint =>
+          val newConstraint = add.getConstraint
+          val existingConstraint = findExistingConstraint(newConstraint.name)
+          if (existingConstraint.isDefined) {
+            throw new AnalysisException(

Review Comment:
   When do we define dedicated error methods in classes like 
`QueryComplicationErrors`? Do we encourage external connectors to use built-in 
error classes in Spark?



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -297,6 +298,24 @@ public int hashCode() {
     }
   }
 
+  /**
+   * Create a TableChange for adding a new Table Constraint

Review Comment:
   Minor: `a new Table Constraint` -> `a new table constraint.`?



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -787,4 +806,81 @@ public int hashCode() {
       return Arrays.hashCode(clusteringColumns);
     }
   }
+
+  /** A TableChange to alter table and add a constraint. */
+  final class AddConstraint implements TableChange {
+    private final Constraint constraint;
+    private final boolean validate;
+
+    private AddConstraint(Constraint constraint, boolean validate) {
+      this.constraint = constraint;
+      this.validate = validate;
+    }
+
+    public Constraint getConstraint() {
+      return constraint;
+    }
+
+    public boolean isValidate() {
+      return validate;
+    }
+
+    @Override
+    public boolean equals(Object o) {
+      if (this == o) return true;
+      if (o == null || getClass() != o.getClass()) return false;
+      AddConstraint that = (AddConstraint) o;
+      return constraint.equals(that.constraint) && validate == that.validate;
+    }
+
+    @Override
+    public int hashCode() {
+      return Objects.hash(constraint, validate);
+    }
+  }
+
+  /** A TableChange to alter table and drop a constraint. */
+  final class DropConstraint implements TableChange {
+    private final String name;
+    private final boolean ifExists;
+    private final Mode mode;
+
+    /**
+     * Defines modes for dropping a constraint.
+     * <p>
+     * RESTRICT - Prevents dropping a constraint if it is referenced by other 
objects.
+     * CASCADE - Automatically drops objects that depend on the constraint.
+     */
+    public enum Mode { RESTRICT, CASCADE }
+
+    private DropConstraint(String name, boolean ifExists, Mode mode) {
+      this.name = name;
+      this.ifExists = ifExists;
+      this.mode = mode;
+    }
+
+    public String getName() {

Review Comment:
   Same feedback about `getXXX` prefixes.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableChange.java:
##########
@@ -787,4 +806,81 @@ public int hashCode() {
       return Arrays.hashCode(clusteringColumns);
     }
   }
+
+  /** A TableChange to alter table and add a constraint. */
+  final class AddConstraint implements TableChange {
+    private final Constraint constraint;
+    private final boolean validate;
+
+    private AddConstraint(Constraint constraint, boolean validate) {
+      this.constraint = constraint;
+      this.validate = validate;
+    }
+
+    public Constraint getConstraint() {

Review Comment:
   Minor: I believe we rarely use `get` prefix in this file/set of APIs. I'd 
call it `constraint()`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to