szehon-ho commented on code in PR #50137:
URL: https://github.com/apache/spark/pull/50137#discussion_r1978292403


##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java:
##########
@@ -311,4 +311,49 @@ default boolean purgeTable(Identifier ident) throws 
UnsupportedOperationExceptio
    */
   void renameTable(Identifier oldIdent, Identifier newIdent)
       throws NoSuchTableException, TableAlreadyExistsException;
+
+  /**
+   * Instantiate a builder to create a table in the catalog.
+   *
+   * @param ident  a table identifier.

Review Comment:
   Note: I guess you were trying to align the params?  But I think the other 
method javadocs here dont align if they are on the same line.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java:
##########
@@ -311,4 +311,49 @@ default boolean purgeTable(Identifier ident) throws 
UnsupportedOperationExceptio
    */
   void renameTable(Identifier oldIdent, Identifier newIdent)
       throws NoSuchTableException, TableAlreadyExistsException;
+
+  /**
+   * Instantiate a builder to create a table in the catalog.
+   *
+   * @param ident  a table identifier.
+   * @param columns the columns of the new table.
+   * @return the TableBuilder to create a table.
+   */
+  default TableBuilder buildTable(Identifier ident, Column[] columns) {
+    return new TableBuilderImpl(this, ident, columns);
+  }
+
+  /**
+   * Builder used to create tables.
+   *
+   * <p>Call {@link #buildTable(Identifier, Column[])} to create a new builder.
+   */
+  interface TableBuilder {
+    /**
+     * Sets the partitions for the table.
+     *
+     * @param partitions Partitions for the table.
+     * @return this for method chaining
+     */
+    TableBuilder withPartitions(Transform[] partitions);
+
+    /**
+     * Adds key/value properties to the table.
+     *
+     * @param properties key/value properties
+     * @return this for method chaining
+     */
+    TableBuilder withProperties(Map<String, String> properties);

Review Comment:
   It seems  'withProperties' sounds like it should set/replace the existing 
map, rather than increment the existing map, based on how the other method is 
defined.
   
   In Spark the builder arguments for conf seem to take in single key, value.  
I wonder if we should follow this pattern if the goal is to have an additive 
API , else it seems wasteful (callers have to make a temporary java Map just to 
call this).
   
   Maybe we can support both APIs?
   
   Just my thought.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to