aokolnychyi commented on code in PR #50137:
URL: https://github.com/apache/spark/pull/50137#discussion_r1980125783


##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableBuilderImpl.java:
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.connector.catalog;
+
+import com.google.common.collect.Maps;
+import org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException;
+import org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException;
+import org.apache.spark.sql.connector.expressions.Transform;
+
+import java.util.Map;
+
+/**
+ * Default implementation of {@link TableCatalog.TableBuilder}.
+ */
+public class TableBuilderImpl implements TableCatalog.TableBuilder {

Review Comment:
   Do we need to make this class public? I think we want external connectors to 
provide their own builders and rely on this one purely for backward 
compatibility.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableBuilderImpl.java:
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.connector.catalog;
+
+import com.google.common.collect.Maps;
+import org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException;
+import org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException;
+import org.apache.spark.sql.connector.expressions.Transform;
+
+import java.util.Map;
+
+/**
+ * Default implementation of {@link TableCatalog.TableBuilder}.
+ */
+public class TableBuilderImpl implements TableCatalog.TableBuilder {

Review Comment:
   Optional: I am personally prefer direct imports for nested 
classes/interfaces to shorten lines, if the context of the imported class is 
obvious.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java:
##########
@@ -311,4 +311,49 @@ default boolean purgeTable(Identifier ident) throws 
UnsupportedOperationExceptio
    */
   void renameTable(Identifier oldIdent, Identifier newIdent)
       throws NoSuchTableException, TableAlreadyExistsException;
+
+  /**
+   * Instantiate a builder to create a table in the catalog.
+   *
+   * @param ident  a table identifier.
+   * @param columns the columns of the new table.
+   * @return the TableBuilder to create a table.
+   */
+  default TableBuilder buildTable(Identifier ident, Column[] columns) {

Review Comment:
   +1 to reusing it for replace.



##########
sql/catalyst/src/test/scala/org/apache/spark/sql/connector/catalog/CatalogSuite.scala:
##########
@@ -152,6 +152,66 @@ class CatalogSuite extends SparkFunSuite {
     assert(catalog.tableExists(testIdent))
   }
 
+  test("createTable: using builder non-partitioned table") {
+    val catalog = newCatalog()
+
+    assert(!catalog.tableExists(testIdent))
+
+    val table = catalog.buildTable(testIdent, columns)
+                       .withPartitions(emptyTrans)

Review Comment:
   Nit: Formatting is off.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableBuilderImpl.java:
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.connector.catalog;
+
+import com.google.common.collect.Maps;
+import org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException;
+import org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException;
+import org.apache.spark.sql.connector.expressions.Transform;
+
+import java.util.Map;
+
+/**
+ * Default implementation of {@link TableCatalog.TableBuilder}.
+ */
+public class TableBuilderImpl implements TableCatalog.TableBuilder {
+  /** Catalog where table needs to be created. */
+  private final TableCatalog catalog;
+  /** Table identifier. */

Review Comment:
   Optional: I think the variables are well-named and descriptive enough, I am 
not sure there is much value in the extra comments.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableBuilderImpl.java:
##########
@@ -0,0 +1,73 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.connector.catalog;
+
+import com.google.common.collect.Maps;
+import org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException;
+import org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException;
+import org.apache.spark.sql.connector.expressions.Transform;
+
+import java.util.Map;
+
+/**
+ * Default implementation of {@link TableCatalog.TableBuilder}.
+ */
+public class TableBuilderImpl implements TableCatalog.TableBuilder {
+  /** Catalog where table needs to be created. */
+  private final TableCatalog catalog;
+  /** Table identifier. */
+  private final Identifier identifier;
+  /** Columns of the new table. */
+  private final Column[] columns;
+  /** Table properties. */
+  private final Map<String, String> properties = Maps.newHashMap();
+  /** Transforms to use for partitioning data in the table. */
+  private Transform[] partitions = new Transform[0];

Review Comment:
   I would say we have to keep the existing behavior, which I believe passes an 
empty array for unpartitioned tables.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java:
##########
@@ -311,4 +311,49 @@ default boolean purgeTable(Identifier ident) throws 
UnsupportedOperationExceptio
    */
   void renameTable(Identifier oldIdent, Identifier newIdent)
       throws NoSuchTableException, TableAlreadyExistsException;
+
+  /**
+   * Instantiate a builder to create a table in the catalog.
+   *
+   * @param ident  a table identifier.
+   * @param columns the columns of the new table.
+   * @return the TableBuilder to create a table.
+   */
+  default TableBuilder buildTable(Identifier ident, Column[] columns) {
+    return new TableBuilderImpl(this, ident, columns);
+  }
+
+  /**
+   * Builder used to create tables.
+   *
+   * <p>Call {@link #buildTable(Identifier, Column[])} to create a new builder.
+   */
+  interface TableBuilder {
+    /**
+     * Sets the partitions for the table.
+     *
+     * @param partitions Partitions for the table.
+     * @return this for method chaining
+     */
+    TableBuilder withPartitions(Transform[] partitions);
+
+    /**
+     * Adds key/value properties to the table.
+     *
+     * @param properties key/value properties
+     * @return this for method chaining
+     */
+    TableBuilder withProperties(Map<String, String> properties);

Review Comment:
   We can add new methods like `addProperty` once there is a use case for it. I 
do agree with Szehon it should replace the original map, like the PR currently 
does.



##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableBuilderImpl.java:
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.connector.catalog;
+
+import com.google.common.collect.Maps;
+import org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException;
+import org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException;
+import org.apache.spark.sql.connector.expressions.Transform;
+
+import java.util.Map;
+
+/**
+ * Default implementation of {@link TableCatalog.TableBuilder}.
+ */
+public class TableBuilderImpl implements TableCatalog.TableBuilder {
+  /** Catalog where table needs to be created. */
+  private final TableCatalog catalog;
+  /** Table identifier. */
+  private final Identifier identifier;
+  /** Columns of the new table. */
+  private final Column[] columns;
+  /** Table properties. */
+  private final Map<String, String> properties = Maps.newHashMap();
+  /** Transforms to use for partitioning data in the table. */
+  private Transform[] partitions = new Transform[0];
+
+  /**
+   * Constructor for TableBuilderImpl.
+   *
+   * @param catalog catalog where table needs to be created.
+   * @param identifier identifier for the table.
+   * @param columns the columns of the new table.
+   */
+  public TableBuilderImpl(TableCatalog catalog,

Review Comment:
   Question: What's the Java style we follow for formatting parameters in Java? 
I see it is not very consistent in the code but it mostly matches what we do 
for Scala where the first parameter is on a new line.
   
   ```
   public TableBuilderImpl(
       TableCatalog catalog,
       Identifier identifier,
       Column[] columns) {
     this.catalog = catalog;
     this.identifier = identifier;
     this.columns = columns;
   }
   ``` 
   
   @cloud-fan @gengliangwang @szehon-ho, any input?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to