cloud-fan commented on PR #50137:
URL: https://github.com/apache/spark/pull/50137#issuecomment-2721388429

   OK so the requirement is:
   - adding a new item to table metadata should not require a new overload of 
`def createTable`
   - the existing `def createTable` implementation should fail if a new table 
feature is added, but should still work if the new feature is not used
   
   I think the builder should not be an interface, but a class that can have 
member variables and methods:
   ```
   class CreateTableBuilder {
     protected Column[] columns;
     ...
     CreateTableBuilder withColumns ...
     ...
     Table create...
   }
   ```
   
   Then we add a new method in `TableCatalog` to create a builder, with the 
default implementation to return the builtin builder.
   
   The workflow is Spark getting the builder from `TableCatalog`, calling 
`withXXX` methods to set up fields, and calling `create` at the end.
   
   When we add a new item (e.g. constraints), we add a new `withXXX` method and 
`supportsXXX` method in the builder. By default `supportsXXX` returns false, 
and `withXXX` throws an exception if `supportsXXX` is false. Users need to 
explicitly override `supportsXXX` to return true to adopt the new feature.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to