If a catalog implements backup/restore, it can easily expose some client
APIs to the end-users (e.g. REST API), I don't see a strong reason to
expose the APIs to Spark. Do you plan to add new SQL commands in Spark to
backup/restore a catalog?
On Tue, May 4, 2021 at 2:39 AM Tianchen Zhang
wrote:
For now we are thinking about adding two methods in Catalog API, not SQL
commands:
1. spark.catalog.backup, which backs up the current catalog.
2. spark.catalog.restore(file), which reads the DFS file and recreates the
entities described in that file.
Can you please give an example of exposing cli