le enough and persistent enough,
>>> why dont just use external metastore? It is fairly straightforward process.
>>> Also regardless you are in cloud or not, database bkp is a routine and
>>> established pattern in most organizations.
>>> You can also enhance HA and D
x27;t see a strong reason to
> expose the APIs to Spark. Do you plan to add new SQL commands in Spark to
> backup/restore a catalog?
>
> On Tue, May 4, 2021 at 2:39 AM Tianchen Zhang
> wrote:
>
>> Hi all,
>>
>> Currently the user-facing Catalog API doesn't s
Hi all,
Currently the user-facing Catalog API doesn't support backup/restore
metadata. Our customers are asking for such functionalities. Here is a
usage example:
1. Read all metadata of one Spark cluster
2. Save them into a Parquet file on DFS
3. Read the Parquet file and restore all metadata in
Hi all,
Currently the user-facing Catalog API doesn't support backup/restore
metadata. Our customers are asking for such functionalities. Here is a
usage example:
1. Read all metadata of one Spark cluster
2. Save them into a Parquet file on DFS
3. Read the Parquet file and restore all metadata in