dataageek opened a new issue, #6310:
URL: https://github.com/apache/gravitino/issues/6310

   ### Version
   
   main branch
   
   ### Describe what's wrong
   
   I have multiple catalogs created using the Iceberg JDBC catalog. When I 
attempt to consume these catalogs via the Gravitino REST server (using the REST 
catalog through Spark), the implementation always defaults to using jdbc as the 
catalog name instead of the catalog name specified in the Spark configuration.
   
   For example, when running the following command:
   
   ```
   ./spark-sql \
       --conf 
spark.sql.catalog.iceberg_jdbc_catalog=org.apache.iceberg.spark.SparkCatalog \
       --conf 
spark.sql.catalog.iceberg_jdbc_catalog.warehouse="/mnt/c/iceberg-warehouse" \
       --conf spark.sql.catalog.iceberg_jdbc_catalog.type=rest \
       --conf 
spark.sql.catalog.iceberg_jdbc_catalog.uri=http://127.0.0.1:9991/iceberg \
       --conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
   
   ```
   
   I expect Spark to use the iceberg_jdbc_catalog catalog, as specified in the 
configuration. However, the Gravitino REST server implementation instead looks 
for a catalog named jdbc.
   
   Although I can override this behavior using the catalog-backend-name 
property in gravitino.conf, I would prefer a dynamic mechanism that fetches the 
catalog name from the Spark configuration directly.     
   
   
   ### Error message and/or stacktrace
   
    No error message. just not working as expected
   
   ### How to reproduce
   
   0.7.0
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@gravitino.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to