dyrnq commented on issue #7442:
URL: https://github.com/apache/gravitino/issues/7442#issuecomment-3026500984

   @FANNG1 
   I just create a POC demo here <https://github.com/dyrnq/iceberg-rest-guide> 
to try out multi-catalog-support for spark and flink.
   
   but the same error occurred,I don't know if it has something with 
`:9001/iceberg/v1/config` API.
   
   ```bash
   Exception in thread "main" 
org.apache.iceberg.exceptions.ServiceFailureException: Server error: 
RuntimeException: Couldn't find Iceberg configuration for foo
           at 
org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:217)
           at 
org.apache.iceberg.rest.ErrorHandlers$NamespaceErrorHandler.accept(ErrorHandlers.java:180)
           at 
org.apache.iceberg.rest.ErrorHandlers$NamespaceErrorHandler.accept(ErrorHandlers.java:166)
           at 
org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:224)
           at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:308)
           at org.apache.iceberg.rest.BaseHTTPClient.get(BaseHTTPClient.java:77)
           at org.apache.iceberg.rest.RESTClient.get(RESTClient.java:97)
           at 
org.apache.iceberg.rest.RESTSessionCatalog.listNamespaces(RESTSessionCatalog.java:657)
           at 
org.apache.iceberg.catalog.BaseSessionCatalog$AsCatalog.listNamespaces(BaseSessionCatalog.java:133)
           at 
org.apache.iceberg.rest.RESTCatalog.listNamespaces(RESTCatalog.java:228)
           at 
org.apache.iceberg.catalog.SupportsNamespaces.listNamespaces(SupportsNamespaces.java:74)
   ```
   
   see <https://github.com/dyrnq/flink-coding/issues/11> for flink
   
   see <https://github.com/dyrnq/spark-scala-example/issues/35> for spark
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to