XJDKC commented on code in PR #3729:
URL: https://github.com/apache/polaris/pull/3729#discussion_r3010985756


##########
polaris-core/src/main/java/org/apache/polaris/core/connection/iceberg/IcebergRestConnectionConfigInfoDpo.java:
##########
@@ -72,6 +87,12 @@ public String getRemoteCatalogName() {
     }
     // Add authentication-specific metadata (non-credential properties)
     
properties.putAll(getAuthenticationParameters().asIcebergCatalogProperties(credentialManager));
+    if (getConfigs().containsKey(GOOGLE_USER_PROJECT_HEADER_KEY)) {
+      properties.put(
+          "header." + GOOGLE_USER_PROJECT_HEADER_KEY,

Review Comment:
   If we choose to allow customers to provide additional config map rather than 
headers, I think it's better to comply with the spec, end users should 
explicitly specify the `header.x-goog-user-project` instead of relying on us to 
figure it out. Cuz it's also the recommended configuration by biglake's public 
doc, when using spark, end users must provide the project id with property key 
= `header.x-goog-user-project`. 
   
   
   ```py
   import pyspark
   from pyspark.context import SparkContext
   from pyspark.sql import SparkSession
   
   catalog_name = "CATALOG_NAME"
   spark = SparkSession.builder.appName("APP_NAME") \
     .config(f'spark.sql.catalog.{catalog_name}', 
'org.apache.iceberg.spark.SparkCatalog') \
     .config(f'spark.sql.catalog.{catalog_name}.type', 'rest') \
     .config(f'spark.sql.catalog.{catalog_name}.uri', 
'https://biglake.googleapis.com/iceberg/v1/restcatalog') \
     .config(f'spark.sql.catalog.{catalog_name}.warehouse', 'WAREHOUSE_PATH') \
     .config(f'spark.sql.catalog.{catalog_name}.header.x-goog-user-project', 
'PROJECT_ID') \
     .config(f'spark.sql.catalog.{catalog_name}.rest.auth.type', 
'org.apache.iceberg.gcp.auth.GoogleAuthManager') \
     .config(f'spark.sql.catalog.{catalog_name}.io-impl', 
'org.apache.iceberg.gcp.gcs.GCSFileIO') \
     
.config(f'spark.sql.catalog.{catalog_name}.rest-metrics-reporting-enabled', 
'false') \
     .config('spark.sql.extensions', 
'org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions') \
     .config('spark.sql.defaultCatalog', 'CATALOG_NAME') \
     .getOrCreate()
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to