yuqi1129 commented on issue #6373: URL: https://github.com/apache/gravitino/issues/6373#issuecomment-2614289409
It seems that when we execute Hive command line, the code will check the permission `rootHDFSDirPath`, however the permission model of HDFS and GCS is not the same, GCS does not use `rwx` to represent the read, write and execute, so I'm afraid that we can only use GCS in the create table syntax NOT in the configuration file. below the detail stack trace ``` Exception in thread "main" java.lang.RuntimeException: The dir: gs://xiaoyu123/tmp on HDFS should be writable. Current permissions are: rwx------ at org.apache.hadoop.hive.ql.exec.Utilities.ensurePathIsWritable(Utilities.java:4501) at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:760) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:701) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:627) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:591) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:323) at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ``` The code:  -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@gravitino.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org