maheshguptags commented on issue #7589:
URL: https://github.com/apache/hudi/issues/7589#issuecomment-1480617541

   Hi @yihua ,
   thank you for looking into this request.
   I tried the same configuration above and am still getting the same error.
   
   please have a look at the stack trace and code.
   
   **CODE**
   ```
   hsc.sql("use default")
   
   table_path = "s3a://test-spark-hudi/clustering_mor/"
   
   df = spark.read.format('org.apache.hudi').load(table_path)
   df.createOrReplaceTempView("clustering_mor")
   
   print('2',spark.sql("show tables from default").show())
   
   
   print('=========================')
   
print('=========================',spark.catalog.listTables(),'=========================')
   print('========================')
   df1 = spark.sql("select * from clustering_mor")
   print("dddd",df1.printSchema())
   print(df1.show())
   
   # print(spark.sql("""call show_savepoints(table 
=>'clustering_mor')""").show())
   
   spark.sql("""call create_savepoint(path => table_path, commit_time => 
'20221228054602665'""")
   print(f"done................")
   
   ```
   **Stacktrace** 
   ```
   import of spark session is done !!!!
   ============================================
   23/03/23 10:44:13 WARN MetricsConfig: Cannot locate configuration: tried 
hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
   23/03/23 10:44:18 WARN DFSPropertiesConfiguration: Cannot find 
HUDI_CONF_DIR, please set it as the dir of hudi-defaults.conf
   23/03/23 10:44:18 WARN DFSPropertiesConfiguration: Properties file 
file:/etc/hudi/conf/hudi-defaults.conf not found. Ignoring to load props file
   +---------+--------------+-----------+
   |namespace|     tableName|isTemporary|
   +---------+--------------+-----------+
   |         |clustering_mor|      false|
   +---------+--------------+-----------+
   
   2 None
   =========================
   ========================= [Table(name='clustering_mor', database=None, 
description=None, tableType='TEMPORARY', isTemporary=True)] 
=========================
   ========================
   root
    |-- _hoodie_commit_time: string (nullable = true)
    |-- _hoodie_commit_seqno: string (nullable = true)
    |-- _hoodie_record_key: string (nullable = true)
    |-- _hoodie_partition_path: string (nullable = true)
    |-- _hoodie_file_name: string (nullable = true)
    |-- campaign_id: string (nullable = true)
    |-- client_id: string (nullable = true)
    |-- created_by: string (nullable = true)
    |-- created_date: string (nullable = true)
    |-- event_count: string (nullable = true)
    |-- event_id: string (nullable = true)
    |-- updated_by: string (nullable = true)
    |-- updated_date: string (nullable = true)
   
   dddd None
   # WARNING: Unable to get Instrumentation. Dynamic Attach failed. You may add 
this JAR as -javaagent manually, or supply -Djdk.attach.allowAttachSelf
   # WARNING: Unable to attach Serviceability Agent. Unable to attach even with 
module exceptions: [org.apache.hudi.org.openjdk.jol.vm.sa.SASupportException: 
Sense failed., org.apache.hudi.org.openjdk.jol.vm.sa.SASupportException: Sense 
failed., org.apache.hudi.org.openjdk.jol.vm.sa.SASupportException: Sense 
failed.]
   
+-------------------+--------------------+--------------------+----------------------+--------------------+-----------+----------+--------------------+--------------------+-----------+--------+--------------------+-------------------+
   |_hoodie_commit_time|_hoodie_commit_seqno|  
_hoodie_record_key|_hoodie_partition_path|   _hoodie_file_name|campaign_id| 
client_id|          created_by|        created_date|event_count|event_id|       
   updated_by|       updated_date|
   
+-------------------+--------------------+--------------------+----------------------+--------------------+-----------+----------+--------------------+--------------------+-----------+--------+--------------------+-------------------+
   |  20230109092536279|20230109092536279...|campaign_id:350,e...|       
campaign_id=350|fe1ae9e1-f3b1-463...|        
350|cl-WJxiIuA|Campaign_Event_Su...|2022-09-12T13:54:...|         79|       
2|Campaign_Event_Su...|2023-01-09T09:25:24|
   
+-------------------+--------------------+--------------------+----------------------+--------------------+-----------+----------+--------------------+--------------------+-----------+--------+--------------------+-------------------+
   
   None
   Traceback (most recent call last):
     File 
"/Users/maheshgupta/PycharmProjects/aws_hudi_connection/code/aws_hudi_spark_cluster_savepoint.py",
 line 52, in <module>
       spark.sql("""call create_savepoint(path => table_path, commit_time => 
'20221228054602665'""")
     File "/Library/Python/3.9/site-packages/pyspark/sql/session.py", line 
1034, in sql
       return DataFrame(self._jsparkSession.sql(sqlQuery), self)
     File "/Library/Python/3.9/site-packages/py4j/java_gateway.py", line 1321, 
in __call__
       return_value = get_return_value(
     File "/Library/Python/3.9/site-packages/pyspark/sql/utils.py", line 196, 
in deco
       raise converted from None
   pyspark.sql.utils.ParseException: 
   Syntax error at or near 'call'(line 1, pos 0)
   
   == SQL ==
   call create_savepoint(path => table_path, commit_time => '20221228054602665'
   ^^^
   
   
   Process finished with exit code 1
   
   ```
   Please let me know if I am not using it properly
   
   Thanks 
   Mahesh  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to