Madhukar525722 commented on code in PR #50022:
URL: https://github.com/apache/spark/pull/50022#discussion_r1967534275


##########
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##########
@@ -1407,13 +1408,74 @@ private[hive] object HiveClientImpl extends Logging {
       case _ =>
         new HiveConf(conf, classOf[HiveConf])
     }
-    try {
+    val hive = try {
       Hive.getWithoutRegisterFns(hiveConf)
     } catch {
       // SPARK-37069: not all Hive versions have the above method (e.g., Hive 
2.3.9 has it but
-      // 2.3.8 don't), therefore here we fallback when encountering the 
exception.
+      // 2.3.8 doesn't), therefore here we fallback when encountering the 
exception.
       case _: NoSuchMethodError =>
         Hive.get(hiveConf)
     }
+
+    // Follow behavior of HIVE-26633 (4.0.0), only apply the max message size 
when
+    // `hive.thrift.client.max.message.size` is set and the value is positive
+    Option(hiveConf.get("hive.thrift.client.max.message.size"))
+      .map(HiveConf.toSizeBytes(_).toInt).filter(_ > 0)
+      .foreach { maxMessageSize =>
+        logDebug(s"Trying to set metastore client thrift max message to 
$maxMessageSize")

Review Comment:
   @pan3793 , I have build using latest patch only.
   And when I enabled the debug log and try to run, I can see
   ```25/02/24 12:11:30 DEBUG HiveClientImpl: Trying to set metastore client 
thrift max message to 1073741824```
   But the logs related to Change the current metastore client thrift max 
message size from, is not there



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to