[ https://issues.apache.org/jira/browse/HIVE-27087?focusedWorklogId=851308&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-851308 ]
ASF GitHub Bot logged work on HIVE-27087: ----------------------------------------- Author: ASF GitHub Bot Created on: 16/Mar/23 10:21 Start Date: 16/Mar/23 10:21 Worklog Time Spent: 10m Work Description: amanraj2520 commented on PR #4067: URL: https://github.com/apache/hive/pull/4067#issuecomment-1471679122 @vihangk1 Thanks for working on this. We have fixed 7-8 tests on branch-3 from the time this branch was cut. Just one request, can you once merge your branch with current branch-3 so that we have a combined list of remaining test failures after this fix. This would give us the concrete next action items. Issue Time Tracking ------------------- Worklog Id: (was: 851308) Time Spent: 3h 40m (was: 3.5h) > Fix TestMiniSparkOnYarnCliDriver test failures on branch-3 > ---------------------------------------------------------- > > Key: HIVE-27087 > URL: https://issues.apache.org/jira/browse/HIVE-27087 > Project: Hive > Issue Type: Sub-task > Reporter: Vihang Karajgaonkar > Assignee: Vihang Karajgaonkar > Priority: Major > Labels: pull-request-available > Time Spent: 3h 40m > Remaining Estimate: 0h > > TestMiniSparkOnYarnCliDriver are failing with the error below > [ERROR] 2023-02-16 14:13:08.991 [Driver] SparkContext - Error initializing > SparkContext. > java.lang.RuntimeException: java.lang.NoSuchFieldException: > DEFAULT_TINY_CACHE_SIZE > at > org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131) > ~[spark-network-common_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118) > ~[spark-network-common_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.network.server.TransportServer.init(TransportServer.java:94) > ~[spark-network-common_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:73) > ~[spark-network-common_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.network.TransportContext.createServer(TransportContext.java:114) > ~[spark-network-common_2.11-2.3.0.jar:2.3.0] > at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:119) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:465) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:464) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2271) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) > ~[scala-library-2.11.8.jar:?] > at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2263) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:469) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) > ~[spark-core_2.11-2.3.0.jar:2.3.0] > at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) > [spark-core_2.11-2.3.0.jar:2.3.0] > at org.apache.spark.SparkContext.<init>(SparkContext.scala:423) > [spark-core_2.11-2.3.0.jar:2.3.0] > at > org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) > [spark-core_2.11-2.3.0.jar:2.3.0] > at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:161) > [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT] > at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:536) > [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT] > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_322] > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > ~[?:1.8.0_322] > The root cause of the problem is that we upgrade the netty library from > 4.1.17.Final to 4.1.69.Final. The upgraded library does not have > `DEFAULT_TINY_CACHE_SIZE` field > [here|https://github.com/netty/netty/blob/netty-4.1.51.Final/buffer/src/main/java/io/netty/buffer/PooledByteBufAllocator.java#L46] > which was removed in 4.1.52.Final -- This message was sent by Atlassian Jira (v8.20.10#820010)