[ https://issues.apache.org/jira/browse/SPARK-50620?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Fei Wang updated SPARK-50620: ----------------------------- Description: "Executor 92 task launch worker for task 2910, task 69.0 in stage 7.0 (TID 2910)" #152 daemon prio=5 os_prio=0 cpu=616.25ms elapsed=258408.34s tid=0x00007f77d67cc330 nid=0x22c9e in Object.wait() [0x00007f77755fb000] java.lang.Thread.State: RUNNABLE at org.apache.spark.sql.internal.SQLConf$.<init>(SQLConf.scala:184) - waiting on the Class initialization monitor for org.apache.spark.sql.internal.SqlApiConf$ at org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala) "Executor 92 task launch worker for task 5362, task 521.0 in stage 10.0 (TID 5362)" #123 daemon prio=5 os_prio=0 cpu=2443.78ms elapsed=258409.29s tid=0x00007f77d60ecad0 nid=0x22c7c in Object.wait() [0x00007f777e591000] java.lang.Thread.State: RUNNABLE at java.lang.Class.forName0(java.base@17.0.6/Native Method) - waiting on the Class initialization monitor for org.apache.spark.sql.internal.SQLConf$ at java.lang.Class.forName(java.base@17.0.6/Class.java:467) at org.apache.spark.util.SparkClassUtils.classForName(SparkClassUtils.scala:41) at org.apache.spark.util.SparkClassUtils.classForName$(SparkClassUtils.scala:36) at org.apache.spark.util.SparkClassUtils$.classForName(SparkClassUtils.scala:83) at org.apache.spark.sql.internal.SqlApiConf$.$anonfun$new$1(SqlApiConf.scala:73) For the first one: {{at org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala)}} {{- waiting on the Class initialization monitor for org.apache.spark.sql.internal.SqlApiConf$}} {{}} {{For the second one.}} {{at org.apache.spark.sql.internal.SqlApiConf$.$anonfun$new$1(SqlApiConf.scala:73)}} {{- waiting on the Class initialization monitor for org.apache.spark.sql.internal.SQLConf$}} (e > Dead lock caused by SQLConf$ and SqlApiConf > ------------------------------------------- > > Key: SPARK-50620 > URL: https://issues.apache.org/jira/browse/SPARK-50620 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.5.0 > Reporter: Fei Wang > Priority: Major > > "Executor 92 task launch worker for task 2910, task 69.0 in stage 7.0 (TID > 2910)" #152 daemon prio=5 os_prio=0 cpu=616.25ms elapsed=258408.34s > tid=0x00007f77d67cc330 nid=0x22c9e in Object.wait() [0x00007f77755fb000] > java.lang.Thread.State: RUNNABLE > at org.apache.spark.sql.internal.SQLConf$.<init>(SQLConf.scala:184) > - waiting on the Class initialization monitor for > org.apache.spark.sql.internal.SqlApiConf$ > at org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala) > > > "Executor 92 task launch worker for task 5362, task 521.0 in stage 10.0 (TID > 5362)" #123 daemon prio=5 os_prio=0 cpu=2443.78ms elapsed=258409.29s > tid=0x00007f77d60ecad0 nid=0x22c7c in Object.wait() [0x00007f777e591000] > java.lang.Thread.State: RUNNABLE > at java.lang.Class.forName0(java.base@17.0.6/Native Method) > - waiting on the Class initialization monitor for > org.apache.spark.sql.internal.SQLConf$ > at java.lang.Class.forName(java.base@17.0.6/Class.java:467) > at > org.apache.spark.util.SparkClassUtils.classForName(SparkClassUtils.scala:41) > at > org.apache.spark.util.SparkClassUtils.classForName$(SparkClassUtils.scala:36) > at > org.apache.spark.util.SparkClassUtils$.classForName(SparkClassUtils.scala:83) > at > org.apache.spark.sql.internal.SqlApiConf$.$anonfun$new$1(SqlApiConf.scala:73) > For the first one: > {{at org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala)}} {{- > waiting on the Class initialization monitor for > org.apache.spark.sql.internal.SqlApiConf$}} > {{}} > {{For the second one.}} > > {{at > org.apache.spark.sql.internal.SqlApiConf$.$anonfun$new$1(SqlApiConf.scala:73)}} > {{- waiting on the Class initialization monitor for > org.apache.spark.sql.internal.SQLConf$}} (e -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org