Are you sure it's deadlock? print the thread dump (from kill -QUIT) of
the thread(s) that are deadlocked, I suppose, to show where the issue
is. It seems unlikely that a logging thread would be holding locks
that the app uses.

On Fri, Nov 28, 2014 at 4:01 PM, Charles <charles...@cenx.com> wrote:
> We create spark context in an application running inside wildfly container.
> When spark context is created, we see following entires in the wildfly log.
> After the log4j-default.properties is loaded, every entry from spark is
> printed out twice. And after running for a while, we start to see deadlock
> between spark logging thread and wildfly logging thread.
>
>  Can I control the spark logging in the driver application? How can I turn
> it off in the driver application? How can I control the level of spark logs
> in the driver application?
>
> 2014-11-27 14:39:26,719 INFO  [akka.event.slf4j.Slf4jLogger]
> (spark-akka.actor.default-dispatcher-4) Slf4jLogger started
> 2014-11-27 14:39:26,917 INFO  [Remoting]
> (spark-akka.actor.default-dispatcher-2) Starting remoting
> 2014-11-27 14:39:27,719 INFO  [Remoting]
> (spark-akka.actor.default-dispatcher-2) Remoting started; listening on
> addresses :[akka.tcp://spark@172.32.1.12:43918]
> 2014-11-27 14:39:27,733 INFO  [Remoting]
> (spark-akka.actor.default-dispatcher-2) Remoting now listens on addresses:
> [akka.tcp://spark@172.32.1.12:43918]
> 2014-11-27 14:39:27,892 INFO  [org.apache.spark.SparkEnv] (MSC service
> thread 1-16) Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 2014-11-27 14:39:27,895 ERROR [stderr] (MSC service thread 1-16) 14/11/27
> 14:39:27 INFO SparkEnv: Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 2014-11-27 14:39:27,896 INFO  [org.apache.spark.SparkEnv] (MSC service
> thread 1-16) Registering BlockManagerMaster
> 2014-11-27 14:39:27,896 ERROR [stderr] (MSC service thread 1-16) 14/11/27
> 14:39:27 INFO SparkEnv: Registering BlockManagerMaster
> 2014-11-27 14:39:28,041 INFO  [org.apache.spark.storage.DiskBlockManager]
> (MSC service thread 1-16) Created local directory at
> /tmp/spark-local-20141127143928-d33c
> 2014-11-27 14:39:28,041 ERROR [stderr] (MSC service thread 1-16) 14/11/27
> 14:39:28 INFO DiskBlockManager: Created local directory at
> /tmp/spark-local-20141127143928-d33c
> 2014-11-27 14:39:28,055 INFO  [org.apache.spark.storage.MemoryStore] (MSC
> service thread 1-16) MemoryStore started with capacity 4.3 GB.
> 2014-11-27 14:39:28,055 ERROR [stderr] (MSC service thread 1-16) 14/11/27
> 14:39:28 INFO MemoryStore: MemoryStore started with capacity 4.3 GB.
> 2014-11-27 14:39:28,117 INFO  [org.apache.spark.network.ConnectionManager]
> (MSC service thread 1-16) Bound socket to port 34018 with id =
> ConnectionManagerId(ip-172-32-1-12,34018)
> 2014-11-27 14:39:28,118 ERROR [stderr] (MSC service thread 1-16) 14/11/27
> 14:39:28 INFO ConnectionManager: Bound socket to port 34018 with id =
> ConnectionManagerId(ip-172-32-1-12,34018)
> 2014-11-27 14:39:28,162 INFO  [org.apache.spark.storage.BlockManagerMaster]
> (MSC service thread 1-16) Trying to register BlockManager
> 2014-11-27 14:39:28,163 ERROR [stderr] (MSC service thread 1-16) 14/11/27
> 14:39:28 INFO BlockManagerMaster: Trying to register BlockManager
> 2014-11-27 14:39:28,181 INFO
> [org.apache.spark.storage.BlockManagerMasterActor$BlockManagerInfo]
> (spark-akka.actor.default-dispatcher-3) Registering block manager
> ip-172-32-1-12:34018 with 4.3 GB RAM
> 2014-11-27 14:39:28,185 ERROR [stderr]
> (spark-akka.actor.default-dispatcher-3) 14/11/27 14:39:28 INFO
> BlockManagerMasterActor$BlockManagerInfo: Registering block manager
> ip-172-32-1-12:34018 with 4.3 GB RAM
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Deadlock-between-spark-logging-and-wildfly-logging-tp20009.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to