[ https://issues.apache.org/jira/browse/SPARK-15401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15290792#comment-15290792 ]
Christophe Préaud commented on SPARK-15401: ------------------------------------------- The thrift server above is launched on a YARN cluster (hence in yarn-client mode). The tmp directory on the driver has been set to "/opt/kookel/data/spark-tmp" using the two properties below: spark.local.dir=/opt/kookel/data/spark-tmp spark.driver.extraJavaOptions=-Djava.io.tmpdir=/opt/kookel/data/spark-tmp But the problem was the same when it was set to the default /tmp > Spark Thrift server creates empty directories in tmp directory > -------------------------------------------------------------- > > Key: SPARK-15401 > URL: https://issues.apache.org/jira/browse/SPARK-15401 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.1 > Reporter: Christophe Préaud > Priority: Minor > > Each connection to the Spark thrift server (e.g. using beeline) creates two > empty directories in the tmp directory which are never removed: > cd <tmp directory> > ls -ltd *_resources | wc -l && /opt/spark/bin/beeline -u > jdbc:hive2://dc1-kdp-prod-hadoop-00.prod.dc1.kelkoo.net:10000 -n kookel -e > '!quit' && ls -ltd *_resources | wc -l > 9080 > Connecting to jdbc:hive2://dc1-kdp-prod-hadoop-00.prod.dc1.kelkoo.net:10000 > Connected to: Spark SQL (version 1.6.1) > Driver: Spark Project Core (version 1.6.1) > Transaction isolation: TRANSACTION_REPEATABLE_READ > Closing: 0: jdbc:hive2://dc1-kdp-prod-hadoop-00.prod.dc1.kelkoo.net:10000 > Beeline version 1.6.1 by Apache Hive > 9082 > Those directories accumulates over time and are not removed: > ls -ld *_resources | wc -l > 9064 > And they are indeed empty: > find *_resources -type f | wc -l > 0 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org