Hello everyone!

Like the title.
I start the Spark SQL 1.2.0 thrift server. Use beeline connect to the server to 
execute SQL.
I want to kill one SQL job running in the thrift server and not kill the thrift 
server.
I set property spark.ui.killEnabled=true in spark-default.conf
But in the UI, only stages can be killed, and the job can’t be killed!
Is any way to kill the SQL job in the thrift server?


Xiaoyu Wang
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to