Spark SQL : Exception on concurrent insert due to lease over _SUCCESS

2018-01-07 Thread Ajith shetty
Hi all I am using spark 2.1 and I encounter exception when do concurrent insert on a table, Here is my scenario and some analysis create table sample using csv options('path' '/tmp/f/') When concurrent insert are executed, we see exception like below: 2017-12-29 13:41:11,117 | ERROR | main | A

Please review the pull request #20177

2018-01-07 Thread Suchith J N
Hi, I'm new to Apache Spark and I'm interested in contributing. I worked on SPARK-22954 and I have submitted a pull request. Someone please review it - https://github.com/apache/spark/pull/20177 (I'm sorry if this is not the norm) Thanks, Suchith