The solution is to strip it out in a hook on your threadpool, by overriding
beforeExecute. See:
https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/ThreadPoolExecutor.html

On Fri, Sep 30, 2016 at 7:08 AM, Grant Digby <dig...@gmail.com> wrote:

> Thanks for the link. Yeah if there's no need to copy execution.id from
> parent
> to child then I agree, you could strip it out, presumably in this part of
> the code using some kind of configuration as to which properties shouldn't
> go across
>
> SparkContext:
>  protected[spark] val localProperties = new
> InheritableThreadLocal[Properties] {
>     override protected def childValue(parent: Properties): Properties = {
>       // Note: make a clone such that changes in the parent properties
> aren't reflected in
>       // the those of the children threads, which has confusing semantics
> (SPARK-10563).
>       SerializationUtils.clone(parent).asInstanceOf[Properties]
>     }
>     override protected def initialValue(): Properties = new Properties()
>   }
>
>
>
> --
> View this message in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/IllegalArgumentException-
> spark-sql-execution-id-is-already-set-tp19124p19190.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Want to work at Handy? Check out our culture deck and open roles 
<http://www.handy.com/careers>
Latest news <http://www.handy.com/press> at Handy
Handy just raised $50m 
<http://venturebeat.com/2015/11/02/on-demand-home-service-handy-raises-50m-in-round-led-by-fidelity/>
 led 
by Fidelity

Reply via email to