To: Michael Armbrust
Cc: Haopu Wang; dev@spark.apache.org
Subject: Re: HiveContext cannot be serialized
I submitted a patch
https://github.com/apache/spark/pull/4628
On Mon, Feb 16, 2015 at 10:59 AM, Michael Armbrust
wrote:
I was suggesting you mark the variable that is holding the
I submitted a patch
https://github.com/apache/spark/pull/4628
On Mon, Feb 16, 2015 at 10:59 AM, Michael Armbrust
wrote:
> I was suggesting you mark the variable that is holding the HiveContext
> '@transient' since the scala compiler is not correctly propagating this
> through the tuple extracti
I was suggesting you mark the variable that is holding the HiveContext
'@transient' since the scala compiler is not correctly propagating this
through the tuple extraction. This is only a workaround. We can also
remove the tuple extraction.
On Mon, Feb 16, 2015 at 10:47 AM, Reynold Xin wrote:
Michael - it is already transient. This should probably considered a bug in
the scala compiler, but we can easily work around it by removing the use of
destructuring binding.
On Mon, Feb 16, 2015 at 10:41 AM, Michael Armbrust
wrote:
> I'd suggest marking the HiveContext as @transient since its n
I'd suggest marking the HiveContext as @transient since its not valid to
use it on the slaves anyway.
On Mon, Feb 16, 2015 at 4:27 AM, Haopu Wang wrote:
> When I'm investigating this issue (in the end of this email), I take a
> look at HiveContext's code and find this change
> (https://github.co