yes it works fine if i switch to using the implicits on the SparkSession
(which is a val)

but do we want to break the old way of doing the import?

On Tue, May 3, 2016 at 12:56 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Have you tried the following ?
>
> scala> import spark.implicits._
> import spark.implicits._
>
> scala> spark
> res0: org.apache.spark.sql.SparkSession =
> org.apache.spark.sql.SparkSession@323d1fa2
>
> Cheers
>
> On Tue, May 3, 2016 at 9:16 AM, Koert Kuipers <ko...@tresata.com> wrote:
>
>> with the introduction of SparkSession SQLContext changed from being a
>> lazy val to a def.
>> however this is troublesome if you want to do:
>>
>> import someDataset.sqlContext.implicits._
>>
>> because it is no longer a stable identifier, i think? i get:
>> stable identifier required, but someDataset.sqlContext.implicits found.
>>
>> anyone else seen this?
>>
>>
>

Reply via email to