Probably not. Want to submit a pull request?
On Tuesday, May 3, 2016, Koert Kuipers wrote:
> yes it works fine if i switch to using the implicits on the SparkSession
> (which is a val)
>
> but do we want to break the old way of doing the import?
>
> On Tue, May 3, 2016 at 12:56 PM, Ted Yu > wro
sure i can do that
On Tue, May 3, 2016 at 1:21 PM, Reynold Xin wrote:
> Probably not. Want to submit a pull request?
>
>
> On Tuesday, May 3, 2016, Koert Kuipers wrote:
>
>> yes it works fine if i switch to using the implicits on the SparkSession
>> (which is a val)
>>
>> but do we want to brea
yes it works fine if i switch to using the implicits on the SparkSession
(which is a val)
but do we want to break the old way of doing the import?
On Tue, May 3, 2016 at 12:56 PM, Ted Yu wrote:
> Have you tried the following ?
>
> scala> import spark.implicits._
> import spark.implicits._
>
> s
Have you tried the following ?
scala> import spark.implicits._
import spark.implicits._
scala> spark
res0: org.apache.spark.sql.SparkSession =
org.apache.spark.sql.SparkSession@323d1fa2
Cheers
On Tue, May 3, 2016 at 9:16 AM, Koert Kuipers wrote:
> with the introduction of SparkSession SQLCont
with the introduction of SparkSession SQLContext changed from being a lazy
val to a def.
however this is troublesome if you want to do:
import someDataset.sqlContext.implicits._
because it is no longer a stable identifier, i think? i get:
stable identifier required, but someDataset.sqlContext.imp