>
> SQLContext is exposed through `sql_ctx`
> (
> https://github.com/apache/spark/blob/8bfaa62f2fcc942dd99a63b20366167277bce2a1/python/pyspark/sql/dataframe.py#L80
> )
>
> On 3/17/20 5:53 PM, Ben Roling wrote:
> > I tried this on the users mailing list but didn't get t
I tried this on the users mailing list but didn't get traction. It's
probably more appropriate here anyway.
I've noticed that DataSet.sqlContext is public in Scala but the equivalent
(DataFrame._sc) in PySpark is named as if it should be treated as private.
Is this intentional? If so, what's th