The old one is deprecated but should still work though.

On Thu, May 19, 2016 at 3:51 PM, Arun Allamsetty <arun.allamse...@gmail.com>
wrote:

> Hi Doug,
>
> If you look at the API docs here:
> http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-docs/api/scala/index.html#org.apache.spark.sql.hive.HiveContext,
> you'll see
> Deprecate* (Since version 2.0.0)* Use
> SparkSession.builder.enableHiveSupport instead
> So you probably need to use that.
>
> Arun
>
> On Thu, May 19, 2016 at 3:44 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> 1. “val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)”
>>> doesn’t work because “HiveContext not a member of
>>> org.apache.spark.sql.hive”  I checked the documentation, and it looks like
>>> it should still work for spark-2.0.0-preview-bin-hadoop2.7.tgz
>>>
>>
>> HiveContext has been deprecated and moved to a 1.x compatibility package,
>> which you'll need to include explicitly.  Docs have not been updated yet.
>>
>>
>>> 2. I also tried the new spark session, ‘spark.table(“db.table”)’, it
>>> fails with a HDFS permission denied can’t write to “/user/hive/warehouse”
>>>
>>
>> Where are the HDFS configurations located?  We might not be propagating
>> them correctly any more.
>>
>
>

Reply via email to