Some more info I’m still digging.
I’m just trying to do `spark.table(“db.table”).count`from a spark-shell
“db.table” is just a hive table.
At commit b67668b this worked just fine and it returned the number of rows in
db.table.
Starting at ca99171 "[SPARK-15073][SQL] Hide SparkSession constructo
The old one is deprecated but should still work though.
On Thu, May 19, 2016 at 3:51 PM, Arun Allamsetty
wrote:
> Hi Doug,
>
> If you look at the API docs here:
> http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-docs/api/scala/index.html#org.apache.spark.sql.hive.HiveContext,
Hi Doug,
If you look at the API docs here:
http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-docs/api/scala/index.html#org.apache.spark.sql.hive.HiveContext,
you'll see
Deprecate* (Since version 2.0.0)* Use
SparkSession.builder.enableHiveSupport instead
So you probably need to us
>
> 1. “val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)”
> doesn’t work because “HiveContext not a member of
> org.apache.spark.sql.hive” I checked the documentation, and it looks like
> it should still work for spark-2.0.0-preview-bin-hadoop2.7.tgz
>
HiveContext has been deprecate
I haven’t had time to really look into this problem, but I want to mention it.
I downloaded
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-bin/spark-2.0.0-preview-bin-hadoop2.7.tgz
and tried to run it against our Secure Hadoop cluster and access a Hive table.
1. “val sqlCo