Some more info I’m still digging.
I’m just trying to do `spark.table(“db.table”).count`from a spark-shell
“db.table” is just a hive table.
At commit b67668b this worked just fine and it returned the number of rows in
db.table.
Starting at ca99171 "[SPARK-15073][SQL] Hide SparkSession constructo
I haven’t had time to really look into this problem, but I want to mention it.
I downloaded
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-bin/spark-2.0.0-preview-bin-hadoop2.7.tgz
and tried to run it against our Secure Hadoop cluster and access a Hive table.
1. “val sqlCo
r ( which is in the
> hadoop cluster). The job submission actually failed in the client side.
>
>Currently we get around this by replace the spark's hive-exec with apache
> hive-exec.
>
Why are you using the Spark Yarn Client.scala directly and not using the
SparkLauncher that
+1 (non-binding)
Tested on secure YARN cluster with HIVE.
Notes: SPARK-10422, SPARK-10737 were causing us problems with 1.5.0. We see
1.5.1 as a big improvement.
Cheers,
Doug
> On Sep 24, 2015, at 3:27 AM, Reynold Xin wrote:
>
> Please vote on releasing the following candidate as Apache
It works for me in cluster mode.
I’m running on Hortonworks 2.2.4.12 in secure mode with Hive 0.14
I built with
./make-distribution —tgz -Phive -Phive-thriftserver -Phbase-provided -Pyarn
-Phadoop-2.6
Doug
> On Aug 25, 2015, at 4:56 PM, Tom Graves wrote:
>
> Anyone using HiveContext with
is adaption layer allows Spark
> SQL to connect to arbitrary Hive version greater than or equal to 0.12.0 (or
> maybe 0.13.1, not decided yet).
>
> However, it's not a promise yet, since this requires major refactoring of the
> current Spark SQL Hive support.
>
> Chen
Hi,
I'm just wondering if anybody is working on supporting Hive 0.14 in secure
mode on hadoop 2.6.0 ?
I see once Jira referring to it
https://issues.apache.org/jira/browse/SPARK-5111
but it mentions no effort to move to 0.14.
Thanks,
Doug
-