Thanks Andrew. What version of HS2 is the SparkSQL thrift server using?
What would be involved in updating? Is it a simple case of increasing the
deep version in one of the project POMs?

Cheers,
~N

On Sat, May 2, 2015 at 11:38 AM, Andrew Lee <alee...@hotmail.com> wrote:

> Hi N,
>
> See: https://issues.apache.org/jira/browse/SPARK-5159
>
> I don't think it is yet supported until the HS2 code base is updated in
> Spark hive-thriftserver project.
>
> ------------------------------
> Date: Fri, 1 May 2015 15:56:30 +1000
> Subject: Spark SQL ThriftServer Impersonation Support
> From: nightwolf...@gmail.com
> To: user@spark.apache.org
>
>
> Hi guys,
>
>
> Trying to use the SparkSQL Thriftserver with hive metastore. It seems that
> hive meta impersonation works fine (when running Hive tasks). However
> spinning up SparkSQL thrift server, impersonation doesn't seem to work...
>
> What settings do I need to enable impersonation?
>
> I've copied the same config as in my hive-site. Here is my launch command
> for the spark thrift server;
>
> --hiveconf hive.server2.enable.impersonation=true --hiveconf
> hive.server2.enable.doAs=true --hiveconf hive.metastore.execute.setugi=true
>
> Here is my full run script:
>
> export HIVE_SERVER2_THRIFT_BIND_HOST=0.0.0.0
> export HIVE_SERVER2_THRIFT_PORT=10000
>
> export HIVE_CONF_DIR=/opt/mapr/hive/hive-0.13/conf/
> export HIVE_HOME=/opt/mapr/hive/hive-0.13/
> export HADOOP_HOME=/opt/mapr/hadoop/hadoop-2.5.1/
> export HADOOP_CONF_DIR=/opt/mapr/hadoop/hadoop-2.5.1/etc/hadoop
>
> export EXECUTOR_MEMORY=30g
> export DRIVER_MEMORY=4g
> export EXECUTOR_CORES=15
> export NUM_EXECUTORS=20
> export KRYO_BUFFER=512
> export SPARK_DRIVER_MAXRESULTSIZE=4096
>
> export HIVE_METASTORE_URIS=thrift://localhost:9083
> export HIVE_METASTORE_WAREHOUSE_DIR=/user/hive/warehouse
>
> export
> SPARK_DIST_CLASSPATH=/opt/mapr/lib/*:/opt/mapr/hadoop/hadoop-2.5.1/share/hadoop/yarn/*:/opt/mapr/hadoop/hadoop-2.5.1/share/hadoop/common/lib/*:/opt/mapr/hive/hive-current/lib/*
> export SPARK_LOG_DIR=/tmp/spark-log
> export
> SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=hdfs:///log/spark-events
> export SPARK_CONF_DIR=/apps/spark/global-conf/
>
> export SPARK_HOME=/apps/spark/spark-1.3.1-bin-mapr4.0.2_yarn_j6_2.10
>
> export SPARK_LIBRARY_PATH=/opt/mapr/lib/*
> export SPARK_JAVA_OPTS=-Djava.library.path=/opt/mapr/lib
>
>
> $SPARK_HOME/*sbin/start-thriftserver.sh* --master yarn-client --jars
> /opt/mapr/lib/libjpam.so --executor-memory $EXECUTOR_MEMORY --driver-memory
> $DRIVER_MEMORY --executor-cores $EXECUTOR_CORES --num-executors
> $NUM_EXECUTORS --conf spark.scheduler.mode=FAIR --conf
> spark.kryoserializer.buffer.mb=$KRYO_BUFFER --conf
> spark.serializer=org.apache.spark.serializer.KryoSerializer --conf
> spark.files.useFetchCache=false --conf
> spark.driver.maxResultSize=$SPARK_DRIVER_MAXRESULTSIZE --hiveconf
> hive.metastore.uris=$HIVE_METASTORE_URIS --hiveconf
> hive.metastore.warehouse.dir=$HIVE_METASTORE_WAREHOUSE_DIR --hiveconf
> hive.server2.enable.impersonation=true --hiveconf
> hive.server2.enable.doAs=true --hiveconf hive.metastore.execute.setugi=true
>
>
> Cheers,
> N
>

Reply via email to