[
https://issues.apache.org/jira/browse/HIVE-8795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14208465#comment-14208465
]
Xuefu Zhang commented on HIVE-8795:
-----------------------------------
Hi [~szehon], You're right. As stated in the description, it needs SPARK_HOME
to be set so that the scripts can be found. I think this is okay for the build
machine to have the script and/or the env variable, or even a full Spark
installation, if we can make this configurable in such a way that on the build
machine spark.master=local-cluster. The default can be still
spark.master=local, so that devs are not required to have such settings in
their environments for unit tests, unless one wants to test with local-cluster
specifically. I assume this can be achieved.
> Switch precommit test from local to local-cluster [Spark Branch]
> ----------------------------------------------------------------
>
> Key: HIVE-8795
> URL: https://issues.apache.org/jira/browse/HIVE-8795
> Project: Hive
> Issue Type: Sub-task
> Components: Spark
> Reporter: Xuefu Zhang
> Assignee: Szehon Ho
>
> It seems unlikely that Spark community will provide MRMiniCluster equivalent
> (SPARK-3691), and Spark local-cluster was the recommendation. Latest research
> shows that Spark local-cluster works with Hive. Therefore, for now, we use
> Spark local-cluster (instead of current local) for our precommit test.
> It's previous belived (HIVE-7382) that a Spark installation is required and
> SPARK_HOME env variable needs to set. Since Spark pulls in Spark's assembly
> jar, it's believed now we only need a few script from Spark installation
> instead.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)