[ https://issues.apache.org/jira/browse/HIVE-15302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15707353#comment-15707353 ]
Rui Li commented on HIVE-15302: ------------------------------- Yeah my plan is to put the jars to HDFS. For example, if user doesn't specify spark.yarn.archive or spark.yarn.jars, we can find the needed jars in spark.home, and upload them to HDFS, under our session's tmp dir. I'm actually not very clear about the difference between spark.yarn.archive and spark.yarn.jars. In my test I just put all the jars in a folder in HDFS, and point spark.yarn.archive to that folder and it worked. I guess the usage of spark.yarn.jars should be similar to this. > Relax the requirement that HoS needs Spark built w/o Hive > --------------------------------------------------------- > > Key: HIVE-15302 > URL: https://issues.apache.org/jira/browse/HIVE-15302 > Project: Hive > Issue Type: Improvement > Reporter: Rui Li > Assignee: Rui Li > > This requirement becomes more and more unacceptable as SparkSQL becomes > widely adopted. Let's use this JIRA to find out how we can relax the > limitation. -- This message was sent by Atlassian JIRA (v6.3.4#6332)