Yeah. for that, you cannot really cache anything through Hive on Spark.
Could you detail more what you want to achieve?

When needed, Hive on Spark uses memory+disk for storage level.

On Fri, Oct 23, 2015 at 4:29 AM, Jone Zhang <joyoungzh...@gmail.com> wrote:

> 1.But It's no way to set Storage Level through properties file in spark,
> Spark provided "def persist(newLevel: StorageLevel)"
> api only...
>
> 2015-10-23 19:03 GMT+08:00 Xuefu Zhang <xzh...@cloudera.com>:
>
>> quick answers:
>> 1. you can pretty much set any spark configuration at hive using set
>> command.
>> 2. no. you have to make the call.
>>
>>
>>
>> On Thu, Oct 22, 2015 at 10:32 PM, Jone Zhang <joyoungzh...@gmail.com>
>> wrote:
>>
>>> 1.How can i set Storage Level when i use Hive on Spark?
>>> 2.Do Spark have any intention of  dynamically determined Hive on
>>> MapReduce or Hive on Spark, base on SQL features.
>>>
>>> Thanks in advance
>>> Best regards
>>>
>>
>>
>

Reply via email to