9:52 AM, JoneZhang wrote:
>
>> 1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY
>> Storage Level?
>> 2.If not, How can i set Storage Level when i use Hive on Spark?
>> 3.Do Spark have any intention of dynamically determined Hive on MapReduce
>&
You can set it to MEMORY_AND_DISK, in this case data will fall back to disk
when it exceeds the memory.
Thanks
Best Regards
On Fri, Oct 23, 2015 at 9:52 AM, JoneZhang wrote:
> 1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY
> Storage Level?
> 2.If not, How
1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY
Storage Level?
2.If not, How can i set Storage Level when i use Hive on Spark?
3.Do Spark have any intention of dynamically determined Hive on MapReduce
or Hive on Spark, base on SQL features.
Thanks in advance
Best