se it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any mo
f data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
r any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 8 Aug 2019 at 15:16, Hao Ren wrote:
>
>>
>>
>> -- Forwarded message -
>> From: Hao Ren
>> Date: Thu, Aug 8, 2019 at 4:15 PM
>> Subject
-- Forwarded message -
> From: Hao Ren
> Date: Thu, Aug 8, 2019 at 4:15 PM
> Subject: Re: Spark SQL reads all leaf directories on a partitioned Hive
> table
> To: Gourav Sengupta
>
>
> Hi Gourva,
>
> I am using enableHiveSupport.
> The table was not created b
Hi,
Just out of curiosity did you start the SPARK session using
enableHiveSupport() ?
Or are you creating the table using SPARK?
Regards,
Gourav
On Wed, Aug 7, 2019 at 3:28 PM Hao Ren wrote:
> Hi,
> I am using Spark SQL 2.3.3 to read a hive table which is partitioned by
> day, hour, platform