I guess not. Spark partitions correspond to number of splits.
On 23 May 2015 00:02, "Cesar Flores" <ces...@gmail.com> wrote:

>
> I have a table in a Hive database partitioning by date. I notice that when
> I query this table using HiveContext the created data frame has an specific
> number of partitions.
>
>
> Do this partitioning corresponds to my original table partitioning in Hive?
>
>
> Thanks
> --
> Cesar Flores
>

Reply via email to