I think it will be same, but let me try that
FYR - https://issues.apache.org/jira/browse/SPARK-19881
On Fri, Jul 28, 2017 at 4:44 PM, ayan guha wrote:
> Try running spark.sql("set yourconf=val")
>
> On Fri, 28 Jul 2017 at 8:51 pm, Chetan Khatri
> wrote:
>
>> Jorn, Both are same.
>>
>> On Fri,
Jorn, Both are same.
On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke wrote:
> Try sparksession.conf().set
>
> On 28. Jul 2017, at 12:19, Chetan Khatri
> wrote:
>
> Hey Dev/ USer,
>
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive
> facing below issue:
>
> org.apache.hadoop.h
Try sparksession.conf().set
> On 28. Jul 2017, at 12:19, Chetan Khatri wrote:
>
> Hey Dev/ USer,
>
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing
> below issue:
>
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344