I have a asked a similar question here

http://stackoverflow.com/questions/40701518/spark-2-0-redefining-sparksession-params-through-getorcreate-and-not-seeing-cha

Please see the answer, basically stating that it's impossible to change
Session config as soon as it was initiated

On Mon, Dec 19, 2016 at 9:01 PM, Venkata Naidu <naidu1...@gmail.com> wrote:

> We can create a link in the spark conf directory to point hive.conf file
> of hive installation I believe.
>
> Thanks,
> Venkat.
>
> On Mon, Dec 19, 2016, 10:58 AM apu <apumishra...@gmail.com> wrote:
>
>> This is for Spark 2.0:
>>
>> If I wanted Hive support on a new SparkSession, I would build it with:
>>
>> spark = SparkSession \
>>     .builder \
>>     .enableHiveSupport() \
>>     .getOrCreate()
>>
>> However, PySpark already creates a SparkSession for me, which appears to
>> lack HiveSupport. How can I either:
>>
>> (a) Add Hive support to an existing SparkSession,
>>
>> or
>>
>> (b) Configure PySpark so that the SparkSession it creates at startup has
>> Hive support enabled?
>>
>> Thanks!
>>
>> Apu
>>
>

Reply via email to