We can create a link in the spark conf directory to point hive.conf file of
hive installation I believe.

Thanks,
Venkat.

On Mon, Dec 19, 2016, 10:58 AM apu <apumishra...@gmail.com> wrote:

> This is for Spark 2.0:
>
> If I wanted Hive support on a new SparkSession, I would build it with:
>
> spark = SparkSession \
>     .builder \
>     .enableHiveSupport() \
>     .getOrCreate()
>
> However, PySpark already creates a SparkSession for me, which appears to
> lack HiveSupport. How can I either:
>
> (a) Add Hive support to an existing SparkSession,
>
> or
>
> (b) Configure PySpark so that the SparkSession it creates at startup has
> Hive support enabled?
>
> Thanks!
>
> Apu
>

Reply via email to