gt;
>> *From:* Mendelson, Assaf
>> *Sent:* Tuesday, November 15, 2016 10:11 AM
>> *To:* 'rxin [via Apache Spark Developers List]'
>> *Subject:* RE: separate spark and hive
>>
>>
>>
>> Spark shell (and pyspark) by default create the spa
documentation at http://spark.apache.org/docs/latest/configuration.html
> has no mentioning of hive at all)
>
> Assaf.
>
>
>
> *From:* Mendelson, Assaf
> *Sent:* Tuesday, November 15, 2016 10:11 AM
> *To:* 'rxin [via Apache Spark Developers List]'
> *Subject:*
no mentioning of
hive at all)
Assaf.
From: Mendelson, Assaf
Sent: Tuesday, November 15, 2016 10:11 AM
To: 'rxin [via Apache Spark Developers List]'
Subject: RE: separate spark and hive
Spark shell (and pyspark) by default create the spark session with hive support
(also true when the
]
[mailto:ml-node+s1001551n19882...@n3.nabble.com]
Sent: Tuesday, November 15, 2016 9:46 AM
To: Mendelson, Assaf
Subject: Re: separate spark and hive
If you just start a SparkSession without calling enableHiveSupport it actually
won't use the Hive catalog support.
On Mon, Nov 14, 2016 at 11:
ither a simple
> configuration or even the default and that if there is any missing
> functionality it should be documented.
>
> Assaf.
>
>
>
>
>
> *From:* Reynold Xin [mailto:r...@databricks.com]
> *Sent:* Tuesday, November 15, 2016 9:31 AM
> *To:* Mendelson, Assaf
@spark.apache.org
Subject: Re: separate spark and hive
I agree with the high level idea, and thus
SPARK-15691<https://issues.apache.org/jira/browse/SPARK-15691>.
In reality, it's a huge amount of work to create & maintain a custom catalog.
It might actually make sense to do, but it j
ing file based to avoid the need for locking). Hive should be
> reserved for those who actually use it (probably for backward
> compatibility).
>
>
>
> Am I missing something here?
>
> Assaf.
>
> --
> View this message in context: separate s
based to avoid the need for locking). Hive should be reserved for those who
actually use it (probably for backward compatibility).
Am I missing something here?
Assaf.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/separate-spark-and-hive-tp19879.html