>
> Spark SQL experts on the forum can confirm on this though.
>
>
>
> *From:* Cheng Lian [mailto:lian.cs@gmail.com]
> *Sent:* Tuesday, December 9, 2014 6:42 AM
> *To:* Anas Mosaad
> *Cc:* Judy Nash; user@spark.apache.org
> *Subject:* Re: Spark-SQL JDBC driver
>
&
In that case, what should be the behavior of saveTableAs?
On Dec 10, 2014 4:03 AM, "Michael Armbrust" wrote:
> That is correct. It the hive context will create an embedded metastore in
> the current directory if you have not configured hive.
>
> On Tue, Dec 9, 2014 at 5:51 PM, Manoj Samel
> wro
with Hive, HiveContext instead of SQLContext must be used.
> 2. `registerTempTable` doesn't persist the table into Hive metastore, and
> the table is lost after quitting spark-shell. Instead, you must use
> `saveAsTable`.
>
>
> On 12/9/14 5:27 PM, Anas Mosaad wrote:
>
>
do need a working
> Metastore.
>
>
> On 12/9/14 3:59 PM, Anas Mosaad wrote:
>
> Thanks Judy, this is exactly what I'm looking for. However, and plz
> forgive me if it's a dump question is: It seems to me that thrift is the
> same as hive2 JDBC driver, does this m
ash
wrote:
> You can use thrift server for this purpose then test it with beeline.
>
>
>
> See doc:
>
>
> https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbc-server
>
>
>
>
>
> *From:* Anas Mosaad [mailto:anas.mos...@inco
Hello Everyone,
I'm brand new to spark and was wondering if there's a JDBC driver to access
spark-SQL directly. I'm running spark in standalone mode and don't have
hadoop in this environment.
--
*Best Regards/أطيب المنى,*
*Anas Mosaad*