Hi again, and thanks for your reply!

On Fri, Jul 4, 2014 at 8:45 PM, Michael Armbrust <mich...@databricks.com> wrote:
>
>> Sweet. Any idea about when this will be merged into master?
>
>
> It is probably going to be a couple of weeks.  There is a fair amount of
> cleanup that needs to be done.  It works though and we used it in most of
> the demos at the spark summit.  Mostly I just need to add tests and move it
> out of HiveContext (there is no good reason for that code to depend on
> HiveContext). So you could also just try working with that branch.
>
>>
>> This is probably a stupid question, but can you query Spark SQL tables
>> from a (local?) hive context? In which case using that could be a
>> workaround until the PR is merged.
>
>
> Yeah, this is kind of subtle.  In a HiveContext, SQL Tables are just an
> additional catalog that sits on top of the metastore.  All the query
> execution occurs in the same code path, including the use of the Hive
> Function Registry, independent of where the table comes from.  So for your
> use case you can just create a hive context, which will create a local
> metastore automatically if no hive-site.xml is present.

Nice, that sounds like it'll solve my problems. Just for clarity, is
LocalHiveContext and HiveContext equal if no hive-site.xml is present,
or are there still differences?

-- 
Best regards,
Martin Gammelsæter

Reply via email to