The names of the directories that are created for the metastore are
different ("metastore" vs "metastore_db"), but that should be it.  Really
we should get rid of LocalHiveContext as it is mostly redundant and the
current state is kind of confusing.  I've created a JIRA to figure this out
before the 1.1 release.


On Mon, Jul 7, 2014 at 12:25 AM, Martin Gammelsæter <
martingammelsae...@gmail.com> wrote:

> Hi again, and thanks for your reply!
>
> On Fri, Jul 4, 2014 at 8:45 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
> >
> >> Sweet. Any idea about when this will be merged into master?
> >
> >
> > It is probably going to be a couple of weeks.  There is a fair amount of
> > cleanup that needs to be done.  It works though and we used it in most of
> > the demos at the spark summit.  Mostly I just need to add tests and move
> it
> > out of HiveContext (there is no good reason for that code to depend on
> > HiveContext). So you could also just try working with that branch.
> >
> >>
> >> This is probably a stupid question, but can you query Spark SQL tables
> >> from a (local?) hive context? In which case using that could be a
> >> workaround until the PR is merged.
> >
> >
> > Yeah, this is kind of subtle.  In a HiveContext, SQL Tables are just an
> > additional catalog that sits on top of the metastore.  All the query
> > execution occurs in the same code path, including the use of the Hive
> > Function Registry, independent of where the table comes from.  So for
> your
> > use case you can just create a hive context, which will create a local
> > metastore automatically if no hive-site.xml is present.
>
> Nice, that sounds like it'll solve my problems. Just for clarity, is
> LocalHiveContext and HiveContext equal if no hive-site.xml is present,
> or are there still differences?
>
> --
> Best regards,
> Martin Gammelsæter
>

Reply via email to