Thanks for the pointers, I will give it a try.

On Wed, Jun 17, 2015 at 12:06 PM, Fulin Sun <su...@certusnet.com.cn> wrote:

> Hi there
> AFAIK, spark can smoothly read hive data using HiveContext, or use
> dataframe and data source api to read any
> external data source to tranform that into dataframe. So I would recommend
> using phoenix-spark module to
> achieve this goal. And you can simply choose to write spark dataframe to
> what kind of storage.
>
> Best,
> Sun.
>
> ------------------------------
> ------------------------------
>
> CertusNet
>
>
> *From:* Buntu Dev <buntu...@gmail.com>
> *Date:* 2015-06-17 14:56
> *To:* user <user@phoenix.apache.org>
> *Subject:* Phoenix and Hive
> I got quite a bit of data in a Hive managed tables and I looking for ways
> to join those tables with the one I create in Phoenix. I'm aware of HBase
> and Hive integration but not sure if there is any current support for
> Phoenix and Hive integration. Please let me know.
>
> Thanks!
>
>

Reply via email to