Have you considered using the spark-hbase-connector for this:

 https://github.com/nerdammer/spark-hbase-connector

On Thu, Mar 12, 2015 at 5:19 AM, Udbhav Agarwal <udbhav.agar...@syncoms.com>
wrote:

>  Thanks Akhil.
>
> Additionaly if we want to do sql query we need to create JavaPairRdd, then
> JavaRdd, then JavaSchemaRdd and then sqlContext.sql(sql query). Ryt ?
>
>
>
>
>
>
>
>
>
> *Thanks,*
>
> *Udbhav Agarwal*
>
>
>
> *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
> *Sent:* 12 March, 2015 11:43 AM
> *To:* Udbhav Agarwal
> *Cc:* user@spark.apache.org
> *Subject:* Re: hbase sql query
>
>
>
> Like this?
>
>
>
> val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
>
>       classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
>
>       classOf[org.apache.hadoop.hbase.client.Result]).cache()
>
>
>
>
>
> Here's a complete example
> <https://www.mapr.com/developercentral/code/loading-hbase-tables-spark#.VQEtqFR515Q>
> .
>
>
>   Thanks
>
> Best Regards
>
>
>
> On Wed, Mar 11, 2015 at 4:46 PM, Udbhav Agarwal <
> udbhav.agar...@syncoms.com> wrote:
>
>  Hi,
>
> How can we simply cache hbase table and do sql query via java api in spark.
>
>
>
>
>
>
>
> *Thanks,*
>
> *Udbhav Agarwal*
>
>
>
>
>

Reply via email to