Ah, missed that java was a requirement. What distribution of Hadoop are you suing? Here is an example that may help, along with a few links to the JavaHbaseContext and a basic example.
https://github.com/tmalaska/SparkOnHBase https://github.com/tmalaska/SparkOnHBase/blob/master/src/main/java/com/cloudera/spark/hbase/example/JavaHBaseMapGetPutExample.java https://github.com/tmalaska/SparkOnHBase/blob/master/src/main/scala/com/cloudera/spark/hbase/JavaHBaseContext.scala On Thu, Mar 12, 2015 at 8:34 AM, Udbhav Agarwal <udbhav.agar...@syncoms.com> wrote: > Thanks Todd, > > But this link is also based on scala, I was looking for some help with > java Apis. > > > > *Thanks,* > > *Udbhav Agarwal* > > > > *From:* Todd Nist [mailto:tsind...@gmail.com] > *Sent:* 12 March, 2015 5:28 PM > *To:* Udbhav Agarwal > *Cc:* Akhil Das; user@spark.apache.org > *Subject:* Re: hbase sql query > > > > Have you considered using the spark-hbase-connector for this: > > > > https://github.com/nerdammer/spark-hbase-connector > > > > On Thu, Mar 12, 2015 at 5:19 AM, Udbhav Agarwal < > udbhav.agar...@syncoms.com> wrote: > > Thanks Akhil. > > Additionaly if we want to do sql query we need to create JavaPairRdd, then > JavaRdd, then JavaSchemaRdd and then sqlContext.sql(sql query). Ryt ? > > > > > > > > > > *Thanks,* > > *Udbhav Agarwal* > > > > *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com] > *Sent:* 12 March, 2015 11:43 AM > *To:* Udbhav Agarwal > *Cc:* user@spark.apache.org > *Subject:* Re: hbase sql query > > > > Like this? > > > > val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], > > classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], > > classOf[org.apache.hadoop.hbase.client.Result]).cache() > > > > > > Here's a complete example > <https://www.mapr.com/developercentral/code/loading-hbase-tables-spark#.VQEtqFR515Q> > . > > > Thanks > > Best Regards > > > > On Wed, Mar 11, 2015 at 4:46 PM, Udbhav Agarwal < > udbhav.agar...@syncoms.com> wrote: > > Hi, > > How can we simply cache hbase table and do sql query via java api in spark. > > > > > > > > *Thanks,* > > *Udbhav Agarwal* > > > > > > >