It's still in HBase' trunk, scheduled for 2.0.0 release based on Jira
ticket.

-Deng

On Tue, Oct 27, 2015 at 6:35 PM, Fengdong Yu <fengdo...@everstring.com>
wrote:

> Does this released with Spark1.*? or still kept in the trunk?
>
>
>
>
> On Oct 27, 2015, at 6:22 PM, Adrian Tanase <atan...@adobe.com> wrote:
>
> Also I just remembered about cloudera’s contribution
>
> http://blog.cloudera.com/blog/2015/08/apache-spark-comes-to-apache-hbase-with-hbase-spark-module/
>
> From: Deng Ching-Mallete
> Date: Tuesday, October 27, 2015 at 12:03 PM
> To: avivb
> Cc: user
> Subject: Re: There is any way to write from spark to HBase CDH4?
>
> Hi,
>
> We are using phoenix-spark (http://phoenix.apache.org/phoenix_spark.html)
> to write data to HBase, but it requires spark 1.3.1+ and phoenix 4.4+.
> Previously, when we were still on spark 1.2, we used the HBase API to write
> directly to HBase.
>
> For HBase 0.98, it's something like this:
>
> rdd.foreachPartition(partition => {
>    // create hbase config
>    val hConf = HBaseConfiguration.create()
>    val hTable = new HTable(hConf, "TABLE_1")
>    hTable.setAutoFlush(false)
>
>    partition.foreach(r => {
>      // generate row key
>      // create row
>      val hRow = new Put(rowKey)
>
>      // add columns
>      hRow.add(..)
>
>      hTable.put(hRow)
>    })
>    hTable.flushCommits()
>    hTable.close()
> })
>
> HTH,
> Deng
>
> On Tue, Oct 27, 2015 at 5:36 PM, avivb <a...@taykey.com> wrote:
>
>> I have already try it with https://github.com/unicredit/hbase-rdd and
>> https://github.com/nerdammer/spark-hbase-connector and in both cases I
>> get
>> timeout.
>>
>> So I would like to know about other option to write from Spark to HBase
>> CDH4.
>>
>> Thanks!
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/There-is-any-way-to-write-from-spark-to-HBase-CDH4-tp25209.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

Reply via email to