Which HBase version are you using ?

Can you show the full stack trace ?

Cheers

On Mon, Dec 22, 2014 at 11:02 AM, Antony Mayi <antonym...@yahoo.com.invalid>
wrote:

> Hi,
>
> can anyone please give me some help how to write custom converter of hbase
> data to (for example) tuples of ((family, qualifier, value), ) for pyspark:
>
> I was trying something like (here trying to tuples of
> ("family:qualifier:value", )):
>
>
> class HBaseResultToTupleConverter extends Converter[Any, List[String]] {
>   override def convert(obj: Any): List[String] = {
>     val result = obj.asInstanceOf[Result]
>     result.rawCells().map(cell =>
> List(Bytes.toString(CellUtil.cloneFamily(cell)),
>       Bytes.toString(CellUtil.cloneQualifier(cell)),
>       Bytes.toString(CellUtil.cloneValue(cell))).mkString(":")
>     ).toList
>   }
> }
>
>
> but then I get a error:
>
> 14/12/22 16:27:40 WARN python.SerDeUtil:
> Failed to pickle Java object as value: $colon$colon, falling back
> to 'toString'. Error: couldn't introspect javabean:
> java.lang.IllegalArgumentException: wrong number of arguments
>
>
> does anyone have a hint?
>
> Thanks,
> Antony.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to