Hi,  I have a spark-streaming application which uses sparkOnHBase lib to do 
streamBulkPut()
Without checkpointing everything works fine.. But recently upon enabling 
checkpointing I got thefollowing exception - 
16/01/22 01:32:35 ERROR executor.Executor: Exception in task 0.0 in stage 39.0 
(TID 134)java.lang.ClassCastException: [B cannot be cast to 
org.apache.spark.SerializableWritable        at 
com.cloudera.spark.hbase.HBaseContext.applyCreds(HBaseContext.scala:225)        
at 
com.cloudera.spark.hbase.HBaseContext.com$cloudera$spark$hbase$HBaseContext$$hbaseForeachPartition(HBaseContext.scala:633)
        at 
com.cloudera.spark.hbase.HBaseContext$$anonfun$com$cloudera$spark$hbase$HBaseContext$$bulkMutation$1.apply(HBaseContext.scala:460)
        at 
com.cloudera.spark.hbase.HBaseContext$$anonfun$com$cloudera$spark$hbase$HBaseContext$$bulkMutation$1.apply(HBaseContext.scala:460)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:806)       
 at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:806)   
     at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)  
      at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)  
      at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)     
   at org.apache.spark.scheduler.Task.run(Task.scala:64)        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
       at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
       at java.lang.Thread.run(Thread.java:745)
Any pointers from previous users of sparkOnHbase lib ??
Thanks,-Vinay

Reply via email to