Hi, everyone, I used the latest 4.1 release to run some tests about local indexing. When I am trying to load data into phoenix table with local index, I got the following error. Not sure whether got some relation with Hbase local index table, cause Hbase local index table is uniformly prefixed with '_LOCAL_IDX_' + TableRef. Any available hints? Also corrects me if I got some misunderstanding. Best Regards, Sun. org.apache.phoenix.execute.CommitException: java.sql.SQLException: ERROR 2008 (INT10): Unable to find cached index metadata. ERROR 2008 (INT10): ERROR 2008 (INT10): Unable to find cached index metadata. key=-8614688887238479432 region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9. Index update failed org.apache.phoenix.execute.MutationState.commit(MutationState.java:433) org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384) org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381) org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381) com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113) com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104) scala.collection.Iterator$class.foreach(Iterator.scala:727) scala.collection.AbstractIterator.foreach(Iterator.scala:1157) com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104) com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89) scala.collection.Iterator$class.foreach(Iterator.scala:727) scala.collection.AbstractIterator.foreach(Iterator.scala:1157) org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759) org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759) org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62) org.apache.spark.scheduler.Task.run(Task.scala:54) org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) java.lang.Thread.run(Thread.java:744)
CertusNet