Yes Ted, spark.executor.extraClassPath will work if hbase client jars is
present in all Spark Worker / NodeManager machines.
spark.yarn.dist.files is the easier way, as hbase client jars can be copied
from driver machine or hdfs into container / spark-executor classpath
automatically. No need to m
+ Spark-Dev
For a Spark job on YARN accessing hbase table, added all hbase client jars
into spark.yarn.dist.files, NodeManager when launching container i.e
executor, does localization and brings all hbase-client jars into executor
CWD, but still the executor tasks fail with ClassNotFoundException