After i set spark.driver.userClassPathFirst=true, my spark-submit --master yarn-client fails with below error & it works fine if i remove userClassPathFirst setting. I need to add this setting to avoid class conflicts in some other job so trying to make it this setting work in simple job first & later try with job with class conflicts.
>From quick search looks like this error occurs when driver cannot find yarn & hadoop related config, so exported SPARK_CONF_DIR & HADOOP_CONF_DIR and also added config files in --jar option but still get the same error. Any ideas on how to fix this? org.apache.spark.SparkException: Unable to load YARN support at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$ 1(SparkHadoopUtil.scala:399) at org.apache.spark.deploy.SparkHadoopUtil$.yarn$ lzycompute(SparkHadoopUtil.scala:394) at org.apache.spark.deploy.SparkHadoopUtil$.yarn( SparkHadoopUtil.scala:394) at org.apache.spark.deploy.SparkHadoopUtil$.get( SparkHadoopUtil.scala:411) at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils. scala:2119) at org.apache.spark.storage.BlockManager.<init>( BlockManager.scala:105) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:365) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193) at org.apache.spark.SparkContext.createSparkEnv(SparkContext. scala:289) at org.apache.spark.SparkContext.<init>(SparkContext.scala:462) at org.apache.spark.api.java.JavaSparkContext.<init>( JavaSparkContext.scala:59) at com.citi.ripcurl.timeseriesbatch.BatchContext.< init>(BatchContext.java:27) at com.citi.ripcurl.timeseriesbatch.example.EqDataQualityExample. runReportQuery(EqDataQualityExample.java:28) at com.citi.ripcurl.timeseriesbatch.example. EqDataQualityExample.main(EqDataQualityExample.java:70) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke( DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit( SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.RuntimeException: java.lang.RuntimeException: class org.apache.hadoop.security.ShellBasedUnixGroupsMapping not org.apache.hadoop.security.GroupMappingServiceProvider