Looks like a classpath issue - Caused by: java.lang.ClassNotFoundException: com.amazonaws.services.s3.AmazonS3
Are you using S3 somewhere? Are the required jars in place? Best Regards, Sonal Founder, Nube Technologies <http://www.nubetech.co> Reifier at Strata Hadoop World <https://www.youtube.com/watch?v=eD3LkpPQIgM> Reifier at Spark Summit 2015 <https://spark-summit.org/2015/events/real-time-fuzzy-matching-with-spark-and-elastic-search/> <http://in.linkedin.com/in/sonalgoyal> On Tue, Sep 6, 2016 at 4:45 PM, Divya Gehlot <divya.htco...@gmail.com> wrote: > Hi, > I am getting below error if I try to use properties file paramater in > spark-submit > > Exception in thread "main" java.util.ServiceConfigurationError: > org.apache.hadoop.fs.FileSystem: Provider > org.apache.hadoop.fs.s3a.S3AFileSystem > could not be instantiated > at java.util.ServiceLoader.fail(ServiceLoader.java:224) > at java.util.ServiceLoader.access$100(ServiceLoader.java:181) > at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377) > at java.util.ServiceLoader$1.next(ServiceLoader.java:445) > at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2673) > at org.apache.hadoop.fs.FileSystem.getFileSystemClass( > FileSystem.java:2684) > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2701) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91) > at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2737) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2719) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:375) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:174) > at org.apache.spark.deploy.yarn.ApplicationMaster.run( > ApplicationMaster.scala:142) > at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$ > main$1.apply$mcV$sp(ApplicationMaster.scala:653) > at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run( > SparkHadoopUtil.scala:69) > at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run( > SparkHadoopUtil.scala:68) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at org.apache.hadoop.security.UserGroupInformation.doAs( > UserGroupInformation.java:1657) > at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser( > SparkHadoopUtil.scala:68) > at org.apache.spark.deploy.yarn.ApplicationMaster$.main( > ApplicationMaster.scala:651) > at org.apache.spark.deploy.yarn.ApplicationMaster.main( > ApplicationMaster.scala) > Caused by: java.lang.NoClassDefFoundError: com/amazonaws/services/s3/ > AmazonS3 > at java.lang.Class.getDeclaredConstructors0(Native Method) > at java.lang.Class.privateGetDeclaredConstructors(Class.java:2595) > at java.lang.Class.getConstructor0(Class.java:2895) > at java.lang.Class.newInstance(Class.java:354) > at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373) > ... 19 more > Caused by: java.lang.ClassNotFoundException: com.amazonaws.services.s3. > AmazonS3 > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > ... 24 more > End of LogType:stderr > > If I remove the --properties-file parameter > the error is gone > > Would really appreciate the help . > > > > Thanks, > Divya >