So you are using spark-submit or spark-shell? you will need to launch either by passing --packages option (like in the example below for spark-csv). you will need to iknow
--packages com.databricks:spark-xml_<scala.version>:<package version> hth On Fri, Jun 17, 2016 at 10:20 AM, VG <vlin...@gmail.com> wrote: > Apologies for that. > I am trying to use spark-xml to load data of a xml file. > > here is the exception > > 16/06/17 14:49:04 INFO BlockManagerMaster: Registered BlockManager > Exception in thread "main" java.lang.ClassNotFoundException: Failed to > find data source: org.apache.spark.xml. Please find packages at > http://spark-packages.org > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:109) > at org.ariba.spark.PostsProcessing.main(PostsProcessing.java:19) > Caused by: java.lang.ClassNotFoundException: > org.apache.spark.xml.DefaultSource > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62) > at scala.util.Try$.apply(Try.scala:192) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62) > at scala.util.Try.orElse(Try.scala:84) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:62) > ... 4 more > > Code > SQLContext sqlContext = new SQLContext(sc); > DataFrame df = sqlContext.read() > .format("org.apache.spark.xml") > .option("rowTag", "row") > .load("A.xml"); > > Any suggestions please .. > > > > > On Fri, Jun 17, 2016 at 2:42 PM, Marco Mistroni <mmistr...@gmail.com> > wrote: > >> too little info >> it'll help if you can post the exception and show your sbt file (if you >> are using sbt), and provide minimal details on what you are doing >> kr >> >> On Fri, Jun 17, 2016 at 10:08 AM, VG <vlin...@gmail.com> wrote: >> >>> Failed to find data source: com.databricks.spark.xml >>> >>> Any suggestions to resolve this >>> >>> >>> >> >