That was exactly the problem Michael, mode details in this post:
http://stackoverflow.com/questions/34184079/cannot-run-queries-in-sqlcontext-from-apache-spark-sql-1-5-2-getting-java-lang

*Matheus*

On Wed, Dec 9, 2015 at 4:43 PM, Michael Armbrust <mich...@databricks.com>
wrote:

> java.lang.NoSuchMethodError almost always means you have the wrong version
> of some library (different than what Spark was compiled with) on your
> classpath.; In this case the Jackson parser.
>
> On Wed, Dec 9, 2015 at 10:38 AM, Matheus Ramos <
> matheusedsonra...@gmail.com> wrote:
>
>> ​I have a Java application using *Spark SQL* (*Spark 1.5.2* using *local
>> mode*), but I cannot execute any SQL commands without getting errors.
>>
>> This is the code I am executing:
>>
>>     //confs
>>     SparkConf sparkConf = new SparkConf();
>>     sparkConf.set("spark.master","local");
>>     sparkConf.set("spark.app.name","application01");
>>     sparkConf.set("spark.driver.host","10.1.1.36");
>>     sparkConf.set("spark.driver.port", "51810");
>>     sparkConf.set("spark.executor.port", "51815");
>>     sparkConf.set("spark.repl.class.uri","http://10.1.1.36:46146";);
>>     sparkConf.set("spark.executor.instances","2");
>>     sparkConf.set("spark.jars","");
>>     sparkConf.set("spark.executor.id","driver");
>>     sparkConf.set("spark.submit.deployMode","client");
>>     sparkConf.set("spark.fileserver.uri","http://10.1.1.36:47314";);
>>     sparkConf.set("spark.localProperties.clone","true");
>>     sparkConf.set("spark.app.id","app-45631207172715-0002");
>>
>>     //Initialize contexts
>>     JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
>>     SQLContext sqlContext = new SQLContext(sparkContext);
>>
>>     //execute command
>>     sqlContext.sql("show tables").show();
>>
>> Spark dependencies in *pom.xml* look like this:
>>
>>     <dependency>
>>       <groupId>org.apache.spark</groupId>
>>       <artifactId>spark-core_2.10</artifactId>
>>       <version>1.5.2</version>
>>     </dependency>
>>
>>     <dependency>
>>       <groupId>org.apache.spark</groupId>
>>       <artifactId>spark-sql_2.10</artifactId>
>>       <version>1.5.2</version>
>>     </dependency>
>>
>>     <dependency>
>>       <groupId>org.apache.spark</groupId>
>>       <artifactId>spark-hive_2.10</artifactId>
>>       <version>1.5.2</version>
>>     </dependency>
>>
>>     <dependency>
>>       <groupId>org.apache.spark</groupId>
>>       <artifactId>spark-repl_2.10</artifactId>
>>       <version>1.5.2</version>
>>     </dependency>
>>
>> Here is the error I am getting:
>>
>> java.lang.NoSuchMethodError: 
>> com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;
>>
>> The *stack trace* is here <http://pastebin.com/YtnpDLgs>.
>>
>> My application is a web application running on Tomcat 7. I don’t have any
>> other configuration files. What could I be doing wrong? Could it be some
>> dependency conflict, since I am able to run the same code in a clean
>> project?
>> I found an issue <https://issues.apache.org/jira/browse/SPARK-8332> that
>> gives some more information about the problem.
>>
>> Regards,
>>
>> Matheus​
>>
>
>

Reply via email to