I am trying to just read a JSON file in SQLContext and print the
dataframe as follows:

     SparkConf conf = new SparkConf().setMaster("local").setAppName("AppName");

     JavaSparkContext sc = new JavaSparkContext(conf);

     SQLContext sqlContext = new SQLContext(sc);

     DataFrame df = sqlContext.read().json(pathToJSONFile);

     df.show();

On Mon, Nov 16, 2015 at 12:48 PM, Fengdong Yu <fengdo...@everstring.com> wrote:
> what’s your SQL?
>
>
>
>
>> On Nov 16, 2015, at 3:02 PM, Yogesh Vyas <informy...@gmail.com> wrote:
>>
>> Hi,
>>
>> While I am trying to read a json file using SQLContext, i get the
>> following error:
>>
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> org.apache.spark.sql.SQLContext.<init>(Lorg/apache/spark/api/java/JavaSparkContext;)V
>>        at com.honeywell.test.testhive.HiveSpark.main(HiveSpark.java:15)
>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>        at 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>        at 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>        at java.lang.reflect.Method.invoke(Method.java:597)
>>        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>> I am using pom.xml with following dependencies and versions:
>> spark-core_2.11 with version 1.5.1
>> spark-streaming_2.11 with version 1.5.1
>> spark-sql_2.11 with version 1.5.1
>>
>> Can anyone please help me out in resolving this ?
>>
>> Regards,
>> Yogesh
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to