Re: Adding hive context gives error

2016-03-07 Thread Suniti Singh
yeah i realized it and changed the version of it to 1.6.0 as mentioned in http://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10/1.6.0 I added the spark sql dependency back to the pom.xml and the scala code works just fine. On Mon, Mar 7, 2016 at 5:00 PM, Tristan Nixon wrote: > Hi

Re: Adding hive context gives error

2016-03-07 Thread Suniti Singh
We do not need to add the external jars to eclipse if maven is used as a Build tool since the spark dependency in POM file will take care of it. On Mon, Mar 7, 2016 at 4:50 PM, Mich Talebzadeh wrote: > Hi Kabeer, > > I have not used eclipse for Spark/Scala although I have played with it. > > A

Re: Adding hive context gives error

2016-03-07 Thread Mich Talebzadeh
Sorry that should have been addressed to Suniti Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com On 8 Marc

Re: Adding hive context gives error

2016-03-07 Thread Mich Talebzadeh
Hi Kabeer, I have not used eclipse for Spark/Scala although I have played with it. As a matter of interest when you set up an Eclipse project do you add external Jars to eclipse From $SPARK_HOME/lib only? Thanks Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEW

Re: Adding hive context gives error

2016-03-07 Thread Suniti Singh
Thanks Mich and Kabeer for quick reply. @ Kabeer - i removed the spark - sql dependency and all the errors are gone. But i am surprised to see this behaviour. Why spark-sql lib are an issue for including the hive context? Regards, Suniti On Mon, Mar 7, 2016 at 4:34 PM, Kabeer Ahmed wrote: > I

Re: Adding hive context gives error

2016-03-07 Thread Kabeer Ahmed
I use SBT and I have never included spark-sql. The simple 2 lines in SBT are as below: libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "1.5.0", "org.apache.spark" %% "spark-hive" % "1.5.0" ) However, I do note that you are using Spark-sql include and the Spark version

Re: Adding hive context gives error

2016-03-07 Thread Mich Talebzadeh
I tend to use SBT to build Spark programs. This works for me import org.apache.spark.SparkContext import org.apache.spark.SparkConf import org.apache.spark.sql.Row import org.apache.spark.sql.hive.HiveContext import org.apache.spark.sql.types._ import org.apache.spark.sql.SQLContext import org.ap