libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.4.1"

Though, I would consider using spark-hive and HiveContext, as the
query parser is more powerful and you'll have access to window
functions and other features.


On Thu, Sep 17, 2015 at 10:59 AM, Cui Lin <icecreamlc...@gmail.com> wrote:

> Hello,
>
> I got stuck in adding spark sql into my standalone application.
> The build.sbt is defined as:
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
>
>
> I got the following error when building the package:
>
> *[error] /data/workspace/test/src/main/scala/TestMain.scala:6: object sql is 
> not a member of package org.apache.spark
> [error] import org.apache.spark.sql.SQLContext;
> [error]                         ^
> [error] /data/workspace/test/src/main/scala/TestMain.scala:19: object sql is 
> not a member of package org.apache.spark
> [error]     val sqlContext = new org.apache.spark.sql.SQLContext(sc)
> [error]                                           ^
> [error] two errors found
> [error] (compile:compile) Compilation failed*
>
>
> So sparksql is not part of spark core package? I have no issue when
> testing my codes in spark-shell. Thanks for the help!
>
>
>
> --
> Best regards!
>
> Lin,Cui
>

Reply via email to