Hi all, 

I have several Hive queries that work in spark-shell, but they don't work in
spark-submit. In fact, I can't even show all databases. The following works
in spark-shell:


import org.apache.spark._
import org.apache.spark.sql._

object ViewabilityFetchInsertDailyHive {
  def main() {   
    val x = sqlContext.sql("show databases")
    val z = x.collect
    for(i <- z) println(i.toString)
  }
}

But the following doesn't work in spark-submit:


object PrintAllDatabases {  
  def main() {
    val sc = new SparkContext(new
SparkConf().setAppName(this.getClass.getName))
    val sqlContext = new SQLContext(sc)
    val x = sqlContext.sql("show databases")
    val z = x.collect
    for(i <- z) println(i.toString)
  }
}


And I get this error:

16/03/14 22:27:55 INFO BlockManagerMaster: Registered BlockManager
16/03/14 22:27:56 INFO EventLoggingListener: Logging events to
hdfs://nameservice1/user/spark/applicationHistory/local-1457994475020
Exception in thread "main" java.lang.RuntimeException: [1.1] failure:
``with'' expected but identifier show found

show databases
^
        at scala.sys.package$.error(package.scala:27)
        at
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36)
        at
org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67)
        at 
org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
        at 
org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
        at
org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)


Any suggestions are appreciated!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Hive-query-works-in-spark-shell-not-spark-submit-tp26492.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to