Hi All,

object Step1 {
  def main(args: Array[String]) = {

    val sparkConf = new SparkConf().setAppName("my-app")
    val sc = new SparkContext(sparkConf)

    val hiveSqlContext: HiveContext = new
org.apache.spark.sql.hive.HiveContext(sc)

    hiveSqlContext.sql(scala.io.Source.fromFile(args(0)).mkString)

    System.out.println("Okay")

  }

}



This is my spark program and my hivescript is at args(0)

$SPARK_HOME/bin/./spark-submit --class com.spark.test.Step1 --master yarn
--deploy-mode cluster com.spark.test-0.1-SNAPSHOT.jar
 hdfs://spirui-d86-f03-06:9229/samples/testsubquery.hql

but file not found exception is coming

why?

where it is expecting the file to be ?
in local or hdfs?
if in hdfs how i should give its path

and is there any better way for hive context than using this to read query
from a  file from hdfs?

Reply via email to