Hi, I verified and I could see hive-site.xml in spark conf directory.
Regards, Anand.C From: fightf...@163.com [mailto:fightf...@163.com] Sent: Friday, November 27, 2015 12:53 PM To: Chandra Mohan, Ananda Vel Murugan <ananda.muru...@honeywell.com>; user <user@spark.apache.org> Subject: Re: error while creating HiveContext Hi, I think you just want to put the hive-site.xml in the spark/conf directory and it would load it into spark classpath. Best, Sun. ________________________________ fightf...@163.com<mailto:fightf...@163.com> From: Chandra Mohan, Ananda Vel Murugan<mailto:ananda.muru...@honeywell.com> Date: 2015-11-27 15:04 To: user<mailto:user@spark.apache.org> Subject: error while creating HiveContext Hi, I am building a spark-sql application in Java. I created a maven project in Eclipse and added all dependencies including spark-core and spark-sql. I am creating HiveContext in my spark program and then try to run sql queries against my Hive Table. When I submit this job in spark, for some reasons it is trying to create derby metastore. But my hive-site.xml clearly specifies the jdbc url of my MySQL . So I think my hive-site.xml is not getting picked by spark program. I specified hive-site.xml path using “—files” argument in spark-submit. I also tried placing hive-site.xml file in my jar . I even tried creating Configuration object with hive-site.xml path and updated my HiveContext by calling addResource() method. I want to know where I should put hive config files in my jar or in my eclipse project or in my cluster for it to be picked by correctly in my spark program. Thanks for any help. Regards, Anand.C