You must be getting a warning at the start of application like : Warning:
Ignoring non-spark config property: runtime.environment=passInValue .

Configs in spark should start with *spark* as prefix. So try something like
--conf spark.runtime.environment=passInValue .

Regards
Varun

On Thu, Nov 12, 2015 at 9:51 PM, java8964 <java8...@hotmail.com> wrote:

> In my Spark application, I want to access the pass in configuration, but
> it doesn't work. How should I do that?
>
> object myCode extends Logging {
>   // starting point of the application
>   def main(args: Array[String]): Unit = {
>     val sparkContext = new SparkContext()
>     val runtimeEnvironment = sparkContext.getConf.get("runtime.environment", 
> "default")
>     Console.println("load properties from runtimeEnvironment: " + 
> runtimeEnvironment)
>     logInfo("load properties from runtimeEnvironment: " + runtimeEnvironment)
>     sparkContext.stop()
>   }
> }
>
>
> /opt/spark/bin/spark-submit --class myCode --conf 
> runtime.environment=passInValue my.jar
>
> load properties from runtimeEnvironment: default
>
>
> It looks like that I cannot access the dynamic passed in value from the 
> command line this way.
>
> In the Hadoop, the Configuration object will include all the passed in 
> key/value in the application. How to archive that in Spark?
>
> Thanks
>
> Yong
>
>


-- 
*VARUN SHARMA*
*Flipkart*
*Bangalore*

Reply via email to