Hi Ted
Thanks for the information .
is there any way that two different spark application share there data ?

Regards
Prateek

On Fri, Dec 4, 2015 at 9:54 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> See Josh's response in this thread:
>
>
> http://search-hadoop.com/m/q3RTt1z1hUw4TiG1&subj=Re+Question+about+yarn+cluster+mode+and+spark+driver+allowMultipleContexts
>
> Cheers
>
> On Fri, Dec 4, 2015 at 9:46 AM, prateek arora <prateek.arora...@gmail.com>
> wrote:
>
>> Hi
>>
>> I want to create multiple sparkContext in my application.
>> i read so many articles they suggest " usage of multiple contexts is
>> discouraged, since SPARK-2243 is still not resolved."
>> i want to know that Is spark 1.5.0 supported to create multiple contexts
>> without error ?
>> and if supported then are we need to set
>> "spark.driver.allowMultipleContexts" configuration parameter ?
>>
>> Regards
>> Prateek
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/is-Multiple-Spark-Contexts-is-supported-in-spark-1-5-0-tp25568.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to