Hi Pawel, That doc was created during the initial days (Spark 0.8.0), you can of course create multiple sparkContexts in the same driver program now.
Thanks Best Regards On Thu, Nov 6, 2014 at 9:30 PM, Paweł Szulc <paul.sz...@gmail.com> wrote: > Hi, > > quick question: I found this: > http://docs.sigmoidanalytics.com/index.php/Problems_and_their_Solutions#Multiple_SparkContext:Failed_to_bind_to:.2F127.0.1.1:45916 > > My main question: is this constrain still valid? AM I not allowed to have > two SparkContexts pointing to the same Spark Master in one driver program? > > > Regards, > Pawel Szulc > > > >