My bad, I just fired up a spark-shell and created a new sparkContext and it
was working fine. I basically did a parallelize and collect with both
sparkContexts.
Thanks
Best Regards
On Fri, Nov 7, 2014 at 3:17 PM, Tobias Pfeiffer wrote:
> Hi,
>
> On Fri, Nov 7, 2014 at 4:58 PM, Akhil Das
> wrot
Hi,
On Fri, Nov 7, 2014 at 4:58 PM, Akhil Das
wrote:
>
> That doc was created during the initial days (Spark 0.8.0), you can of
> course create multiple sparkContexts in the same driver program now.
>
You sure about that? According to
http://apache-spark-user-list.1001560.n3.nabble.com/Is-spark-
Hi Pawel,
That doc was created during the initial days (Spark 0.8.0), you can of
course create multiple sparkContexts in the same driver program now.
Thanks
Best Regards
On Thu, Nov 6, 2014 at 9:30 PM, Paweł Szulc wrote:
> Hi,
>
> quick question: I found this:
> http://docs.sigmoidanalytics.co
Hi,
quick question: I found this:
http://docs.sigmoidanalytics.com/index.php/Problems_and_their_Solutions#Multiple_SparkContext:Failed_to_bind_to:.2F127.0.1.1:45916
My main question: is this constrain still valid? AM I not allowed to have
two SparkContexts pointing to the same Spark Master in one