Thanks. I'll try that. Hopefully that should work.
On Mon, Jul 4, 2016 at 9:12 PM, Mathieu Longtin
wrote:
> I started with a download of 1.6.0. These days, we use a self compiled
> 1.6.2.
>
> On Mon, Jul 4, 2016 at 11:39 AM Ashwin Raaghav
> wrote:
>
>> I am thinki
Longtin
wrote:
> 1.6.1.
>
> I have no idea. SPARK_WORKER_CORES should do the same.
>
> On Mon, Jul 4, 2016 at 11:24 AM Ashwin Raaghav
> wrote:
>
>> Which version of Spark are you using? 1.6.1?
>>
>> Any ideas as to why it is not working in ours?
>>
>&
Which version of Spark are you using? 1.6.1?
Any ideas as to why it is not working in ours?
On Mon, Jul 4, 2016 at 8:51 PM, Mathieu Longtin
wrote:
> 16.
>
> On Mon, Jul 4, 2016 at 11:16 AM Ashwin Raaghav
> wrote:
>
>> Hi,
>>
>> I tried what you suggeste
e per server. However, it seems it will
> start as many pyspark as there are cores, but maybe not use them.
>
> On Mon, Jul 4, 2016 at 10:44 AM Ashwin Raaghav
> wrote:
>
>> Hi Mathieu,
>>
>> Isn't that the same as setting "spark.executor.cores" to 1? An
aemons process is still not coming down. It looks like initially
>> there is one Pyspark.daemons process and this in turn spawns as many
>> pyspark.daemons processes as the number of cores in the machine.
>>
>> Any help is appreciated :)
>>
>> Thanks,
>> Ashwin Raagha
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>
>
--
Regards,
Ashwin Raaghav