dIn
>>>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>>
>>>> <https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>>>>
>>>>
t; LinkedIn *
>>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>> http://talebzadehmich.wo
gt; any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
y other property which may arise
>>> from relying on this email's technical content is explicitly disclaimed.
>>> The author will in no case be liable for any monetary damages arising from
>>> such loss, damage or destruction.
>>>
>>>
>
g from such
>> loss, damage or destruction.
>>
>>
>> On 28 June 2016 at 15:27, adaman79 > <mailto:felix.mas...@codecentric.de>> wrote:
>> Hey guys,
>>
>> I have a problem with memory because over 90% of my spark driver will be
>> started on on
ity to define the node the spark
>> driver
>> will be started when using spark-submit or setting it somewhere in the
>> code.
>>
>> Is this possible? Does anyone else have this kind of problem?
>>
>> thx and best rega
is possible? Does anyone else have this kind of problem?
>
> thx and best regards
> Felix
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Set-the-node-the-spark-driver-will-be-started-tp27244.html
&
ext:
> http://apache-spark-user-list.1001560.n3.nabble.com/Set-the-node-the-spark-driver-will-be-started-tp27244.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe
? Does anyone else have this kind of problem?
thx and best regards
Felix
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Set-the-node-the-spark-driver-will-be-started-tp27244.html
Sent from the Apache Spark User List mailing list archive at Nabble.com