Hi,
I don’t know how this should help. We use maven shade plugin. This behavior
currently happen in local unit tests.
Pascal
> Am 21.08.2017 um 12:58 schrieb 周康 :
>
> Use maven shade plugin may help
>
> 2017-08-21 18:43 GMT+08:00 Pascal Stammer <mailto:stam...@deichbr
(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
We think that this is caused by incompatible versions of netty. We also have a
transitive dependency in neo4j dependencies. They are using
io.netty:netty-all.4.1.8.Final … Does anybody can provide some help?
Regards,
Pascal Stammer
logging? I appreciate any help!
Regards,
Pascal Stammer
> eg. spark-submit executor-memory 2g
>
>
> Regards,
>
> Takashi
>
>> 2017-07-18 5:18 GMT+09:00 Pascal Stammer :
>>> Hi,
>>>
>>> I am running a Spark 2.1.x Application on AWS EMR with YARN and get
>>> following error that kill
6.9 GB virtual memory used. Killing container.
I already change spark.yarn.executor.memoryOverhead but the error still occurs.
Does anybody have a hint for me which parameter or configuration I have to
adapt.
Thank you very much.
Regards,
Pascal Stammer
use reflection at a few
locations.
We have now a few approaches in mind:
1. Ask our domain for simple objects without cyclic references
2. Implement our own Encoders
Are we missing something. We appreciate every hint so resolve this.
Kind Regards,
Pascal Stammer