To add more infor:

This is the setting in my spark-env.sh
[root@ES01 conf]# grep -v "#" spark-env.sh
SPARK_EXECUTOR_CORES=1
SPARK_EXECUTOR_INSTANCES=1
SPARK_DAEMON_MEMORY=4G

So I did not set the executor to use more memory here.

Also here is the top output 

KiB Mem : 16268156 total,   161116 free, 15213076 used,   893964 buff/cache
KiB Swap:  6291452 total,  3332460 free,  2958992 used.   238788 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND     
                                                                                
                                                               
 6629 root      20   0 49.990g 0.011t   5324 S   0.0 72.1  78:28.99 java        
                                                                                
                                                               


As you can see the process 6629 which is executor is using 72% MEM.


So I wonder why it causing such high memory usage




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Why-I-have-memory-leaking-for-such-simple-spark-stream-code-tp26904p26910.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to