Hi all: My spark's version is 2.3.2. I start thriftserver with default spark config. On another hand, I use java-application to query result via JDBC . The query application has plenty of statement to execute. The previous statement executes very quickly, and the latter statement executes slower and slower. I try to observe actions of appliction 'Thrit JDBC Server' on web ui. I keep refreshing the page, but the page response is getting slower and slower. Finally, it shows gc over exceed. Then , I try to config the memory of executor in config spark-env.sh . And the executor's memory does increase. But the problem still exists. What puzzles me is The JDBC Server application serves as driver, only handle some code distribution and rpc connection works.Does it need so much meormy? If so , how to increase it's memory?
shicheng31...@gmail.com