Dear all,
We've tried to use sparkSql to do some insert from A table to B table
action where using the exact same SQL script,
hive is able to finish it but Spark 1.3.1 would always end with OOM issue;
we tried several configuration including:
--executor-cores 2
--num-executors 300
--executor-memory 7g
sconf.set("spark.storage.memoryFraction", "0")
but none of them can change the result of error:
java.lang.OutOfMemoryError: GC overhead limit exceeded
is there any other configuration we can make? Thanks!
---------------------------------------------------------------------------
TSMC PROPERTY
This email communication (and any attachments) is proprietary information
for the sole use of its
intended recipient. Any unauthorized review, use or distribution by anyone
other than the intended
recipient is strictly prohibited. If you are not the intended recipient,
please notify the sender by
replying to this email, and then delete this email and any copies of it
immediately. Thank you.
---------------------------------------------------------------------------
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]