hi.  i'm running into this OutOfMemory issue when i'm broadcasting a large
array.  what is the best way to handle this?

should i split the array into smaller arrays before broadcasting, and then
combining them locally at each node?

thanks!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/broadcast-OutOfMemoryError-tp20633.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to