Hi, I used createDataFrame API of SqlContext in python. and getting OutOfMemoryException. I am wondering if it is creating whole dataFrame in memory? I did not find any documentation describing memory usage of Spark APIs. Documentation given is nice but little more details (specially on memory usage/ data distribution etc.) will really help.
-- Regards Harit Vishwakarma