Hi, I had a query regarding which object, a list or a dataframe, consumes more R memory. Let me clarify this:
For example, I have a df of 6 rows and 12 columns, say 'test'. I do object.size() and find it uses 3.3 KB of memory. I run a loop and make a list, say 'testlist', of 6 elements, each element being the above mentioned df 'test'. The size of this list is 19.9 KB, understandably. Now I combine this list into a dataframe using rbind. The df formed has 12 cols and 36 rows. The size of this df is just 5.8 KB, almost a 75% reduction in memory. I had to work with a much larger list, and I thought of using the same method to convert my bigger list (62 dataframes, each having 4 cols and close to 200,000 rows) into a single dataframe. The big list, sat LIST A, had a size of 571 MB. But when I convert it into a dataframe, say DF A, using rbind, the object size increases to 1.35 GB. This was in contradiction to the earlier result. What am I missing? Why a 75% reduction in size in one case and double size in other? Anyone with any explanation? Sorry for the verbose email, just wanted to make my case clear. Thanks in advance, Regards Shivam [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.