On Jul 6, 2012, at 1:39 PM, C W wrote:
Quick question, what the memory size in R?
I converted to CSV, but only 53300 of the 1,000,000 rows were read
in. Did
R run out of memory? If so, is there a work around?
You probably have mismatched quotes. Consider using quote="". Also
consider doing this:
table(count.fields(file-name)) # with a valid file name
That count.fields function is very useful since it accepts the same
arguments as the read.tables functions, with defaults of:
quote = "\"'", skip = 0, blank.lines.skip = TRUE, comment.char = "#")
--
David.
Thanks,
Mike
On Fri, Jul 6, 2012 at 1:24 PM, Duncan Murdoch <[email protected]
>wrote:
On 06/07/2012 1:11 PM, C W wrote:
Hi all
I have a large SAS data set, how do I get it read in R?
The data is too big (about 400,000 rows by 100 columns) to be
saved as an
Excel file. How should I get it read in R? Any packages? I
don't seem
to
find any.
You could write it out in some plain delimited format, e.g. CSV or
tab-delimited. Watch out for special characters in strings that
confuse R
when it reads it in (e.g. commas in unquoted CSV strings, quotes
within
strings, etc.)
Duncan Murdoch
[[alternative HTML version deleted]]
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
David Winsemius, MD
West Hartford, CT
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.