Hi,
May be its reading your file and taking time which depends on size of the file
that you are reading.
Please explore ‘data.table’ library to read big files in few seconds.
If you attempt to close the application while execution had been in progress
for sometime it would take time most of the
ring definite answer.
Petr
> -Original Message-
> From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of SHIVI
> BHATIA
> Sent: Wednesday, February 17, 2016 10:16 AM
> To: r-help@r-project.org
> Subject: [R] R Memory Issue
>
> Dear Team,
>
>
&
Dear Team,
Every now and then I face some weird issues with R. For instance it would
not read my csv file or any other read.table command and once I would close
the session and reopen again it works fine.
It have tried using rm(list=ls()) & gc() to free some memory and restart R
Also
Hi -
I also posted this on r-sig-ecology to little fanfare, so I'm trying
here. I've recently hit an apparent R issue that I cannot resolve (or
understand, actually).
I am using the quantreg package (quantile regression) to fit a vector
of quantiles to a dataset, approx 200-400 observation
What are you going to do with the table after you write it out? Are
you just going to read it back into R? If so, have you tried using
'save'?
On Tue, Apr 15, 2008 at 12:12 PM, Xiaojing Wang <[EMAIL PROTECTED]> wrote:
> Hello, all,
>
> First thanks in advance for helping me.
>
> I am now handlin
Try to write the data.frame to file in blocks of rows by calling
write.table() multiple times - see argument 'append' for
write.table(). That will probably require less memory.
/Henrik
On Tue, Apr 15, 2008 at 6:12 PM, Xiaojing Wang <[EMAIL PROTECTED]> wrote:
> Hello, all,
>
> First thanks in ad
Hi Xiaojing,
That's a big table!
You might try 'write' (you'll have to work harder to get your data into
an appropriate format).
You might also try the R-2.7 release candidate, which I think is
available here
http://r.research.att.com/
for the mac. There was a change in R-2.7 that will make
Hello, all,
First thanks in advance for helping me.
I am now handling a data frame, dimension 11095400 rows and 4 columns. It
seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
trying to write this file out using the command:
write.table(all,file="~/Desktop/alex.lgen",s
8 matches
Mail list logo