Re: [R] memory issue

2017-05-02 Thread Jeff Newmiller
Suggestions... Post plain text (you reduce your own chances of getting feedback by failing to do this in your email program) Provide sample data and code Buy more RAM use data.table package and fread load and analyze subsets of data Put the data into a database (e.g. sqlite?) If these sugge

[R] memory issue

2017-05-02 Thread Amit Sengupta via R-help
HI,I am unable to read a 2.4 gig file into a table (using read.table) in a 64 bit R environment. Do you have any suggestions?Amit [[alternative HTML version deleted]] __ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https:/

Re: [R] R Memory Issue

2016-02-17 Thread Sandeep Rana
Hi, May be its reading your file and taking time which depends on size of the file that you are reading. Please explore ‘data.table’ library to read big files in few seconds. If you attempt to close the application while execution had been in progress for sometime it would take time most of the

Re: [R] R Memory Issue

2016-02-17 Thread PIKAL Petr
ring definite answer. Petr > -Original Message- > From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of SHIVI > BHATIA > Sent: Wednesday, February 17, 2016 10:16 AM > To: r-help@r-project.org > Subject: [R] R Memory Issue > > Dear Team, > > &

[R] R Memory Issue

2016-02-17 Thread SHIVI BHATIA
Dear Team, Every now and then I face some weird issues with R. For instance it would not read my csv file or any other read.table command and once I would close the session and reopen again it works fine. It have tried using rm(list=ls()) & gc() to free some memory and restart R Also

Re: [R] Memory issue with svm modeling in R

2012-10-23 Thread Jessica Streicher
Well, i'm no expert on these topics, but if its 2.7 gig and R can maximally use 2gig, then the easiest solution would be giving R more memory. Did you read through help(memory.size) as the error suggested? try calling memory.size(T) or memory.limit(3000) and see if it works. I don't have any ex

Re: [R] Memory issue with svm modeling in R

2012-10-22 Thread Jessica Streicher
Hello Vignesh, we did not get any attachments, maybe you could upload them somewhere? On 19.10.2012, at 09:46, Vignesh Prajapati wrote: > As I found the memory problem with local machine/micro instance(amazon) for > building SVM model in R on large dataset(2,01,478 rows with 11 variables), > the

[R] Memory issue with svm modeling in R

2012-10-19 Thread Vignesh Prajapati
As I found the memory problem with local machine/micro instance(amazon) for building SVM model in R on large dataset(2,01,478 rows with 11 variables), then I have migrated our micro instance to large instance at Amazon. Still I have memory issue with large amazon instance while developing R model f

Re: [R] Memory issue. XXXX

2012-03-02 Thread Prof Brian Ripley
On 02/03/2012 23:36, steven mosher wrote: 1. How much RAM do you have (looks like 2GB ) . If you have more than 2GB then you can allocate more memory with memory.size() Actually, this looks like 32-bit Windows (unstated), so you cannot. See the rw-FAQ for things your sysadmin can do even

Re: [R] Memory issue. XXXX

2012-03-02 Thread steven mosher
1. How much RAM do you have (looks like 2GB ) . If you have more than 2GB then you can allocate more memory with memory.size() 2. If you have 2GB or less then you have a couple options a) make sure your session is clean of unnecessary objects. b) Dont read in all the data if you dont

Re: [R] Memory issue. XXXX

2012-03-02 Thread Sarah Goslee
Let's see... You could delete objects from your R session. You could buy more RAM. You could see help(memory.size). You could try googling to see how others have dealt with memory management in R, a process which turns up useful information like this: http://www.r-bloggers.com/memory-management-in

[R] Memory issue. XXXX

2012-03-02 Thread Dan Abner
Hi everyone, Any ideas on troubleshooting this memory issue: > d1<-read.csv("arrears.csv") Error: cannot allocate vector of size 77.3 Mb In addition: Warning messages: 1: In class(data) <- "data.frame" : Reached total allocation of 1535Mb: see help(memory.size) 2: In class(data) <- "data.frame"

Re: [R] Memory Issue

2010-08-24 Thread Cuckovic Paik
Thanks for constrctive comments. I was very careful when I wrote the code. I wrote many functions and then wrapped up to get a single function. Originally, I used optim() to get MLE, it was at least 10 times slower than the code based on Newton method. I also vectorized all objects whenever possib

Re: [R] Memory Issue

2010-08-23 Thread Dennis Murphy
Hi: Are you running 32-bit or 64-bit R? For memory-intensive processes like these, 64-bit R is almost a necessity. You might also look into more efficient ways to invert the matrix, especially if it has special properties that can be exploited (e.g., symmetry). More to the point, you want to compu

[R] Memory Issue

2010-08-23 Thread Cuckovic Paik
Dear All, I have an issue on memory use in R programming. Here is the brief story: I want to simulate the power of a nonparameteric test and compare it with the existing tests. The basic steps are 1. I need to use Newton method to obtain the nonparametric MLE that involves the inversion of a l

Re: [R] Memory issue

2010-05-05 Thread kMan
Dear Alex, Has manual garbage collection had any effect? Sincerely, KeithC. -Original Message- From: Alex van der Spek [mailto:do...@xs4all.nl] Sent: Wednesday, May 05, 2010 3:48 AM To: r-help@r-project.org Subject: [R] Memory issue Reading a flat text file 138 Mbyte large into R with

Re: [R] Memory issue

2010-05-05 Thread Alex van der Spek
Thank you all, No offense meant. I like R tremendously but I admit I am only a beginner. I did not know about gc(), but it explains my confusion about rm() not doing what I expected it to do. I suspected that .RData was a compressed file. Thanks for the confirmation. As for Windows, unfortun

Re: [R] Memory issue

2010-05-05 Thread Prof Brian Ripley
On Wed, 5 May 2010, Alex van der Spek wrote: Reading a flat text file 138 Mbyte large into R with a combination of scan (to get the header) and read.table. After conversion of text time stamps to POSIXct and conversion of integer codes to factors I convert everything into one data frame and re

[R] Memory issue

2010-05-05 Thread Alex van der Spek
Reading a flat text file 138 Mbyte large into R with a combination of scan (to get the header) and read.table. After conversion of text time stamps to POSIXct and conversion of integer codes to factors I convert everything into one data frame and release the old structures containing the data b

[R] R memory issue / quantreg

2010-01-28 Thread Dan Rabosky
Hi - I also posted this on r-sig-ecology to little fanfare, so I'm trying here. I've recently hit an apparent R issue that I cannot resolve (or understand, actually). I am using the quantreg package (quantile regression) to fit a vector of quantiles to a dataset, approx 200-400 observation

Re: [R] Memory issue?

2009-01-28 Thread Ubuntu Diego
I had similar issues with memory occupancy. You should explicitly call gc() to call the garbage collector (free memory routine) after you do rm() of the big objects. D. __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help P

Re: [R] Memory issue?

2009-01-27 Thread Paul Hiemstra
Daniel Brewer wrote: I have a script that sometimes produces the following error: Error in assign(".target", met...@target, envir = envir) : formal argument "envir" matched by multiple actual arguments Do you think this is a memory issue? I don't know what else it could be as it doesn't alwa

[R] Memory issue?

2009-01-27 Thread Daniel Brewer
I have a script that sometimes produces the following error: Error in assign(".target", met...@target, envir = envir) : formal argument "envir" matched by multiple actual arguments Do you think this is a memory issue? I don't know what else it could be as it doesn't always occur even if the sc

Re: [R] R memory issue for writing out the file

2008-04-15 Thread jim holtman
What are you going to do with the table after you write it out? Are you just going to read it back into R? If so, have you tried using 'save'? On Tue, Apr 15, 2008 at 12:12 PM, Xiaojing Wang <[EMAIL PROTECTED]> wrote: > Hello, all, > > First thanks in advance for helping me. > > I am now handlin

Re: [R] R memory issue for writing out the file

2008-04-15 Thread Henrik Bengtsson
Try to write the data.frame to file in blocks of rows by calling write.table() multiple times - see argument 'append' for write.table(). That will probably require less memory. /Henrik On Tue, Apr 15, 2008 at 6:12 PM, Xiaojing Wang <[EMAIL PROTECTED]> wrote: > Hello, all, > > First thanks in ad

Re: [R] R memory issue for writing out the file

2008-04-15 Thread Martin Morgan
Hi Xiaojing, That's a big table! You might try 'write' (you'll have to work harder to get your data into an appropriate format). You might also try the R-2.7 release candidate, which I think is available here http://r.research.att.com/ for the mac. There was a change in R-2.7 that will make

[R] R memory issue for writing out the file

2008-04-15 Thread Xiaojing Wang
Hello, all, First thanks in advance for helping me. I am now handling a data frame, dimension 11095400 rows and 4 columns. It seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was trying to write this file out using the command: write.table(all,file="~/Desktop/alex.lgen",s

[R] Memory Issue

2007-09-19 Thread Yoni Stoffman
Hi, I'm new to R and there is something I'm missing about how it uses memory. I'm doing a simple query (using RODBC package) and immediately set the data.frame to null close the connection/channel and explicitly call to the garbage collector (gc()) however when I look in the task monitor I see bo