You may take a look at the bigmemory package or other which deal with large
memory data in
https://cran.r-project.org/web/views/HighPerformanceComputing.html#large-memory-and-out-of-memory-data
Some extra explanation is in https://stackoverflow.com/a/11127229/997979
Iago
___
One possibility might be to use Rcpp.
An R matrix is stored in contiguous memory, which can be considered as a
vector.
Define a C++ function which operates on a vector in place, as in the
following:
library(Rcpp)
cppFunction(
'void subtractConst(NumericVector x, double c) {
for ( int i = 0; i
I doubt that R's basic matrix capabilities can handle this, but have a look
at the Matrix package, especially if your matrix is some special form.
Bert
On Tue, Apr 11, 2023, 19:21 Shunran Zhang
wrote:
> Hi all,
>
> I am currently working with a quite large matrix that takes 300G of
> memory. My
Thanks for the info.
For the data type, my matrix as of now is indeed a matrix in a perfect
square shape filled in a previous segment of code, but I believe I could
extract one row/column at a time to do some processing. I can also
change that previous part of code to change the data type of i
The example given does not leave room for even a single copy of your matrix
so, yes, you need alternatives.
Your example was fairly trivial as all you wanted to do is subtract each
value from 100 and replace it. Obviously something like squaring a matrix
has no trivial way to do without multiple c
Hi all,
I am currently working with a quite large matrix that takes 300G of
memory. My computer only has 512G of memory. I would need to do a few
arithmetic on it with a scalar value. My current code looks like this:
mat <- 100 - mat
However such code quickly uses up all of the remaining memor
6 matches
Mail list logo