Hi,
On Fri, Oct 5, 2012 at 1:41 PM, Ista Zahn wrote:
> On Fri, Oct 5, 2012 at 12:09 PM, PIKAL Petr wrote:
[snip]
>> If I compute correctly, such a big matrix (20e6*1000) needs about 160 GB
>> just to be in memory. Are you prepared for this?
>
> This is not as outrageous as one might think -- yo
help@r-project.org
>> Subject: [R] R: machine for moderately large data
>>
>> Dear all,
>>
>> I would like to ask your advice about a suitable computer for the
>> following usage.
>> I (am starting to) work with moderately big data in R:
>> - cca
Hi
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
> project.org] On Behalf Of Skála, Zdeněk (INCOMA GfK)
> Sent: Friday, October 05, 2012 3:38 PM
> To: r-help@r-project.org
> Subject: [R] R: machine for moderately large data
>
>
Dear all,
I would like to ask your advice about a suitable computer for the following
usage.
I (am starting to) work with moderately big data in R:
- cca 2 - 20 million rows * 100 - 1000 columns (market basket data)
- mainly clustering, classification trees, association analysis (e.g. libraries
4 matches
Mail list logo