dear R experts.  about once a year, I wonder where GPU computing is.  this
is of course a quickly moving field, so I am hoping to elicit some advice
from those that have followed the developments more actively.  fortunately,
unlike my many other cries for help, this post is not critical.  it is more
curiosity.


Integration: for me, I want to substitute the basic statistical functions
with GPU versions:

   mean, var, sd, lm, lm.summary, sort, order, quantile  (, and maybe
lapply with plain arithmetic functions).

one of these functions is already in gputools (gLm).  I did not see the
others, but some functions (like mean) may well be used internally.  is
there an R GPU library that can swap-and-replace these eight functions
"seamlessly and transparently"?


Benchmarks: I wonder what the typical speedup for these functions are in
mainstream systems---say a $300 CPU vs. a $200 GPU.  if it is a factor of
20, it is well worth it *for me*.  if it is a factor of 5, I may as well
use mcapply for most of *my* problems (which can be split nicely across a
quad-core CPU).

is there still an overhead cost for switching between CPU and GPU
calculations?  I think modern GPUs have unified memory space, so the data
copy problem is hopefully long gone.

/iaw
----
Ivo Welch (ivo.we...@gmail.com)

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to