I did find this:
http://www.andrew.cmu.edu/user/ngm/15-823/project/Draft.pdf
But there are several reasons this seems to be a dead-end route for Postgres:
1) It's limited to in-memory sorts. Speeding up in-memory sorts by a linear
factor seems uninteresting. Anything large enough for a small
At 01:26 AM 6/9/2007, Billings, John wrote:
Does anyone think that PostgreSQL could benefit from using the video
card as a parallel computing device? I'm working on a project using
Nvidia's CUDA with an 8800 series video card to handle non-graphical
algorithms. I'm curious if anyone thinks th
If you're absolutely, positive dying for some excuse to do this (i.e.
I don't currently have the budget to pay you anything to do it), I
work in a manufacturing environment where we are using a postgresql
database to store bills of materials for parts. One of the things we
also have to do i
On Jun 8, 2007, at 3:33 PM, Guy Rouillier wrote:
Well, I'm not one of the developers, and one of them may have this
particular scratch, but in my opinion just about any available fish
has to be bigger than this one. Until someone comes out with a
standardized approach for utilizing whatev
Billings, John wrote:
Does anyone think that PostgreSQL could benefit from using the video
card as a parallel computing device?
Well, I'm not one of the developers, and one of them may have this
particular scratch, but in my opinion just about any available fish has
to be bigger than this one
Does anyone think that PostgreSQL could benefit from using the video
card as a parallel computing device? I'm working on a project using
Nvidia's CUDA with an 8800 series video card to handle non-graphical
algorithms. I'm curious if anyone thinks that this technology could be
used to speed up a d