Re: [HACKERS] Using the GPU

2007-06-09 Thread Jeroen T. Vermeulen
On Sat, June 9, 2007 07:36, Gregory Stark wrote: > "Billings, John" <[EMAIL PROTECTED]> writes: > >> Does anyone think that PostgreSQL could benefit from using the video >> card as a parallel computing device? I'm working on a project using >> Nvidia's CUDA with an 8800 series video card to handle

Re: [HACKERS] Using the GPU

2007-06-09 Thread Lukas Kahwe Smith
Gregory Stark wrote: "Billings, John" <[EMAIL PROTECTED]> writes: Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curio

Re: [HACKERS] Using the GPU

2007-06-09 Thread Nicolas Barbier
2007/6/9, Gregory Stark <[EMAIL PROTECTED]>: There has been some interesting research on sorting using the GPU which could be very interesting for databases. However I think Postgres would be unlikely to go the route of having compiled driver code for every possible video card. It's unlikely to

Re: [HACKERS] Using the GPU

2007-06-08 Thread Gregory Stark
"Billings, John" <[EMAIL PROTECTED]> writes: > Does anyone think that PostgreSQL could benefit from using the video > card as a parallel computing device? I'm working on a project using > Nvidia's CUDA with an 8800 series video card to handle non-graphical > algorithms. I'm curious if anyone thi

Re: [HACKERS] Using the GPU

2007-06-08 Thread Vincent Janelle
bject: [HACKERS] Using the GPU Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone thinks

[HACKERS] Using the GPU

2007-06-08 Thread Billings, John
Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone thinks that this technology could be used to speed up