On Saturday, November 5, 2016 at 6:39:52 PM UTC-7, Steve D'Aprano wrote:
> On Sun, 6 Nov 2016 09:17 am, Mr. Wrobel wrote:
> 
> 
> I don't have any experience with GPU processing. I expect that it will be
> useful for somethings, but for number-crushing and numeric work, I am
> concerned that GPUs rarely provide correctly rounded IEEE-754 maths. That
> means that they are accurate enough for games where a few visual glitches
> don't matter, but they risk being inaccurate for serious work.
> 
> I fear that doing numeric work in GPUs will be returning to the 1970s, when
> every computer was incompatible with every other computer, and it was
> almost impossible to write cross-platform, correct, accurate numeric code.

Hi Steve,

You, Jason Swails, myself, and several others had a discussion about the state 
of GPU arithmetic and IEEE-754 compliance just over a year ago.

https://groups.google.com/forum/#!msg/comp.lang.python/Gt_FzFlES8A/r_3dbW5XzfkJ;context-place=forum/comp.lang.python

It has been very important for the field of computational molecular dynamics 
(and probably several other fields) to get floating-point arithmetic working 
right on GPU architecture.  I don't know anything about other manufacturers of 
GPU's, but NVidia announced IEEE-754, double-precision arithmetic for their 
GPU's in 2008, and it's been included in the standard since CUDA 2.0.

If floating-point math wasn't working on GPU's, I suspect that a lot of people 
in the scientific community would be complaining.  

Do you have any new information that would lead you to doubt what we said in 
the discussion we had last year?
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to