John Ladasky wrote:
> What I would REALLY like to do is to take advantage of my GPU. I can't help you with that, but I would like to point out that GPUs typically don't support IEE-754 maths, which means that while they are likely significantly faster, they're also likely significantly less accurate. Any any two different brands/models of GPU are likely to give different results. (Possibly not *very* different, but considering the mess that floating point maths was prior to IEEE-754, possibly *very* different.) Personally, I wouldn't trust GPU floating point for serious work. Maybe for quick and dirty exploration of the data, but I'd then want to repeat any calculations using the main CPU before using the numbers anywhere :-) -- Steve -- https://mail.python.org/mailman/listinfo/python-list