[Theory] How to speed up python code execution / pypy vs GPU

Steve D'Aprano steve+python at pearwood.info
Mon Nov 7 20:23:07 EST 2016


On Tue, 8 Nov 2016 05:47 am, jladasky at itu.edu wrote:

> On Saturday, November 5, 2016 at 6:39:52 PM UTC-7, Steve D'Aprano wrote:
>> On Sun, 6 Nov 2016 09:17 am, Mr. Wrobel wrote:
>> 
>> 
>> I don't have any experience with GPU processing. I expect that it will be
>> useful for somethings, but for number-crushing and numeric work, I am
>> concerned that GPUs rarely provide correctly rounded IEEE-754 maths. That
>> means that they are accurate enough for games where a few visual glitches
>> don't matter, but they risk being inaccurate for serious work.
>> 
>> I fear that doing numeric work in GPUs will be returning to the 1970s,
>> when every computer was incompatible with every other computer, and it
>> was almost impossible to write cross-platform, correct, accurate numeric
>> code.
> 
> Hi Steve,
> 
> You, Jason Swails, myself, and several others had a discussion about the
> state of GPU arithmetic and IEEE-754 compliance just over a year ago.

I don't know why you think I was part of this discussion -- I made one
comment early in the thread, and took part in none of the subsequent
comments.

If I had read any of the subsequent comments in the thread, I don't remember
them.

>
https://groups.google.com/forum/#!msg/comp.lang.python/Gt_FzFlES8A/r_3dbW5XzfkJ;context-place=forum/comp.lang.python


For those who dislike GoogleGroups, here's the official archive:

https://mail.python.org/pipermail/python-list/2015-February/686683.html

 
> It has been very important for the field of computational molecular
> dynamics (and probably several other fields) to get floating-point
> arithmetic working right on GPU architecture.  I don't know anything about
> other manufacturers of GPU's, but NVidia announced IEEE-754,
> double-precision arithmetic for their GPU's in 2008, and it's been
> included in the standard since CUDA 2.0.

That's excellent news, and well-done to NVidia.

But as far as I know, they're not the only manufacturer of GPUs, and they
are the only ones who support IEEE 754. So this is *exactly* the situation
I feared: incompatible GPUs with varying support for IEEE 754 making it
difficult or impossible to write correct numeric code across GPU platforms.

Perhaps it doesn't matter? Maybe people simply don't bother to use anything
but Nvidia GPUs for numeric computation, and treat the other GPUs as toys
only suitable for games.


> If floating-point math wasn't working on GPU's, I suspect that a lot of
> people in the scientific community would be complaining.

I don't.

These are scientists, not computational mathematics computer scientists. In
the 1980s, the authors of the "Numeric Recipes in ..." books, William H
Press et al, wrote a comment about the large number of scientific papers
and simulations which should be invalidated due to poor numeric properties
of the default pseudo-random number generators available at the time.

I see no reason to think that the numeric programming sophistication of the
average working scientist or Ph.D. student has improved since then.

The average scientist cannot even be trusted to write an Excel spreadsheet
without errors that invalidate their conclusion:

https://www.washingtonpost.com/news/wonk/wp/2016/08/26/an-alarming-number-of-scientific-papers-contain-excel-errors/

let alone complex floating point numeric code. Sometimes those errors can
change history: the best, some might say *only*, evidence for the austerity
policies which have been destroying the economies in Europe for almost a
decade now is simply a programming error.

http://www.bloomberg.com/news/articles/2013-04-18/faq-reinhart-rogoff-and-the-excel-error-that-changed-history

These are not new problems: dubious numeric computations have plagued
scientists and engineers for decades, there is still a huge publication
bias against negative results, most papers are written but not read, and
even those which are read, most are wrong.

http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0020124

Especially in fast moving fields of science where there is money to be made,
like medicine and genetics. There the problems are much, much worse.

Bottom line: I'm very glad that Nvidia now support IEEE 754 maths, and that
reduces my concerns: at least users of one common GPU can be expected to
have correctly rounded results of basic arithmetic operations.


-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.




More information about the Python-list mailing list