[Theory] How to speed up python code execution / pypy vs GPU

John Ladasky john_ladasky at sbcglobal.net
Wed Nov 9 02:35:04 EST 2016


On Monday, November 7, 2016 at 5:23:25 PM UTC-8, Steve D'Aprano wrote:
> On Tue, 8 Nov 2016 05:47 am, j... at i...edu wrote:
> > It has been very important for the field of computational molecular
> > dynamics (and probably several other fields) to get floating-point
> > arithmetic working right on GPU architecture.  I don't know anything about
> > other manufacturers of GPU's, but NVidia announced IEEE-754,
> > double-precision arithmetic for their GPU's in 2008, and it's been
> > included in the standard since CUDA 2.0.
> 
> That's excellent news, and well-done to NVidia.
> 
> But as far as I know, they're not the only manufacturer of GPUs, and they
> are the only ones who support IEEE 754. So this is *exactly* the situation
> I feared: incompatible GPUs with varying support for IEEE 754 making it
> difficult or impossible to write correct numeric code across GPU platforms.
> 
> Perhaps it doesn't matter? Maybe people simply don't bother to use anything
> but Nvidia GPUs for numeric computation, and treat the other GPUs as toys
> only suitable for games.

Maybe so.  I only know for certain that recent NVidia devices comply with IEEE-754.  Others might work too.

> > If floating-point math wasn't working on GPU's, I suspect that a lot of
> > people in the scientific community would be complaining.
> 
> I don't.
> 
> These are scientists, not computational mathematics computer scientists. In
> the 1980s, the authors of the "Numeric Recipes in ..." books, William H
> Press et al, wrote a comment about the large number of scientific papers
> and simulations which should be invalidated due to poor numeric properties
> of the default pseudo-random number generators available at the time.
> 
> I see no reason to think that the numeric programming sophistication of the
> average working scientist or Ph.D. student has improved since then.

I work a lot with a package called GROMACS, which does highly iterative calculations to simulate the motions of atoms in complex molecules.  GROMACS can be built to run on a pure-CPU platform (taking advantage of multiple cores, if you want), a pure-GPU platform (leaving your CPU cores free), or a blended platform, where certain parts of the algorithm run on CPUs and other parts on GPUs.  This latter configuration is the most powerful, because only some parts of the simulation algorithm are optimal for GPUs.  GROMACS only supports NVidia hardware with CUDA 2.0+.

Because of the iterative nature of these calculations, small discrepancies in the arithmetic algorithms can rapidly lead to a completely different-looking result.  In order to verify the integrity of GROMACS, the developers run simulations with all three supported hardware configurations, and verify that the results are identical.  Now, I don't know that every last function and corner case in the IEEE-754 suite gets exercised by GROMACS, but that's a strong vote of confidence.

> The average scientist cannot even be trusted to write an Excel spreadsheet
> without errors that invalidate their conclusion:
> 
> https://www.washingtonpost.com/news/wonk/wp/2016/08/26/an-alarming-number-of-scientific-papers-contain-excel-errors/
>
>
> let alone complex floating point numeric code. Sometimes those errors can
> change history: the best, some might say *only*, evidence for the austerity
> policies which have been destroying the economies in Europe for almost a
> decade now is simply a programming error.
> 
> http://www.bloomberg.com/news/articles/2013-04-18/faq-reinhart-rogoff-and-the-excel-error-that-changed-history

I know this story.  It's embarrassing.

> These are not new problems: dubious numeric computations have plagued
> scientists and engineers for decades, there is still a huge publication
> bias against negative results, most papers are written but not read, and
> even those which are read, most are wrong.
> 
> http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0020124
> 
> Especially in fast moving fields of science where there is money to be made,
> like medicine and genetics. There the problems are much, much worse.
> 
> Bottom line: I'm very glad that Nvidia now support IEEE 754 maths, and that
> reduces my concerns: at least users of one common GPU can be expected to
> have correctly rounded results of basic arithmetic operations.
> 
> 
> -- 
> Steve
> “Cheer up,” they said, “things could be worse.” So I cheered up, and sure
> enough, things got worse.

You're right Steve, the election results are rolling in.





More information about the Python-list mailing list