[Chicago] Using Graphics Processing Cards for Math Calculations

Peter Fein pfein at pobox.com
Mon Oct 9 23:55:32 CEST 2006


On Monday 09 October 2006 16:11, Joe Baker wrote:
> I've been reading quite allot lately about using GPU's for performing
> high end mathematics computations.
>
> GPUs are about 10 times faster at moving memory and can perform 10 times
> more math problems in parallel than CPUs can. So for the right
> application this could mean a 100 fold increase in computational
> throughput.
>
> Does anybody know of any Python modules which take advantage of GPU
> devices from Nvidia or ATI?

I doubt it, though I'm certainly interested. ;)

This stuff is academic-beta quality ATM - the C libraries are pretty unstable.  
Slashdot ran an article not too long ago...  Apparently, the chipsets aren't 
really designed to be used this way - you have to do things like maintaing 
screen state, even if you don't really care about the screen.

Apparently ATI/Nvidia have finally woken up to the fact that people will SPEND 
MONEY ON HARDWARE to use it for things they original manufacturers didn't 
think of, and are therefore starting to assist/develop for this market, 
instead of trying to squash it.  Pure craziness, I tell you.

Certainly seems like specialized hardware's gaining some momentum, whether 
repurposed GPUs, Cell processors or algorithm-specific silicon (for 
encryption algos & the like).  Though it's probably gonna be at least a few 
years before these things are really price/performance competitive with 
general-purpose CPUs.

-- 
Peter Fein                                                     pfein at pobox.com
773-575-0694                                      Jabber: peter.fein at gmail.com
http://www.pobox.com/~pfein/                     irc://irc.freenode.net/#chipy


More information about the Chicago mailing list