Parallelization of Python on GPU?

Ethan Furman ethan at stoneleaf.us
Wed Feb 25 22:03:29 EST 2015


On 02/25/2015 06:35 PM, John Ladasky wrote:
> What I would REALLY like to do is to take advantage of my GPU.  My NVidia graphics
> card has 1152 cores and a 1.0 GHz clock.  I wouldn't mind borrowing a few hundred
> of those GPU cores at a time, and see what they can do.  In theory, I calculate
> that I can speed up the job by another five-fold.

Only free for academic use:

  https://developer.nvidia.com/how-to-cuda-python


unsure, but looks like free to use:

  http://mathema.tician.de/software/pycuda/


and, of course, the StackOverflow question:

  http://stackoverflow.com/q/5957554/208880

--
~Ethan~

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-list/attachments/20150225/74ce39f6/attachment.sig>


More information about the Python-list mailing list