[Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy

Ryan Soklaski rsoklaski at gmail.com
Sun Apr 18 12:10:13 EDT 2021


All,

I am excited to announce the release of MyGrad 2.0.

MyGrad's primary goal is to make automatic differentiation accessible and
easy to use across the NumPy ecosystem (see [1] for more detailed comments).

Source: https://github.com/rsokl/MyGrad
Docs: https://mygrad.readthedocs.io/en/latest/

MyGrad's only dependency is NumPy and (as of version 2.0) it makes keen use
of NumPy's excellent protocols for overriding functions and ufuncs. Thus
you can "drop in" a mygrad-tensor into your pure NumPy code and compute
derivatives through it.

Ultimately, MyGrad could be extended to bring autodiff to other array-based
libraries like CuPy, Sparse, and Dask.

For full release notes see [2]. Feedback, critiques, and ideas are welcome!

Cheers,
Ryan Soklaski

[1] MyGrad is not meant to "compete" with the likes of PyTorch and JAX,
which are fantastically-fast and powerful autodiff libraries. Rather, its
emphasis is on being lightweight and seamless to use in NumPy-centric
workflows.
[2] https://mygrad.readthedocs.io/en/latest/changes.html#v2-0-0
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.python.org/pipermail/numpy-discussion/attachments/20210418/c8ebed5e/attachment.html>


More information about the NumPy-Discussion mailing list