[Numpy-discussion] ENH: compute many inner products quickly

Mark Daoust daoust.mj at gmail.com
Sun Jun 5 20:08:32 EDT 2016


Here's the einsum version:

`es =  np.einsum('Na,ab,Nb->N',X,A,X)`

But that's running ~45x slower than your version.

OT: anyone know why einsum is so bad for this one?

Mark Daoust

On Sat, May 28, 2016 at 11:53 PM, Scott Sievert <sievert.scott at gmail.com>
wrote:

> I recently ran into an application where I had to compute many inner
> products quickly (roughy 50k inner products in less than a second). I
> wanted a vector of inner products over the 50k vectors, or `[x1.T @ A @ x1,
> …, xn.T @ A @ xn]` with A.shape = (1k, 1k).
>
> My first instinct was to look for a NumPy function to quickly compute
> this, such as np.inner. However, it looks like np.inner has some other
> behavior and I couldn’t get tensordot/einsum to work for me.
>
> Then a labmate pointed out that I can just do some slick matrix
> multiplication to compute the same quantity, `(X.T * A @ X.T).sum(axis=0)`.
> I opened [a PR] with this, and proposed that we define a new function
> called `inner_prods` for this.
>
> However, in the PR, @shoyer pointed out
>
> > The main challenge is to figure out how to transition the behavior of
> all these operations, while preserving backwards compatibility. Quite
> likely, we need to pick new names for these functions, though we should try
> to pick something that doesn't suggest that they are second class
> alternatives.
>
> Do we choose new function names? Do we add a keyword arg that changes what
> np.inner returns?
>
> [a PR]:https://github.com/numpy/numpy/pull/7690
>
>
>
>
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20160605/4e935bed/attachment.html>


More information about the NumPy-Discussion mailing list