[Numpy-discussion] Dot/inner products with broadcasting?

Jaakko Luttinen jaakko.luttinen at aalto.fi
Thu Mar 14 07:54:06 EDT 2013


Answering to myself, this pull request seems to implement an inner
product with broadcasting (inner1d) and many other useful functions:
https://github.com/numpy/numpy/pull/2954/
-J

On 03/13/2013 04:21 PM, Jaakko Luttinen wrote:
> Hi!
> 
> How can I compute dot product (or similar multiply&sum operations)
> efficiently so that broadcasting is utilized?
> For multi-dimensional arrays, NumPy's inner and dot functions do not
> match the leading axes and use broadcasting, but instead the result has
> first the leading axes of the first input array and then the leading
> axes of the second input array.
> 
> For instance, I would like to compute the following inner-product:
> np.sum(A*B, axis=-1)
> 
> But numpy.inner gives:
> A = np.random.randn(2,3,4)
> B = np.random.randn(3,4)
> np.inner(A,B).shape
> # -> (2, 3, 3) instead of (2, 3)
> 
> Similarly for dot product, I would like to compute for instance:
> np.sum(A[...,:,:,np.newaxis]*B[...,np.newaxis,:,:], axis=-2)
> 
> But numpy.dot gives:
> In [12]: A = np.random.randn(2,3,4); B = np.random.randn(2,4,5)
> In [13]: np.dot(A,B).shape
> # -> (2, 3, 2, 5) instead of (2, 3, 5)
> 
> I could use einsum for these operations, but I'm not sure whether that's
> as efficient as using some BLAS-supported(?) dot products.
> 
> I couldn't find any function which could perform this kind of
> operations. NumPy's functions seem to either flatten the input arrays
> (vdot, outer) or just use the axes of the input arrays separately (dot,
> inner, tensordot).
> 
> Any help?
> 
> Best regards,
> Jaakko
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
> 




More information about the NumPy-Discussion mailing list