[Numpy-discussion] Optimized sum of squares

Anne Archibald peridot.faceted at gmail.com
Tue Oct 20 15:09:36 EDT 2009


2009/10/20  <josef.pktd at gmail.com>:
> On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben <gruben at bigpond.net.au> wrote:
>> Hi Gaël,
>>
>> If you've got a 1D array/vector called "a", I think the normal idiom is
>>
>> np.dot(a,a)
>>
>> For the more general case, I think
>> np.tensordot(a, a, axes=something_else)
>> should do it, where you should be able to figure out something_else for
>> your particular case.
>
> Is it really possible to get the same as np.sum(a*a, axis)  with
> tensordot  if a.ndim=2 ?
> Any way I try the "something_else", I get extra terms as in np.dot(a.T, a)

It seems like this would be a good place to apply numpy's
higher-dimensional ufuncs: what you want seems to just be the vector
inner product, broadcast over all other dimensions. In fact I believe
this is implemented in numpy as a demo: numpy.umath_tests.inner1d
should do the job.


Anne



More information about the NumPy-Discussion mailing list