[Matrix-SIG] Sparse matrices (again?)

Travis Oliphant Oliphant.Travis@mayo.edu
Wed, 1 Sep 1999 18:34:33 -0500 (CDT)


Well, it's been a few months since I first talked about sparse matrices
and like many of you I've been busy cleaning up some other issues on my
plate (and taking a vacation).  Well, now I really need sparse matrix
support so I'm spending some time on it.

There were some excellent comments back in May and quite a bit of interest
which is why I'm mailing this list a progress-report-of-sorts so that I
can avoid unnecessary duplication of effort if possible and so I can
benefit from the more knowledgeable people out there (and solicit any
help by interested people).

I've wrapped up most of sparsekit2 and some of the sparse blas (from NIST)
after making changes to them so they will support single and double
precision for real and complex matrices.  These packages will provide the
core (fast) functionality of sparse matrix operations and conversions from
one format to another.  

Sparse kit contains conversion routines to convert between 15 different
formats but only contains basic matrix operations for compressed sparse
row format.  

The sparse blas contains matrix-vector operations for many different
formats also but I have only wrapped up compressed sparse row and
compressed sparse column (as these were the only ones I converted from
double-precision-only to all-precisions as a start).

Now I'm setting up the Python class which will call these underlying
routines to do the operations and this is where I need some advice.

I'd thought to name the class spMatrix and subclass it from the Array base
class (thanks to the ExtensionClass foundation for NumPy 1.12), but I
wonder what the benefit is of doing that since I will need to redefine
most operations anyway.  Any ideas?

When I get a fleshed out sparse class I'll post what I've got.
Alternatively if someone as a server they where CVS development could take
place that they'd be willing to donate I would place it there. 

Just letting those interested know that work is progressing.

Travis,