[SciPy-Dev] Randomized linear algebra functionality

Pauli Virtanen pav at iki.fi
Sat Sep 15 20:38:27 EDT 2018


la, 2018-09-15 kello 16:45 -0700, Matt Haberland kirjoitti:
> Yes, well mostly (2). I am particularly interested in a calculating
> matrix
> rank and especially finding the linearly dependent columns. I don't
> necessarily need an orthonormal basis for the nullspace.

As Andreas notes, the former exists in

https://docs.scipy.org/doc/scipy/reference/linalg.interpolative.html

The latter probably also can be expressed in terms of this
decomposition.

	Pauli
 

> 
> Matt
> 
> On Sat, Sep 15, 2018 at 3:47 PM, Touqir Sajed <touqir at ualberta.ca>
> wrote:
> 
> > I assume you are referring to the linear algebra algorithms in
> > section 4.1
> > that use their basis construction algorithm and not the graph
> > matching and
> > matroid optimization algorithms? It seems their results are still
> > the state
> > of the art. I am fine with implementing them.
> > I am thinking of the following API changes :
> > 
> > (1) Implement   scipy.linalg.dot_lowrank or
> > scipy.linalg.mult_lowrank
> > for multiplying a low-rank matrix A with another matrix B. Or just
> > implement this additional functionality within numpy's dot function
> > and add
> > a boolean parameter.
> > 
> > (2) Add a boolean parameter "rand"
> > to    scipy.linalg.null_space    to use
> > the more sophisticated randomized algorithm for null space
> > computation.
> > 
> > (3) Implement   scipy.linalg.rankone_matrix_decomposition   for
> > decomposing the matrix into a sum of r rank 1 matrices.
> > 
> > All of these algorithms should give performance improvements for
> > low-rank
> > matrices.
> > 
> > On Sat, Sep 15, 2018 at 2:07 PM Matt Haberland <haberland at ucla.edu>
> > wrote:
> > 
> > > I'd really love to see some of the techniques in:
> > > https://arxiv.org/pdf/1203.6705.pdf
> > > implemented!
> > > Matt
> > > 
> > > On Sat, Sep 15, 2018 at 12:44 PM, Touqir Sajed <
> > > touqir at ualberta.ca>
> > > wrote:
> > > 
> > > > Aha, this implements the exact functionality that I had in
> > > > mind! Thanks
> > > > for bringing this up Andreas.
> > > > Regarding fast JL transformation, there is an existing package
> > > > in python
> > > > : https://github.com/michaelmathen/FJLT
> > > > <
> > > > https://ml-trckr.com/link/https%3A%2F%2Fgithub.com%2Fmichaelmathen%2FFJLT/ItaYhTc7pvlDqpF6qIPH
> > > > >
> > > > .
> > > > 
> > > > Are there functionalities that the core developers have in mind
> > > > and can
> > > > use some help implementing them?
> > > > 
> > > > On Sat, Sep 15, 2018 at 1:04 PM Andreas Kloeckner <
> > > > lists at informa.tiker.net> wrote:
> > > > 
> > > > > Touqir,
> > > > > 
> > > > > Touqir Sajed <touqir at ualberta.ca> writes:
> > > > > > There seems to be some interest in implementing randomized
> > > > > > linear
> > > > > 
> > > > > algebra
> > > > > > algorithms such as JL transformation, low-rank
> > > > > > approximation, etc as
> > > > > > written in  
> > > > > > https://github.com/scipy/scipy/wiki/GSoC-2018-project-
> > > > > 
> > > > > ideas
> > > > > > <https://ml-trckr.com/link/https%3A%2F%2Fgithub.com%
> > > > > 
> > > > > 2Fscipy%2Fscipy%2Fwiki%2FGSoC-2018-project-
> > > > > ideas/cBzmzxqJ6V7Fi58kyipk>
> > > > > > . To the best of my knowledge, they are not implemented
> > > > > > yet. If there
> > > > > 
> > > > > still
> > > > > > exists interests I would like to implement the Fast JL
> > > > > > transformation
> > > > > 
> > > > > and
> > > > > > the randomized SVD for low-rank approximation. The pseudo
> > > > > > code of
> > > > > 
> > > > > Fast JL
> > > > > > transformation is given here:
> > > > > > 
https://pdfs.semanticscholar.org/ccf0/1e0375d11df5335b645161b4833e32
> > > > > 
> > > > > 380d89.pdf
> > > > > > <https://ml-trckr.com/link/https%3A%2F%2Fpdfs.
> > > > > 
> > > > > semanticscholar.org%2Fccf0%2F1e0375d11df5335b645161b4833e3238
> > > > > 0d89.pdf/
> > > > > cBzmzxqJ6V7Fi58kyipk>
> > > > > > . The randomized SVD is based on 
> > > > > > https://arxiv.org/pdf/0909.4061.pdf
> > > > > > <https://ml-trckr.com/link/https%3A%2F%2Farxiv.org%2Fpdf%
> > > > > 
> > > > > 2F0909.4061.pdf/cBzmzxqJ6V7Fi58kyipk>
> > > > > 
> > > > > scipy.linalg.interpolative is in fact based on randomized
> > > > > linear
> > > > > algebra. In particular, this function:
> > > > > 
> > > > > https://docs.scipy.org/doc/scipy/reference/generated/
> > > > > scipy.linalg.interpolative.svd.html#scipy.linalg.interpolativ
> > > > > e.svd
> > > > > 
> > > > > already does implement a randomized SVD with a number of
> > > > > acceleration
> > > > > bells and whistles.
> > > > > 
> > > > > Andreas
> > > > > 
> > > > 
> > > > 
> > > > --
> > > > Computing Science Master's student at University of Alberta,
> > > > Canada,
> > > > specializing in Machine Learning. Website : 
> > > > https://ca.linkedin.com/in/
> > > > touqir-sajed-6a95b1126
> > > > 
> > > > _______________________________________________
> > > > SciPy-Dev mailing list
> > > > SciPy-Dev at python.org
> > > > https://mail.python.org/mailman/listinfo/scipy-dev
> > > > 
> > > > 
> > > 
> > > 
> > > --
> > > Matt Haberland
> > > Assistant Adjunct Professor in the Program in Computing
> > > Department of Mathematics
> > > 6617A Math Sciences Building, UCLA
> > > _______________________________________________
> > > SciPy-Dev mailing list
> > > SciPy-Dev at python.org
> > > https://mail.python.org/mailman/listinfo/scipy-dev
> > > 
> > 
> > 
> > --
> > Computing Science Master's student at University of Alberta,
> > Canada,
> > specializing in Machine Learning. Website : 
> > https://ca.linkedin.com/in/
> > touqir-sajed-6a95b1126
> > 
> > _______________________________________________
> > SciPy-Dev mailing list
> > SciPy-Dev at python.org
> > https://mail.python.org/mailman/listinfo/scipy-dev
> > 
> > 
> 
> 
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev




More information about the SciPy-Dev mailing list