Determinant of Large Matrix

Peter Otten __peter__ at web.de
Wed Jun 6 16:39:03 EDT 2007


James Stroud wrote:

> I'm using numpy to calculate determinants of matrices that look like
> this (13x13):
> 
> [[ 0.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.]
>   [ 1.  0.  1.  4.  1.  9.  4.  4.  1.  1.  4.  9.  4.  9.]
>   [ 1.  1.  0.  1.  4.  4.  9.  9.  4.  4.  1.  4.  1.  4.]
>   [ 1.  4.  1.  0.  9.  1.  4.  4.  9.  1.  4.  1.  4.  1.]
>   [ 1.  1.  4.  9.  0.  4.  4.  4.  1.  4.  1.  9.  4.  9.]
>   [ 1.  9.  4.  1.  4.  0.  4.  4.  9.  4.  1.  1.  4.  1.]
>   [ 1.  4.  9.  4.  4.  4.  0.  1.  1.  1.  9.  1.  9.  4.]
>   [ 1.  4.  9.  4.  4.  4.  1.  0.  4.  1.  9.  4.  4.  1.]
>   [ 1.  1.  4.  9.  1.  9.  1.  4.  0.  4.  4.  4.  4.  9.]
>   [ 1.  1.  4.  1.  4.  4.  1.  1.  4.  0.  9.  4.  9.  4.]
>   [ 1.  4.  1.  4.  1.  1.  9.  9.  4.  9.  0.  4.  1.  4.]
>   [ 1.  9.  4.  1.  9.  1.  1.  4.  4.  4.  4.  0.  4.  1.]
>   [ 1.  4.  1.  4.  4.  4.  9.  4.  4.  9.  1.  4.  0.  1.]
>   [ 1.  9.  4.  1.  9.  1.  4.  1.  9.  4.  4.  1.  1.  0.]]
> 
> For this matrix, I'm getting this with numpy:
> 
>   2774532095.9999971
> 
> But I have a feeling I'm exceeding the capacity of floats here. Does
> anyone have an idea for how to treat this? Is it absurd to think I could
> get a determinant of this matrix? Is there a python package that could
> help me?

Here's some anecdotal evidence that your result may be correct:

import operator

m = eval("""[[ 0.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.]
  [ 1.  0.  1.  4.  1.  9.  4.  4.  1.  1.  4.  9.  4.  9.]
  [ 1.  1.  0.  1.  4.  4.  9.  9.  4.  4.  1.  4.  1.  4.]
  [ 1.  4.  1.  0.  9.  1.  4.  4.  9.  1.  4.  1.  4.  1.]
  [ 1.  1.  4.  9.  0.  4.  4.  4.  1.  4.  1.  9.  4.  9.]
  [ 1.  9.  4.  1.  4.  0.  4.  4.  9.  4.  1.  1.  4.  1.]
  [ 1.  4.  9.  4.  4.  4.  0.  1.  1.  1.  9.  1.  9.  4.]
  [ 1.  4.  9.  4.  4.  4.  1.  0.  4.  1.  9.  4.  4.  1.]
  [ 1.  1.  4.  9.  1.  9.  1.  4.  0.  4.  4.  4.  4.  9.]
  [ 1.  1.  4.  1.  4.  4.  1.  1.  4.  0.  9.  4.  9.  4.]
  [ 1.  4.  1.  4.  1.  1.  9.  9.  4.  9.  0.  4.  1.  4.]
  [ 1.  9.  4.  1.  9.  1.  1.  4.  4.  4.  4.  0.  4.  1.]
  [ 1.  4.  1.  4.  4.  4.  9.  4.  4.  9.  1.  4.  0.  1.]
  [ 1.  9.  4.  1.  9.  1.  4.  1.  9.  4.  4.  1.  1.  0.]]""".replace(".",
".,").replace("]", "],"))[0]

M = [[int(x) for x in row] for row in m]

def subdet(m, rowindex):
    return [row[1:] for index, row in enumerate(m) if index != rowindex]

def det(m):
    if len(m) == 1:
        return m[0][0]
    sign = 1
    sigma = 0
    for index, row in enumerate(m):
        x = row[0]
        if x:
            sigma += sign * x * det(subdet(m, index))
        sign = -sign
    return sigma

def common_multiple(items):
    items = set(items)
    items.discard(0)
    if items:
        return reduce(operator.mul, items)
    else:
        return 0

def det3(m, switch_algo=8):
    p = 1
    q = 1
    while 1:
        if len(m) == switch_algo:
            a, b = divmod(p*det(m), q)
            assert b == 0
            return a
        cm = common_multiple(row[0] for row in m)
        if cm == 0: return 0

        sign = 1
        e = enumerate(m)
        for first_index, first_row in e:
            if first_row[0]:
                f = cm // first_row[0]
                assert (cm % first_row[0]) == 0
                p *= sign * cm
                q *= f
                first_row[:] = [f*x for x in first_row[1:]]
                break
            first_row[:] = first_row[1:]
            sign = -sign
        for index, row in e:
            if row[0]:
                f = cm // row[0]
                assert (cm % row[0]) == 0
                q *= f
                row[:] = [f*x - fx for x, fx in zip(row[1:], first_row)]
            else:
                row[:] = row[1:]
        del m[first_index]

if __name__ == "__main__":
    import pprint
    pprint.pprint(M)
    result = det3(M)
    assert result == 2774532096
    print "det(M) =", result

As I use only integers, any errors should be algorithmic rather than caused
by rounding.

Peter



More information about the Python-list mailing list