[SciPy-User] my own hist_equal (Vectorization gods required! ;)

Michael Aye kmichael.aye at googlemail.com
Thu Apr 8 12:11:50 EDT 2010


Hi all,

I am implementing my own histogram equalization, as the few lines of
code that were mentioned before in this forum didn't really work for
me and I wanted to really understand what is going on there as well
anyway.

So here is what I got so far, main effort was to wrap my brain around
the double lookup table I needed to find the right values to replace
and the new values, based on the transformed cdf.
So, it seems to work fine, but the remaining loop of course disturbs
me in terms of efficiency and I wondered if some of you vectorization
gods could have a look if there's something possible.
Especially when I have 16 bit images, there are a lot of different
luminances I have to loop through, so it's not the fastest.
Apart from that I think it can't be optimized further much (I think?)

def hist_equal(data):
    # range is +2 to have the highest luminance to get into correct
bin
    bins = numpy.arange(data.min(), data.max() + 2)
    # first the histogram of luminances
    h, bins = numpy.histogram(data, bins=bins)
    # now get the cdf
    cdf = h.cumsum()
    # now get the unique luminance values
    uniques = numpy.unique(data)
    # this to not look it up again everytime in the numpy array
    nPixel = data.size
    cdfmin = cdf.min()
    # don't change the incoming data
    nData = data.copy()
    for lum in uniques:
        # algorithm taken from wikipedia entry for histogram
equalization
        # always scaling to 256 luminance values
        nData[numpy.where(data == lum)] = \
            numpy.round((cdf[numpy.where(bins == lum)] - cdfmin) *
255 / nPixel)
    return nData

I am thinking that this might even be useful to implement somewhere
(unless it is already and I missed it??)

Best regards,
Michael



More information about the SciPy-User mailing list