[SciPy-User] orthogonal polynomials ?

josef.pktd at gmail.com josef.pktd at gmail.com
Mon May 16 12:47:21 EDT 2011


On Mon, May 16, 2011 at 12:29 PM, Charles R Harris
<charlesr.harris at gmail.com> wrote:
>
>
> On Mon, May 16, 2011 at 10:22 AM, <josef.pktd at gmail.com> wrote:
>>
>> On Mon, May 16, 2011 at 11:27 AM, Charles R Harris
>> <charlesr.harris at gmail.com> wrote:
>> >
>> >
>> > On Mon, May 16, 2011 at 9:17 AM, Charles R Harris
>> > <charlesr.harris at gmail.com> wrote:
>> >>
>> >>
>> >> On Mon, May 16, 2011 at 9:11 AM, <josef.pktd at gmail.com> wrote:
>> >>>
>> >>> On Sat, May 14, 2011 at 6:04 PM, Charles R Harris
>> >>> <charlesr.harris at gmail.com> wrote:
>> >>> >
>> >>> >
>> >>> > On Sat, May 14, 2011 at 2:26 PM, <josef.pktd at gmail.com> wrote:
>> >>> >>
>> >>> >> On Sat, May 14, 2011 at 4:11 PM, nicky van foreest
>> >>> >> <vanforeest at gmail.com>
>> >>> >> wrote:
>> >>> >> > Hi,
>> >>> >> >
>> >>> >> > Might this be what you want:
>> >>> >> >
>> >>> >> > The first eleven probabilists' Hermite polynomials are:
>> >>> >> >
>> >>> >> > ...
>> >>> >> >
>> >>> >> > My chromium browser does not seem to paste pngs. Anyway, check
>> >>> >> >
>> >>> >> >
>> >>> >> > http://en.wikipedia.org/wiki/Hermite_polynomials
>> >>> >> >
>> >>> >> > and you'll see that the first polynomial is 1, the second x, and
>> >>> >> > so
>> >>> >> > forth. From my courses on quantum mechanics I recall that these
>> >>> >> > polynomials are, with respect to some weight function,
>> >>> >> > orthogonal.
>> >>> >>
>> >>> >> Thanks, I haven't looked at that yet, we should add wikipedia to
>> >>> >> the
>> >>> >> scipy.special docs.
>> >>> >>
>> >>> >> However, I would like to change the last part "with respect to some
>> >>> >> weight function"
>> >>> >> http://en.wikipedia.org/wiki/Hermite_polynomials#Orthogonality
>> >>> >>
>> >>> >> Instead of Gaussian weights I would like uniform weights on bounded
>> >>> >> support. And I have never seen anything about changing the weight
>> >>> >> function for the orthogonal basis of these kind of polynomials.
>> >>> >>
>> >>> >
>> >>> > In numpy 1.6, you can use the Legendre polynomials. They are
>> >>> > orthogonal
>> >>> > on
>> >>> > [-1,1] as has been mentioned, but can be mapped to other domains.
>> >>> > For
>> >>> > example
>> >>> >
>> >>> > In [1]: from numpy.polynomial import Legendre as L
>> >>> >
>> >>> > In [2]: for i in range(5): plot(*L([0]*i + [1],
>> >>> > domain=[0,1]).linspace())
>> >>> >    ...:
>> >>> >
>> >>> > produces the attached plots.
>> >>>
>> >>> I'm still on numpy 1.5 so this will have to wait a bit.
>> >>>
>> >>> > <snip>
>> >>> >
>> >>> > Chuck
>> >>> >
>> >>>
>> >>>
>> >>> as a first application for orthogonal polynomials I was trying to get
>> >>> an estimate for a density, but I haven't figured out the weighting
>> >>> yet.
>> >>>
>> >>> Fourier polynomials work better for this.
>> >>>
>> >>
>> >> You might want to try Chebyshev then, the Cheybyshev polynomialas are
>> >> essentially cosines and will handle the ends better. Weighting might
>> >> also
>> >> help, as I expect the distribution of the errors are somewhat Poisson.
>> >>
>> >
>> > I should mention that all the polynomial fits will give you the same
>> > results, but the Chebyshev fits are more numerically stable. The general
>> > approach is to overfit, i.e., use more polynomials than needed and then
>> > truncate the series resulting in a faux min/max approximation. Unlike
>> > power
>> > series, the coefficients of the Cheybshev series will tend to decrease
>> > rapidly at some point.
>>
>> I think I might have still something wrong with the way I use the
>> scipy.special polynomials
>>
>> for a large sample size with 10000 observations, the nice graph is
>> fourier with 20 elements, the second (not so nice) is with
>> scipy.special.chebyt with 500 polynomials. The graph for 20 Chebychev
>> polynomials looks very similar
>>
>> Chebychev doesn't want to get low enough to adjust to the low part.
>>
>> (Note: I rescale to [0,1] for fourier, and [-1,1] for chebyt)
>>
>
> That certainly doesn't look right. Could you mail me the data offline? Also,
> what are you fitting, the histogram, the cdf, or...?

The numbers are generated (by a function in scikits.statsmodels). It's
all in the script that I posted, except I keep changing things.

I'm not fitting anything directly.
There is supposed to be a closed form expression, the estimated
coefficient of each polynomial is just the mean of that polynomial
evaluated at the data. The theory and the descriptions sounded easy,
but unfortunately it didn't work out.

I was just hoping to get lucky and that I'm able to skip the small print.
http://onlinelibrary.wiley.com/doi/10.1002/wics.97/abstract
got me started and it has the fourier case that works.

There are lots of older papers that I only skimmed, but I should be
able to work out the Hermite case before going back to the general
case again with arbitrary orthogonal polynomial bases. (Initially I
wanted to ignore the Hermite bases because gaussian_kde works well in
that case.)

Thanks,

Josef

>
> Chuck
>
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
>



More information about the SciPy-User mailing list