[SciPy-User] [OT] Transform (i.e., Fourier, Laplace, etc.) methods in Prob. & Stats.

josef.pktd at gmail.com josef.pktd at gmail.com
Wed Nov 25 21:48:56 EST 2009


On Wed, Nov 25, 2009 at 9:04 PM,  <josef.pktd at gmail.com> wrote:
> On Wed, Nov 25, 2009 at 7:31 PM, David Goldsmith
> <d.l.goldsmith at gmail.com> wrote:
>> On Wed, Nov 25, 2009 at 3:45 PM, <josef.pktd at gmail.com> wrote:
>>>
>>> On Wed, Nov 25, 2009 at 6:24 PM, David Goldsmith
>>> <d.l.goldsmith at gmail.com> wrote:
>>> > Good info, thanks; I'll look up "your" thread, Josef, on the archive and
>>> > run
>>> > down what look like relevant references.  (FWIW, my interest is that I'm
>>> > helping out (nominally, "tutoring," but this level, it's more akin to
>>> > being
>>> > a sounding board, checking his derivations, and "reminding" him of
>>> > various
>>> > subtleties that are emphasized in math, but not necessarily in EE, etc.)
>>> > this guy working on his dissertation on air traffic control automation
>>> > using
>>> > wireless communication protocols, very probability heavy stuff, and for
>>> > the
>>> > second time yesterday, he presented me with a transform application - in
>>> > this instance, the "Z" transform - in this probability-heavy stuff, and
>>> > this
>>> > is outside of my training in probability, so I want to "bone-up.")
>>> > Thanks
>>> > again,
>>>
>>> I always have to look for your reply because you don't follow our
>>> bottom-posting
>>> policy.
>>
>> Sorry, I tend to "follow" when I'm saying something in direct response to
>> something I'm replying to and/or when I think that I'm likely _not_
>> terminating the thread, but when I'm responding generally and/or think that
>> I am likely terminating the thread, then I tend to just reply at the top.
>> I'll try to remember that we have a policy. :-)
>>
>>> I have seen the z-transform only in the context of time series analysis
>>> http://en.wikipedia.org/wiki/Z-transform
>>> especially this
>>>
>>> http://en.wikipedia.org/wiki/Z-transform#Linear_constant-coefficient_difference_equation
>>> covered to some extend in scipy.signal, lfilter and lti
>>
>> Part of the problem was that it wasn't clear to either of us - myself or my
>> "student" - why the authors of this particular paper were using the
>> z-transform at all where they were - it seemed their result was easily
>> derivable w/out it, so we were both baffled.
>>
>>> so the other literature to Laplace transforms and characteristic functions
>>> might not be very closely related.
>>
>> Perhaps not directly (in any event, presently, I'm interested in
>> theoretical/"analytical," i.e., not numerical, applications anyway), but my
>> philosophy has always been, if I can be directed to something that is closer
>> to on target than what I've been able to find on my own, then even if it's
>> not a bulls-eye, I can often find a bulls-eye in the reference's
>> references.  For example, "Chung (or any other book on graduate
>> probability)" sounds like a good starting point.  So thanks for reminding me
>> about the thread.  (I knew it sounded familiar: I contributed to it!  And on
>> that note, I "let it lie" at the time, but now feel I should say, admittedly
>> a little defensively, that of course Anne's comments were on the mark; the
>> only reasons I felt it necessary to add what I did about complex integration
>> over a closed path were: A) you had indicated that you were a bit of a
>> novice in the field, and the result I was giving is, perhaps arguably, the
>> subject's most fundamental result, and B) I felt that it was important that
>> you were aware of it because, if any of your functions _were_ analytic and
>> your paths closed, then you shouldn't be doing any (explicit) numerical (or
>> symbolic, for that matter) integration at all - you should just be
>> "hard-wiring" those integrals to zero!  And for what it's worth: every time
>> you integrate with respect to one (continuous) real variable, you're doing a
>> path integration - one so comparatively trivial that we don't call it that,
>> but a path integration nevertheless.) :-)
>
> I was just reading up a bit on contour integrals on wikipedia, and it
> looks too applied for Probability and Measure theory. It just tells
> you how to use some tricks to calculate specific Rieman integrals in
> the complex plane. I didn't see any hints for Lebesque integrals.
> All real analysis, and measure theory (that I have seen) is based on
> Lebesque integration or Lebesque-Stiltjes as in Chungs book. So for me
> contour integrals just falls in between the measure theory and the
> applied (real) calculations, and I never had to figure out what it
> does.
>
> I'm not doing path integration when I integrate with respect to a
> (probability) measure that has both continuous intervals and mass
> points (Lebesque not Rieman if you want to be picky)

Maybe the last statement is wrong, it's too long ago that I
struggled with this. Maybe I'm mixing up Lebesgue-integral,
Lebesgue-measurable, and measures that are absolutely continuous
with respect to Lebesgue-measure.

Josef

>
> Josef
>
>
>
>
>>
>> DG
>>
>>>
>>> Josef
>>>
>>>
>>> >
>>> > DG
>>> >
>>> > On Wed, Nov 25, 2009 at 2:41 PM, nicky van foreest
>>> > <vanforeest at gmail.com>
>>> > wrote:
>>> >>
>>> >> Hi,
>>> >>
>>> >> 2009/11/25  <josef.pktd at gmail.com>:
>>> >> > On Wed, Nov 25, 2009 at 2:53 PM, David Goldsmith
>>> >> > <d.l.goldsmith at gmail.com> wrote:
>>> >> >> Are there enoug applications of transform methods (by which I mean,
>>> >> >> Fourier, Laplace, Z, etc.) in probability & statistics for this to
>>> >> >> be
>>> >> >> considered its own specialty therein?  Any text recommendations on
>>> >> >> it
>>> >> >> (even
>>> >> >> if it's only a chapter dedicated to it)?  Thanks,
>>> >> >>
>>> >> >
>>> >> > Some information is in the thread on my recent question
>>> >> > "characteristic functions of probability distributions"
>>> >> >
>>> >> > There is a large literature in econometrics and statistics about
>>> >> > using
>>> >> > the characteristic function for estimation and testing.
>>> >> > The reference of Nicky for queuing theory uses mostly the Laplace
>>> >> > transform (for discrete distributions),
>>> >>
>>> >> It has been some time ago (more than 5 years...), but I recall that
>>> >> Whitt, in his articles on the numerical inversion of Laplace
>>> >> transforms, discretized Laplace transforms to facilitate the
>>> >> inversion, The distributions themselves are not necessarily discrete.
>>> >> One example would be the waiting time distribution of customers in a
>>> >> queue, which is continuous for most service and arrival processes.
>>> >>
>>> >> There is certainly potential for dedicated numerical inversion algo's
>>> >> for the Laplace transforms of density and distribution functions. The
>>> >> latter form a somewhat specialized sort of function. Distribution
>>> >> functions are 0 at -\infty, and 1 at \infty, and are non decreasing.
>>> >> They may also have discontinuities, but not too many. These properties
>>> >> may affect the inversion.  Besides these properties, the transforms
>>> >> are used to obtain insight into the behavior of the sum of independent
>>> >> random variables. Such sums can be rewritten as the product of the
>>> >> transforms of distribution. This product in turn requires inversion
>>> >> to, as some people call it, take away the Laplacian curtain.
>>> >>
>>> >> Nicky
>>> >> _______________________________________________
>>> >> SciPy-User mailing list
>>> >> SciPy-User at scipy.org
>>> >> http://mail.scipy.org/mailman/listinfo/scipy-user
>>> >
>>> >
>>> > _______________________________________________
>>> > SciPy-User mailing list
>>> > SciPy-User at scipy.org
>>> > http://mail.scipy.org/mailman/listinfo/scipy-user
>>> >
>>> >
>>> _______________________________________________
>>> SciPy-User mailing list
>>> SciPy-User at scipy.org
>>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>
>>
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User at scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>
>>
>



More information about the SciPy-User mailing list