[SciPy-User] Error estimation of integration via numpy.trapz
Jonas V
jonas-v at gmx.net
Sun Jan 19 19:25:08 EST 2014
Dear all,
I integrate samples f(x) whereas f and x are both 1-dimensional Numpy
arrays.
A = numpy.trapz(f, x=x)
Is there a way to get an error estimation of this integration?
Usually, the error could be estimated like this:
|E| <= (xmax-xmin)^3/12 * max(|f''(x)|)
So do I need to find the max(|f''(x)|) myself or is there already
something in Numpy that can be used?
Thank you!
Kind regards,
Jonas
More information about the SciPy-User
mailing list