[SciPy-dev] Re: scipy / SGI MIPSpro compiler (part 3)

Pearu Peterson pearu at scipy.org
Mon Aug 26 14:37:07 EDT 2002


On 26 Aug 2002, Travis Oliphant wrote:

> > ======================================================================
> > FAIL: check_normal (test_morestats.test_anderson)
> > ----------------------------------------------------------------------
> > Traceback (most recent call last):
> >   File "/usr/local/unstable/lib/python2.1/site-packages/scipy/stats/tests/test_morestats.py", line 46, in check_normal
> >     assert(scipy.all(A < crit[-2:]))
> > AssertionError
> > ======================================================================
> > 
> 
> These "errors"  (and a similar one in the shapiro test) come about
> because of real statistical tests which can sometimes fail even though
> the code is working correctly.  I'm not sure what to do about this. 
> Perhaps a warning could be printed showing the results if the test fails
> rather than raising an error.

When I did

>>> for i in range(200): scipy.stats.test(10)

and counted the number of "errors" then the result was 19. So, every 10th 
run one of the corresponding tests will fail. So, I would suggest the 
following fix:

def check_normal(self):
    msg = ''
    for i in range(5):
        x1 = scipy.stats.expon(size=50)
        x2 = scipy.stats.norm(size=50)
        A,crit,sig = scipy.stats.anderson(x1)
        if not (scipy.all(A > crit[:-1])):
            msg = 'scipy.all(A > crit[:-1])'
            continue  # try again
        A,crit,sig = scipy.stats.anderson(x2)
        if not (scipy.all(A < crit[-2:])):
            msg = 'scipy.all(A < crit[-2:]))'
            continue
        else:
            return # success
    raise AssertionError,msg+' failed'

Pearu




More information about the SciPy-Dev mailing list