From alun at griffinpc.co.uk Mon Jan 5 07:27:51 2015 From: alun at griffinpc.co.uk (Alun (Griffin PC)) Date: Mon, 05 Jan 2015 12:27:51 +0000 Subject: [SciPy-User] Problems with fitting Weibell distribution Message-ID: <54AA8347.8090303@griffinpc.co.uk> Hi I am trying to fit a Weibull distribution to some data, with the following code: # Python script to fit metocean data import scipy.stats as s import numpy as np import matplotlib.pyplot as plt # Load data data = np.loadtxt("meteo.prn",skiprows=1,usecols=(4,)) # Fit data p0, p1, p2, p3 = s.exponweib.fit(data, floc=0) # Plot data x = np.linspace(data.min(), data.max(), 1000) y = s.exponweib(p0, p1, p2, p3).pdf(x) plt.plot(x, y) plt.hist(data, data.max(), normed=True) plt.show() Unfortunately, I don't get a distribution that looks anything like the inputs. I have searched the web and the above code is based on a couple of other posts but I am getting confused about the arguments that the fit function returns and which the EXPONWEIB function needs. All help greatly appreciated! Thanks!! Alun Griffiths -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- YYYY MM DD HH Ws Hs 1992 1 11 0 8.12 1.42 1992 1 11 3 8.01 1.5 1992 1 11 6 7.35 1.46 1992 1 11 9 7.67 1.42 1992 1 11 12 5.3 1.39 1992 1 11 15 6.86 1.37 1992 1 11 18 5.08 1.51 1992 1 11 21 3.5 2.06 1992 1 12 0 1.27 1.99 1992 1 12 3 2.71 1.81 1992 1 12 6 6.23 1.63 1992 1 12 9 6.97 1.51 1992 1 12 12 5.08 1.42 1992 1 12 15 5.16 1.31 1992 1 12 18 5.24 1.23 1992 1 12 21 1.37 1.14 1992 1 13 0 0 1.06 1992 1 13 3 4.2 0.97 1992 1 13 6 7.46 0.92 1992 1 13 9 5.72 0.9 1992 1 13 12 6.96 0.93 1992 1 13 15 10.3 0.84 1992 1 13 18 11.83 1.49 1992 1 13 21 12.76 2.5 1992 1 14 0 10.62 2.9 1992 1 14 3 10.7 2.85 1992 1 14 6 12.04 2.88 1992 1 14 9 14.46 3.2 1992 1 14 12 12.28 3.46 1992 1 14 15 11.48 3.35 1992 1 14 18 13.06 3.31 1992 1 14 21 12.26 3.38 1992 1 15 0 11.53 3.35 1992 1 15 3 9.64 3.19 1992 1 15 6 10.41 2.97 1992 1 15 9 12.94 2.97 1992 1 15 12 12.72 3.17 1992 1 15 15 13.48 3.37 1992 1 15 18 12.67 3.27 1992 1 15 21 12.62 3.24 1992 1 16 0 11.72 3.04 1992 1 16 3 10.07 2.81 1992 1 16 6 5.18 2.61 1992 1 16 9 8.5 2.34 1992 1 16 12 7.47 2.24 1992 1 16 15 9.23 2.13 1992 1 16 18 8.07 2.04 1992 1 16 21 6.73 1.81 1992 1 17 0 7.33 1.53 1992 1 17 3 6.69 1.31 1992 1 17 6 9.39 1.21 1992 1 17 9 9.8 1.43 1992 1 17 12 5.55 1.51 1992 1 17 15 3.02 1.45 1992 1 17 18 2.98 1.38 1992 1 17 21 3.59 1.33 1992 1 18 0 3.63 1.35 1992 1 18 3 7.39 1.49 1992 1 18 6 9.24 1.8 1992 1 18 9 10.02 1.96 1992 1 18 12 12.77 2.2 1992 1 18 15 13.65 2.81 1992 1 18 18 12.36 3.3 1992 1 18 21 11.47 3.7 1992 1 19 0 10.17 4.09 1992 1 19 3 11.31 4.3 1992 1 19 6 10.45 4.1 1992 1 19 9 10.65 3.86 1992 1 19 12 9.74 3.54 1992 1 19 15 9.39 3.21 1992 1 19 18 7.6 2.78 1992 1 19 21 6.44 2.36 1992 1 20 0 5.9 2.06 1992 1 20 3 5.49 1.84 1992 1 20 6 4.49 1.65 1992 1 20 9 5.7 1.46 1992 1 20 12 3.75 1.3 1992 1 20 15 6.42 1.14 1992 1 20 18 9.28 1.09 1992 1 20 21 11.64 1.41 1992 1 21 0 12.88 2.01 1992 1 21 3 12.03 2.56 1992 1 21 6 13.28 3.02 1992 1 21 9 11.55 3.24 1992 1 21 12 10.94 3.14 1992 1 21 15 10.03 2.85 1992 1 21 18 12.13 2.61 1992 1 21 21 9.53 2.4 1992 1 22 0 5.63 1.96 1992 1 22 3 3.59 1.67 1992 1 22 6 5.54 1.49 1992 1 22 9 6.49 1.39 1992 1 22 12 6.73 1.32 1992 1 22 15 6.56 1.28 1992 1 22 18 8.5 1.27 1992 1 22 21 5.53 1.34 1992 1 23 0 6.2 1.31 1992 1 23 3 5.42 1.28 1992 1 23 6 9.47 1.32 1992 1 23 9 9.31 1.65 1992 1 23 12 8.83 1.92 1992 1 23 15 7.85 2.02 1992 1 23 18 8.51 1.95 1992 1 23 21 7.98 1.9 1992 1 24 0 9.62 1.89 1992 1 24 3 9.59 1.95 1992 1 24 6 8.93 1.99 1992 1 24 9 7.65 1.96 1992 1 24 12 5.95 1.78 1992 1 24 15 3.13 1.59 1992 1 24 18 0.03 1.43 1992 1 24 21 2.81 1.28 1992 1 25 0 4.25 1.17 1992 1 25 3 5.12 1.12 1992 1 25 6 9.81 1.07 1992 1 25 9 9.62 1.33 1992 1 25 12 7.16 1.52 1992 1 25 15 10.7 1.71 1992 1 25 18 15.79 2.32 1992 1 25 21 13.53 3.1 1992 1 26 0 11.07 3 1992 1 26 3 9.63 2.46 1992 1 26 6 6.31 2.22 1992 1 26 9 3.36 2.19 1992 1 26 12 4.73 2.14 1992 1 26 15 6.25 1.89 1992 1 26 18 9.16 1.74 1992 1 26 21 9.16 1.73 1992 1 27 0 6.67 1.78 1992 1 27 3 4.91 1.7 1992 1 27 6 3.28 1.64 1992 1 27 9 2.77 1.61 1992 1 27 12 0.02 1.61 1992 1 27 15 0.14 1.64 1992 1 27 18 2.67 1.66 1992 1 27 21 2.63 1.69 1992 1 28 0 3.09 1.71 1992 1 28 3 4.41 1.74 1992 1 28 6 3.79 1.77 1992 1 28 9 4.97 1.78 1992 1 28 12 5.36 1.8 1992 1 28 15 7.4 1.84 1992 1 28 18 6.42 1.91 1992 1 28 21 5.5 1.92 1992 1 29 0 2.62 1.85 1992 1 29 3 2.11 1.77 1992 1 29 6 4.08 1.71 1992 1 29 9 6.53 1.72 1992 1 29 12 4.86 1.74 1992 1 29 15 3.12 1.7 1992 1 29 18 0.63 1.65 1992 1 29 21 3.44 1.59 1992 1 30 0 6.55 1.62 1992 1 30 3 6.79 1.63 1992 1 30 6 10.75 1.7 1992 1 30 9 10.42 1.93 1992 1 30 12 13.23 2.28 1992 1 30 15 14.46 2.95 1992 1 30 18 14.78 3.53 1992 1 30 21 13.19 3.83 1992 1 31 0 7.66 3.55 1992 1 31 3 4.79 3.05 1992 1 31 6 1.97 2.68 1992 1 31 9 1.95 2.39 1992 1 31 12 1.77 2.13 1992 1 31 15 3.21 1.91 1992 1 31 18 4.81 1.7 1992 1 31 21 6.17 1.53 1992 2 1 0 1.7 1.36 1992 2 1 3 1.23 1.22 1992 2 1 6 2.43 1.15 1992 2 1 9 2.2 1.15 1992 2 1 12 0.31 1.22 1992 2 1 15 0.36 1.27 1992 2 1 18 0 1.28 1992 2 1 21 1.08 1.28 1992 2 2 0 4.71 1.3 1992 2 2 3 3.64 1.33 1992 2 2 6 5.95 1.33 1992 2 2 9 4.72 1.34 1992 2 2 12 4.35 1.33 1992 2 2 15 4.65 1.34 1992 2 2 18 5.14 1.34 1992 2 2 21 5.69 1.33 1992 2 3 0 5.4 1.33 1992 2 3 3 6.89 1.38 1992 2 3 6 6.45 1.43 1992 2 3 9 5.45 1.44 1992 2 3 12 6.59 1.46 1992 2 3 15 7.68 1.63 1992 2 3 18 5.48 1.7 1992 2 3 21 6.04 1.59 1992 2 4 0 4.11 1.56 1992 2 4 3 3.29 1.52 1992 2 4 6 2.64 1.54 1992 2 4 9 0 1.55 1992 2 4 12 2.22 1.53 1992 2 4 15 5.39 1.51 1992 2 4 18 5.2 1.54 1992 2 4 21 7.49 1.62 1992 2 5 0 8.1 1.7 1992 2 5 3 8.42 1.7 1992 2 5 6 9.43 1.71 1992 2 5 9 10.41 1.91 1992 2 5 12 6.26 2.06 1992 2 5 15 4.32 1.98 1992 2 5 18 7.66 1.77 1992 2 5 21 11.1 1.53 1992 2 6 0 8.87 1.81 1992 2 6 3 13.85 2.25 1992 2 6 6 14.77 2.78 1992 2 6 9 12.9 3.11 1992 2 6 12 12.96 3.44 1992 2 6 15 12.64 3.83 1992 2 6 18 13.56 3.64 1992 2 6 21 11.88 3.45 1992 2 7 0 9.69 2.97 1992 2 7 3 5.62 2.69 1992 2 7 6 7.15 2.57 1992 2 7 9 10.02 2.47 1992 2 7 12 12.36 2.5 1992 2 7 15 8.4 2.77 1992 2 7 18 6.35 2.87 1992 2 7 21 6.41 2.73 1992 2 8 0 7.1 2.89 1992 2 8 3 7.03 3.07 1992 2 8 6 6.09 2.98 1992 2 8 9 7.89 2.69 1992 2 8 12 5.89 2.35 1992 2 8 15 6.62 2.07 1992 2 8 18 8.7 1.89 1992 2 8 21 8.88 1.91 1992 2 9 0 7.94 1.95 1992 2 9 3 6.74 1.86 1992 2 9 6 6.18 1.76 1992 2 9 9 5.72 1.7 1992 2 9 12 2.69 1.75 1992 2 9 15 3.14 1.91 1992 2 9 18 9.3 1.93 1992 2 9 21 11.3 2.2 1992 2 10 0 19.91 2.99 1992 2 10 3 17.4 4.73 1992 2 10 6 16.15 6.01 1992 2 10 9 11.9 5.33 1992 2 10 12 9.54 3.87 1992 2 10 15 7.99 3.12 1992 2 10 18 5.52 2.72 1992 2 10 21 4.95 2.45 1992 2 11 0 4.04 2.29 1992 2 11 3 8.03 2.34 1992 2 11 6 7.52 2.64 1992 2 11 9 6.85 2.87 1992 2 11 12 3.9 2.76 1992 2 11 15 4.68 2.43 1992 2 11 18 3.96 2.09 1992 2 11 21 5.57 1.81 1992 2 12 0 5.34 1.59 1992 2 12 3 7.09 1.47 1992 2 12 6 6.65 1.43 1992 2 12 9 7.87 1.41 1992 2 12 12 9.69 1.55 1992 2 12 15 11.37 1.92 1992 2 12 18 9.53 2.13 1992 2 12 21 8.31 2.07 1992 2 13 0 7 2.13 1992 2 13 3 8.79 2.03 1992 2 13 6 9.13 1.94 1992 2 13 9 9.81 2 1992 2 13 12 9.56 2 1992 2 13 15 9.19 1.94 1992 2 13 18 8.3 1.79 1992 2 13 21 6.29 1.65 1992 2 14 0 8.04 1.62 1992 2 14 3 6.86 1.76 1992 2 14 6 8.89 1.88 1992 2 14 9 8.68 1.94 1992 2 14 12 7.84 2.01 1992 2 14 15 9.58 2.16 1992 2 14 18 7.98 2.39 1992 2 14 21 8.33 2.46 1992 2 15 0 5.19 2.41 1992 2 15 3 6.03 2.15 1992 2 15 6 8.33 1.95 1992 2 15 9 7.44 1.81 1992 2 15 12 12.92 1.82 1992 2 15 15 12.64 2.38 1992 2 15 18 11.96 2.36 1992 2 15 21 12.54 2.51 1992 2 16 0 9.18 2.88 1992 2 16 3 9.59 2.74 1992 2 16 6 10.78 2.57 1992 2 16 9 9.73 2.73 1992 2 16 12 6.91 2.68 1992 2 16 15 6.54 2.43 1992 2 16 18 2.63 2.12 1992 2 16 21 0.48 1.82 1992 2 17 0 2.12 1.55 1992 2 17 3 4.54 1.31 1992 2 17 6 5.84 1.12 1992 2 17 9 7.95 1.01 1992 2 17 12 5.94 1.05 1992 2 17 15 8.61 1.2 1992 2 17 18 8.94 1.43 1992 2 17 21 9.97 1.76 1992 2 18 0 6.08 2.06 1992 2 18 3 5.66 2.14 1992 2 18 6 5.59 2.14 1992 2 18 9 5.7 2.07 1992 2 18 12 4.63 1.91 1992 2 18 15 4.38 1.74 1992 2 18 18 6.26 1.65 1992 2 18 21 9.43 1.77 1992 2 19 0 8.46 2.13 1992 2 19 3 6.91 2.27 1992 2 19 6 3.34 2.29 1992 2 19 9 3.29 2.24 1992 2 19 12 3.53 2.1 1992 2 19 15 4.14 1.92 1992 2 19 18 5.13 1.77 1992 2 19 21 6.81 1.72 1992 2 20 0 3.18 1.7 1992 2 20 3 4.85 1.64 1992 2 20 6 2.73 1.64 1992 2 20 9 5.33 1.75 1992 2 20 12 2.62 1.86 1992 2 20 15 1.63 1.85 1992 2 20 18 1.66 1.76 1992 2 20 21 3.09 1.64 1992 2 21 0 3.12 1.52 1992 2 21 3 4.77 1.45 1992 2 21 6 5.4 1.49 1992 2 21 9 6.67 1.59 1992 2 21 12 3.15 1.59 1992 2 21 15 1.15 1.51 1992 2 21 18 3.51 1.46 1992 2 21 21 3.88 1.41 1992 2 22 0 4.26 1.39 1992 2 22 3 5.48 1.42 1992 2 22 6 6.86 1.49 1992 2 22 9 7.22 1.52 1992 2 22 12 10.73 1.56 1992 2 22 15 11.24 1.78 1992 2 22 18 7.78 2.04 1992 2 22 21 9.41 2.13 1992 2 23 0 10.29 2.3 1992 2 23 3 8.49 2.44 1992 2 23 6 7.9 2.31 1992 2 23 9 6.55 2.05 1992 2 23 12 7.86 1.82 1992 2 23 15 7.65 1.74 1992 2 23 18 6.8 1.69 1992 2 23 21 3.98 1.61 1992 2 24 0 5.42 1.5 1992 2 24 3 9.53 1.55 1992 2 24 6 12.46 2.15 1992 2 24 9 11.23 2.81 1992 2 24 12 7.96 2.79 1992 2 24 15 6.77 2.45 1992 2 24 18 4.82 2.1 1992 2 24 21 4.54 1.79 1992 2 25 0 4.14 1.54 1992 2 25 3 5.93 1.35 1992 2 25 6 8.38 1.2 1992 2 25 9 8.72 1.19 1992 2 25 12 8.66 1.25 1992 2 25 15 7.58 1.38 1992 2 25 18 9.36 1.47 1992 2 25 21 7.81 1.58 1992 2 26 0 5.27 1.53 1992 2 26 3 4.88 1.42 1992 2 26 6 5.75 1.32 1992 2 26 9 5.76 1.27 1992 2 26 12 9.68 1.35 1992 2 26 15 10.76 1.82 1992 2 26 18 12.04 2.13 1992 2 26 21 10.54 2.3 1992 2 27 0 13.46 2.73 1992 2 27 3 19.33 3.82 1992 2 27 6 20.81 6.15 1992 2 27 9 20.49 8.05 1992 2 27 12 18.07 8.4 1992 2 27 15 17.49 7.66 1992 2 27 18 16.67 6.66 1992 2 27 21 15.04 5.8 1992 2 28 0 11.67 4.97 1992 2 28 3 8.36 4.1 1992 2 28 6 5.88 3.27 1992 2 28 9 9.53 2.58 1992 2 28 12 9.59 2.24 1992 2 28 15 9.24 2.16 1992 2 28 18 6.58 2.06 1992 2 28 21 6.76 1.85 1992 2 29 0 5.22 1.67 1992 2 29 3 4.55 1.53 1992 2 29 6 5.46 1.43 1992 2 29 9 6.18 1.28 1992 2 29 12 6.83 1.21 1992 2 29 15 9.15 1.38 1992 2 29 18 8.55 1.56 1992 2 29 21 7.6 1.56 1992 3 1 0 6.94 1.49 1992 3 1 3 6.75 1.44 1992 3 1 6 8.62 1.53 1992 3 1 9 9.08 1.76 1992 3 1 12 10.65 1.94 1992 3 1 15 9.94 2.16 1992 3 1 18 8.5 2.18 1992 3 1 21 6.17 1.98 1992 3 2 0 6.43 1.74 1992 3 2 3 6.47 1.59 1992 3 2 6 5.36 1.5 1992 3 2 9 2.74 1.45 1992 3 2 12 0.38 1.43 1992 3 2 15 5.7 1.4 1992 3 2 18 9.14 1.44 1992 3 2 21 10.12 1.73 1992 3 3 0 9.68 2.1 1992 3 3 3 11.32 2.5 1992 3 3 6 11.16 2.82 1992 3 3 9 11.33 3.06 1992 3 3 12 10.46 3.1 1992 3 3 15 7.47 2.94 1992 3 3 18 4.44 2.73 1992 3 3 21 2.2 2.56 1992 3 4 0 4.06 2.44 1992 3 4 3 5.28 2.32 1992 3 4 6 5.08 2.2 1992 3 4 9 5.37 2.08 1992 3 4 12 6.99 1.97 1992 3 4 15 8.18 1.92 1992 3 4 18 10.44 1.93 1992 3 4 21 9.83 2.04 1992 3 5 0 8.49 2.03 1992 3 5 3 8.8 2.05 1992 3 5 6 7.26 2.07 1992 3 5 9 5.95 1.99 1992 3 5 12 6.46 1.92 1992 3 5 15 6.76 1.84 1992 3 5 18 7.88 1.78 1992 3 5 21 7.83 1.75 1992 3 6 0 5.05 1.65 1992 3 6 3 6.16 1.51 1992 3 6 6 2.04 1.41 1992 3 6 9 2.91 1.35 1992 3 6 12 3.75 1.35 1992 3 6 15 7.43 1.45 1992 3 6 18 7.97 1.47 1992 3 6 21 10.08 1.57 1992 3 7 0 11.32 1.85 1992 3 7 3 12.11 2.02 1992 3 7 6 8.54 1.98 1992 3 7 9 7.38 1.77 1992 3 7 12 8.94 1.55 1992 3 7 15 8.1 1.47 1992 3 7 18 5.02 1.53 1992 3 7 21 4.95 1.66 1992 3 8 0 4.16 1.78 1992 3 8 3 2.74 2.04 1992 3 8 6 2.73 2.12 1992 3 8 9 4.35 1.95 1992 3 8 12 5.2 1.71 1992 3 8 15 5.56 1.57 1992 3 8 18 8.89 1.61 1992 3 8 21 10.89 1.93 1992 3 9 0 10.38 2.21 1992 3 9 3 11.56 2.41 1992 3 9 6 10.1 2.3 1992 3 9 9 8.3 2.1 1992 3 9 12 7.65 1.99 1992 3 9 15 9.38 1.99 1992 3 9 18 13.35 2.2 1992 3 9 21 10.22 2.47 1992 3 10 0 9.77 2.45 1992 3 10 3 10.48 2.49 1992 3 10 6 10.87 2.56 1992 3 10 9 10.8 2.72 1992 3 10 12 7.2 2.64 1992 3 10 15 11.07 2.45 1992 3 10 18 16.16 2.77 1992 3 10 21 13.29 4.16 1992 3 11 0 8 3.86 1992 3 11 3 9.05 3.32 1992 3 11 6 7.79 2.96 1992 3 11 9 9.84 2.49 1992 3 11 12 7.52 2.28 1992 3 11 15 8.75 2.11 1992 3 11 18 8.61 2.12 1992 3 11 21 9.45 2.15 1992 3 12 0 8.05 2.18 1992 3 12 3 10.23 2.22 1992 3 12 6 7.49 2.22 1992 3 12 9 6.65 1.92 1992 3 12 12 5.74 1.6 1992 3 12 15 3.97 1.32 1992 3 12 18 0 1.11 1992 3 12 21 2.86 0.91 1992 3 13 0 4.56 0.75 1992 3 13 3 4.73 0.78 1992 3 13 6 7.31 0.82 1992 3 13 9 9.23 1.12 1992 3 13 12 9.52 1.64 1992 3 13 15 9.01 2.05 1992 3 13 18 7.09 2.2 1992 3 13 21 6.44 2.14 1992 3 14 0 5.42 2.01 1992 3 14 3 5.59 2.11 1992 3 14 6 8.77 2.18 1992 3 14 9 7.39 2.18 1992 3 14 12 5.79 2.08 1992 3 14 15 6.24 1.9 1992 3 14 18 4.63 1.72 1992 3 14 21 3.55 1.55 1992 3 15 0 3.56 1.39 1992 3 15 3 7.14 1.3 1992 3 15 6 6.69 1.38 1992 3 15 9 7.74 1.54 1992 3 15 12 7.34 1.68 1992 3 15 15 6.81 1.64 1992 3 15 18 6.21 1.53 1992 3 15 21 9.72 1.58 1992 3 16 0 11.02 1.74 1992 3 16 3 10.31 1.91 1992 3 16 6 5.41 1.97 1992 3 16 9 8.78 1.98 1992 3 16 12 5.47 1.92 1992 3 16 15 7.57 1.78 1992 3 16 18 8.77 1.73 1992 3 16 21 9.29 1.86 1992 3 17 0 9.15 2.17 1992 3 17 3 8.32 2.25 1992 3 17 6 10.57 2.28 1992 3 17 9 8.84 2.31 1992 3 17 12 8.96 2.35 1992 3 17 15 9.96 2.51 1992 3 17 18 8.57 2.58 1992 3 17 21 10.1 2.55 1992 3 18 0 9.25 2.56 1992 3 18 3 9.98 2.5 1992 3 18 6 12.68 2.82 1992 3 18 9 16.15 3.94 1992 3 18 12 17.94 5.49 1992 3 18 15 14.78 6.27 1992 3 18 18 9.31 5.79 1992 3 18 21 15.14 5.46 1992 3 19 0 15.14 5.89 1992 3 19 3 13.88 5.73 1992 3 19 6 9.9 5.03 1992 3 19 9 9.66 4.16 1992 3 19 12 10.18 3.62 1992 3 19 15 12.15 3.59 1992 3 19 18 9.81 3.37 1992 3 19 21 10.67 3.03 1992 3 20 0 10.92 2.87 1992 3 20 3 12.89 2.85 1992 3 20 6 12 2.94 1992 3 20 9 10.75 2.91 1992 3 20 12 11.15 2.77 1992 3 20 15 10.8 2.82 1992 3 20 18 11.6 2.81 1992 3 20 21 9.97 2.71 1992 3 21 0 8.89 2.64 1992 3 21 3 12.43 2.69 1992 3 21 6 14.53 3.6 1992 3 21 9 16.6 4.65 1992 3 21 12 12.18 5.68 1992 3 21 15 20.29 6.26 1992 3 21 18 22.6 7.06 1992 3 21 21 21.37 8.3 1992 3 22 0 16.25 8.07 1992 3 22 3 18.48 7.68 1992 3 22 6 14.03 6.55 1992 3 22 9 12.27 5.32 1992 3 22 12 10.26 4.33 1992 3 22 15 8.36 3.58 1992 3 22 18 7.49 3.08 1992 3 22 21 8.13 2.77 1992 3 23 0 11.12 2.58 1992 3 23 3 14.44 2.73 1992 3 23 6 12.88 3.1 1992 3 23 9 13.77 3.37 1992 3 23 12 12.33 3.6 1992 3 23 15 11.99 3.5 1992 3 23 18 10.11 3.32 1992 3 23 21 9.02 3.1 1992 3 24 0 9.19 3.18 1992 3 24 3 8.95 3.16 1992 3 24 6 9.03 3 1992 3 24 9 9.39 2.85 1992 3 24 12 9.37 2.72 1992 3 24 15 9.67 2.81 1992 3 24 18 7.5 2.77 1992 3 24 21 6.73 2.59 1992 3 25 0 3.89 2.33 1992 3 25 3 3.48 2.08 1992 3 25 6 8.48 1.88 1992 3 25 9 6.27 1.77 1992 3 25 12 7.61 1.65 1992 3 25 15 8.81 1.6 1992 3 25 18 9.6 1.61 1992 3 25 21 10.04 1.7 1992 3 26 0 7.85 1.77 1992 3 26 3 9.24 1.86 1992 3 26 6 9.04 2.03 1992 3 26 9 9.42 1.94 1992 3 26 12 9.01 2.1 1992 3 26 15 11.24 2.31 1992 3 26 18 11.65 2.74 1992 3 26 21 12.44 3.22 1992 3 27 0 12.56 3.88 1992 3 27 3 9.96 3.99 1992 3 27 6 11.78 3.99 1992 3 27 9 6.69 3.73 1992 3 27 12 5.03 3.31 1992 3 27 15 7.72 2.88 1992 3 27 18 5.31 2.55 1992 3 27 21 4.14 2.18 1992 3 28 0 3.18 1.85 1992 3 28 3 3.31 1.59 1992 3 28 6 5.78 1.38 1992 3 28 9 5.42 1.21 1992 3 28 12 2.78 1.03 1992 3 28 15 5.61 0.89 1992 3 28 18 7.61 0.85 1992 3 28 21 7.93 0.92 1992 3 29 0 5.52 1 1992 3 29 3 5.93 0.96 1992 3 29 6 10.04 1.07 1992 3 29 9 9.91 1.45 1992 3 29 12 9.26 1.75 1992 3 29 15 8.62 1.96 1992 3 29 18 9.39 2.1 1992 3 29 21 7.56 2.08 1992 3 30 0 6.39 1.85 1992 3 30 3 4.71 1.63 1992 3 30 6 5.55 1.44 1992 3 30 9 4.24 1.3 1992 3 30 12 3.54 1.17 1992 3 30 15 2.6 1.06 1992 3 30 18 3.04 0.97 1992 3 30 21 3.43 0.92 1992 3 31 0 4.55 0.91 1992 3 31 3 4.69 0.92 1992 3 31 6 1.6 0.91 1992 3 31 9 0.96 0.9 1992 3 31 12 2.37 0.88 1992 3 31 15 3.75 0.87 1992 3 31 18 4.38 0.86 1992 3 31 21 3.39 0.86 1992 4 1 0 4 0.85 1992 4 1 3 5.3 0.87 1992 4 1 6 5.74 0.91 1992 4 1 9 6.43 0.97 1992 4 1 12 6.94 1.04 1992 4 1 15 6.19 1.14 1992 4 1 18 6.05 1.17 1992 4 1 21 7.89 1.23 1992 4 2 0 9.3 1.38 1992 4 2 3 6.19 1.36 1992 4 2 6 6.28 1.23 1992 4 2 9 4.73 1.13 1992 4 2 12 4.56 1.07 1992 4 2 15 6.13 1.08 1992 4 2 18 5.27 1.08 1992 4 2 21 4.55 1.04 1992 4 3 0 4.31 0.97 1992 4 3 3 6.1 0.95 1992 4 3 6 3.45 0.93 1992 4 3 9 5.11 0.91 1992 4 3 12 8.94 0.99 1992 4 3 15 7.94 1.18 1992 4 3 18 5.81 1.13 1992 4 3 21 4.92 1.02 1992 4 4 0 4.95 0.94 1992 4 4 3 4.69 0.88 1992 4 4 6 1.87 0.8 1992 4 4 9 0.26 0.74 1992 4 4 12 3.74 0.68 1992 4 4 15 4.82 0.87 1992 4 4 18 3 0.8 1992 4 4 21 4.12 0.78 1992 4 5 0 5.29 0.78 1992 4 5 3 5.43 0.86 1992 4 5 6 3.64 0.86 1992 4 5 9 3.1 0.84 1992 4 5 12 3.04 0.98 1992 4 5 15 1.97 0.99 1992 4 5 18 1.09 0.85 1992 4 5 21 2.38 0.83 1992 4 6 0 3.83 0.87 1992 4 6 3 5.2 0.87 1992 4 6 6 3.24 0.77 1992 4 6 9 4.98 0.66 1992 4 6 12 6.83 0.64 1992 4 6 15 5.1 0.71 1992 4 6 18 4.22 0.71 1992 4 6 21 2.59 0.75 1992 4 7 0 0.97 0.76 1992 4 7 3 0.7 0.69 1992 4 7 6 2.57 0.64 1992 4 7 9 2.83 0.63 1992 4 7 12 3.74 0.69 1992 4 7 15 5.73 0.86 1992 4 7 18 6.68 0.99 1992 4 7 21 7.36 1.03 1992 4 8 0 7.98 1.06 1992 4 8 3 8.98 1.23 1992 4 8 6 8.42 1.43 1992 4 8 9 8.59 1.54 1992 4 8 12 7.72 1.6 1992 4 8 15 9.65 1.65 1992 4 8 18 9.21 1.81 1992 4 8 21 10.3 1.95 1992 4 9 0 12.65 2.26 1992 4 9 3 11.36 2.57 1992 4 9 6 12.68 2.77 1992 4 9 9 11.17 2.93 1992 4 9 12 10.41 2.85 1992 4 9 15 7.41 2.64 1992 4 9 18 9.27 2.36 1992 4 9 21 6.29 2.18 1992 4 10 0 8.57 2.07 1992 4 10 3 8.4 2.13 1992 4 10 6 7.62 2.17 1992 4 10 9 6.7 2.14 1992 4 10 12 9.34 2.16 1992 4 10 15 8.84 2.34 1992 4 10 18 9.55 2.46 1992 4 10 21 8.07 2.55 1992 4 11 0 5.81 2.62 1992 4 11 3 7.19 2.66 1992 4 11 6 7.01 2.73 1992 4 11 9 6.2 2.77 1992 4 11 12 5.93 2.65 1992 4 11 15 8.75 2.48 1992 4 11 18 11.03 2.48 1992 4 11 21 9.08 2.51 1992 4 12 0 9.29 2.41 1992 4 12 3 8.73 2.31 1992 4 12 6 8.2 2.25 1992 4 12 9 6.54 2.15 1992 4 12 12 4.54 2 1992 4 12 15 4.19 1.9 1992 4 12 18 3.96 1.86 1992 4 12 21 6.64 1.85 1992 4 13 0 7.21 1.84 1992 4 13 3 9.65 1.91 1992 4 13 6 9.59 2.06 1992 4 13 9 10.38 2.28 1992 4 13 12 8.8 2.29 1992 4 13 15 8.85 2.05 1992 4 13 18 4.36 1.89 1992 4 13 21 3.83 1.78 1992 4 14 0 2.32 1.71 1992 4 14 3 0 1.64 1992 4 14 6 1.76 1.54 1992 4 14 9 4.69 1.43 1992 4 14 12 6.43 1.39 1992 4 14 15 5.84 1.33 1992 4 14 18 5.74 1.22 1992 4 14 21 4.46 1.11 1992 4 15 0 1.97 1 1992 4 15 3 0.68 0.91 1992 4 15 6 0.91 0.83 1992 4 15 9 2.42 0.8 1992 4 15 12 6.76 0.81 1992 4 15 15 7.31 0.96 1992 4 15 18 7.15 1.07 1992 4 15 21 9 1.18 1992 4 16 0 7.96 1.44 1992 4 16 3 9.92 1.64 1992 4 16 6 9.95 1.66 1992 4 16 9 5.71 1.56 1992 4 16 12 5.69 1.47 1992 4 16 15 5.89 1.39 1992 4 16 18 5.02 1.33 1992 4 16 21 1.99 1.29 1992 4 17 0 4.75 1.24 1992 4 17 3 6.16 1.22 1992 4 17 6 5.6 1.25 1992 4 17 9 4.5 1.26 1992 4 17 12 5.27 1.29 1992 4 17 15 4.27 1.28 1992 4 17 18 4.85 1.23 1992 4 17 21 5.53 1.24 1992 4 18 0 3.94 1.46 1992 4 18 3 5.54 1.79 1992 4 18 6 4.47 1.94 1992 4 18 9 4.26 1.98 1992 4 18 12 6.81 1.83 1992 4 18 15 5.19 1.65 1992 4 18 18 3.92 1.47 1992 4 18 21 4.22 1.31 1992 4 19 0 6.03 1.18 1992 4 19 3 5.32 1.09 1992 4 19 6 6.06 1.02 1992 4 19 9 7.78 1.04 1992 4 19 12 1.07 1.11 1992 4 19 15 7.87 1.09 1992 4 19 18 11.51 1.25 1992 4 19 21 11.43 1.42 1992 4 20 0 12.83 1.72 1992 4 20 3 12.38 2.17 1992 4 20 6 13.71 2.64 1992 4 20 9 14.02 3.29 1992 4 20 12 11.51 3.8 1992 4 20 15 12.15 3.74 1992 4 20 18 14.47 3.89 1992 4 20 21 14.73 4.42 1992 4 21 0 14.66 4.82 1992 4 21 3 14.05 5.31 1992 4 21 6 16.08 5.62 1992 4 21 9 14.12 5.64 1992 4 21 12 12.31 5.45 1992 4 21 15 13.38 5.26 1992 4 21 18 11.59 5.07 1992 4 21 21 8.41 4.75 1992 4 22 0 4.05 4.41 1992 4 22 3 1.72 4.02 1992 4 22 6 3.63 3.59 1992 4 22 9 7.57 3.15 1992 4 22 12 6.46 2.82 1992 4 22 15 5.12 2.55 1992 4 22 18 6.53 2.28 1992 4 22 21 9.21 2.16 1992 4 23 0 12.44 2.37 1992 4 23 3 10.88 2.67 1992 4 23 6 16.17 2.97 1992 4 23 9 9.32 3.33 1992 4 23 12 8.91 3.51 1992 4 23 15 9.96 3.85 1992 4 23 18 11.05 3.65 1992 4 23 21 9.93 3.41 1992 4 24 0 9.04 3.26 1992 4 24 3 10.25 3.25 1992 4 24 6 8.94 3.26 1992 4 24 9 11.04 3.27 1992 4 24 12 10.89 3.31 1992 4 24 15 8.3 3.22 1992 4 24 18 8.13 3.1 1992 4 24 21 8.39 2.88 1992 4 25 0 7.4 2.87 1992 4 25 3 7.38 3.26 1992 4 25 6 7.67 3.31 1992 4 25 9 7.95 3.04 1992 4 25 12 4.32 2.7 1992 4 25 15 8.46 2.37 1992 4 25 18 6.42 2.17 1992 4 25 21 10.39 2.26 1992 4 26 0 7.7 2.49 1992 4 26 3 6.29 2.23 1992 4 26 6 6.88 1.93 1992 4 26 9 7.11 1.8 1992 4 26 12 8.65 1.75 1992 4 26 15 7.93 1.67 1992 4 26 18 8.98 1.6 1992 4 26 21 9.2 1.58 1992 4 27 0 13.06 1.75 1992 4 27 3 15.88 2.32 1992 4 27 6 17.15 3.22 1992 4 27 9 15.39 4.05 1992 4 27 12 11.87 4.28 1992 4 27 15 10.56 3.9 1992 4 27 18 10.54 3.55 1992 4 27 21 8.81 3.27 1992 4 28 0 6.77 2.97 1992 4 28 3 5.41 2.73 1992 4 28 6 4.06 2.51 1992 4 28 9 4.87 2.29 1992 4 28 12 4.85 2.15 1992 4 28 15 4.25 2.15 1992 4 28 18 3.75 2.16 1992 4 28 21 5.64 2.13 1992 4 29 0 2.76 2.07 1992 4 29 3 3.29 1.92 1992 4 29 6 4.62 1.73 1992 4 29 9 5.39 1.58 1992 4 29 12 3.36 1.44 1992 4 29 15 5.37 1.35 1992 4 29 18 4.5 1.33 1992 4 29 21 3.98 1.29 1992 4 30 0 1.94 1.21 1992 4 30 3 1.86 1.1 1992 4 30 6 1.37 0.98 1992 4 30 9 0.87 0.88 1992 4 30 12 0.85 0.82 1992 4 30 15 2.37 0.8 1992 4 30 18 2.38 0.89 1992 4 30 21 2.65 0.97 1992 5 1 0 3.46 0.95 1992 5 1 3 5.04 0.92 1992 5 1 6 4.48 0.92 1992 5 1 9 3.49 0.89 1992 5 1 12 4.53 0.85 1992 5 1 15 6.53 0.85 1992 5 1 18 6.31 1 1992 5 1 21 6.54 1.15 1992 5 2 0 7.56 1.28 1992 5 2 3 12.53 1.59 1992 5 2 6 14.07 2.1 1992 5 2 9 11.35 2.6 1992 5 2 12 11.2 2.92 1992 5 2 15 9.59 2.93 1992 5 2 18 10.99 3.01 1992 5 2 21 11.4 3.25 1992 5 3 0 10.5 3.79 1992 5 3 3 6.4 3.78 1992 5 3 6 7.74 3.6 1992 5 3 9 5.49 3.32 1992 5 3 12 7.72 2.95 1992 5 3 15 11.4 2.6 1992 5 3 18 14.45 2.65 1992 5 3 21 14.11 3.02 1992 5 4 0 15.44 3.6 1992 5 4 3 17.01 4.26 1992 5 4 6 10.87 5.04 1992 5 4 9 13.27 4.61 1992 5 4 12 7.86 3.93 1992 5 4 15 4.89 3.21 1992 5 4 18 0 2.73 1992 5 4 21 3.99 2.49 1992 5 5 0 0.54 2.26 1992 5 5 3 2.33 2.05 1992 5 5 6 1.8 1.95 1992 5 5 9 1.23 1.92 1992 5 5 12 2.14 1.86 1992 5 5 15 3.17 1.77 1992 5 5 18 1.45 1.66 1992 5 5 21 3.41 1.58 1992 5 6 0 3.53 1.55 1992 5 6 3 5.6 1.57 1992 5 6 6 3.45 1.57 1992 5 6 9 6.15 1.57 1992 5 6 12 3.61 1.61 1992 5 6 15 1.45 1.64 1992 5 6 18 1 1.67 1992 5 6 21 0.76 1.65 1992 5 7 0 0.23 1.62 1992 5 7 3 1.57 1.58 1992 5 7 6 0.54 1.56 1992 5 7 9 2 1.54 1992 5 7 12 2.5 1.53 1992 5 7 15 2.51 1.51 1992 5 7 18 2.61 1.48 1992 5 7 21 4.23 1.44 1992 5 8 0 2.65 1.39 1992 5 8 3 1.38 1.32 1992 5 8 6 1.74 1.25 1992 5 8 9 3.87 1.2 1992 5 8 12 2.87 1.2 1992 5 8 15 3.88 1.17 1992 5 8 18 5.57 1.13 1992 5 8 21 6.35 1.13 1992 5 9 0 6.21 1.14 1992 5 9 3 7.27 1.16 1992 5 9 6 8.36 1.27 1992 5 9 9 7.59 1.39 1992 5 9 12 7.68 1.48 1992 5 9 15 6.81 1.56 1992 5 9 18 4.67 1.77 1992 5 9 21 4.38 1.76 1992 5 10 0 2.69 1.67 1992 5 10 3 2.45 1.57 1992 5 10 6 4.77 1.46 1992 5 10 9 6.37 1.39 1992 5 10 12 7.08 1.37 1992 5 10 15 6.78 1.36 1992 5 10 18 5.94 1.34 1992 5 10 21 5.2 1.29 1992 5 11 0 3.09 1.2 1992 5 11 3 3.18 1.13 1992 5 11 6 6.01 1.08 1992 5 11 9 7.42 1.1 1992 5 11 12 8.6 1.14 1992 5 11 15 8.64 1.18 1992 5 11 18 10.33 1.46 1992 5 11 21 10.19 2.05 1992 5 12 0 11.25 2.48 1992 5 12 3 10.76 2.71 1992 5 12 6 10.68 2.82 1992 5 12 9 10.82 2.92 1992 5 12 12 11.14 3.02 1992 5 12 15 9.81 3.09 1992 5 12 18 10.13 3.3 1992 5 12 21 10.28 3.49 1992 5 13 0 9.83 3.67 1992 5 13 3 6.31 4.23 1992 5 13 6 7.04 4.13 1992 5 13 9 7.26 3.61 1992 5 13 12 4.95 3.23 1992 5 13 15 4.56 3.1 1992 5 13 18 3.81 3.06 1992 5 13 21 3.97 2.97 1992 5 14 0 7.24 2.82 1992 5 14 3 3.47 2.69 1992 5 14 6 2.22 2.61 1992 5 14 9 4.09 2.59 1992 5 14 12 5.1 2.52 1992 5 14 15 5.19 2.47 1992 5 14 18 4.53 2.41 1992 5 14 21 2.26 2.34 1992 5 15 0 3.71 2.26 1992 5 15 3 3.28 2.19 1992 5 15 6 4.61 2.2 1992 5 15 9 4.82 2.17 1992 5 15 12 4.62 2.45 1992 5 15 15 6.03 2.59 1992 5 15 18 3.7 2.64 1992 5 15 21 5.98 2.79 1992 5 16 0 5.04 2.95 1992 5 16 3 4.45 2.81 1992 5 16 6 5.63 2.56 1992 5 16 9 5.2 2.28 1992 5 16 12 6.51 2.04 1992 5 16 15 5.9 1.89 1992 5 16 18 5.07 1.73 1992 5 16 21 5.68 1.58 1992 5 17 0 3.87 1.46 1992 5 17 3 4.87 1.36 1992 5 17 6 5.08 1.26 1992 5 17 9 4.95 1.18 1992 5 17 12 5.35 1.14 1992 5 17 15 6.65 1.21 1992 5 17 18 5.02 1.34 1992 5 17 21 5.95 1.37 1992 5 18 0 4.97 1.4 1992 5 18 3 5.82 1.5 1992 5 18 6 6.17 1.78 1992 5 18 9 9.24 2.08 1992 5 18 12 9.11 2.38 1992 5 18 15 9.43 2.61 1992 5 18 18 11.47 3.05 1992 5 18 21 12.85 3.41 1992 5 19 0 10.31 3.36 1992 5 19 3 10.78 3.01 1992 5 19 6 8.03 2.79 1992 5 19 9 9.39 2.52 1992 5 19 12 5.1 2.3 1992 5 19 15 2.41 1.99 1992 5 19 18 4.66 1.71 1992 5 19 21 5.24 1.5 1992 5 20 0 6.48 1.32 1992 5 20 3 7.78 1.21 1992 5 20 6 8.02 1.23 1992 5 20 9 9.19 1.28 1992 5 20 12 8.33 1.4 1992 5 20 15 10.36 1.59 1992 5 20 18 13.22 2.1 1992 5 20 21 11.7 2.5 1992 5 21 0 15.4 3.08 1992 5 21 3 13.36 3.92 1992 5 21 6 11.14 4.16 1992 5 21 9 13.66 3.89 1992 5 21 12 13.98 4.17 1992 5 21 15 15.32 4.53 1992 5 21 18 13.28 4.94 1992 5 21 21 12.26 4.57 1992 5 22 0 8.4 4.32 1992 5 22 3 10.49 3.9 1992 5 22 6 7.89 3.57 1992 5 22 9 7.97 3.19 1992 5 22 12 8.55 2.87 1992 5 22 15 11.26 2.88 1992 5 22 18 16.4 3.08 1992 5 22 21 15.83 3.93 1992 5 23 0 16.82 4.8 1992 5 23 3 15.01 5.39 1992 5 23 6 11.79 5.12 1992 5 23 9 5.96 4.2 1992 5 23 12 2.17 3.4 1992 5 23 15 1.03 2.89 1992 5 23 18 4.87 2.63 1992 5 23 21 6.63 2.41 1992 5 24 0 7.1 2.15 1992 5 24 3 7.84 2.05 1992 5 24 6 5.64 1.93 1992 5 24 9 8.32 1.76 1992 5 24 12 10.59 1.71 1992 5 24 15 9.92 1.83 1992 5 24 18 11.04 2.04 1992 5 24 21 8.86 2.31 1992 5 25 0 9.13 2.42 1992 5 25 3 7.4 2.37 1992 5 25 6 6.39 2.14 1992 5 25 9 8.26 2.01 1992 5 25 12 12.27 2.59 1992 5 25 15 13.39 3.27 1992 5 25 18 13.76 3.72 1992 5 25 21 13.23 3.97 1992 5 26 0 10.59 4.05 1992 5 26 3 13.08 4.1 1992 5 26 6 11.95 4.39 1992 5 26 9 12.35 4.26 1992 5 26 12 9.4 3.9 1992 5 26 15 8.11 3.34 1992 5 26 18 6.28 2.83 1992 5 26 21 7.29 2.49 1992 5 27 0 7.55 2.36 1992 5 27 3 10.77 2.46 1992 5 27 6 8.56 2.9 1992 5 27 9 7.87 3.02 1992 5 27 12 7.34 2.89 1992 5 27 15 12.23 2.77 1992 5 27 18 15.17 3.44 1992 5 27 21 15.18 3.98 1992 5 28 0 15.25 4.06 1992 5 28 3 13.13 4.06 1992 5 28 6 11.07 3.8 1992 5 28 9 8.81 3.4 1992 5 28 12 10.82 3.06 1992 5 28 15 10.21 2.98 1992 5 28 18 9.02 2.98 1992 5 28 21 8.43 2.74 1992 5 29 0 8.45 2.51 1992 5 29 3 7.5 2.33 1992 5 29 6 6.33 2.17 1992 5 29 9 4.58 2.03 1992 5 29 12 6.04 1.95 1992 5 29 15 6.01 1.98 1992 5 29 18 5.67 1.96 1992 5 29 21 5.82 1.85 1992 5 30 0 7.97 1.72 1992 5 30 3 8.85 1.68 1992 5 30 6 9.03 1.75 1992 5 30 9 9.57 1.86 1992 5 30 12 10.39 2.04 1992 5 30 15 11.1 2.25 1992 5 30 18 9.86 2.35 1992 5 30 21 9.48 2.39 1992 5 31 0 8.89 2.46 1992 5 31 3 5.79 2.33 1992 5 31 6 8.24 2.22 1992 5 31 9 7.23 2.22 1992 5 31 12 8.55 2.15 1992 5 31 15 10.92 2.21 1992 5 31 18 14.54 2.91 1992 5 31 21 15.39 3.87 1992 6 1 0 15.04 5.38 1992 6 1 3 15.29 6.26 1992 6 1 6 16.1 6.16 1992 6 1 9 13.7 5.91 1992 6 1 12 9.55 5.16 1992 6 1 15 10.62 4.39 1992 6 1 18 12.13 4.04 1992 6 1 21 14.96 4.24 1992 6 2 0 14.63 4.51 1992 6 2 3 13.8 4.61 1992 6 2 6 18.63 4.85 1992 6 2 9 16.19 5.25 1992 6 2 12 9.2 5.21 1992 6 2 15 1.22 4.25 1992 6 2 18 9.01 3.25 1992 6 2 21 9.33 2.68 1992 6 3 0 7.92 2.26 1992 6 3 3 7.36 1.84 1992 6 3 6 6.46 1.59 1992 6 3 9 4.82 1.5 1992 6 3 12 3.18 1.63 1992 6 3 15 3.05 1.71 1992 6 3 18 0.77 1.67 1992 6 3 21 1.32 1.64 1992 6 4 0 3.44 1.6 1992 6 4 3 4.71 1.65 1992 6 4 6 8.63 1.62 1992 6 4 9 10.09 1.74 1992 6 4 12 11.4 1.97 1992 6 4 15 10.56 2.21 1992 6 4 18 12.45 2.43 1992 6 4 21 12.47 2.91 1992 6 5 0 14.44 3.46 1992 6 5 3 15.6 3.78 1992 6 5 6 14.74 3.9 1992 6 5 9 15.57 3.89 1992 6 5 12 12.54 3.89 1992 6 5 15 12.8 3.77 1992 6 5 18 12.67 3.77 1992 6 5 21 12.86 3.75 1992 6 6 0 13.01 3.83 1992 6 6 3 12.43 3.96 1992 6 6 6 14.17 4.06 1992 6 6 9 13.15 4.3 1992 6 6 12 13.4 4.57 1992 6 6 15 13.49 4.95 1992 6 6 18 12.59 5.01 1992 6 6 21 12.04 4.67 1992 6 7 0 12.78 4.16 1992 6 7 3 10.78 3.83 1992 6 7 6 10.18 3.53 1992 6 7 9 8.97 3.32 1992 6 7 12 6.91 3.1 1992 6 7 15 7.03 2.85 1992 6 7 18 6.77 2.59 1992 6 7 21 8.52 2.34 1992 6 8 0 5.35 2.16 1992 6 8 3 4.63 1.98 1992 6 8 6 3.61 1.85 1992 6 8 9 2.67 1.76 1992 6 8 12 5.46 1.7 1992 6 8 15 6.43 1.71 1992 6 8 18 6.57 1.69 1992 6 8 21 7.16 1.68 1992 6 9 0 10.55 1.82 1992 6 9 3 8.85 2.06 1992 6 9 6 10.01 2.13 1992 6 9 9 9.14 2.28 1992 6 9 12 10.48 2.5 1992 6 9 15 11.3 2.73 1992 6 9 18 13.44 2.93 1992 6 9 21 11.87 3.17 1992 6 10 0 10.38 3.32 1992 6 10 3 10.26 3.43 1992 6 10 6 11.38 3.61 1992 6 10 9 11.15 3.79 1992 6 10 12 11.63 3.88 1992 6 10 15 9.67 4.16 1992 6 10 18 6.4 4.23 1992 6 10 21 5.92 4.19 1992 6 11 0 5.46 4.14 1992 6 11 3 3.71 4.08 1992 6 11 6 4.79 4.05 1992 6 11 9 5.94 4 1992 6 11 12 7.13 4.05 1992 6 11 15 10.13 4.07 1992 6 11 18 10.43 4.08 1992 6 11 21 12.2 4.11 1992 6 12 0 13.4 4.14 1992 6 12 3 15.3 4.36 1992 6 12 6 14.43 4.55 1992 6 12 9 11.82 4.73 1992 6 12 12 7.98 4.51 1992 6 12 15 9.31 4.02 1992 6 12 18 8.57 3.7 1992 6 12 21 6.55 3.39 1992 6 13 0 6.84 3.09 1992 6 13 3 7.18 2.83 1992 6 13 6 8.13 2.62 1992 6 13 9 7.45 2.45 1992 6 13 12 7.56 2.27 1992 6 13 15 8.3 2.13 1992 6 13 18 8.29 2.07 1992 6 13 21 7.4 2.03 1992 6 14 0 9.36 2.01 1992 6 14 3 10.84 2.19 1992 6 14 6 9.57 2.36 1992 6 14 9 7.29 2.36 1992 6 14 12 2.76 2.19 1992 6 14 15 2.57 2.02 1992 6 14 18 1.02 1.88 1992 6 14 21 3.58 1.72 1992 6 15 0 6.24 1.54 1992 6 15 3 5.95 1.41 1992 6 15 6 3.05 1.3 1992 6 15 9 1.39 1.19 1992 6 15 12 2.64 1.11 1992 6 15 15 1.65 1.06 1992 6 15 18 0.18 1.07 1992 6 15 21 2.31 1.08 1992 6 16 0 3.14 1.09 1992 6 16 3 4.61 1.08 1992 6 16 6 6.54 1.08 1992 6 16 9 10.47 1.2 1992 6 16 12 15.9 1.96 1992 6 16 15 14.62 2.77 1992 6 16 18 14.14 3.09 1992 6 16 21 12.19 3.34 1992 6 17 0 9.62 3.4 1992 6 17 3 9.89 3.25 1992 6 17 6 10.76 3.04 1992 6 17 9 9.48 2.91 1992 6 17 12 8.79 2.74 1992 6 17 15 8.48 2.57 1992 6 17 18 6.12 2.35 1992 6 17 21 4.93 2.1 1992 6 18 0 5.51 1.89 1992 6 18 3 5.85 1.73 1992 6 18 6 5.72 1.62 1992 6 18 9 7.32 1.56 1992 6 18 12 8.22 1.53 1992 6 18 15 8.32 1.54 1992 6 18 18 7.61 1.47 1992 6 18 21 4.91 1.34 1992 6 19 0 3.27 1.22 1992 6 19 3 2.31 1.24 1992 6 19 6 2.54 1.32 1992 6 19 9 2.98 1.38 1992 6 19 12 5.37 1.42 1992 6 19 15 7.62 1.43 1992 6 19 18 9.62 1.5 1992 6 19 21 14.34 2.21 1992 6 20 0 10.16 2.79 1992 6 20 3 10.72 2.85 1992 6 20 6 12.99 2.69 1992 6 20 9 12.97 2.75 1992 6 20 12 13.16 2.68 1992 6 20 15 6.5 2.41 1992 6 20 18 6.85 2.11 1992 6 20 21 6.27 1.87 1992 6 21 0 6.01 1.61 1992 6 21 3 6.86 1.44 1992 6 21 6 7.78 1.43 1992 6 21 9 7.28 1.51 1992 6 21 12 8.02 1.65 1992 6 21 15 9.66 1.87 1992 6 21 18 9.66 2.1 1992 6 21 21 9.3 2.33 1992 6 22 0 10.52 2.4 1992 6 22 3 8.45 2.51 1992 6 22 6 8.08 2.54 1992 6 22 9 9.66 2.43 1992 6 22 12 12.43 2.62 1992 6 22 15 11.88 3.02 1992 6 22 18 14.45 3.6 1992 6 22 21 14.56 4.28 1992 6 23 0 16.41 5.01 1992 6 23 3 13.41 5.17 1992 6 23 6 10.96 4.39 1992 6 23 9 10.88 3.78 1992 6 23 12 7.67 3.38 1992 6 23 15 6.74 2.99 1992 6 23 18 5.97 2.62 1992 6 23 21 4.38 2.3 1992 6 24 0 1.44 2.04 1992 6 24 3 0.72 1.87 1992 6 24 6 2.43 1.73 1992 6 24 9 2.88 1.57 1992 6 24 12 3.64 1.41 1992 6 24 15 5.27 1.28 1992 6 24 18 6.45 1.16 1992 6 24 21 5.66 1.19 1992 6 25 0 3.11 1.21 1992 6 25 3 2.67 1.22 1992 6 25 6 3.66 1.23 1992 6 25 9 4.72 1.21 1992 6 25 12 3.2 1.15 1992 6 25 15 4.7 1.08 1992 6 25 18 2.37 1.02 1992 6 25 21 4.06 0.95 1992 6 26 0 9.18 0.95 1992 6 26 3 11.46 1.2 1992 6 26 6 16.71 2.1 1992 6 26 9 16.54 3.33 1992 6 26 12 16.04 4.73 1992 6 26 15 14.56 5.45 1992 6 26 18 12.58 5.45 1992 6 26 21 12.83 5.26 1992 6 27 0 13.54 4.9 1992 6 27 3 13.69 4.95 1992 6 27 6 12.3 4.64 1992 6 27 9 14.04 4.56 1992 6 27 12 11.08 4.16 1992 6 27 15 10.8 3.69 1992 6 27 18 11.4 3.35 1992 6 27 21 11.25 3.14 1992 6 28 0 12.56 2.97 1992 6 28 3 12.74 3.02 1992 6 28 6 12.24 3.08 1992 6 28 9 11.22 3.1 1992 6 28 12 9.17 2.78 1992 6 28 15 13.36 2.39 1992 6 28 18 15.54 2.89 1992 6 28 21 17.04 3.92 1992 6 29 0 18.81 4.94 1992 6 29 3 19.19 5.76 1992 6 29 6 14.28 5.99 1992 6 29 9 14.54 5.34 1992 6 29 12 17.4 4.82 1992 6 29 15 15.11 4.59 1992 6 29 18 16.45 4.52 1992 6 29 21 12.29 4.38 1992 6 30 0 9.35 3.87 1992 6 30 3 11.27 3.26 1992 6 30 6 9.94 2.93 1992 6 30 9 9.51 2.56 1992 6 30 12 6.33 2.21 1992 6 30 15 2.7 1.97 1992 6 30 18 0.99 1.91 1992 6 30 21 3.63 1.97 1992 7 1 0 8.99 1.98 1992 7 1 3 7.47 1.98 1992 7 1 6 5.85 1.93 1992 7 1 9 4.92 1.84 1992 7 1 12 4.37 1.75 1992 7 1 15 6.04 1.71 1992 7 1 18 4.3 1.7 1992 7 1 21 3.24 1.61 1992 7 2 0 5.49 1.53 1992 7 2 3 8.51 1.51 1992 7 2 6 10.12 1.7 1992 7 2 9 11.42 2.04 1992 7 2 12 11.67 2.31 1992 7 2 15 11.51 2.57 1992 7 2 18 11.06 2.76 1992 7 2 21 9.74 2.66 1992 7 3 0 7.39 2.29 1992 7 3 3 6.94 1.93 1992 7 3 6 7.39 1.64 1992 7 3 9 7.59 1.47 1992 7 3 12 7.4 1.41 1992 7 3 15 5.87 1.34 1992 7 3 18 5.54 1.2 1992 7 3 21 5.59 1.09 1992 7 4 0 5.9 1.01 1992 7 4 3 6.77 1.05 1992 7 4 6 4.21 1.03 1992 7 4 9 4.94 0.92 1992 7 4 12 4.77 0.82 1992 7 4 15 3.25 0.72 1992 7 4 18 6.14 0.68 1992 7 4 21 7.45 0.84 1992 7 5 0 6.95 1.17 1992 7 5 3 8.33 1.54 1992 7 5 6 8.67 1.76 1992 7 5 9 9.19 1.85 1992 7 5 12 10.51 1.99 1992 7 5 15 9.74 2.18 1992 7 5 18 12.16 2.45 1992 7 5 21 9.82 2.61 1992 7 6 0 12.07 2.73 1992 7 6 3 12.14 3.19 1992 7 6 6 11.96 3.47 1992 7 6 9 13.3 3.64 1992 7 6 12 13.09 3.67 1992 7 6 15 12.07 3.67 1992 7 6 18 7.95 3.27 1992 7 6 21 5.42 2.8 1992 7 7 0 4.96 2.36 1992 7 7 3 4.38 1.96 1992 7 7 6 7.26 1.61 1992 7 7 9 8.59 1.4 1992 7 7 12 9.83 1.33 1992 7 7 15 11.54 1.6 1992 7 7 18 13.25 2.1 1992 7 7 21 11.61 2.69 1992 7 8 0 11.43 3.1 1992 7 8 3 8.68 3.14 1992 7 8 6 9.8 3.21 1992 7 8 9 11.52 3.42 1992 7 8 12 12.54 3.36 1992 7 8 15 10.46 3.26 1992 7 8 18 8.26 3.11 1992 7 8 21 7.84 2.91 1992 7 9 0 7.53 2.68 1992 7 9 3 6.52 2.47 1992 7 9 6 7 2.31 1992 7 9 9 7.24 2.21 1992 7 9 12 9.8 2.21 1992 7 9 15 10.54 2.42 1992 7 9 18 9.6 2.47 1992 7 9 21 7.73 2.4 1992 7 10 0 4.66 2.22 1992 7 10 3 4.92 2.02 1992 7 10 6 4.62 1.8 1992 7 10 9 6.18 1.62 1992 7 10 12 11.79 1.56 1992 7 10 15 9.07 1.95 1992 7 10 18 7.27 2.2 1992 7 10 21 10.45 2.46 1992 7 11 0 9.33 2.73 1992 7 11 3 9.27 2.77 1992 7 11 6 4.06 2.75 1992 7 11 9 7.61 2.61 1992 7 11 12 7.75 2.59 1992 7 11 15 8.77 2.6 1992 7 11 18 8.27 2.56 1992 7 11 21 6.23 2.39 1992 7 12 0 6.5 2.2 1992 7 12 3 7.57 2.1 1992 7 12 6 7.35 2.07 1992 7 12 9 6.93 2.09 1992 7 12 12 4.28 1.99 1992 7 12 15 4.07 1.77 1992 7 12 18 6.89 1.56 1992 7 12 21 5.72 1.41 1992 7 13 0 7 1.27 1992 7 13 3 7.27 1.2 1992 7 13 6 6.37 1.17 1992 7 13 9 6.55 1.22 1992 7 13 12 6.39 1.39 1992 7 13 15 5.86 1.37 1992 7 13 18 9.37 1.33 1992 7 13 21 9.93 1.6 1992 7 14 0 9.57 1.82 1992 7 14 3 10.09 2.05 1992 7 14 6 9.44 2.25 1992 7 14 9 9.8 2.29 1992 7 14 12 11.11 2.39 1992 7 14 15 9.74 2.44 1992 7 14 18 9.46 2.23 1992 7 14 21 9.45 1.9 1992 7 15 0 8.08 1.73 1992 7 15 3 7.6 1.56 1992 7 15 6 5.84 1.45 1992 7 15 9 4.41 1.35 1992 7 15 12 3.83 1.38 1992 7 15 15 3.15 1.46 1992 7 15 18 2.58 1.53 1992 7 15 21 1.84 1.51 1992 7 16 0 1.16 1.42 1992 7 16 3 3.89 1.3 1992 7 16 6 5.28 1.2 1992 7 16 9 4.46 1.06 1992 7 16 12 7.04 0.98 1992 7 16 15 7.4 1.02 1992 7 16 18 8.31 1.14 1992 7 16 21 7.96 1.29 1992 7 17 0 6.93 1.34 1992 7 17 3 9.7 1.37 1992 7 17 6 11.28 1.56 1992 7 17 9 11.15 1.9 1992 7 17 12 11.92 2.24 1992 7 17 15 13.44 2.6 1992 7 17 18 14.35 3.03 1992 7 17 21 14.21 3.51 1992 7 18 0 16.2 3.95 1992 7 18 3 14.18 4.24 1992 7 18 6 15.4 4.38 1992 7 18 9 13.81 4.37 1992 7 18 12 13.78 4.11 1992 7 18 15 5.97 3.65 1992 7 18 18 9.24 3.12 1992 7 18 21 8.22 2.68 1992 7 19 0 6.54 2.36 1992 7 19 3 7.33 2.12 1992 7 19 6 6.91 2 1992 7 19 9 8.79 1.97 1992 7 19 12 7.73 2.04 1992 7 19 15 7.99 2.12 1992 7 19 18 5.6 1.98 1992 7 19 21 6.2 1.76 1992 7 20 0 7.59 1.65 1992 7 20 3 8.56 1.76 1992 7 20 6 7.2 1.92 1992 7 20 9 9.3 2.06 1992 7 20 12 10.98 2.37 1992 7 20 15 10.18 2.76 1992 7 20 18 9.55 2.98 1992 7 20 21 10.02 3.03 1992 7 21 0 8.27 2.83 1992 7 21 3 10.47 2.64 1992 7 21 6 9.89 2.66 1992 7 21 9 10.9 2.64 1992 7 21 12 10.54 2.68 1992 7 21 15 8.04 2.44 1992 7 21 18 7.34 2.06 1992 7 21 21 9.17 1.92 1992 7 22 0 9.73 2.01 1992 7 22 3 9.08 2 1992 7 22 6 8.48 1.83 1992 7 22 9 8.55 1.63 1992 7 22 12 7.51 1.51 1992 7 22 15 4.94 1.29 1992 7 22 18 4.42 1.08 1992 7 22 21 5.58 0.92 1992 7 23 0 5.81 0.88 1992 7 23 3 6.16 0.96 1992 7 23 6 8.11 1.19 1992 7 23 9 9.09 1.57 1992 7 23 12 9.27 1.9 1992 7 23 15 9.06 2.11 1992 7 23 18 7.6 2.08 1992 7 23 21 7.44 1.89 1992 7 24 0 6.88 1.84 1992 7 24 3 7.96 2.06 1992 7 24 6 7.78 2.14 1992 7 24 9 8.67 2.12 1992 7 24 12 10.68 2.15 1992 7 24 15 10.64 2.33 1992 7 24 18 12.61 2.59 1992 7 24 21 12.04 2.97 1992 7 25 0 10.04 3.09 1992 7 25 3 11.76 3.03 1992 7 25 6 13.16 3.12 1992 7 25 9 10.69 3.13 1992 7 25 12 10.99 3.08 1992 7 25 15 12.05 3.24 1992 7 25 18 13.49 3.42 1992 7 25 21 12.08 3.55 1992 7 26 0 10.49 3.44 1992 7 26 3 9.94 3.15 1992 7 26 6 11.66 3 1992 7 26 9 9.62 2.91 1992 7 26 12 10.98 2.74 1992 7 26 15 9.7 2.65 1992 7 26 18 8.72 2.42 1992 7 26 21 7.25 2.21 1992 7 27 0 7.99 2.03 1992 7 27 3 8.1 1.94 1992 7 27 6 5.98 1.86 1992 7 27 9 6.59 1.81 1992 7 27 12 14 1.92 1992 7 27 15 10.21 2.18 1992 7 27 18 10.52 2.32 1992 7 27 21 9.74 2.49 1992 7 28 0 9.37 2.62 1992 7 28 3 8.8 2.59 1992 7 28 6 9.82 2.55 1992 7 28 9 9.5 2.53 1992 7 28 12 10.8 2.47 1992 7 28 15 9.51 2.6 1992 7 28 18 10.42 2.74 1992 7 28 21 9.56 2.75 1992 7 29 0 9.3 2.79 1992 7 29 3 9.21 2.8 1992 7 29 6 7.04 3 1992 7 29 9 9.33 3.09 1992 7 29 12 9.8 3.03 1992 7 29 15 9.86 2.87 1992 7 29 18 8.46 2.72 1992 7 29 21 13.07 2.52 1992 7 30 0 12.49 2.83 1992 7 30 3 12.14 3.13 1992 7 30 6 8.65 3.21 1992 7 30 9 10.34 2.91 1992 7 30 12 10.68 2.8 1992 7 30 15 13.3 3.04 1992 7 30 18 13.46 3.2 1992 7 30 21 11.66 3.14 1992 7 31 0 8.46 2.82 1992 7 31 3 8.69 2.37 1992 7 31 6 8.14 2.08 1992 7 31 9 7.5 1.87 1992 7 31 12 9.46 1.7 1992 7 31 15 7.77 1.67 1992 7 31 18 8.33 1.61 1992 7 31 21 7.7 1.63 1992 8 1 0 7.05 1.69 1992 8 1 3 7.75 1.76 1992 8 1 6 8.39 1.77 1992 8 1 9 8.9 1.86 1992 8 1 12 7.45 1.94 1992 8 1 15 6.94 1.91 1992 8 1 18 10.36 1.87 1992 8 1 21 9.92 2.05 1992 8 2 0 8.93 2.09 1992 8 2 3 6.76 1.92 1992 8 2 6 5.73 1.68 1992 8 2 9 4.33 1.47 1992 8 2 12 2.92 1.28 1992 8 2 15 2.26 1.1 1992 8 2 18 4.03 0.94 1992 8 2 21 3.51 0.81 1992 8 3 0 2.09 0.72 1992 8 3 3 3.27 0.67 1992 8 3 6 4.74 0.65 1992 8 3 9 4.88 0.65 1992 8 3 12 4.85 0.63 1992 8 3 15 5.42 0.66 1992 8 3 18 5.13 0.69 1992 8 3 21 4.5 0.71 1992 8 4 0 5.5 0.76 1992 8 4 3 5.36 0.86 1992 8 4 6 4.37 0.91 1992 8 4 9 4.49 0.96 1992 8 4 12 4.84 1.05 1992 8 4 15 4.05 1.1 1992 8 4 18 3.74 1.08 1992 8 4 21 3.88 1.05 1992 8 5 0 4.6 1.01 1992 8 5 3 4.92 1 1992 8 5 6 4.57 0.98 1992 8 5 9 5.23 0.96 1992 8 5 12 5.5 0.97 1992 8 5 15 6.42 0.99 1992 8 5 18 7.68 1.06 1992 8 5 21 6 1.09 1992 8 6 0 7.26 1.04 1992 8 6 3 8.31 1.06 1992 8 6 6 8.43 1.41 1992 8 6 9 7.85 1.57 1992 8 6 12 8.31 1.64 1992 8 6 15 8.07 1.73 1992 8 6 18 6.29 1.63 1992 8 6 21 5.33 1.48 1992 8 7 0 6.17 1.34 1992 8 7 3 6.11 1.24 1992 8 7 6 5.46 1.21 1992 8 7 9 6.71 1.23 1992 8 7 12 8.56 1.33 1992 8 7 15 9.18 1.5 1992 8 7 18 10.32 1.72 1992 8 7 21 11.06 2.06 1992 8 8 0 9.93 2.24 1992 8 8 3 10.33 2.55 1992 8 8 6 9.66 2.74 1992 8 8 9 10.74 2.85 1992 8 8 12 13.96 2.67 1992 8 8 15 13.81 2.74 1992 8 8 18 13.48 2.95 1992 8 8 21 12.07 3.27 1992 8 9 0 10.7 3.37 1992 8 9 3 11.12 3.3 1992 8 9 6 7.92 3.06 1992 8 9 9 9.81 3.09 1992 8 9 12 9.86 3.55 1992 8 9 15 11.44 3.63 1992 8 9 18 14.72 3.61 1992 8 9 21 12.96 3.71 1992 8 10 0 13.39 3.86 1992 8 10 3 15.13 4.33 1992 8 10 6 14.17 4.94 1992 8 10 9 11.39 4.6 1992 8 10 12 8.98 4.13 1992 8 10 15 6.59 3.61 1992 8 10 18 7.81 3.04 1992 8 10 21 9.71 2.56 1992 8 11 0 9.01 2.29 1992 8 11 3 10.06 2.19 1992 8 11 6 10.73 2.19 1992 8 11 9 9.89 2.31 1992 8 11 12 11.2 2.35 1992 8 11 15 10.35 2.53 1992 8 11 18 9.98 2.57 1992 8 11 21 10.52 2.64 1992 8 12 0 11.34 2.99 1992 8 12 3 11.55 3.39 1992 8 12 6 11.84 3.31 1992 8 12 9 8.35 3.06 1992 8 12 12 6.53 2.8 1992 8 12 15 6.32 2.37 1992 8 12 18 7.12 1.98 1992 8 12 21 6.59 1.71 1992 8 13 0 3.56 1.53 1992 8 13 3 4.39 1.35 1992 8 13 6 4.05 1.18 1992 8 13 9 5.34 1.06 1992 8 13 12 3.75 0.97 1992 8 13 15 4.57 0.9 1992 8 13 18 7.65 0.91 1992 8 13 21 9.63 1.06 1992 8 14 0 11.2 1.51 1992 8 14 3 12.62 2.07 1992 8 14 6 13.45 2.6 1992 8 14 9 14.44 3.12 1992 8 14 12 15.46 3.64 1992 8 14 15 13.49 3.95 1992 8 14 18 12.07 3.96 1992 8 14 21 13.36 3.93 1992 8 15 0 13.05 3.88 1992 8 15 3 11.18 3.69 1992 8 15 6 9.87 3.46 1992 8 15 9 12.91 3.49 1992 8 15 12 11.67 3.54 1992 8 15 15 12.92 3.37 1992 8 15 18 12.34 3.49 1992 8 15 21 10.17 3.49 1992 8 16 0 7.72 3.28 1992 8 16 3 7.61 3.06 1992 8 16 6 8.41 2.87 1992 8 16 9 8.06 2.74 1992 8 16 12 7.74 2.59 1992 8 16 15 8.78 2.49 1992 8 16 18 7.08 2.38 1992 8 16 21 5.1 2.2 1992 8 17 0 2.77 2.01 1992 8 17 3 2.39 1.86 1992 8 17 6 4.09 1.74 1992 8 17 9 5.6 1.64 1992 8 17 12 2.21 1.58 1992 8 17 15 0.59 1.53 1992 8 17 18 1.95 1.46 1992 8 17 21 4.28 1.38 1992 8 18 0 5.92 1.3 1992 8 18 3 8.09 1.3 1992 8 18 6 8.09 1.42 1992 8 18 9 8.77 1.54 1992 8 18 12 9.43 1.65 1992 8 18 15 8.98 1.75 1992 8 18 18 9.59 1.87 1992 8 18 21 8.66 1.95 1992 8 19 0 10.56 2.03 1992 8 19 3 9.23 2.18 1992 8 19 6 7.6 2.15 1992 8 19 9 6.73 1.98 1992 8 19 12 5.93 1.82 1992 8 19 15 4.23 1.74 1992 8 19 18 3.47 1.69 1992 8 19 21 2.44 1.58 1992 8 20 0 7.03 1.4 1992 8 20 3 8.8 1.38 1992 8 20 6 11.5 1.59 1992 8 20 9 10.11 1.83 1992 8 20 12 9.76 1.92 1992 8 20 15 8.68 1.92 1992 8 20 18 9.99 1.96 1992 8 20 21 10.93 2.09 1992 8 21 0 11.58 2.36 1992 8 21 3 13.26 2.72 1992 8 21 6 11.42 3.04 1992 8 21 9 12.88 3.32 1992 8 21 12 14.96 3.69 1992 8 21 15 14.71 4.04 1992 8 21 18 18.4 4.38 1992 8 21 21 15.36 4.86 1992 8 22 0 15.07 5.19 1992 8 22 3 15.52 5.59 1992 8 22 6 17.03 5.75 1992 8 22 9 13.84 5.85 1992 8 22 12 8.66 5.53 1992 8 22 15 8.33 5.03 1992 8 22 18 7.83 4.48 1992 8 22 21 6.7 4 1992 8 23 0 5.75 3.68 1992 8 23 3 6.65 3.56 1992 8 23 6 6.49 3.39 1992 8 23 9 5.47 3.16 1992 8 23 12 3.89 3.03 1992 8 23 15 5.6 3.01 1992 8 23 18 7.5 3.01 1992 8 23 21 10.11 3.01 1992 8 24 0 9.03 3.1 1992 8 24 3 10.92 3.19 1992 8 24 6 11.86 3.35 1992 8 24 9 11.17 3.73 1992 8 24 12 3.55 3.86 1992 8 24 15 1.67 3.64 1992 8 24 18 0.61 3.36 1992 8 24 21 5.44 3.09 1992 8 25 0 10.57 2.9 1992 8 25 3 11.08 2.95 1992 8 25 6 12.54 3.11 1992 8 25 9 9.33 3.23 1992 8 25 12 11.51 3.14 1992 8 25 15 10.77 2.99 1992 8 25 18 9.01 2.8 1992 8 25 21 8.7 2.58 1992 8 26 0 8.14 2.34 1992 8 26 3 8.77 2.15 1992 8 26 6 7.42 2.1 1992 8 26 9 9.25 2.1 1992 8 26 12 9.84 2.23 1992 8 26 15 10.29 2.38 1992 8 26 18 8.4 2.44 1992 8 26 21 6.66 2.36 1992 8 27 0 5.3 2.35 1992 8 27 3 6.26 2.34 1992 8 27 6 7.11 2.17 1992 8 27 9 9.7 2.11 1992 8 27 12 6.37 2.12 1992 8 27 15 10 2.01 1992 8 27 18 9.99 1.94 1992 8 27 21 10.61 2.14 1992 8 28 0 9.5 2.49 1992 8 28 3 7.66 2.42 1992 8 28 6 8.84 2.32 1992 8 28 9 7.63 2.15 1992 8 28 12 5.64 1.81 1992 8 28 15 4.95 1.47 1992 8 28 18 5.24 1.21 1992 8 28 21 6.4 1.09 1992 8 29 0 6.95 1.03 1992 8 29 3 7.02 1.04 1992 8 29 6 8.77 1.06 1992 8 29 9 8.12 1.12 1992 8 29 12 11.53 1.29 1992 8 29 15 13.12 1.86 1992 8 29 18 14.19 2.46 1992 8 29 21 11.15 2.88 1992 8 30 0 8.59 2.98 1992 8 30 3 5.84 2.84 1992 8 30 6 5.26 2.56 1992 8 30 9 10.79 2.22 1992 8 30 12 7.64 2.06 1992 8 30 15 6.91 1.91 1992 8 30 18 6.34 1.77 1992 8 30 21 8.34 1.69 1992 8 31 0 8.43 1.68 1992 8 31 3 7.68 1.72 1992 8 31 6 4.02 1.65 1992 8 31 9 7.66 1.59 1992 8 31 12 5.18 1.6 1992 8 31 15 4.85 1.52 1992 8 31 18 4.9 1.41 1992 8 31 21 4.9 1.28 1992 9 1 0 0.82 1.12 1992 9 1 3 2.11 0.97 1992 9 1 6 2.86 0.87 1992 9 1 9 4.29 0.81 1992 9 1 12 7.43 0.92 1992 9 1 15 9.58 1.09 1992 9 1 18 12.19 1.46 1992 9 1 21 11.62 1.9 1992 9 2 0 12.64 2.2 1992 9 2 3 13.22 2.62 1992 9 2 6 12.91 3.01 1992 9 2 9 13.79 3.32 1992 9 2 12 14.91 3.6 1992 9 2 15 14.51 3.89 1992 9 2 18 17.42 4.19 1992 9 2 21 15.18 4.57 1992 9 3 0 13 4.51 1992 9 3 3 9.76 3.99 1992 9 3 6 10.43 3.5 1992 9 3 9 10.82 3.24 1992 9 3 12 12.61 3.25 1992 9 3 15 13.94 3.63 1992 9 3 18 12.09 3.97 1992 9 3 21 11.39 3.76 1992 9 4 0 8.73 3.41 1992 9 4 3 8.99 3.13 1992 9 4 6 8.64 3 1992 9 4 9 8.47 2.97 1992 9 4 12 8.85 2.83 1992 9 4 15 5.74 2.59 1992 9 4 18 5.52 2.38 1992 9 4 21 7.1 2.3 1992 9 5 0 9.93 2.45 1992 9 5 3 9.76 2.7 1992 9 5 6 10.49 2.75 1992 9 5 9 10.73 2.9 1992 9 5 12 8.4 3.02 1992 9 5 15 7.98 3.01 1992 9 5 18 10.83 2.89 1992 9 5 21 12.8 2.86 1992 9 6 0 11.04 2.92 1992 9 6 3 11.5 2.93 1992 9 6 6 10.07 2.97 1992 9 6 9 9.22 3.57 1992 9 6 12 5.25 3.65 1992 9 6 15 5.32 3.31 1992 9 6 18 11.34 3.02 1992 9 6 21 9.47 2.96 1992 9 7 0 7.62 2.76 1992 9 7 3 7.75 2.5 1992 9 7 6 6.7 2.37 1992 9 7 9 6.37 2.29 1992 9 7 12 3.94 2.13 1992 9 7 15 3.44 1.9 1992 9 7 18 4.86 1.65 1992 9 7 21 5.54 1.42 1992 9 8 0 5.43 1.23 1992 9 8 3 7.08 1.11 1992 9 8 6 8.61 1.11 1992 9 8 9 10.31 1.36 1992 9 8 12 9.31 1.64 1992 9 8 15 11.11 1.88 1992 9 8 18 12.89 2.33 1992 9 8 21 12.39 2.82 1992 9 9 0 13.34 3.25 1992 9 9 3 11.62 3.55 1992 9 9 6 9.9 3.37 1992 9 9 9 6.83 3 1992 9 9 12 3.8 2.67 1992 9 9 15 2.32 2.44 1992 9 9 18 4.44 2.25 1992 9 9 21 5.96 2.07 1992 9 10 0 8.28 1.98 1992 9 10 3 9.9 2.13 1992 9 10 6 11.03 2.52 1992 9 10 9 11.92 3.02 1992 9 10 12 11.21 3.14 1992 9 10 15 11.14 3 1992 9 10 18 11.52 2.98 1992 9 10 21 10.27 2.99 1992 9 11 0 8.99 2.92 1992 9 11 3 6.03 2.76 1992 9 11 6 3.57 2.58 1992 9 11 9 2.98 2.42 1992 9 11 12 4.38 2.27 1992 9 11 15 7.28 2.09 1992 9 11 18 7.75 1.97 1992 9 11 21 7.89 1.9 1992 9 12 0 8.88 1.83 1992 9 12 3 8.75 1.82 1992 9 12 6 8.75 1.77 1992 9 12 9 7.94 1.67 1992 9 12 12 8.06 1.6 1992 9 12 15 8.39 1.68 1992 9 12 18 7.24 1.78 1992 9 12 21 6.47 1.79 1992 9 13 0 6.34 1.76 1992 9 13 3 5.44 1.77 1992 9 13 6 3.71 1.79 1992 9 13 9 2.35 1.85 1992 9 13 12 2.85 1.97 1992 9 13 15 7.1 2.1 1992 9 13 18 8.66 2.19 1992 9 13 21 10.87 2.34 1992 9 14 0 15.56 2.78 1992 9 14 3 14.02 3.29 1992 9 14 6 13.66 3.54 1992 9 14 9 10.52 3.16 1992 9 14 12 12.17 2.79 1992 9 14 15 9.7 2.82 1992 9 14 18 7.12 2.86 1992 9 14 21 8.8 2.8 1992 9 15 0 8.03 2.81 1992 9 15 3 9.4 2.88 1992 9 15 6 5.07 2.74 1992 9 15 9 4.39 2.42 1992 9 15 12 2.51 2.14 1992 9 15 15 6.7 1.9 1992 9 15 18 8.89 1.76 1992 9 15 21 11.82 1.83 1992 9 16 0 11.24 2.26 1992 9 16 3 12.25 2.73 1992 9 16 6 9.82 2.97 1992 9 16 9 9.39 3.14 1992 9 16 12 10.58 3.2 1992 9 16 15 9.36 3.17 1992 9 16 18 9.46 3.15 1992 9 16 21 9 3.06 1992 9 17 0 9.1 2.85 1992 9 17 3 7.13 2.53 1992 9 17 6 4.96 2.25 1992 9 17 9 4.5 2.07 1992 9 17 12 2.98 1.97 1992 9 17 15 3.08 1.81 1992 9 17 18 3.35 1.63 1992 9 17 21 2.56 1.45 1992 9 18 0 2.21 1.31 1992 9 18 3 5.24 1.2 1992 9 18 6 6.72 1.15 1992 9 18 9 7.71 1.18 1992 9 18 12 5.5 1.23 1992 9 18 15 4.35 1.25 1992 9 18 18 3.84 1.22 1992 9 18 21 4.45 1.19 1992 9 19 0 6.04 1.2 1992 9 19 3 4.11 1.17 1992 9 19 6 3.38 1.08 1992 9 19 9 6.51 1.01 1992 9 19 12 5.82 1 1992 9 19 15 5.78 1.02 1992 9 19 18 7.64 1.14 1992 9 19 21 8.81 1.31 1992 9 20 0 7.76 1.52 1992 9 20 3 8.93 1.75 1992 9 20 6 7.72 1.77 1992 9 20 9 6.93 1.65 1992 9 20 12 8.53 1.6 1992 9 20 15 7.9 1.62 1992 9 20 18 9.43 1.74 1992 9 20 21 10.15 2 1992 9 21 0 7.62 2.17 1992 9 21 3 7.29 2.32 1992 9 21 6 7.29 2.3 1992 9 21 9 10.01 2.17 1992 9 21 12 12.26 2.22 1992 9 21 15 12.99 2.81 1992 9 21 18 17.57 3.27 1992 9 21 21 15.51 3.94 1992 9 22 0 12.56 4.27 1992 9 22 3 14.1 4.41 1992 9 22 6 14.12 4.15 1992 9 22 9 11.45 4.28 1992 9 22 12 7.73 3.86 1992 9 22 15 7.92 3.53 1992 9 22 18 13.61 3.41 1992 9 22 21 15.59 3.84 1992 9 23 0 14.88 4.52 1992 9 23 3 13.05 4.82 1992 9 23 6 13.05 4.55 1992 9 23 9 13.71 4.65 1992 9 23 12 14.04 4.73 1992 9 23 15 18.82 5.17 1992 9 23 18 18.74 6.32 1992 9 23 21 15.75 6.24 1992 9 24 0 14.95 5.49 1992 9 24 3 14.46 5.18 1992 9 24 6 15.91 5.07 1992 9 24 9 10.98 4.66 1992 9 24 12 9.47 3.78 1992 9 24 15 8.21 3.27 1992 9 24 18 8.61 2.89 1992 9 24 21 9.71 2.62 1992 9 25 0 7.12 2.56 1992 9 25 3 6.75 2.46 1992 9 25 6 5.92 2.38 1992 9 25 9 5.16 2.26 1992 9 25 12 4.27 2.17 1992 9 25 15 4.23 2.1 1992 9 25 18 4 2.03 1992 9 25 21 3.41 1.92 1992 9 26 0 5.22 1.76 1992 9 26 3 6.03 1.61 1992 9 26 6 6.82 1.53 1992 9 26 9 7.57 1.48 1992 9 26 12 6.52 1.41 1992 9 26 15 6.84 1.33 1992 9 26 18 5.94 1.29 1992 9 26 21 5.94 1.22 1992 9 27 0 5.47 1.12 1992 9 27 3 5.93 1.02 1992 9 27 6 8.12 1.03 1992 9 27 9 8.36 1.22 1992 9 27 12 8.98 1.23 1992 9 27 15 9.33 1.26 1992 9 27 18 10.84 1.61 1992 9 27 21 12.75 2.26 1992 9 28 0 10.39 2.95 1992 9 28 3 8.72 2.97 1992 9 28 6 9.28 2.82 1992 9 28 9 9.2 2.76 1992 9 28 12 12.03 2.88 1992 9 28 15 11.5 3 1992 9 28 18 8.8 2.72 1992 9 28 21 6.54 2.47 1992 9 29 0 10.83 2.37 1992 9 29 3 11.99 2.58 1992 9 29 6 9.97 2.75 1992 9 29 9 9.08 2.64 1992 9 29 12 8.29 2.32 1992 9 29 15 7.54 2.17 1992 9 29 18 7.82 2.43 1992 9 29 21 6.54 2.55 1992 9 30 0 7.02 2.42 1992 9 30 3 8.29 2.27 1992 9 30 6 7.63 2.13 1992 9 30 9 5.71 1.95 1992 9 30 12 3.17 1.74 1992 9 30 15 2.25 1.54 1992 9 30 18 4.45 1.36 1992 9 30 21 7.54 1.21 1992 10 1 0 7.93 1.18 1992 10 1 3 9.94 1.34 1992 10 1 6 9.74 1.63 1992 10 1 9 10.42 1.93 1992 10 1 12 5.85 2.11 1992 10 1 15 6.45 2.02 1992 10 1 18 7.86 1.89 1992 10 1 21 8.35 1.87 1992 10 2 0 7.74 1.89 1992 10 2 3 7.64 1.91 1992 10 2 6 8.07 1.96 1992 10 2 9 7.24 1.97 1992 10 2 12 5.91 1.91 1992 10 2 15 7.11 1.9 1992 10 2 18 7.09 1.89 1992 10 2 21 7.08 1.86 1992 10 3 0 5.16 1.87 1992 10 3 3 5.9 1.97 1992 10 3 6 5.84 1.95 1992 10 3 9 5.56 1.8 1992 10 3 12 8.42 1.64 1992 10 3 15 11.08 1.63 1992 10 3 18 11.77 1.88 1992 10 3 21 13.09 2.22 1992 10 4 0 13.35 2.65 1992 10 4 3 10.93 2.91 1992 10 4 6 10.6 2.96 1992 10 4 9 9.34 2.86 1992 10 4 12 8.12 2.76 1992 10 4 15 8.4 2.62 1992 10 4 18 5.11 2.48 1992 10 4 21 3.32 2.39 1992 10 5 0 2.02 2.32 1992 10 5 3 0.98 2.24 1992 10 5 6 3.41 2.11 1992 10 5 9 6.25 1.98 1992 10 5 12 5.78 1.88 1992 10 5 15 5.39 1.83 1992 10 5 18 5.83 1.74 1992 10 5 21 4.61 1.61 1992 10 6 0 4.5 1.57 1992 10 6 3 3.8 1.58 1992 10 6 6 6.06 1.58 1992 10 6 9 5.34 1.58 1992 10 6 12 6.64 1.55 1992 10 6 15 9.57 1.59 1992 10 6 18 7.71 1.74 1992 10 6 21 10.75 1.88 1992 10 7 0 9.45 2.28 1992 10 7 3 8.93 2.45 1992 10 7 6 8.36 2.47 1992 10 7 9 11.08 2.51 1992 10 7 12 10.51 2.67 1992 10 7 15 11.19 2.61 1992 10 7 18 13.68 2.67 1992 10 7 21 11.99 2.91 1992 10 8 0 12.42 2.94 1992 10 8 3 10.53 2.97 1992 10 8 6 7.11 2.75 1992 10 8 9 7.63 2.38 1992 10 8 12 6.92 2.06 1992 10 8 15 6.07 1.82 1992 10 8 18 5.53 1.64 1992 10 8 21 4.8 1.6 1992 10 9 0 1.87 1.65 1992 10 9 3 3.71 1.65 1992 10 9 6 5.71 1.65 1992 10 9 9 9.12 1.59 1992 10 9 12 10.28 1.67 1992 10 9 15 12.62 1.96 1992 10 9 18 13.77 2.37 1992 10 9 21 14.51 2.8 1992 10 10 0 15.84 3.24 1992 10 10 3 14.98 3.67 1992 10 10 6 13.57 3.84 1992 10 10 9 13.07 3.98 1992 10 10 12 11.05 3.92 1992 10 10 15 7.28 3.73 1992 10 10 18 7.75 3.57 1992 10 10 21 7.42 3.35 1992 10 11 0 5.22 3.09 1992 10 11 3 4.53 2.83 1992 10 11 6 3.99 2.6 1992 10 11 9 6.16 2.4 1992 10 11 12 9.47 2.26 1992 10 11 15 8.78 2.2 1992 10 11 18 7.55 2.12 1992 10 11 21 6.22 2 1992 10 12 0 6.15 1.84 1992 10 12 3 13.08 1.77 1992 10 12 6 11.09 2.16 1992 10 12 9 12.28 2.56 1992 10 12 12 9.69 2.91 1992 10 12 15 10.47 2.9 1992 10 12 18 11.35 2.98 1992 10 12 21 11.66 3.08 1992 10 13 0 10.84 2.91 1992 10 13 3 7.39 2.49 1992 10 13 6 7.21 2.05 1992 10 13 9 10.68 1.71 1992 10 13 12 6.45 1.75 1992 10 13 15 15.61 1.91 1992 10 13 18 21.28 3.61 1992 10 13 21 20.84 5.53 1992 10 14 0 18.19 6.81 1992 10 14 3 16.51 7.06 1992 10 14 6 12.8 6.17 1992 10 14 9 12.71 5.15 1992 10 14 12 9.7 4.55 1992 10 14 15 8.91 3.99 1992 10 14 18 4.78 3.44 1992 10 14 21 4.95 2.89 1992 10 15 0 10.92 2.33 1992 10 15 3 11.37 2.01 1992 10 15 6 2.47 1.96 1992 10 15 9 7.12 1.89 1992 10 15 12 16.15 1.9 1992 10 15 15 5.78 2.14 1992 10 15 18 8.5 1.95 1992 10 15 21 8.81 1.78 1992 10 16 0 8.67 1.83 1992 10 16 3 9.58 1.91 1992 10 16 6 7.79 1.85 1992 10 16 9 8.19 1.75 1992 10 16 12 7.7 1.77 1992 10 16 15 7.35 1.75 1992 10 16 18 4.31 1.69 1992 10 16 21 3.49 1.57 1992 10 17 0 2.91 1.42 1992 10 17 3 0 1.27 1992 10 17 6 3.61 1.14 1992 10 17 9 3.14 1.08 1992 10 17 12 3.14 1.04 1992 10 17 15 3.5 0.99 1992 10 17 18 5.19 0.93 1992 10 17 21 5.3 0.88 1992 10 18 0 6.84 0.86 1992 10 18 3 7.05 0.89 1992 10 18 6 9.5 1.01 1992 10 18 9 8.97 1.23 1992 10 18 12 9.31 1.48 1992 10 18 15 11.09 1.77 1992 10 18 18 12.79 2.18 1992 10 18 21 13.59 2.65 1992 10 19 0 12.88 3.18 1992 10 19 3 13.27 3.56 1992 10 19 6 15.2 3.73 1992 10 19 9 13.64 4 1992 10 19 12 12.97 4.17 1992 10 19 15 8.02 3.83 1992 10 19 18 4.82 3.36 1992 10 19 21 7.39 2.98 1992 10 20 0 6.74 2.7 1992 10 20 3 8.55 2.4 1992 10 20 6 9.94 2.25 1992 10 20 9 10.04 2.3 1992 10 20 12 7.1 2.33 1992 10 20 15 4.66 2.2 1992 10 20 18 5.27 1.98 1992 10 20 21 5.04 1.8 1992 10 21 0 6.98 1.62 1992 10 21 3 9.88 1.53 1992 10 21 6 12.28 1.72 1992 10 21 9 9.51 1.98 1992 10 21 12 8.92 2.04 1992 10 21 15 9.22 2.15 1992 10 21 18 8.09 2.1 1992 10 21 21 6.38 1.96 1992 10 22 0 6.32 1.81 1992 10 22 3 6.11 1.7 1992 10 22 6 5.17 1.56 1992 10 22 9 4.23 1.46 1992 10 22 12 2.7 1.35 1992 10 22 15 2.94 1.26 1992 10 22 18 3.6 1.2 1992 10 22 21 3.07 1.19 1992 10 23 0 3.09 1.2 1992 10 23 3 4.71 1.19 1992 10 23 6 6.05 1.18 1992 10 23 9 7.07 1.21 1992 10 23 12 9.28 1.32 1992 10 23 15 10.28 1.65 1992 10 23 18 8.94 2.13 1992 10 23 21 8.57 2.2 1992 10 24 0 7.82 1.94 1992 10 24 3 10.98 1.81 1992 10 24 6 12.79 2.11 1992 10 24 9 12.54 2.58 1992 10 24 12 11.8 2.66 1992 10 24 15 9.99 2.64 1992 10 24 18 10.42 2.44 1992 10 24 21 10.13 2.33 1992 10 25 0 10.83 2.4 1992 10 25 3 14.39 2.74 1992 10 25 6 9.94 3.31 1992 10 25 9 6.35 3.28 1992 10 25 12 8.79 3.14 1992 10 25 15 10.54 2.8 1992 10 25 18 10.67 2.69 1992 10 25 21 12.21 2.89 1992 10 26 0 13.11 3.3 1992 10 26 3 12.82 3.57 1992 10 26 6 8.37 3.33 1992 10 26 9 9.76 2.92 1992 10 26 12 10.9 3.05 1992 10 26 15 10.58 2.96 1992 10 26 18 12.05 2.74 1992 10 26 21 9.27 2.94 1992 10 27 0 4.82 2.89 1992 10 27 3 4.98 2.86 1992 10 27 6 6.79 2.69 1992 10 27 9 8.72 2.47 1992 10 27 12 9.91 2.46 1992 10 27 15 11.32 2.65 1992 10 27 18 9.88 2.79 1992 10 27 21 12.4 2.94 1992 10 28 0 10.93 3.06 1992 10 28 3 10.15 3.03 1992 10 28 6 13.33 3.38 1992 10 28 9 14.65 4.08 1992 10 28 12 10.85 4.24 1992 10 28 15 10.39 4 1992 10 28 18 10.4 3.66 1992 10 28 21 9.56 3.21 1992 10 29 0 10.65 2.79 1992 10 29 3 10.45 2.49 1992 10 29 6 13.95 2.27 1992 10 29 9 12.09 2.02 1992 10 29 12 6.28 1.94 1992 10 29 15 13.67 1.77 1992 10 29 18 17.88 2.32 1992 10 29 21 17.56 3.59 1992 10 30 0 15.07 3.83 1992 10 30 3 12.93 4.01 1992 10 30 6 12.43 3.65 1992 10 30 9 15.54 3.55 1992 10 30 12 13.35 3.9 1992 10 30 15 13.9 4.12 1992 10 30 18 13.62 4.53 1992 10 30 21 12.6 4.89 1992 10 31 0 6.97 4.67 1992 10 31 3 5.96 4.01 1992 10 31 6 9.34 3.35 1992 10 31 9 11.53 2.87 1992 10 31 12 7.85 2.54 1992 10 31 15 5.64 2.18 1992 10 31 18 6.14 1.84 1992 10 31 21 6.35 1.59 1992 11 1 0 10.09 1.57 1992 11 1 3 8.67 2.01 1992 11 1 6 7.36 2.08 1992 11 1 9 7.16 1.9 1992 11 1 12 8.31 1.95 1992 11 1 15 8.18 2.24 1992 11 1 18 10.73 2.49 1992 11 1 21 10.25 2.72 1992 11 2 0 10.48 2.87 1992 11 2 3 11.09 3.15 1992 11 2 6 9.85 3.09 1992 11 2 9 11.22 3.17 1992 11 2 12 8.19 3.05 1992 11 2 15 8.87 2.79 1992 11 2 18 6.44 2.59 1992 11 2 21 8.38 2.42 1992 11 3 0 6.96 2.33 1992 11 3 3 9.26 2.43 1992 11 3 6 10.15 2.7 1992 11 3 9 10.31 2.8 1992 11 3 12 9.13 2.83 1992 11 3 15 9.58 2.92 1992 11 3 18 8.43 3.02 1992 11 3 21 9.99 3.16 1992 11 4 0 11.2 3.35 1992 11 4 3 10.64 3.24 1992 11 4 6 6.44 2.78 1992 11 4 9 6.14 2.27 1992 11 4 12 9.9 1.9 1992 11 4 15 8.97 1.82 1992 11 4 18 6.72 1.76 1992 11 4 21 5.67 1.67 1992 11 5 0 6.39 1.66 1992 11 5 3 6.99 1.68 1992 11 5 6 6.51 1.75 1992 11 5 9 6.43 1.76 1992 11 5 12 9.42 1.75 1992 11 5 15 9.26 1.93 1992 11 5 18 9.27 2.06 1992 11 5 21 7.78 2.01 1992 11 6 0 5.37 1.82 1992 11 6 3 6.89 1.63 1992 11 6 6 8.48 1.55 1992 11 6 9 9.88 1.67 1992 11 6 12 10.46 1.96 1992 11 6 15 12.05 2.36 1992 11 6 18 13.51 2.89 1992 11 6 21 13.8 3.15 1992 11 7 0 11.7 3.37 1992 11 7 3 9.06 3.41 1992 11 7 6 8.92 3.31 1992 11 7 9 10.76 3.38 1992 11 7 12 9.17 3.52 1992 11 7 15 8.02 3.33 1992 11 7 18 9.44 3.02 1992 11 7 21 9.09 2.81 1992 11 8 0 9.02 2.62 1992 11 8 3 8.71 2.51 1992 11 8 6 7.52 2.49 1992 11 8 9 6.73 2.49 1992 11 8 12 4.8 2.36 1992 11 8 15 2.59 2.2 1992 11 8 18 2.55 2.15 1992 11 8 21 6.05 2.13 1992 11 9 0 7.72 2.09 1992 11 9 3 9.91 2.21 1992 11 9 6 10.24 2.27 1992 11 9 9 12.87 2.47 1992 11 9 12 12.83 3.23 1992 11 9 15 9.52 3.39 1992 11 9 18 6.93 2.91 1992 11 9 21 8.19 2.6 1992 11 10 0 5.26 2.33 1992 11 10 3 6.74 2.04 1992 11 10 6 4.52 1.81 1992 11 10 9 6.28 1.59 1992 11 10 12 7.74 1.45 1992 11 10 15 7.19 1.4 1992 11 10 18 6.09 1.38 1992 11 10 21 6.04 1.34 1992 11 11 0 3.77 1.24 1992 11 11 3 3.17 1.09 1992 11 11 6 2.17 0.97 1992 11 11 9 0.49 0.86 1992 11 11 12 1.51 0.77 1992 11 11 15 4.77 0.69 1992 11 11 18 1.37 0.74 1992 11 11 21 3.75 0.72 1992 11 12 0 9.41 0.82 1992 11 12 3 9.96 1.22 1992 11 12 6 11 1.65 1992 11 12 9 9.65 2.19 1992 11 12 12 11.52 2.48 1992 11 12 15 9.01 2.46 1992 11 12 18 6.87 2.15 1992 11 12 21 5.36 1.83 1992 11 13 0 1.41 1.58 1992 11 13 3 0.75 1.36 1992 11 13 6 1.34 1.19 1992 11 13 9 4.08 1.05 1992 11 13 12 7.12 0.96 1992 11 13 15 8.4 1.03 1992 11 13 18 10.58 1.46 1992 11 13 21 10.53 1.97 1992 11 14 0 7.74 2.2 1992 11 14 3 10.09 2.3 1992 11 14 6 13.26 2.75 1992 11 14 9 11.92 3.11 1992 11 14 12 10.24 3.1 1992 11 14 15 9.38 3.04 1992 11 14 18 9.7 2.89 1992 11 14 21 8.55 2.58 1992 11 15 0 8.77 2.32 1992 11 15 3 9.59 2.2 1992 11 15 6 10.77 2.2 1992 11 15 9 8.4 2.17 1992 11 15 12 6.16 2.05 1992 11 15 15 6.27 1.87 1992 11 15 18 7.56 1.68 1992 11 15 21 9.13 1.67 1992 11 16 0 7.67 1.68 1992 11 16 3 8.88 1.66 1992 11 16 6 9.21 1.74 1992 11 16 9 8.21 1.78 1992 11 16 12 7.73 1.7 1992 11 16 15 6.83 1.58 1992 11 16 18 5.6 1.49 1992 11 16 21 5.44 1.43 1992 11 17 0 3.37 1.29 1992 11 17 3 1.47 1.13 1992 11 17 6 0.75 0.98 1992 11 17 9 0.19 0.84 1992 11 17 12 4.39 0.73 1992 11 17 15 6.31 0.89 1992 11 17 18 9.12 0.92 1992 11 17 21 8.98 1.1 1992 11 18 0 10.53 1.36 1992 11 18 3 11.46 1.76 1992 11 18 6 12.03 2.19 1992 11 18 9 10.18 2.52 1992 11 18 12 6.83 2.38 1992 11 18 15 5.63 2.07 1992 11 18 18 3.95 1.8 1992 11 18 21 2.07 1.61 1992 11 19 0 1.5 1.48 1992 11 19 3 3.52 1.39 1992 11 19 6 4 1.39 1992 11 19 9 5.19 1.46 1992 11 19 12 7.71 1.57 1992 11 19 15 5.45 1.7 1992 11 19 18 5.1 1.72 1992 11 19 21 5.2 1.66 1992 11 20 0 4.68 1.6 1992 11 20 3 3.73 1.56 1992 11 20 6 3.52 1.58 1992 11 20 9 7.09 1.61 1992 11 20 12 9.25 1.72 1992 11 20 15 8.64 1.83 1992 11 20 18 7.01 1.83 1992 11 20 21 6.81 1.62 1992 11 21 0 4.44 1.45 1992 11 21 3 2.79 1.31 1992 11 21 6 5.33 1.24 1992 11 21 9 6.36 1.2 1992 11 21 12 6.23 1.18 1992 11 21 15 5.83 1.14 1992 11 21 18 4.87 1.07 1992 11 21 21 5.15 1.03 1992 11 22 0 5.17 1.01 1992 11 22 3 3.8 0.97 1992 11 22 6 4.02 0.95 1992 11 22 9 5.3 0.95 1992 11 22 12 5.09 0.94 1992 11 22 15 4.61 0.91 1992 11 22 18 3.72 0.86 1992 11 22 21 1.43 0.78 1992 11 23 0 2.7 0.69 1992 11 23 3 5.72 0.81 1992 11 23 6 6.49 0.78 1992 11 23 9 7.8 0.92 1992 11 23 12 9.04 1.08 1992 11 23 15 9.01 1.27 1992 11 23 18 9.07 1.54 1992 11 23 21 8.32 1.73 1992 11 24 0 6.64 1.59 1992 11 24 3 7.85 1.49 1992 11 24 6 8.43 1.52 1992 11 24 9 9.47 1.69 1992 11 24 12 10.97 2.01 1992 11 24 15 9.56 2.16 1992 11 24 18 4.26 2.08 1992 11 24 21 4.69 1.91 1992 11 25 0 5.12 1.7 1992 11 25 3 2.91 1.43 1992 11 25 6 4.43 1.16 1992 11 25 9 8.25 1 1992 11 25 12 7.43 1.03 1992 11 25 15 8.15 1.15 1992 11 25 18 7.13 1.26 1992 11 25 21 6.81 1.25 1992 11 26 0 5.46 1.17 1992 11 26 3 5.15 1.08 1992 11 26 6 5.39 1.01 1992 11 26 9 5.1 0.96 1992 11 26 12 5.11 0.91 1992 11 26 15 6.32 0.86 1992 11 26 18 8.61 0.89 1992 11 26 21 7.98 0.95 1992 11 27 0 9.81 1.06 1992 11 27 3 9.5 1.3 1992 11 27 6 11.19 1.71 1992 11 27 9 11.35 2.11 1992 11 27 12 13.48 2.45 1992 11 27 15 12.9 3.06 1992 11 27 18 11.6 3.28 1992 11 27 21 11.76 3.24 1992 11 28 0 14.97 3.3 1992 11 28 3 13.88 3.87 1992 11 28 6 9.52 4.22 1992 11 28 9 11.18 3.83 1992 11 28 12 5.02 3.46 1992 11 28 15 5.87 3.04 1992 11 28 18 7.98 2.67 1992 11 28 21 6.18 2.43 1992 11 29 0 4.03 2.21 1992 11 29 3 1.89 2.02 1992 11 29 6 1.69 1.9 1992 11 29 9 2.89 1.78 1992 11 29 12 1.66 1.64 1992 11 29 15 5.74 1.47 1992 11 29 18 8.7 1.37 1992 11 29 21 8.44 1.29 1992 11 30 0 9.05 1.35 1992 11 30 3 7.25 1.54 1992 11 30 6 8.36 1.67 1992 11 30 9 9.41 1.7 1992 11 30 12 10.89 1.73 1992 11 30 15 11.38 1.9 1992 11 30 18 12.89 2.14 1992 11 30 21 12.93 2.46 1992 12 1 0 11.87 2.78 1992 12 1 3 9.71 2.83 1992 12 1 6 10.38 2.72 1992 12 1 9 9.27 2.63 1992 12 1 12 6.91 2.5 1992 12 1 15 5.64 2.28 1992 12 1 18 3.42 2.12 1992 12 1 21 2.78 2.04 1992 12 2 0 3.31 1.9 1992 12 2 3 3.72 1.84 1992 12 2 6 4.98 1.85 1992 12 2 9 7.28 1.91 1992 12 2 12 7.46 1.94 1992 12 2 15 7.14 2.06 1992 12 2 18 6.09 2.2 1992 12 2 21 7.1 2.35 1992 12 3 0 7.04 2.51 1992 12 3 3 9.25 2.67 1992 12 3 6 8.34 2.76 1992 12 3 9 8.91 2.8 1992 12 3 12 6.71 2.8 1992 12 3 15 6.23 2.72 1992 12 3 18 4.04 2.63 1992 12 3 21 4.05 2.5 1992 12 4 0 4.39 2.36 1992 12 4 3 4.06 2.23 1992 12 4 6 6.17 2.13 1992 12 4 9 4.24 2.04 1992 12 4 12 3.08 1.96 1992 12 4 15 2.04 1.9 1992 12 4 18 2.53 1.86 1992 12 4 21 2.16 1.85 1992 12 5 0 2.81 1.87 1992 12 5 3 2.12 1.91 1992 12 5 6 1.1 1.95 1992 12 5 9 2.43 1.97 1992 12 5 12 6.05 1.98 1992 12 5 15 6.95 2 1992 12 5 18 7.55 2.03 1992 12 5 21 10.75 2.14 1992 12 6 0 11.15 2.32 1992 12 6 3 8.45 2.47 1992 12 6 6 3.11 2.45 1992 12 6 9 2.48 2.34 1992 12 6 12 2.79 2.17 1992 12 6 15 3.12 2.02 1992 12 6 18 5.79 1.89 1992 12 6 21 6.33 1.84 1992 12 7 0 6.21 1.82 1992 12 7 3 6.58 1.8 1992 12 7 6 5.83 1.76 1992 12 7 9 4.24 1.7 1992 12 7 12 5.94 1.67 1992 12 7 15 6.92 1.67 1992 12 7 18 5.98 1.7 1992 12 7 21 5.3 1.73 1992 12 8 0 7.8 1.68 1992 12 8 3 7.66 1.64 1992 12 8 6 6.63 1.6 1992 12 8 9 6.41 1.58 1992 12 8 12 9.54 1.72 1992 12 8 15 10.36 2 1992 12 8 18 8.4 2.08 1992 12 8 21 10.02 2.17 1992 12 9 0 10.43 2.24 1992 12 9 3 11.06 2.35 1992 12 9 6 11.7 2.72 1992 12 9 9 12.48 2.94 1992 12 9 12 10.65 2.93 1992 12 9 15 9.31 2.77 1992 12 9 18 5.82 2.45 1992 12 9 21 5.14 2.12 1992 12 10 0 5.96 1.83 1992 12 10 3 3.66 1.61 1992 12 10 6 1.8 1.42 1992 12 10 9 1.63 1.26 1992 12 10 12 2.57 1.15 1992 12 10 15 4.64 1.06 1992 12 10 18 4.21 1.05 1992 12 10 21 4.61 0.96 1992 12 11 0 5.65 0.93 1992 12 11 3 7.5 0.99 1992 12 11 6 7.99 1.12 1992 12 11 9 6.93 1.24 1992 12 11 12 8.73 1.34 1992 12 11 15 7.6 1.46 1992 12 11 18 9.14 1.52 1992 12 11 21 9.91 1.67 1992 12 12 0 9.71 1.86 1992 12 12 3 11.02 2.15 1992 12 12 6 13 2.59 1992 12 12 9 12.18 3 1992 12 12 12 13.71 3.26 1992 12 12 15 14.2 3.68 1992 12 12 18 13.16 4.21 1992 12 12 21 10.57 4.04 1992 12 13 0 10.86 3.6 1992 12 13 3 12.88 3.43 1992 12 13 6 13.09 3.58 1992 12 13 9 9.95 3.33 1992 12 13 12 8.5 2.92 1992 12 13 15 6.56 2.63 1992 12 13 18 6.21 2.41 1992 12 13 21 4.07 2.26 1992 12 14 0 3.95 2.18 1992 12 14 3 4.37 2.13 1992 12 14 6 3.73 2.06 1992 12 14 9 3.32 1.94 1992 12 14 12 3.61 1.83 1992 12 14 15 4.84 1.71 1992 12 14 18 8.26 1.61 1992 12 14 21 7.9 1.61 1992 12 15 0 7.95 1.62 1992 12 15 3 7.13 1.62 1992 12 15 6 5.19 1.54 1992 12 15 9 2.7 1.48 1992 12 15 12 2.26 1.41 1992 12 15 15 5.56 1.35 1992 12 15 18 5.9 1.39 1992 12 15 21 5.23 1.55 1992 12 16 0 1.38 1.46 1992 12 16 3 3.76 1.31 1992 12 16 6 3.15 1.2 1992 12 16 9 5.1 1.11 1992 12 16 12 5.34 1.04 1992 12 16 15 5.17 1.04 1992 12 16 18 3.67 1.06 1992 12 16 21 3.03 1.08 1992 12 17 0 4.16 1.06 1992 12 17 3 6.34 1.05 1992 12 17 6 3.53 1.15 1992 12 17 9 2.86 1.46 1992 12 17 12 4.21 1.66 1992 12 17 15 1.39 1.71 1992 12 17 18 1.86 1.69 1992 12 17 21 5.17 1.59 1992 12 18 0 5.87 1.55 1992 12 18 3 8.15 1.64 1992 12 18 6 8.14 1.59 1992 12 18 9 7.93 1.53 1992 12 18 12 7.01 1.53 1992 12 18 15 8.04 1.47 1992 12 18 18 7.56 1.5 1992 12 18 21 7.98 1.84 1992 12 19 0 5.16 1.96 1992 12 19 3 6.27 1.8 1992 12 19 6 5.5 1.83 1992 12 19 9 5.67 1.89 1992 12 19 12 9.37 1.82 1992 12 19 15 13.43 1.99 1992 12 19 18 13.4 2.69 1992 12 19 21 14.55 3.3 1992 12 20 0 14.96 3.88 1992 12 20 3 16.28 4.66 1992 12 20 6 16.69 5.51 1992 12 20 9 14.62 5.88 1992 12 20 12 12.48 5.33 1992 12 20 15 16.74 4.88 1992 12 20 18 17.6 4.94 1992 12 20 21 17.4 5.31 1992 12 21 0 12.48 4.96 1992 12 21 3 13.36 4.38 1992 12 21 6 12.77 3.93 1992 12 21 9 10.35 3.35 1992 12 21 12 7.11 2.73 1992 12 21 15 3.15 2.28 1992 12 21 18 0 1.93 1992 12 21 21 2.84 1.6 1992 12 22 0 6.11 1.29 1992 12 22 3 8.54 1.07 1992 12 22 6 8.16 1.13 1992 12 22 9 11.25 1.37 1992 12 22 12 12.38 1.81 1992 12 22 15 13.09 2.24 1992 12 22 18 13.08 2.67 1992 12 22 21 9.62 2.81 1992 12 23 0 5.69 2.59 1992 12 23 3 7.16 2.31 1992 12 23 6 8.25 2.1 1992 12 23 9 6.98 1.96 1992 12 23 12 6.02 1.8 1992 12 23 15 7.53 1.77 1992 12 23 18 6.07 1.76 1992 12 23 21 4.74 1.65 1992 12 24 0 3.93 1.56 1992 12 24 3 4.03 1.48 1992 12 24 6 1.68 1.4 1992 12 24 9 3.65 1.36 1992 12 24 12 4.47 1.31 1992 12 24 15 6.55 1.21 1992 12 24 18 7.28 1.13 1992 12 24 21 6.94 1.17 1992 12 25 0 8.08 1.28 1992 12 25 3 7.88 1.36 1992 12 25 6 9.2 1.42 1992 12 25 9 7.84 1.55 1992 12 25 12 9.84 1.71 1992 12 25 15 9.15 1.94 1992 12 25 18 10.65 2.02 1992 12 25 21 10.45 2.08 1992 12 26 0 10.94 2.1 1992 12 26 3 10.73 2.41 1992 12 26 6 12.73 3.31 1992 12 26 9 13.85 3.77 1992 12 26 12 15.82 4.09 1992 12 26 15 16.89 4.7 1992 12 26 18 15.75 5.4 1992 12 26 21 15.32 5.82 1992 12 27 0 11.78 5.68 1992 12 27 3 8.68 4.92 1992 12 27 6 10.29 4.3 1992 12 27 9 10.09 3.89 1992 12 27 12 10.73 3.62 1992 12 27 15 6.8 3.48 1992 12 27 18 5.81 3.22 1992 12 27 21 7.64 2.96 1992 12 28 0 5.6 2.75 1992 12 28 3 4.45 2.54 1992 12 28 6 7.41 2.39 1992 12 28 9 10.92 2.35 1992 12 28 12 6.96 2.27 1992 12 28 15 7.44 2.14 1992 12 28 18 11.71 2.05 1992 12 28 21 5.71 2.03 1992 12 29 0 2.38 1.83 1992 12 29 3 6.42 1.65 1992 12 29 6 11.14 1.53 1992 12 29 9 10.47 1.61 1992 12 29 12 2.14 1.74 1992 12 29 15 3.99 1.54 1992 12 29 18 7.22 1.39 1992 12 29 21 5.67 1.44 1992 12 30 0 3.18 1.44 1992 12 30 3 6.89 1.49 1992 12 30 6 7 1.94 1992 12 30 9 11.14 2.3 1992 12 30 12 14.42 2.53 1992 12 30 15 13.48 2.86 1992 12 30 18 14.17 3.49 1992 12 30 21 16.52 4.15 1992 12 31 0 14.49 4.49 1992 12 31 3 14.37 4.47 1992 12 31 6 11.36 4.02 1992 12 31 9 13.06 3.65 1992 12 31 12 7.99 3.24 1992 12 31 15 7.92 2.59 1992 12 31 18 5.84 2.1 1992 12 31 21 5.27 1.72 1993 1 1 0 7.29 1.44 1993 1 1 3 8.31 1.31 1993 1 1 6 4.79 1.24 1993 1 1 9 5.57 1.1 1993 1 1 12 6.37 1.03 1993 1 1 15 7.41 1.07 1993 1 1 18 6.06 1.29 1993 1 1 21 5.47 1.28 1993 1 2 0 3.44 1.17 1993 1 2 3 2.2 1.11 1993 1 2 6 3.78 1.25 1993 1 2 9 3.4 1.35 1993 1 2 12 5.14 1.38 1993 1 2 15 4.48 1.32 1993 1 2 18 6.26 1.23 1993 1 2 21 5.64 1.2 1993 1 3 0 6.49 1.18 1993 1 3 3 7.19 1.2 1993 1 3 6 9.43 1.28 1993 1 3 9 6.25 1.33 1993 1 3 12 3.07 1.26 1993 1 3 15 2.25 1.17 1993 1 3 18 2.99 1.1 1993 1 3 21 4.79 1.04 1993 1 4 0 9.13 1.06 1993 1 4 3 10.1 1.26 1993 1 4 6 7.32 1.46 1993 1 4 9 4.21 1.33 1993 1 4 12 8.15 1.17 1993 1 4 15 8.93 1.22 1993 1 4 18 9.32 1.35 1993 1 4 21 9.06 1.55 1993 1 5 0 6.76 1.68 1993 1 5 3 4.88 1.64 1993 1 5 6 3.57 1.54 1993 1 5 9 3.27 1.39 1993 1 5 12 3.48 1.29 1993 1 5 15 6.87 1.22 1993 1 5 18 9.32 1.18 1993 1 5 21 8.16 1.34 1993 1 6 0 2.24 1.39 1993 1 6 3 4.94 1.37 1993 1 6 6 3.05 1.24 1993 1 6 9 3.41 1.11 1993 1 6 12 4.97 1.07 1993 1 6 15 11.71 1.28 1993 1 6 18 14.17 2.04 1993 1 6 21 14.97 2.89 1993 1 7 0 12.05 3.11 1993 1 7 3 11.41 3.17 1993 1 7 6 11.18 3.08 1993 1 7 9 10.43 2.93 1993 1 7 12 7.7 2.65 1993 1 7 15 6.59 2.28 1993 1 7 18 5.3 1.99 1993 1 7 21 7.38 1.78 1993 1 8 0 7.87 1.72 1993 1 8 3 7.04 1.72 1993 1 8 6 5.88 1.59 1993 1 8 9 4.84 1.44 1993 1 8 12 2.77 1.33 1993 1 8 15 2.59 1.26 1993 1 8 18 3.92 1.2 1993 1 8 21 5.82 1.13 1993 1 9 0 9.66 1.21 1993 1 9 3 9.67 1.6 1993 1 9 6 8.45 1.88 1993 1 9 9 9.29 2.02 1993 1 9 12 9.08 2.1 1993 1 9 15 8.35 2.12 1993 1 9 18 6.65 2.01 1993 1 9 21 5.22 1.79 1993 1 10 0 4.95 1.62 1993 1 10 3 3.54 1.62 1993 1 10 6 3.21 1.67 1993 1 10 9 4.84 1.52 1993 1 10 12 5.78 1.34 1993 1 10 15 8.61 1.22 1993 1 10 18 11.83 1.46 1993 1 10 21 10.92 1.73 1993 1 11 0 9.17 1.77 1993 1 11 3 4.48 1.69 1993 1 11 6 9.47 1.67 1993 1 11 9 7.95 1.77 1993 1 11 12 11.3 1.79 1993 1 11 15 9.59 1.95 1993 1 11 18 11.98 2.29 1993 1 11 21 8.83 2.32 1993 1 12 0 9.6 2.03 1993 1 12 3 7.84 2.46 1993 1 12 6 5.34 2.4 1993 1 12 9 7.34 2.14 1993 1 12 12 7.96 2.03 1993 1 12 15 9.24 1.93 1993 1 12 18 8.98 1.87 1993 1 12 21 10.3 2.03 1993 1 13 0 9.43 2.2 1993 1 13 3 9.07 2.22 1993 1 13 6 7.79 2.19 1993 1 13 9 5.05 2.17 1993 1 13 12 5.96 2.02 1993 1 13 15 10.51 1.92 1993 1 13 18 10.87 2.11 1993 1 13 21 10.58 2.34 1993 1 14 0 9.35 2.41 1993 1 14 3 9.4 2.43 1993 1 14 6 11.68 2.52 1993 1 14 9 9.09 2.64 1993 1 14 12 9.57 2.62 1993 1 14 15 11.09 2.76 1993 1 14 18 12.1 2.9 1993 1 14 21 12.61 3.26 1993 1 15 0 11.42 3.42 1993 1 15 3 11.94 3.41 1993 1 15 6 11.63 3.35 1993 1 15 9 11.04 3.29 1993 1 15 12 8.52 2.93 1993 1 15 15 9.17 2.49 1993 1 15 18 8.75 2.16 1993 1 15 21 9.09 1.94 1993 1 16 0 8.19 1.8 1993 1 16 3 8.25 1.67 1993 1 16 6 10.09 1.7 1993 1 16 9 10.52 2.01 1993 1 16 12 9.03 2.09 1993 1 16 15 7.24 1.87 1993 1 16 18 5.19 1.68 1993 1 16 21 5.39 1.59 1993 1 17 0 7 1.58 1993 1 17 3 8.22 1.67 1993 1 17 6 7.32 1.81 1993 1 17 9 8.35 1.9 1993 1 17 12 8.59 1.98 1993 1 17 15 9.27 2.04 1993 1 17 18 8.25 2.05 1993 1 17 21 6.42 1.95 1993 1 18 0 8.24 1.87 1993 1 18 3 10.45 2.07 1993 1 18 6 7.48 2.19 1993 1 18 9 6.03 2.07 1993 1 18 12 3.75 1.9 1993 1 18 15 2.3 1.76 1993 1 18 18 4.33 1.65 1993 1 18 21 4.71 1.56 1993 1 19 0 3.63 1.47 1993 1 19 3 6.45 1.4 1993 1 19 6 9.66 1.43 1993 1 19 9 10.65 1.74 1993 1 19 12 7 2.07 1993 1 19 15 4.98 1.91 1993 1 19 18 5.98 1.72 1993 1 19 21 5.78 1.64 1993 1 20 0 4.22 1.56 1993 1 20 3 5.43 1.48 1993 1 20 6 1.8 1.46 1993 1 20 9 4.9 1.6 1993 1 20 12 7.74 1.62 1993 1 20 15 8.66 1.66 1993 1 20 18 13.32 1.93 1993 1 20 21 15.9 2.81 1993 1 21 0 11.5 3.19 1993 1 21 3 9.75 3.03 1993 1 21 6 4.84 2.77 1993 1 21 9 6.45 2.47 1993 1 21 12 7.92 2.25 1993 1 21 15 5.2 2.11 1993 1 21 18 7.84 1.97 1993 1 21 21 7.66 1.93 1993 1 22 0 10.2 2.04 1993 1 22 3 12.85 2.39 1993 1 22 6 9.14 2.36 1993 1 22 9 5.77 2.05 1993 1 22 12 4.51 1.86 1993 1 22 15 7.86 1.69 1993 1 22 18 8.98 1.52 1993 1 22 21 4.96 1.34 1993 1 23 0 6.27 1.18 1993 1 23 3 6.75 1.15 1993 1 23 6 7.9 1.16 1993 1 23 9 8.44 1.26 1993 1 23 12 8.4 1.4 1993 1 23 15 6.22 1.52 1993 1 23 18 4.9 1.57 1993 1 23 21 3.36 1.51 1993 1 24 0 5.73 1.36 1993 1 24 3 8.75 1.27 1993 1 24 6 8.89 1.46 1993 1 24 9 7.11 1.49 1993 1 24 12 8.7 1.49 1993 1 24 15 10.29 1.74 1993 1 24 18 8.59 1.92 1993 1 24 21 7.42 1.74 1993 1 25 0 7.66 1.53 1993 1 25 3 9.12 1.46 1993 1 25 6 9.34 1.67 1993 1 25 9 9.53 1.95 1993 1 25 12 7 1.89 1993 1 25 15 5.78 1.7 1993 1 25 18 5.56 1.52 1993 1 25 21 5.64 1.37 1993 1 26 0 8.97 1.2 1993 1 26 3 8.79 1.26 1993 1 26 6 8.83 1.39 1993 1 26 9 8.66 1.57 1993 1 26 12 6.61 1.63 1993 1 26 15 4.73 1.54 1993 1 26 18 10.35 1.54 1993 1 26 21 9.02 1.94 1993 1 27 0 4.94 1.91 1993 1 27 3 5.2 1.75 1993 1 27 6 5.19 1.67 1993 1 27 9 7.73 1.66 1993 1 27 12 4.94 1.78 1993 1 27 15 4.8 1.7 1993 1 27 18 7.56 1.6 1993 1 27 21 7.55 1.66 1993 1 28 0 6.6 1.73 1993 1 28 3 8.86 1.71 1993 1 28 6 5.55 1.96 1993 1 28 9 3.28 1.91 1993 1 28 12 0.43 1.84 1993 1 28 15 1.39 1.99 1993 1 28 18 1.21 1.93 1993 1 28 21 3.25 1.73 1993 1 29 0 4.77 1.49 1993 1 29 3 6.13 1.34 1993 1 29 6 6.51 1.41 1993 1 29 9 5.8 1.47 1993 1 29 12 4.58 1.41 1993 1 29 15 4.94 1.31 1993 1 29 18 7.3 1.27 1993 1 29 21 5.6 1.27 1993 1 30 0 5.03 1.24 1993 1 30 3 6.83 1.22 1993 1 30 6 6.51 1.24 1993 1 30 9 7.91 1.3 1993 1 30 12 9.34 1.36 1993 1 30 15 7.79 1.43 1993 1 30 18 8.51 1.55 1993 1 30 21 8.07 1.78 1993 1 31 0 9.37 2.09 1993 1 31 3 10.58 2.51 1993 1 31 6 14.25 3.02 1993 1 31 9 11.56 3.47 1993 1 31 12 14.4 4.14 1993 1 31 15 16.59 4.4 1993 1 31 18 21.18 4.42 1993 1 31 21 17.66 5.48 1993 2 1 0 13.01 6.68 1993 2 1 3 7.96 6.17 1993 2 1 6 7.54 4.79 1993 2 1 9 6.16 3.92 1993 2 1 12 4.18 3.47 1993 2 1 15 6.64 3.07 1993 2 1 18 5.75 2.66 1993 2 1 21 2.78 2.31 1993 2 2 0 6.02 2.08 1993 2 2 3 8.75 1.98 1993 2 2 6 10.37 2.16 1993 2 2 9 9.2 2.39 1993 2 2 12 10.42 2.47 1993 2 2 15 9.74 2.59 1993 2 2 18 8.6 2.48 1993 2 2 21 6.31 2.29 1993 2 3 0 3.53 2.17 1993 2 3 3 3.98 2.08 1993 2 3 6 5.19 1.95 1993 2 3 9 8 1.8 1993 2 3 12 8.46 1.75 1993 2 3 15 8.7 1.83 1993 2 3 18 8.56 1.89 1993 2 3 21 6.99 1.78 1993 2 4 0 5.71 1.6 1993 2 4 3 6.47 1.49 1993 2 4 6 12.13 1.59 1993 2 4 9 11.03 2.06 1993 2 4 12 11.59 2.55 1993 2 4 15 8.12 2.48 1993 2 4 18 6.46 2.2 1993 2 4 21 3.98 1.87 1993 2 5 0 3.4 1.63 1993 2 5 3 4.26 1.63 1993 2 5 6 7.32 1.69 1993 2 5 9 7.3 1.53 1993 2 5 12 5.91 1.38 1993 2 5 15 5.73 1.3 1993 2 5 18 5.21 1.27 1993 2 5 21 6.15 1.26 1993 2 6 0 4.17 1.28 1993 2 6 3 5.2 1.35 1993 2 6 6 7 1.45 1993 2 6 9 7.24 1.58 1993 2 6 12 6.07 1.64 1993 2 6 15 6.72 1.73 1993 2 6 18 9.05 2.02 1993 2 6 21 8.48 2.29 1993 2 7 0 9.52 2.25 1993 2 7 3 11.24 2.41 1993 2 7 6 8.79 2.45 1993 2 7 9 9.09 2.35 1993 2 7 12 5.39 2.13 1993 2 7 15 1.37 1.84 1993 2 7 18 2.29 1.59 1993 2 7 21 4.71 1.3 1993 2 8 0 8.44 1.08 1993 2 8 3 11.5 1.13 1993 2 8 6 15.12 1.73 1993 2 8 9 13.6 2.4 1993 2 8 12 7.32 2.78 1993 2 8 15 10.36 2.87 1993 2 8 18 7.82 2.97 1993 2 8 21 5.55 2.69 1993 2 9 0 4.8 2.42 1993 2 9 3 5.31 2.16 1993 2 9 6 3.33 1.91 1993 2 9 9 6.82 1.77 1993 2 9 12 8.27 1.75 1993 2 9 15 9.02 1.79 1993 2 9 18 6.97 1.91 1993 2 9 21 5.95 1.92 1993 2 10 0 4.13 1.8 1993 2 10 3 3.7 1.69 1993 2 10 6 3.86 1.61 1993 2 10 9 3.78 1.54 1993 2 10 12 2.8 1.49 1993 2 10 15 3.43 1.41 1993 2 10 18 4.97 1.29 1993 2 10 21 5.96 1.19 1993 2 11 0 4.7 1.1 1993 2 11 3 4.47 1 1993 2 11 6 5.28 0.93 1993 2 11 9 5.48 0.89 1993 2 11 12 6.49 0.89 1993 2 11 15 5.43 0.91 1993 2 11 18 6.06 0.93 1993 2 11 21 5.66 0.97 1993 2 12 0 8.24 0.98 1993 2 12 3 8.73 1.11 1993 2 12 6 9.35 1.35 1993 2 12 9 9.64 1.64 1993 2 12 12 9.57 1.88 1993 2 12 15 13.28 2.36 1993 2 12 18 9.51 2.63 1993 2 12 21 7.58 2.38 1993 2 13 0 4.88 1.97 1993 2 13 3 3.62 1.7 1993 2 13 6 3.45 1.55 1993 2 13 9 4.05 1.5 1993 2 13 12 2.51 1.42 1993 2 13 15 4.25 1.32 1993 2 13 18 3.88 1.24 1993 2 13 21 2.77 1.19 1993 2 14 0 0.45 1.16 1993 2 14 3 2.97 1.14 1993 2 14 6 2.95 1.12 1993 2 14 9 2.16 1.06 1993 2 14 12 8.15 0.98 1993 2 14 15 11.07 1.23 1993 2 14 18 10.95 1.45 1993 2 14 21 13.22 1.9 1993 2 15 0 12.3 2.46 1993 2 15 3 8.41 2.75 1993 2 15 6 6.79 2.87 1993 2 15 9 7.82 2.85 1993 2 15 12 12.21 2.72 1993 2 15 15 12.02 2.6 1993 2 15 18 9.05 2.6 1993 2 15 21 6.78 2.37 1993 2 16 0 7 2.07 1993 2 16 3 8.06 1.97 1993 2 16 6 8.88 1.98 1993 2 16 9 7.36 1.86 1993 2 16 12 8.3 1.66 1993 2 16 15 8.15 1.58 1993 2 16 18 8.49 1.55 1993 2 16 21 5.21 1.5 1993 2 17 0 2.27 1.43 1993 2 17 3 4.65 1.31 1993 2 17 6 9.67 1.43 1993 2 17 9 12.19 1.88 1993 2 17 12 13.13 2.36 1993 2 17 15 10.83 2.66 1993 2 17 18 9.91 2.66 1993 2 17 21 9.01 2.21 1993 2 18 0 7.62 1.75 1993 2 18 3 7.44 1.67 1993 2 18 6 6.12 1.69 1993 2 18 9 7.4 1.71 1993 2 18 12 6.56 1.68 1993 2 18 15 6.21 1.85 1993 2 18 18 5.78 2.13 1993 2 18 21 6.65 2.03 1993 2 19 0 6.76 1.99 1993 2 19 3 5.47 2 1993 2 19 6 3.96 1.92 1993 2 19 9 6.25 1.77 1993 2 19 12 10.55 1.71 1993 2 19 15 11.99 2.2 1993 2 19 18 11.21 2.75 1993 2 19 21 10.81 2.86 1993 2 20 0 8.58 2.92 1993 2 20 3 11.31 3 1993 2 20 6 10.76 3.06 1993 2 20 9 12.73 3.18 1993 2 20 12 7.59 2.93 1993 2 20 15 6.38 2.44 1993 2 20 18 6.25 2.04 1993 2 20 21 6.97 1.78 1993 2 21 0 4.99 1.63 1993 2 21 3 5.67 1.47 1993 2 21 6 6.2 1.36 1993 2 21 9 7.87 1.28 1993 2 21 12 11.54 1.37 1993 2 21 15 10.89 1.78 1993 2 21 18 11.24 2.06 1993 2 21 21 9.68 2.32 1993 2 22 0 10.42 2.32 1993 2 22 3 9.58 2.25 1993 2 22 6 9.26 2.31 1993 2 22 9 8.77 2.44 1993 2 22 12 9.3 2.56 1993 2 22 15 6.62 2.59 1993 2 22 18 4.15 2.47 1993 2 22 21 4.03 2.36 1993 2 23 0 8.12 2.32 1993 2 23 3 10.88 2.34 1993 2 23 6 11.22 2.6 1993 2 23 9 11.17 2.75 1993 2 23 12 12.27 2.8 1993 2 23 15 12.02 2.84 1993 2 23 18 11.82 2.88 1993 2 23 21 11.29 2.93 1993 2 24 0 10.54 2.87 1993 2 24 3 9.86 2.71 1993 2 24 6 8.88 2.49 1993 2 24 9 8.58 2.29 1993 2 24 12 8.28 2.13 1993 2 24 15 9.1 2.02 1993 2 24 18 10.11 2 1993 2 24 21 9.93 2.11 1993 2 25 0 11.13 2.25 1993 2 25 3 12.38 2.56 1993 2 25 6 12.46 2.73 1993 2 25 9 12.16 2.81 1993 2 25 12 10.04 2.67 1993 2 25 15 9.24 2.45 1993 2 25 18 8.25 2.23 1993 2 25 21 7.65 2.01 1993 2 26 0 6.95 1.81 1993 2 26 3 9.36 1.66 1993 2 26 6 7.45 1.63 1993 2 26 9 8.71 1.7 1993 2 26 12 9.79 1.82 1993 2 26 15 7.4 1.87 1993 2 26 18 6.7 1.85 1993 2 26 21 4.89 1.75 1993 2 27 0 6.56 1.59 1993 2 27 3 7.84 1.48 1993 2 27 6 4.53 1.37 1993 2 27 9 6.14 1.24 1993 2 27 12 7.7 1.2 1993 2 27 15 6.68 1.26 1993 2 27 18 9 1.21 1993 2 27 21 9.54 1.41 1993 2 28 0 10.39 1.86 1993 2 28 3 11.65 2.43 1993 2 28 6 12 2.92 1993 2 28 9 14.28 3.75 1993 2 28 12 14.93 4.54 1993 2 28 15 13.7 4.85 1993 2 28 18 12.47 4.7 1993 2 28 21 14.49 4.57 1993 3 1 0 12.5 4.52 1993 3 1 3 12.8 4.14 1993 3 1 6 12.18 3.9 1993 3 1 9 12.52 3.84 1993 3 1 12 11.71 3.7 1993 3 1 15 11.3 3.51 1993 3 1 18 10.55 3.37 1993 3 1 21 8.95 3.1 1993 3 2 0 9.29 2.75 1993 3 2 3 7.38 2.45 1993 3 2 6 6.53 2.13 1993 3 2 9 5.18 1.86 1993 3 2 12 5.94 1.64 1993 3 2 15 5.66 1.5 1993 3 2 18 5.16 1.44 1993 3 2 21 4.04 1.41 1993 3 3 0 5.11 1.38 1993 3 3 3 5.6 1.36 1993 3 3 6 6.38 1.37 1993 3 3 9 5.56 1.34 1993 3 3 12 5.91 1.26 1993 3 3 15 5.78 1.26 1993 3 3 18 5.5 1.29 1993 3 3 21 4.91 1.26 1993 3 4 0 2.56 1.2 1993 3 4 3 3.54 1.13 1993 3 4 6 3.22 1.05 1993 3 4 9 4.78 0.96 1993 3 4 12 4.79 0.89 1993 3 4 15 4.44 0.84 1993 3 4 18 4.8 0.82 1993 3 4 21 5.46 0.83 1993 3 5 0 5.2 0.84 1993 3 5 3 6.24 0.86 1993 3 5 6 8.32 0.94 1993 3 5 9 8.23 1.15 1993 3 5 12 8.91 1.32 1993 3 5 15 9.96 1.52 1993 3 5 18 9.85 1.75 1993 3 5 21 7.93 1.85 1993 3 6 0 10.4 1.86 1993 3 6 3 10.81 1.99 1993 3 6 6 11.27 2.06 1993 3 6 9 11.72 2.22 1993 3 6 12 10.78 2.4 1993 3 6 15 13.82 2.62 1993 3 6 18 13.67 2.9 1993 3 6 21 13.55 3.19 1993 3 7 0 9.55 3.32 1993 3 7 3 10.32 3.2 1993 3 7 6 9.89 2.97 1993 3 7 9 7.49 2.72 1993 3 7 12 5.48 2.46 1993 3 7 15 8.26 2.28 1993 3 7 18 8.12 2.23 1993 3 7 21 8.53 2.21 1993 3 8 0 10.06 2.32 1993 3 8 3 10.39 2.48 1993 3 8 6 10.97 2.64 1993 3 8 9 8.22 2.75 1993 3 8 12 5.34 2.58 1993 3 8 15 2.06 2.3 1993 3 8 18 3.22 2.08 1993 3 8 21 4.4 1.87 1993 3 9 0 5.08 1.7 1993 3 9 3 7.16 1.61 1993 3 9 6 7.91 1.65 1993 3 9 9 7.16 1.6 1993 3 9 12 10.04 1.64 1993 3 9 15 12.49 1.96 1993 3 9 18 11.6 2.38 1993 3 9 21 8.11 2.5 1993 3 10 0 9.02 2.26 1993 3 10 3 8.56 2.14 1993 3 10 6 12.63 2.47 1993 3 10 9 10.21 2.97 1993 3 10 12 6.76 2.87 1993 3 10 15 4.03 2.58 1993 3 10 18 2.61 2.32 1993 3 10 21 3.19 2.12 1993 3 11 0 5.82 1.96 1993 3 11 3 7.52 1.86 1993 3 11 6 7.82 1.88 1993 3 11 9 7.13 1.82 1993 3 11 12 6.39 1.73 1993 3 11 15 6.75 1.7 1993 3 11 18 6.9 1.6 1993 3 11 21 5.56 1.49 1993 3 12 0 5.85 1.39 1993 3 12 3 6.33 1.31 1993 3 12 6 6.02 1.21 1993 3 12 9 9.45 1.34 1993 3 12 12 7.69 1.45 1993 3 12 15 10.38 1.53 1993 3 12 18 10.6 1.92 1993 3 12 21 9.41 2.13 1993 3 13 0 12.47 2.75 1993 3 13 3 14.26 3.69 1993 3 13 6 12.31 4.25 1993 3 13 9 11.6 4.04 1993 3 13 12 11.49 3.78 1993 3 13 15 11.39 3.69 1993 3 13 18 11.25 3.5 1993 3 13 21 12.36 3.52 1993 3 14 0 11.29 3.66 1993 3 14 3 11.89 3.69 1993 3 14 6 12.7 3.72 1993 3 14 9 10.03 3.62 1993 3 14 12 9.28 3.31 1993 3 14 15 6.32 2.99 1993 3 14 18 4.27 2.61 1993 3 14 21 4.64 2.23 1993 3 15 0 6.96 1.92 1993 3 15 3 8.12 1.76 1993 3 15 6 5.34 1.66 1993 3 15 9 5.94 1.49 1993 3 15 12 2.09 1.36 1993 3 15 15 0.88 1.23 1993 3 15 18 1.74 1.12 1993 3 15 21 5.92 1.08 1993 3 16 0 6.23 1.23 1993 3 16 3 7.66 1.36 1993 3 16 6 5.29 1.35 1993 3 16 9 6.25 1.28 1993 3 16 12 4.8 1.24 1993 3 16 15 3.56 1.18 1993 3 16 18 2.91 1.14 1993 3 16 21 4.42 1.05 1993 3 17 0 4.39 0.92 1993 3 17 3 6.44 0.79 1993 3 17 6 8.54 0.8 1993 3 17 9 5.65 0.92 1993 3 17 12 3.46 0.97 1993 3 17 15 5.18 1.06 1993 3 17 18 6.71 1.21 1993 3 17 21 6.9 1.27 1993 3 18 0 10.43 1.4 1993 3 18 3 8.18 1.63 1993 3 18 6 7.31 1.7 1993 3 18 9 5.33 1.83 1993 3 18 12 7.07 1.93 1993 3 18 15 6.2 1.87 1993 3 18 18 5.97 1.71 1993 3 18 21 8.88 1.58 1993 3 19 0 8.31 1.65 1993 3 19 3 10.17 1.78 1993 3 19 6 5.2 2 1993 3 19 9 14.34 2.41 1993 3 19 12 9.13 2.78 1993 3 19 15 9 2.45 1993 3 19 18 7.94 2.16 1993 3 19 21 10.33 2.05 1993 3 20 0 10.09 2.17 1993 3 20 3 10.66 2.36 1993 3 20 6 7.4 2.32 1993 3 20 9 7.2 1.98 1993 3 20 12 4.45 1.77 1993 3 20 15 3.81 1.6 1993 3 20 18 0.66 1.41 1993 3 20 21 3.7 1.22 1993 3 21 0 4.35 1.08 1993 3 21 3 4.42 0.97 1993 3 21 6 8.81 0.92 1993 3 21 9 7.07 1.08 1993 3 21 12 5.97 1.11 1993 3 21 15 5.51 1.3 1993 3 21 18 4.5 1.36 1993 3 21 21 5.31 1.32 1993 3 22 0 6.95 1.4 1993 3 22 3 8.58 1.54 1993 3 22 6 11.68 1.82 1993 3 22 9 9.28 2.08 1993 3 22 12 12.26 2.19 1993 3 22 15 11.36 2.31 1993 3 22 18 8.44 2.34 1993 3 22 21 9.42 2.33 1993 3 23 0 7.9 2.41 1993 3 23 3 6.64 2.28 1993 3 23 6 5.65 2.12 1993 3 23 9 5.98 2.02 1993 3 23 12 3.98 1.92 1993 3 23 15 4.58 1.81 1993 3 23 18 4.71 1.7 1993 3 23 21 10.87 1.5 1993 3 24 0 13.17 1.85 1993 3 24 3 13.36 2.43 1993 3 24 6 13.7 2.76 1993 3 24 9 9.92 3.11 1993 3 24 12 12.15 3.23 1993 3 24 15 10.9 2.99 1993 3 24 18 6.39 2.86 1993 3 24 21 4.29 2.87 1993 3 25 0 7.47 2.87 1993 3 25 3 11.35 2.88 1993 3 25 6 13.03 2.88 1993 3 25 9 8.69 2.9 1993 3 25 12 7.37 2.56 1993 3 25 15 8.56 2.29 1993 3 25 18 14.59 2.53 1993 3 25 21 16.34 3.6 1993 3 26 0 13.88 4.54 1993 3 26 3 9.48 4.87 1993 3 26 6 4.52 4.36 1993 3 26 9 3.72 3.64 1993 3 26 12 3.31 3.04 1993 3 26 15 4.55 2.52 1993 3 26 18 4.24 2.07 1993 3 26 21 8.65 1.66 1993 3 27 0 10.62 1.48 1993 3 27 3 11.61 1.69 1993 3 27 6 11.97 1.99 1993 3 27 9 9.82 2.21 1993 3 27 12 9.63 1.98 1993 3 27 15 8.3 1.87 1993 3 27 18 5.68 1.73 1993 3 27 21 4.73 1.55 1993 3 28 0 4.02 1.44 1993 3 28 3 6.43 1.41 1993 3 28 6 6.2 1.41 1993 3 28 9 6.94 1.5 1993 3 28 12 6.65 1.72 1993 3 28 15 7.62 1.88 1993 3 28 18 10.34 2.14 1993 3 28 21 9.58 2.35 1993 3 29 0 8.21 2.33 1993 3 29 3 9.55 2.42 1993 3 29 6 8.03 2.49 1993 3 29 9 9.31 2.41 1993 3 29 12 4.85 2.27 1993 3 29 15 6.85 2.03 1993 3 29 18 8.15 1.85 1993 3 29 21 8.49 1.72 1993 3 30 0 10.51 1.64 1993 3 30 3 10.45 1.7 1993 3 30 6 11.72 1.92 1993 3 30 9 11.73 2.23 1993 3 30 12 10.78 2.6 1993 3 30 15 12.26 2.82 1993 3 30 18 13.45 3.14 1993 3 30 21 11.4 3.59 1993 3 31 0 8.9 3.45 1993 3 31 3 8.3 3.19 1993 3 31 6 7.27 2.91 1993 3 31 9 7.76 2.65 1993 3 31 12 5.32 2.42 1993 3 31 15 7.84 2.2 1993 3 31 18 11.94 2.15 1993 3 31 21 12.08 2.43 1993 4 1 0 8.16 2.64 1993 4 1 3 7.32 2.56 1993 4 1 6 8.23 2.45 1993 4 1 9 9.31 2.37 1993 4 1 12 12.32 2.39 1993 4 1 15 9.9 2.51 1993 4 1 18 11.31 2.56 1993 4 1 21 14.24 2.83 1993 4 2 0 13.36 2.95 1993 4 2 3 14.59 3.34 1993 4 2 6 15.18 3.76 1993 4 2 9 17.67 4.79 1993 4 2 12 19.45 5.79 1993 4 2 15 18.07 6.5 1993 4 2 18 14.62 6.96 1993 4 2 21 17.38 7.18 1993 4 3 0 14.96 6.81 1993 4 3 3 16.43 6.45 1993 4 3 6 13.87 6.03 1993 4 3 9 13.45 5.2 1993 4 3 12 9.19 4.22 1993 4 3 15 8.03 3.33 1993 4 3 18 6.41 2.68 1993 4 3 21 7.4 2.21 1993 4 4 0 4.71 1.89 1993 4 4 3 5.2 1.6 1993 4 4 6 7.22 1.39 1993 4 4 9 7.62 1.33 1993 4 4 12 8.47 1.36 1993 4 4 15 8.65 1.42 1993 4 4 18 8.21 1.49 1993 4 4 21 8.87 1.56 1993 4 5 0 9.81 1.68 1993 4 5 3 9.42 1.74 1993 4 5 6 9.01 1.77 1993 4 5 9 9.02 1.83 1993 4 5 12 7.27 1.85 1993 4 5 15 7.92 1.8 1993 4 5 18 8.62 1.77 1993 4 5 21 9.72 1.85 1993 4 6 0 10.94 2.01 1993 4 6 3 11.53 2.23 1993 4 6 6 12.58 2.47 1993 4 6 9 11.43 2.73 1993 4 6 12 13.57 2.8 1993 4 6 15 12.22 2.86 1993 4 6 18 13.04 3 1993 4 6 21 13.03 3.13 1993 4 7 0 13.31 3.12 1993 4 7 3 10.94 3.32 1993 4 7 6 12.93 3.33 1993 4 7 9 12.21 3.42 1993 4 7 12 10.35 3.45 1993 4 7 15 11.04 3.5 1993 4 7 18 5.67 3.44 1993 4 7 21 11.87 3.24 1993 4 8 0 7.45 3.21 1993 4 8 3 6.29 3.18 1993 4 8 6 3.64 3.02 1993 4 8 9 6.47 2.91 1993 4 8 12 7 2.9 1993 4 8 15 7.57 2.93 1993 4 8 18 2.47 2.85 1993 4 8 21 5.18 2.77 1993 4 9 0 8.14 2.73 1993 4 9 3 10.07 2.81 1993 4 9 6 8.35 2.88 1993 4 9 9 3.46 2.8 1993 4 9 12 7.42 2.75 1993 4 9 15 9.57 2.86 1993 4 9 18 8.16 3.09 1993 4 9 21 8.79 3.1 1993 4 10 0 7.99 3.05 1993 4 10 3 6.92 2.89 1993 4 10 6 7.87 2.81 1993 4 10 9 13.72 3.05 1993 4 10 12 15.5 3.9 1993 4 10 15 12.51 4.25 1993 4 10 18 12.67 4.09 1993 4 10 21 9.23 3.81 1993 4 11 0 5.93 3.39 1993 4 11 3 6.32 2.95 1993 4 11 6 7.65 2.54 1993 4 11 9 7.48 2.25 1993 4 11 12 9.73 2.04 1993 4 11 15 9.57 1.93 1993 4 11 18 9.83 1.95 1993 4 11 21 11.21 2.09 1993 4 12 0 14.09 2.51 1993 4 12 3 11.39 3.02 1993 4 12 6 11.72 3.26 1993 4 12 9 8.68 3.15 1993 4 12 12 9.58 2.63 1993 4 12 15 8.91 2.34 1993 4 12 18 5.44 2.2 1993 4 12 21 2.7 2.04 1993 4 13 0 5.33 1.92 1993 4 13 3 7.62 1.92 1993 4 13 6 7.25 1.94 1993 4 13 9 7.5 1.78 1993 4 13 12 7.75 1.62 1993 4 13 15 6.96 1.55 1993 4 13 18 5.62 1.56 1993 4 13 21 1.91 1.55 1993 4 14 0 0 1.48 1993 4 14 3 4.4 1.37 1993 4 14 6 10 1.33 1993 4 14 9 13.32 1.59 1993 4 14 12 19.8 2.81 1993 4 14 15 21.06 4.07 1993 4 14 18 23.13 5.31 1993 4 14 21 19.35 5.85 1993 4 15 0 18.18 5.82 1993 4 15 3 14.98 5.47 1993 4 15 6 8.42 4.91 1993 4 15 9 6.89 4.24 1993 4 15 12 3.83 3.62 1993 4 15 15 2.33 3.14 1993 4 15 18 1.63 2.86 1993 4 15 21 0.04 2.68 1993 4 16 0 3.46 2.5 1993 4 16 3 4.18 2.27 1993 4 16 6 5.48 2.06 1993 4 16 9 7.58 1.9 1993 4 16 12 9.64 1.87 1993 4 16 15 8.59 1.87 1993 4 16 18 11.71 2 1993 4 16 21 10.63 2.47 1993 4 17 0 8.49 2.71 1993 4 17 3 8.84 2.64 1993 4 17 6 7.8 2.66 1993 4 17 9 7.18 2.54 1993 4 17 12 7.73 2.49 1993 4 17 15 8.9 2.5 1993 4 17 18 10.41 2.5 1993 4 17 21 10.41 2.58 1993 4 18 0 10.63 2.79 1993 4 18 3 10.14 2.91 1993 4 18 6 7.6 2.82 1993 4 18 9 7.01 2.7 1993 4 18 12 7.65 2.79 1993 4 18 15 8.04 2.65 1993 4 18 18 8.42 2.4 1993 4 18 21 8.5 2.31 1993 4 19 0 7.85 2.36 1993 4 19 3 8.49 2.44 1993 4 19 6 8.84 2.66 1993 4 19 9 8.03 2.71 1993 4 19 12 6.38 2.57 1993 4 19 15 3.76 2.46 1993 4 19 18 5 2.41 1993 4 19 21 5.35 2.27 1993 4 20 0 6.86 2.1 1993 4 20 3 9.29 2.02 1993 4 20 6 8.07 2.1 1993 4 20 9 7.95 2.04 1993 4 20 12 6.97 1.95 1993 4 20 15 9.4 1.87 1993 4 20 18 9.46 2.01 1993 4 20 21 9.7 2.27 1993 4 21 0 6.72 2.39 1993 4 21 3 6.32 2.27 1993 4 21 6 5.99 2.14 1993 4 21 9 7.05 1.99 1993 4 21 12 6.67 1.88 1993 4 21 15 7.55 1.87 1993 4 21 18 7.26 1.86 1993 4 21 21 7.13 1.77 1993 4 22 0 4.03 1.64 1993 4 22 3 5.89 1.47 1993 4 22 6 4.93 1.33 1993 4 22 9 6.39 1.21 1993 4 22 12 6.41 1.13 1993 4 22 15 7.01 1.08 1993 4 22 18 9.25 1.1 1993 4 22 21 10.83 1.32 1993 4 23 0 10.18 1.69 1993 4 23 3 9.1 1.9 1993 4 23 6 8.57 1.94 1993 4 23 9 6.03 1.9 1993 4 23 12 3.38 1.85 1993 4 23 15 1.4 1.87 1993 4 23 18 2.61 1.94 1993 4 23 21 7.13 1.9 1993 4 24 0 6.84 1.91 1993 4 24 3 7.4 1.98 1993 4 24 6 3.65 2.06 1993 4 24 9 5.58 2.07 1993 4 24 12 6.65 2.04 1993 4 24 15 10.88 2.03 1993 4 24 18 12.88 2.2 1993 4 24 21 10.03 2.38 1993 4 25 0 5.16 2.36 1993 4 25 3 3.69 2.28 1993 4 25 6 5.45 2.18 1993 4 25 9 6.56 2.07 1993 4 25 12 6.4 1.93 1993 4 25 15 9.41 1.9 1993 4 25 18 9.65 2.14 1993 4 25 21 8.86 2.14 1993 4 26 0 8.18 2.34 1993 4 26 3 8.1 2.46 1993 4 26 6 5.44 2.49 1993 4 26 9 6.5 2.51 1993 4 26 12 6.66 2.58 1993 4 26 15 5.12 2.46 1993 4 26 18 5.12 2.24 1993 4 26 21 8.58 2.04 1993 4 27 0 10.27 1.89 1993 4 27 3 11.65 1.94 1993 4 27 6 12.18 2.32 1993 4 27 9 13.67 2.94 1993 4 27 12 11.69 3.58 1993 4 27 15 14.05 4 1993 4 27 18 13.23 5.25 1993 4 27 21 13.1 5.12 1993 4 28 0 10.29 4.89 1993 4 28 3 10.81 4.51 1993 4 28 6 7.49 3.96 1993 4 28 9 7.63 3.45 1993 4 28 12 5.26 3 1993 4 28 15 7.68 2.56 1993 4 28 18 7.06 2.17 1993 4 28 21 8.73 1.82 1993 4 29 0 11.36 1.77 1993 4 29 3 12.11 2.07 1993 4 29 6 10.96 2.42 1993 4 29 9 12.12 2.7 1993 4 29 12 15.01 3.32 1993 4 29 15 10.58 3.59 1993 4 29 18 11.41 3.34 1993 4 29 21 9.54 3.08 1993 4 30 0 10.53 2.87 1993 4 30 3 11.02 2.71 1993 4 30 6 10.55 2.6 1993 4 30 9 11.24 2.43 1993 4 30 12 10.42 2.3 1993 4 30 15 10.5 2.29 1993 4 30 18 12.24 2.48 1993 4 30 21 7.36 2.43 1993 5 1 0 6.3 2.34 1993 5 1 3 3.94 2.12 1993 5 1 6 2.2 1.85 1993 5 1 9 1.91 1.67 1993 5 1 12 3.83 1.52 1993 5 1 15 6.32 1.44 1993 5 1 18 7.81 1.26 1993 5 1 21 8.82 1.16 1993 5 2 0 7.75 1.19 1993 5 2 3 9.09 1.32 1993 5 2 6 11.24 1.57 1993 5 2 9 11.78 1.99 1993 5 2 12 10.83 2.28 1993 5 2 15 12.92 2.55 1993 5 2 18 14.89 2.96 1993 5 2 21 12.67 3.26 1993 5 3 0 12.27 3.42 1993 5 3 3 12.79 3.52 1993 5 3 6 12.36 3.54 1993 5 3 9 13.34 3.88 1993 5 3 12 8.64 4.1 1993 5 3 15 9.25 3.81 1993 5 3 18 6.73 3.5 1993 5 3 21 6.56 3.14 1993 5 4 0 6.31 2.86 1993 5 4 3 3 2.71 1993 5 4 6 1.3 2.58 1993 5 4 9 6.51 2.45 1993 5 4 12 11.29 2.34 1993 5 4 15 13.19 2.57 1993 5 4 18 15.71 3.66 1993 5 4 21 16.03 5.06 1993 5 5 0 14.41 5.25 1993 5 5 3 13.91 5.15 1993 5 5 6 10.39 4.9 1993 5 5 9 10.43 4.38 1993 5 5 12 8.16 3.87 1993 5 5 15 8.97 3.39 1993 5 5 18 7.89 2.91 1993 5 5 21 7.04 2.49 1993 5 6 0 9.73 2.11 1993 5 6 3 11.5 2 1993 5 6 6 14.63 2.1 1993 5 6 9 18.05 2.84 1993 5 6 12 21.91 4.52 1993 5 6 15 14.74 5.94 1993 5 6 18 11.28 5.6 1993 5 6 21 11.64 4.94 1993 5 7 0 10.81 4.49 1993 5 7 3 7.17 4.04 1993 5 7 6 8.58 3.72 1993 5 7 9 8.44 3.63 1993 5 7 12 12.79 3.58 1993 5 7 15 11.29 3.61 1993 5 7 18 8.84 3.67 1993 5 7 21 8.65 3.76 1993 5 8 0 9.1 3.48 1993 5 8 3 8.68 3.25 1993 5 8 6 9.99 3.1 1993 5 8 9 9.48 3.01 1993 5 8 12 11.37 2.9 1993 5 8 15 10.47 2.84 1993 5 8 18 9.2 2.8 1993 5 8 21 7.89 2.71 1993 5 9 0 7.16 2.5 1993 5 9 3 8.48 2.3 1993 5 9 6 9.52 2.25 1993 5 9 9 8.47 2.15 1993 5 9 12 7.3 2.06 1993 5 9 15 7.94 2.03 1993 5 9 18 8.86 2.12 1993 5 9 21 8.65 2.4 1993 5 10 0 4.27 2.45 1993 5 10 3 5.08 2.2 1993 5 10 6 5.57 1.93 1993 5 10 9 4.82 1.75 1993 5 10 12 4.52 1.64 1993 5 10 15 4.74 1.56 1993 5 10 18 4.74 1.47 1993 5 10 21 4.36 1.39 1993 5 11 0 6.42 1.32 1993 5 11 3 7.78 1.34 1993 5 11 6 6.56 1.41 1993 5 11 9 6.47 1.38 1993 5 11 12 4.11 1.31 1993 5 11 15 2.43 1.25 1993 5 11 18 3.65 1.2 1993 5 11 21 1.06 1.12 1993 5 12 0 1.08 1.04 1993 5 12 3 1.79 0.98 1993 5 12 6 1.9 0.96 1993 5 12 9 1.27 1.05 1993 5 12 12 7.37 1.18 1993 5 12 15 8.48 1.37 1993 5 12 18 7.05 1.46 1993 5 12 21 8.2 1.54 1993 5 13 0 7.05 1.71 1993 5 13 3 8.42 1.94 1993 5 13 6 10.5 2.03 1993 5 13 9 11.53 2.1 1993 5 13 12 12.47 2.42 1993 5 13 15 12.33 2.65 1993 5 13 18 11.31 2.68 1993 5 13 21 11.39 2.66 1993 5 14 0 11.96 2.6 1993 5 14 3 8.3 2.51 1993 5 14 6 6.55 2.33 1993 5 14 9 5.95 2.15 1993 5 14 12 6.39 2.02 1993 5 14 15 3.82 1.75 1993 5 14 18 5.3 1.54 1993 5 14 21 5.92 1.43 1993 5 15 0 6.13 1.4 1993 5 15 3 7.94 1.44 1993 5 15 6 7.9 1.56 1993 5 15 9 9.64 1.67 1993 5 15 12 10.05 1.91 1993 5 15 15 9.42 2.14 1993 5 15 18 8.62 2.29 1993 5 15 21 7.78 2.21 1993 5 16 0 5.05 2.03 1993 5 16 3 5.15 1.83 1993 5 16 6 7.11 1.69 1993 5 16 9 7.74 1.65 1993 5 16 12 3.6 1.64 1993 5 16 15 2.66 1.61 1993 5 16 18 3.51 1.58 1993 5 16 21 4.24 1.53 1993 5 17 0 6.73 1.47 1993 5 17 3 7.41 1.44 1993 5 17 6 6.57 1.41 1993 5 17 9 5.04 1.33 1993 5 17 12 4.92 1.23 1993 5 17 15 4.28 1.16 1993 5 17 18 3.17 1.12 1993 5 17 21 2.84 1.1 1993 5 18 0 5.24 1.1 1993 5 18 3 4.99 1.2 1993 5 18 6 4.33 1.25 1993 5 18 9 4.73 1.23 1993 5 18 12 5.34 1.25 1993 5 18 15 8.91 1.34 1993 5 18 18 9.54 1.53 1993 5 18 21 11.42 1.83 1993 5 19 0 12.16 2.26 1993 5 19 3 12.11 2.62 1993 5 19 6 8.43 2.74 1993 5 19 9 8.57 2.8 1993 5 19 12 11.78 2.85 1993 5 19 15 12.4 3.09 1993 5 19 18 13.55 3.36 1993 5 19 21 13.57 3.71 1993 5 20 0 14.18 4.06 1993 5 20 3 14.11 4.54 1993 5 20 6 13.79 4.66 1993 5 20 9 12.76 4.53 1993 5 20 12 14.08 4.43 1993 5 20 15 12.68 4.18 1993 5 20 18 15.22 4 1993 5 20 21 10.27 3.82 1993 5 21 0 8.14 3.42 1993 5 21 3 8.24 3.12 1993 5 21 6 8.2 2.87 1993 5 21 9 6.93 2.69 1993 5 21 12 4.83 2.53 1993 5 21 15 10.44 2.34 1993 5 21 18 10.94 2.37 1993 5 21 21 10.03 2.35 1993 5 22 0 10.48 2.29 1993 5 22 3 9.64 2.28 1993 5 22 6 9.94 2.31 1993 5 22 9 10.01 2.42 1993 5 22 12 9.24 2.57 1993 5 22 15 8.76 2.63 1993 5 22 18 10.1 2.56 1993 5 22 21 9.57 2.47 1993 5 23 0 6.98 2.29 1993 5 23 3 6.63 2.02 1993 5 23 6 3.51 1.77 1993 5 23 9 2.46 1.6 1993 5 23 12 3.4 1.5 1993 5 23 15 1.47 1.54 1993 5 23 18 3.98 1.68 1993 5 23 21 6.94 1.79 1993 5 24 0 6.4 1.84 1993 5 24 3 5.52 1.74 1993 5 24 6 5.06 1.54 1993 5 24 9 4.79 1.44 1993 5 24 12 0.81 1.57 1993 5 24 15 3.19 1.61 1993 5 24 18 1.71 1.5 1993 5 24 21 1.75 1.37 1993 5 25 0 5.36 1.26 1993 5 25 3 5.95 1.25 1993 5 25 6 5.19 1.4 1993 5 25 9 6.26 1.57 1993 5 25 12 9.12 1.77 1993 5 25 15 9.37 1.97 1993 5 25 18 6.16 1.71 1993 5 25 21 3.33 1.44 1993 5 26 0 4.14 1.27 1993 5 26 3 5.92 1.2 1993 5 26 6 6.19 1.26 1993 5 26 9 6.73 1.32 1993 5 26 12 6.57 1.51 1993 5 26 15 6.62 1.69 1993 5 26 18 7.58 1.79 1993 5 26 21 8.58 1.94 1993 5 27 0 6.13 1.88 1993 5 27 3 5.37 1.7 1993 5 27 6 6.11 1.55 1993 5 27 9 6.51 1.46 1993 5 27 12 5.95 1.4 1993 5 27 15 6.97 1.34 1993 5 27 18 7.58 1.34 1993 5 27 21 8.38 1.46 1993 5 28 0 10.44 1.64 1993 5 28 3 10.06 1.85 1993 5 28 6 10.02 2.09 1993 5 28 9 9.32 2.34 1993 5 28 12 7.29 2.41 1993 5 28 15 7.38 2.25 1993 5 28 18 8.16 2.2 1993 5 28 21 7.56 2.28 1993 5 29 0 8.9 2.22 1993 5 29 3 8.93 2.14 1993 5 29 6 9.95 2.11 1993 5 29 9 8.67 2.14 1993 5 29 12 8.76 2.16 1993 5 29 15 8.3 2.22 1993 5 29 18 9.95 2.32 1993 5 29 21 9.13 2.49 1993 5 30 0 8.47 2.54 1993 5 30 3 8.27 2.49 1993 5 30 6 9.4 2.36 1993 5 30 9 10.02 2.29 1993 5 30 12 7.45 2.34 1993 5 30 15 9.14 2.32 1993 5 30 18 10.24 2.48 1993 5 30 21 8 2.54 1993 5 31 0 8.59 2.55 1993 5 31 3 10.29 2.66 1993 5 31 6 12.68 3.03 1993 5 31 9 15.01 3.34 1993 5 31 12 11.87 4.04 1993 5 31 15 15.3 4.78 1993 5 31 18 14.72 4.87 1993 5 31 21 15.73 4.72 1993 6 1 0 12.6 4.25 1993 6 1 3 11.38 3.74 1993 6 1 6 7.97 3.27 1993 6 1 9 5.7 2.67 1993 6 1 12 6.76 2.11 1993 6 1 15 6.97 1.72 1993 6 1 18 5.78 1.5 1993 6 1 21 7.43 1.35 1993 6 2 0 9.58 1.27 1993 6 2 3 11.63 1.51 1993 6 2 6 14.02 1.98 1993 6 2 9 15.91 2.77 1993 6 2 12 13.13 3.62 1993 6 2 15 12.52 4.07 1993 6 2 18 9.48 4.32 1993 6 2 21 12.26 4.29 1993 6 3 0 13.5 4.48 1993 6 3 3 14.7 4.96 1993 6 3 6 16.02 5.38 1993 6 3 9 14.06 5.48 1993 6 3 12 11.24 5.34 1993 6 3 15 11.87 5.12 1993 6 3 18 7.73 4.97 1993 6 3 21 6.42 4.64 1993 6 4 0 4.42 4.34 1993 6 4 3 3.44 4.06 1993 6 4 6 6.68 3.78 1993 6 4 9 7.29 3.51 1993 6 4 12 5.32 3.29 1993 6 4 15 3.42 3.14 1993 6 4 18 4.2 3.08 1993 6 4 21 9.63 2.87 1993 6 5 0 12.39 2.72 1993 6 5 3 11.09 2.78 1993 6 5 6 10.25 2.95 1993 6 5 9 8.36 3.07 1993 6 5 12 6.48 2.89 1993 6 5 15 4.95 2.59 1993 6 5 18 3.74 2.32 1993 6 5 21 6.47 2.1 1993 6 6 0 8.53 2.04 1993 6 6 3 9.62 2.23 1993 6 6 6 7.91 2.35 1993 6 6 9 8.73 2.28 1993 6 6 12 7.19 2.26 1993 6 6 15 8.26 2.21 1993 6 6 18 7.86 2.26 1993 6 6 21 9.12 2.34 1993 6 7 0 7.02 2.35 1993 6 7 3 6.88 2.23 1993 6 7 6 6.29 2.07 1993 6 7 9 6.63 1.94 1993 6 7 12 6.59 1.87 1993 6 7 15 5.56 1.85 1993 6 7 18 4.49 1.79 1993 6 7 21 5.55 1.74 1993 6 8 0 4.18 1.69 1993 6 8 3 4.15 1.62 1993 6 8 6 6.78 1.59 1993 6 8 9 7.15 1.62 1993 6 8 12 8.44 1.6 1993 6 8 15 8.03 1.65 1993 6 8 18 5.29 1.62 1993 6 8 21 5.75 1.46 1993 6 9 0 6.8 1.38 1993 6 9 3 7.37 1.4 1993 6 9 6 7.21 1.37 1993 6 9 9 7.97 1.39 1993 6 9 12 6.22 1.39 1993 6 9 15 3.86 1.33 1993 6 9 18 3.98 1.31 1993 6 9 21 3.86 1.28 1993 6 10 0 1.6 1.18 1993 6 10 3 0.57 1.09 1993 6 10 6 0 1.02 1993 6 10 9 0.81 0.97 1993 6 10 12 0.97 0.9 1993 6 10 15 1.68 0.82 1993 6 10 18 2.57 0.72 1993 6 10 21 3.98 0.63 1993 6 11 0 6.54 0.65 1993 6 11 3 7.6 0.81 1993 6 11 6 8.09 0.96 1993 6 11 9 9.76 1.22 1993 6 11 12 10.57 1.56 1993 6 11 15 12.09 1.95 1993 6 11 18 13.58 2.36 1993 6 11 21 12.85 2.76 1993 6 12 0 15.15 3.16 1993 6 12 3 14.9 3.83 1993 6 12 6 15.35 4.45 1993 6 12 9 14.57 4.97 1993 6 12 12 13.46 5.14 1993 6 12 15 12.88 5.05 1993 6 12 18 12.47 4.89 1993 6 12 21 11.98 4.67 1993 6 13 0 9.95 4.25 1993 6 13 3 9.45 3.76 1993 6 13 6 7.72 3.26 1993 6 13 9 6.29 2.82 1993 6 13 12 4.38 2.5 1993 6 13 15 5.49 2.29 1993 6 13 18 4.21 2.15 1993 6 13 21 4.48 2.02 1993 6 14 0 5.15 1.88 1993 6 14 3 6.84 1.75 1993 6 14 6 7.16 1.66 1993 6 14 9 7.06 1.61 1993 6 14 12 5.94 1.52 1993 6 14 15 4.66 1.45 1993 6 14 18 4.92 1.43 1993 6 14 21 3.88 1.46 1993 6 15 0 5.21 1.5 1993 6 15 3 4.11 1.55 1993 6 15 6 1.82 1.57 1993 6 15 9 2.88 1.54 1993 6 15 12 5.47 1.51 1993 6 15 15 6.78 1.52 1993 6 15 18 8.95 1.59 1993 6 15 21 5.44 1.68 1993 6 16 0 2.65 1.73 1993 6 16 3 4.4 1.74 1993 6 16 6 2.89 1.73 1993 6 16 9 1.76 1.69 1993 6 16 12 4.26 1.57 1993 6 16 15 3.06 1.46 1993 6 16 18 4.69 1.42 1993 6 16 21 5.61 1.47 1993 6 17 0 6.06 1.6 1993 6 17 3 7.63 1.74 1993 6 17 6 7.25 1.83 1993 6 17 9 8.14 1.86 1993 6 17 12 7.7 1.95 1993 6 17 15 7.49 2.01 1993 6 17 18 7.52 1.93 1993 6 17 21 7.46 1.84 1993 6 18 0 5.76 1.81 1993 6 18 3 5.79 1.73 1993 6 18 6 3.8 1.62 1993 6 18 9 3.62 1.54 1993 6 18 12 3.46 1.41 1993 6 18 15 4.95 1.26 1993 6 18 18 5.61 1.14 1993 6 18 21 4.4 1.05 1993 6 19 0 6.4 0.95 1993 6 19 3 6.06 0.9 1993 6 19 6 9 1.06 1993 6 19 9 10.82 1.49 1993 6 19 12 7.36 1.89 1993 6 19 15 6.19 1.86 1993 6 19 18 4.6 1.62 1993 6 19 21 1.04 1.4 1993 6 20 0 3.05 1.22 1993 6 20 3 0.69 1.07 1993 6 20 6 3.15 0.95 1993 6 20 9 6.23 0.92 1993 6 20 12 8.46 0.95 1993 6 20 15 7.46 1.12 1993 6 20 18 9.68 1.46 1993 6 20 21 8.52 1.73 1993 6 21 0 8.8 1.72 1993 6 21 3 7.37 1.65 1993 6 21 6 7.11 1.59 1993 6 21 9 9.43 1.71 1993 6 21 12 6.91 1.83 1993 6 21 15 7.13 1.71 1993 6 21 18 6.75 1.6 1993 6 21 21 6.89 1.44 1993 6 22 0 7.4 1.32 1993 6 22 3 8.24 1.32 1993 6 22 6 9.52 1.42 1993 6 22 9 8.53 1.51 1993 6 22 12 8.66 1.59 1993 6 22 15 8.79 1.69 1993 6 22 18 9.83 1.73 1993 6 22 21 8.12 1.71 1993 6 23 0 8.11 1.63 1993 6 23 3 6.96 1.6 1993 6 23 6 10.69 1.66 1993 6 23 9 9.65 1.82 1993 6 23 12 8.45 1.85 1993 6 23 15 7.5 1.83 1993 6 23 18 6.34 1.69 1993 6 23 21 5.79 1.5 1993 6 24 0 5.81 1.32 1993 6 24 3 6.37 1.19 1993 6 24 6 9.79 1.17 1993 6 24 9 10.29 1.44 1993 6 24 12 7.97 1.7 1993 6 24 15 8.47 1.82 1993 6 24 18 8.59 1.77 1993 6 24 21 5.6 1.57 1993 6 25 0 6.91 1.36 1993 6 25 3 6.6 1.25 1993 6 25 6 7.28 1.2 1993 6 25 9 9.16 1.23 1993 6 25 12 8.67 1.47 1993 6 25 15 12.54 1.77 1993 6 25 18 10.03 2.36 1993 6 25 21 9.35 2.79 1993 6 26 0 7.71 2.94 1993 6 26 3 7.92 2.98 1993 6 26 6 10.55 2.93 1993 6 26 9 11.7 3.17 1993 6 26 12 10.23 3.31 1993 6 26 15 12.65 3.61 1993 6 26 18 13.45 4.08 1993 6 26 21 11.1 3.99 1993 6 27 0 13.36 3.82 1993 6 27 3 13.93 4.19 1993 6 27 6 14.4 4.4 1993 6 27 9 16.41 4.67 1993 6 27 12 13.24 4.77 1993 6 27 15 13.95 4.51 1993 6 27 18 15.14 4.56 1993 6 27 21 14.89 5.04 1993 6 28 0 14.16 5.35 1993 6 28 3 18.06 6.43 1993 6 28 6 15.62 6.37 1993 6 28 9 14.19 5.76 1993 6 28 12 13.14 5.18 1993 6 28 15 14.08 4.85 1993 6 28 18 14.56 4.62 1993 6 28 21 14.23 4.37 1993 6 29 0 12.44 4.01 1993 6 29 3 11.67 3.56 1993 6 29 6 12.09 3.21 1993 6 29 9 11.38 3.09 1993 6 29 12 9.7 2.86 1993 6 29 15 11.28 2.77 1993 6 29 18 12.17 2.84 1993 6 29 21 11.69 2.89 1993 6 30 0 9.35 2.74 1993 6 30 3 9.67 2.5 1993 6 30 6 8.39 2.36 1993 6 30 9 10.44 2.31 1993 6 30 12 8.84 2.62 1993 6 30 15 9.19 2.67 1993 6 30 18 10.71 2.85 1993 6 30 21 10.72 3.29 1993 7 1 0 7.6 3.44 1993 7 1 3 9 3.5 1993 7 1 6 9.75 4.68 1993 7 1 9 12.56 4.35 1993 7 1 12 13.51 4.06 1993 7 1 15 12.43 3.77 1993 7 1 18 10.13 3.39 1993 7 1 21 8.74 2.94 1993 7 2 0 6.57 2.66 1993 7 2 3 6.69 2.52 1993 7 2 6 6.6 2.47 1993 7 2 9 6.43 2.44 1993 7 2 12 8.56 2.39 1993 7 2 15 8.63 2.45 1993 7 2 18 5.19 2.46 1993 7 2 21 6.09 2.34 1993 7 3 0 6.88 2.11 1993 7 3 3 6.12 1.87 1993 7 3 6 3.6 1.72 1993 7 3 9 5.2 1.62 1993 7 3 12 7.5 1.58 1993 7 3 15 7.92 1.71 1993 7 3 18 6.26 1.81 1993 7 3 21 5.44 1.8 1993 7 4 0 5.92 1.76 1993 7 4 3 5.23 1.79 1993 7 4 6 1.88 1.78 1993 7 4 9 2.31 1.68 1993 7 4 12 1.99 1.55 1993 7 4 15 1.67 1.44 1993 7 4 18 1.01 1.4 1993 7 4 21 2.97 1.42 1993 7 5 0 3.94 1.45 1993 7 5 3 4.75 1.49 1993 7 5 6 5.41 1.57 1993 7 5 9 5.23 1.59 1993 7 5 12 4.62 1.54 1993 7 5 15 3.72 1.47 1993 7 5 18 3.68 1.38 1993 7 5 21 4.25 1.28 1993 7 6 0 5.16 1.21 1993 7 6 3 4.4 1.13 1993 7 6 6 2.54 1.02 1993 7 6 9 2.44 0.89 1993 7 6 12 5.78 0.97 1993 7 6 15 8.57 0.94 1993 7 6 18 10.85 1.15 1993 7 6 21 13.39 1.8 1993 7 7 0 13.42 2.55 1993 7 7 3 15.52 3.15 1993 7 7 6 14.33 3.83 1993 7 7 9 10.2 3.9 1993 7 7 12 8.35 3.42 1993 7 7 15 7.91 3.03 1993 7 7 18 5.6 2.74 1993 7 7 21 6.55 2.5 1993 7 8 0 7.54 2.32 1993 7 8 3 6.39 2.16 1993 7 8 6 4.72 2.02 1993 7 8 9 4.92 1.89 1993 7 8 12 3.08 1.95 1993 7 8 15 1.92 1.97 1993 7 8 18 3.77 1.76 1993 7 8 21 3.43 1.54 1993 7 9 0 10.16 1.41 1993 7 9 3 8.1 1.31 1993 7 9 6 7.08 1.22 1993 7 9 9 7.23 1.14 1993 7 9 12 7.64 1.1 1993 7 9 15 7.23 1.16 1993 7 9 18 4.84 1.09 1993 7 9 21 7.32 1.07 1993 7 10 0 8.47 1.28 1993 7 10 3 9.37 1.6 1993 7 10 6 6.87 1.77 1993 7 10 9 7.38 1.67 1993 7 10 12 6.73 1.69 1993 7 10 15 7.68 1.77 1993 7 10 18 7.21 1.78 1993 7 10 21 6.69 1.72 1993 7 11 0 4.84 1.57 1993 7 11 3 5.26 1.43 1993 7 11 6 6.07 1.36 1993 7 11 9 5.39 1.32 1993 7 11 12 4.01 1.32 1993 7 11 15 5.21 1.39 1993 7 11 18 6.76 1.56 1993 7 11 21 7.58 1.81 1993 7 12 0 8.64 2.04 1993 7 12 3 8.88 2.38 1993 7 12 6 7.9 2.7 1993 7 12 9 7.88 2.86 1993 7 12 12 8.25 2.89 1993 7 12 15 8.58 2.86 1993 7 12 18 7.84 2.84 1993 7 12 21 7.04 2.67 1993 7 13 0 7.35 2.44 1993 7 13 3 9.03 2.28 1993 7 13 6 12.04 2.23 1993 7 13 9 12.02 2.34 1993 7 13 12 12.01 2.46 1993 7 13 15 9.82 2.6 1993 7 13 18 9.15 2.56 1993 7 13 21 7.36 2.4 1993 7 14 0 5.63 2.16 1993 7 14 3 4.79 1.96 1993 7 14 6 6.7 1.78 1993 7 14 9 8.68 1.75 1993 7 14 12 8.28 1.95 1993 7 14 15 11.87 2.25 1993 7 14 18 11.41 2.67 1993 7 14 21 10.02 2.69 1993 7 15 0 14.9 2.83 1993 7 15 3 11.89 3.07 1993 7 15 6 10.81 3.1 1993 7 15 9 9.1 2.93 1993 7 15 12 9.27 2.79 1993 7 15 15 8.64 2.71 1993 7 15 18 9.26 2.49 1993 7 15 21 10.02 2.32 1993 7 16 0 9.01 2.33 1993 7 16 3 8.67 2.54 1993 7 16 6 9.27 2.49 1993 7 16 9 11.46 2.74 1993 7 16 12 8.07 2.88 1993 7 16 15 8.15 2.64 1993 7 16 18 10.89 2.51 1993 7 16 21 11.96 2.5 1993 7 17 0 13.64 2.62 1993 7 17 3 14.72 3.08 1993 7 17 6 20.06 4.02 1993 7 17 9 17.38 5.19 1993 7 17 12 18.91 5.67 1993 7 17 15 16.61 6.34 1993 7 17 18 14.24 6.44 1993 7 17 21 10.73 5.56 1993 7 18 0 8.25 4.74 1993 7 18 3 9.3 4.05 1993 7 18 6 7.08 3.53 1993 7 18 9 4.35 3.09 1993 7 18 12 3.61 2.75 1993 7 18 15 3.26 2.48 1993 7 18 18 4.49 2.3 1993 7 18 21 2.2 2.17 1993 7 19 0 4.14 2.08 1993 7 19 3 3.73 2.37 1993 7 19 6 2.11 2.74 1993 7 19 9 2.69 2.81 1993 7 19 12 1.23 2.72 1993 7 19 15 1.74 2.52 1993 7 19 18 4.37 2.27 1993 7 19 21 5.47 2.06 1993 7 20 0 1.71 1.89 1993 7 20 3 1.99 1.76 1993 7 20 6 2.63 1.7 1993 7 20 9 1.98 1.67 1993 7 20 12 5.57 1.58 1993 7 20 15 5.6 1.45 1993 7 20 18 4.82 1.32 1993 7 20 21 4.25 1.21 1993 7 21 0 7.06 1.14 1993 7 21 3 7.97 1.2 1993 7 21 6 9.01 1.34 1993 7 21 9 9.18 1.49 1993 7 21 12 6.73 1.53 1993 7 21 15 6.87 1.37 1993 7 21 18 6.03 1.33 1993 7 21 21 6.28 1.35 1993 7 22 0 5.34 1.34 1993 7 22 3 5.3 1.33 1993 7 22 6 7.92 1.39 1993 7 22 9 7.25 1.49 1993 7 22 12 10.34 1.5 1993 7 22 15 10.85 1.72 1993 7 22 18 11.19 2 1993 7 22 21 12.15 2.3 1993 7 23 0 10.81 2.54 1993 7 23 3 8.02 2.53 1993 7 23 6 7.33 2.28 1993 7 23 9 8.95 2 1993 7 23 12 8.28 1.89 1993 7 23 15 7.77 1.9 1993 7 23 18 9.06 2 1993 7 23 21 9.08 2.11 1993 7 24 0 8.54 2.24 1993 7 24 3 9.24 2.4 1993 7 24 6 6.26 2.4 1993 7 24 9 4.41 2.16 1993 7 24 12 0.5 1.86 1993 7 24 15 5.1 1.55 1993 7 24 18 6.96 1.28 1993 7 24 21 8 1.12 1993 7 25 0 9.2 1.1 1993 7 25 3 12.26 1.43 1993 7 25 6 12.5 2.05 1993 7 25 9 15.47 2.74 1993 7 25 12 16.47 3.44 1993 7 25 15 16.18 3.89 1993 7 25 18 17.28 4.33 1993 7 25 21 15.79 4.85 1993 7 26 0 14.07 4.88 1993 7 26 3 12.78 4.55 1993 7 26 6 14.57 4.28 1993 7 26 9 13.07 4.24 1993 7 26 12 12.25 4 1993 7 26 15 10.46 3.72 1993 7 26 18 8.74 3.48 1993 7 26 21 9.32 3.34 1993 7 27 0 8.56 3.25 1993 7 27 3 7.17 3.15 1993 7 27 6 6.61 3.03 1993 7 27 9 6.51 2.93 1993 7 27 12 5.49 2.84 1993 7 27 15 4.7 2.81 1993 7 27 18 6.49 2.83 1993 7 27 21 7.5 2.73 1993 7 28 0 7.29 2.66 1993 7 28 3 7.66 2.68 1993 7 28 6 6.52 2.77 1993 7 28 9 7.01 2.79 1993 7 28 12 9.06 2.67 1993 7 28 15 8.58 2.55 1993 7 28 18 9.34 2.44 1993 7 28 21 8.61 2.38 1993 7 29 0 6.64 2.22 1993 7 29 3 7.21 2.1 1993 7 29 6 3.38 2.02 1993 7 29 9 4.65 1.92 1993 7 29 12 5.47 1.88 1993 7 29 15 5.74 1.87 1993 7 29 18 5.9 1.86 1993 7 29 21 8.26 1.83 1993 7 30 0 10.57 1.88 1993 7 30 3 10.02 1.99 1993 7 30 6 9.69 2.12 1993 7 30 9 10.65 2.25 1993 7 30 12 9.61 2.4 1993 7 30 15 10.98 2.5 1993 7 30 18 9.9 2.6 1993 7 30 21 9.04 2.62 1993 7 31 0 8.55 2.58 1993 7 31 3 8.18 2.51 1993 7 31 6 10.01 2.43 1993 7 31 9 9.44 2.42 1993 7 31 12 7.98 2.3 1993 7 31 15 8.57 2.17 1993 7 31 18 10.34 2.17 1993 7 31 21 10.18 2.31 1993 8 1 0 10.03 2.38 1993 8 1 3 9.92 2.37 1993 8 1 6 10.42 2.42 1993 8 1 9 10.39 2.58 1993 8 1 12 10.18 2.69 1993 8 1 15 11.24 2.84 1993 8 1 18 11.27 3.12 1993 8 1 21 12.41 3.4 1993 8 2 0 13.45 3.69 1993 8 2 3 12.15 3.86 1993 8 2 6 10.37 3.8 1993 8 2 9 10.14 3.49 1993 8 2 12 11.24 3.22 1993 8 2 15 10.83 3.09 1993 8 2 18 8.91 2.92 1993 8 2 21 7.31 2.62 1993 8 3 0 8.07 2.33 1993 8 3 3 7.18 2.12 1993 8 3 6 8.18 1.95 1993 8 3 9 7.65 1.86 1993 8 3 12 6.08 1.78 1993 8 3 15 5.58 1.66 1993 8 3 18 8.03 1.62 1993 8 3 21 7.63 1.73 1993 8 4 0 10.23 1.74 1993 8 4 3 12.45 2.06 1993 8 4 6 15.52 2.66 1993 8 4 9 16.54 3.53 1993 8 4 12 15.95 4.33 1993 8 4 15 14.07 4.72 1993 8 4 18 16.04 4.75 1993 8 4 21 13.35 4.43 1993 8 5 0 13.01 3.94 1993 8 5 3 10.97 3.54 1993 8 5 6 10.2 3.33 1993 8 5 9 9.41 3.13 1993 8 5 12 6.75 2.94 1993 8 5 15 5.39 2.76 1993 8 5 18 5.87 2.6 1993 8 5 21 5.94 2.38 1993 8 6 0 7.96 2.14 1993 8 6 3 8.34 1.96 1993 8 6 6 6.91 1.83 1993 8 6 9 4.17 1.68 1993 8 6 12 3.25 1.57 1993 8 6 15 5.48 1.49 1993 8 6 18 4.99 1.44 1993 8 6 21 7.59 1.38 1993 8 7 0 7.43 1.36 1993 8 7 3 8.84 1.4 1993 8 7 6 8.05 1.52 1993 8 7 9 8.91 1.72 1993 8 7 12 8.82 1.85 1993 8 7 15 6.98 1.8 1993 8 7 18 4.87 1.69 1993 8 7 21 15.2 1.66 1993 8 8 0 16.98 3.41 1993 8 8 3 16.21 4.09 1993 8 8 6 11.67 4.43 1993 8 8 9 14.25 4.27 1993 8 8 12 14.63 4.72 1993 8 8 15 14.88 5.13 1993 8 8 18 15.2 5.23 1993 8 8 21 14.82 5.55 1993 8 9 0 14.68 5.72 1993 8 9 3 17.32 6.19 1993 8 9 6 19.45 7.14 1993 8 9 9 22.85 8.54 1993 8 9 12 21.25 9.11 1993 8 9 15 20.05 8.42 1993 8 9 18 20.22 7.94 1993 8 9 21 19.71 7.49 1993 8 10 0 17.26 6.81 1993 8 10 3 14.87 5.84 1993 8 10 6 11.72 4.85 1993 8 10 9 12.17 4.06 1993 8 10 12 9.4 3.49 1993 8 10 15 9.02 2.96 1993 8 10 18 10.68 2.64 1993 8 10 21 11.34 2.62 1993 8 11 0 10.38 2.68 1993 8 11 3 7.11 2.67 1993 8 11 6 4.67 2.6 1993 8 11 9 0.96 2.49 1993 8 11 12 2.15 2.34 1993 8 11 15 6.57 2.24 1993 8 11 18 9.67 2.11 1993 8 11 21 10.69 2.17 1993 8 12 0 8.7 2.39 1993 8 12 3 8.51 2.82 1993 8 12 6 0.93 2.89 1993 8 12 9 0.52 2.64 1993 8 12 12 1.43 2.44 1993 8 12 15 1.13 2.33 1993 8 12 18 4.15 2.21 1993 8 12 21 5.36 2.08 1993 8 13 0 2.82 1.97 1993 8 13 3 5.46 1.87 1993 8 13 6 6.74 1.81 1993 8 13 9 8.26 1.82 1993 8 13 12 7.97 1.92 1993 8 13 15 8.55 2.1 1993 8 13 18 5.6 2.14 1993 8 13 21 4.67 2.03 1993 8 14 0 1.65 1.91 1993 8 14 3 3.01 1.77 1993 8 14 6 4.82 1.6 1993 8 14 9 7.96 1.49 1993 8 14 12 9.16 1.54 1993 8 14 15 10.38 1.71 1993 8 14 18 7.82 1.8 1993 8 14 21 6.33 1.69 1993 8 15 0 6.8 1.58 1993 8 15 3 7.1 1.57 1993 8 15 6 3.32 1.58 1993 8 15 9 5.75 1.62 1993 8 15 12 7.3 1.67 1993 8 15 15 8.12 1.78 1993 8 15 18 4.88 1.93 1993 8 15 21 5.03 2.15 1993 8 16 0 5.51 2.21 1993 8 16 3 5.91 2.12 1993 8 16 6 4.91 2.03 1993 8 16 9 5.92 1.97 1993 8 16 12 4.5 1.86 1993 8 16 15 6.49 1.74 1993 8 16 18 8.14 1.65 1993 8 16 21 8.49 1.69 1993 8 17 0 6.5 1.72 1993 8 17 3 7.12 1.66 1993 8 17 6 8.12 1.66 1993 8 17 9 9.51 1.7 1993 8 17 12 8.87 1.81 1993 8 17 15 13.24 2.16 1993 8 17 18 13.05 2.65 1993 8 17 21 10.75 2.91 1993 8 18 0 8.14 2.84 1993 8 18 3 9.96 2.84 1993 8 18 6 7.63 2.83 1993 8 18 9 8.17 2.7 1993 8 18 12 5.68 2.65 1993 8 18 15 6.86 2.44 1993 8 18 18 5.12 2.23 1993 8 18 21 3.52 2.1 1993 8 19 0 2.91 2.03 1993 8 19 3 4.24 1.86 1993 8 19 6 4.1 1.65 1993 8 19 9 4.14 1.48 1993 8 19 12 3.66 1.33 1993 8 19 15 3.74 1.19 1993 8 19 18 5.27 1.06 1993 8 19 21 3.62 0.97 1993 8 20 0 4.71 0.88 1993 8 20 3 5.81 0.83 1993 8 20 6 5.47 0.85 1993 8 20 9 7.52 0.87 1993 8 20 12 10.51 1.11 1993 8 20 15 11.93 1.67 1993 8 20 18 14.03 2.44 1993 8 20 21 12.88 3.03 1993 8 21 0 8.51 3.08 1993 8 21 3 8.86 2.79 1993 8 21 6 8.67 2.46 1993 8 21 9 7.48 2.27 1993 8 21 12 6.64 2.11 1993 8 21 15 6.43 1.99 1993 8 21 18 5.91 1.98 1993 8 21 21 5.9 2.03 1993 8 22 0 6.04 2.14 1993 8 22 3 6.36 2.24 1993 8 22 6 6.58 2.22 1993 8 22 9 6.1 2.08 1993 8 22 12 3.74 1.86 1993 8 22 15 7.39 1.61 1993 8 22 18 5.12 1.45 1993 8 22 21 5.16 1.29 1993 8 23 0 3.48 1.16 1993 8 23 3 4.56 1.07 1993 8 23 6 5.84 1 1993 8 23 9 5.14 0.93 1993 8 23 12 2.6 0.86 1993 8 23 15 4.82 0.78 1993 8 23 18 7.35 0.86 1993 8 23 21 9.49 1.05 1993 8 24 0 9.57 1.41 1993 8 24 3 9.05 1.73 1993 8 24 6 9.49 1.82 1993 8 24 9 6.77 1.64 1993 8 24 12 5.99 1.43 1993 8 24 15 8.91 1.31 1993 8 24 18 9.86 1.42 1993 8 24 21 8.94 1.53 1993 8 25 0 8.87 1.66 1993 8 25 3 10.18 1.94 1993 8 25 6 8.87 2.13 1993 8 25 9 7.87 2.01 1993 8 25 12 5.7 1.82 1993 8 25 15 3.84 1.64 1993 8 25 18 4.84 1.49 1993 8 25 21 5.92 1.35 1993 8 26 0 5.76 1.28 1993 8 26 3 6.41 1.28 1993 8 26 6 5.87 1.33 1993 8 26 9 7.41 1.38 1993 8 26 12 7.74 1.43 1993 8 26 15 8.7 1.6 1993 8 26 18 8.09 1.74 1993 8 26 21 9.03 1.75 1993 8 27 0 7.97 1.73 1993 8 27 3 7.69 1.77 1993 8 27 6 6.62 1.77 1993 8 27 9 7.72 1.85 1993 8 27 12 9.91 2.12 1993 8 27 15 10.14 2.4 1993 8 27 18 11.81 2.69 1993 8 27 21 11.82 2.92 1993 8 28 0 13.45 3.21 1993 8 28 3 12.94 3.63 1993 8 28 6 9.52 3.57 1993 8 28 9 10.11 3.29 1993 8 28 12 9.61 3.16 1993 8 28 15 10.42 3.03 1993 8 28 18 11.39 2.89 1993 8 28 21 10.98 2.88 1993 8 29 0 10.38 2.93 1993 8 29 3 12.26 2.88 1993 8 29 6 12.34 3.09 1993 8 29 9 15.05 3.49 1993 8 29 12 15.49 4.11 1993 8 29 15 10.05 4.02 1993 8 29 18 6.51 3.53 1993 8 29 21 6.24 3.12 1993 8 30 0 7.7 2.72 1993 8 30 3 7.72 2.41 1993 8 30 6 8.67 2.2 1993 8 30 9 9.23 2.17 1993 8 30 12 11.24 2.42 1993 8 30 15 11.45 2.82 1993 8 30 18 10.81 2.98 1993 8 30 21 10.92 2.93 1993 8 31 0 9.2 2.81 1993 8 31 3 9.76 2.63 1993 8 31 6 5.62 2.45 1993 8 31 9 7.51 2.16 1993 8 31 12 7.27 1.95 1993 8 31 15 9.25 1.77 1993 8 31 18 10.25 1.8 1993 8 31 21 9.55 1.92 1993 9 1 0 8.51 1.95 1993 9 1 3 11.18 2.05 1993 9 1 6 12.59 2.33 1993 9 1 9 11.01 2.61 1993 9 1 12 7.23 2.82 1993 9 1 15 8.41 2.71 1993 9 1 18 7.92 2.55 1993 9 1 21 8.07 2.39 1993 9 2 0 6.05 2.26 1993 9 2 3 6.11 2.07 1993 9 2 6 4.43 1.88 1993 9 2 9 2.39 1.72 1993 9 2 12 0 1.6 1993 9 2 15 1.41 1.49 1993 9 2 18 4.69 1.39 1993 9 2 21 7.5 1.35 1993 9 3 0 8.26 1.39 1993 9 3 3 9.27 1.53 1993 9 3 6 7.87 1.65 1993 9 3 9 6.41 1.59 1993 9 3 12 6.06 1.41 1993 9 3 15 3.95 1.23 1993 9 3 18 2.59 1.08 1993 9 3 21 0.83 0.94 1993 9 4 0 0.24 0.83 1993 9 4 3 0.92 0.72 1993 9 4 6 2.93 0.63 1993 9 4 9 3.99 0.62 1993 9 4 12 7.51 0.63 1993 9 4 15 8.23 0.88 1993 9 4 18 7.26 1.08 1993 9 4 21 7.64 1.26 1993 9 5 0 7.87 1.42 1993 9 5 3 8.37 1.63 1993 9 5 6 8.66 1.8 1993 9 5 9 8.2 1.8 1993 9 5 12 9.28 1.72 1993 9 5 15 8.34 1.65 1993 9 5 18 8.87 1.72 1993 9 5 21 9.3 1.9 1993 9 6 0 7.37 2.06 1993 9 6 3 8.07 1.92 1993 9 6 6 5.93 1.73 1993 9 6 9 4.58 1.51 1993 9 6 12 4.98 1.38 1993 9 6 15 4.42 1.35 1993 9 6 18 2.7 1.28 1993 9 6 21 2.83 1.24 1993 9 7 0 2.56 1.26 1993 9 7 3 0.4 1.27 1993 9 7 6 1.31 1.22 1993 9 7 9 3.52 1.19 1993 9 7 12 4.87 1.31 1993 9 7 15 3.13 1.09 1993 9 7 18 4.64 0.94 1993 9 7 21 5.69 0.94 1993 9 8 0 6.69 1.11 1993 9 8 3 6.27 1.24 1993 9 8 6 4.62 1.33 1993 9 8 9 4.79 1.41 1993 9 8 12 3.16 1.49 1993 9 8 15 2.81 1.58 1993 9 8 18 4.52 1.62 1993 9 8 21 5.29 1.67 1993 9 9 0 7.63 1.79 1993 9 9 3 9.33 2.02 1993 9 9 6 10.21 2.28 1993 9 9 9 7.98 2.54 1993 9 9 12 10.57 2.63 1993 9 9 15 11.37 2.9 1993 9 9 18 12.73 3.39 1993 9 9 21 12.48 3.77 1993 9 10 0 12.19 3.97 1993 9 10 3 16.91 4.67 1993 9 10 6 13.53 5.51 1993 9 10 9 12.38 5.16 1993 9 10 12 9.63 4.54 1993 9 10 15 9.78 3.98 1993 9 10 18 9.22 3.53 1993 9 10 21 7.71 3.07 1993 9 11 0 5.92 2.61 1993 9 11 3 9.75 2.2 1993 9 11 6 10.43 2.09 1993 9 11 9 12.31 2.17 1993 9 11 12 12.63 2.63 1993 9 11 15 10.71 3.08 1993 9 11 18 9.84 3.24 1993 9 11 21 7.26 2.95 1993 9 12 0 5.76 2.6 1993 9 12 3 4.37 2.28 1993 9 12 6 2.65 2.06 1993 9 12 9 7.54 1.82 1993 9 12 12 10.29 1.67 1993 9 12 15 12.22 1.9 1993 9 12 18 14.32 2.3 1993 9 12 21 11.26 2.54 1993 9 13 0 9.21 2.64 1993 9 13 3 9.4 2.49 1993 9 13 6 6.47 2.29 1993 9 13 9 10.76 2.14 1993 9 13 12 8.97 2.31 1993 9 13 15 10.46 2.5 1993 9 13 18 11.39 2.85 1993 9 13 21 9.29 3.05 1993 9 14 0 7.26 3.07 1993 9 14 3 10.91 3.18 1993 9 14 6 10.6 3.08 1993 9 14 9 10.06 2.87 1993 9 14 12 11.91 2.83 1993 9 14 15 12.12 3.18 1993 9 14 18 12.44 3.45 1993 9 14 21 12.37 3.67 1993 9 15 0 9.01 3.73 1993 9 15 3 10.4 3.43 1993 9 15 6 9.15 3.15 1993 9 15 9 9.97 2.78 1993 9 15 12 12.66 3 1993 9 15 15 13.26 4.06 1993 9 15 18 8.43 3.96 1993 9 15 21 9.04 3.46 1993 9 16 0 8.76 3.09 1993 9 16 3 9.21 2.81 1993 9 16 6 4.09 2.52 1993 9 16 9 6.69 2.22 1993 9 16 12 5.32 1.99 1993 9 16 15 4.55 1.73 1993 9 16 18 2.36 1.49 1993 9 16 21 1.59 1.27 1993 9 17 0 2.88 1.09 1993 9 17 3 4.27 0.95 1993 9 17 6 6.45 0.86 1993 9 17 9 8.54 0.9 1993 9 17 12 8.24 1.14 1993 9 17 15 9.4 1.47 1993 9 17 18 8.06 1.69 1993 9 17 21 8.31 1.71 1993 9 18 0 7.77 1.8 1993 9 18 3 7.15 1.84 1993 9 18 6 5.2 1.81 1993 9 18 9 4.52 1.7 1993 9 18 12 3.86 1.6 1993 9 18 15 3.87 1.5 1993 9 18 18 4.97 1.42 1993 9 18 21 4.54 1.34 1993 9 19 0 5.28 1.28 1993 9 19 3 7.83 1.27 1993 9 19 6 11.99 1.43 1993 9 19 9 10.73 1.95 1993 9 19 12 8.24 2.11 1993 9 19 15 10.74 2.26 1993 9 19 18 11.48 2.45 1993 9 19 21 12.35 2.57 1993 9 20 0 12.12 2.66 1993 9 20 3 10.92 2.76 1993 9 20 6 11.94 2.95 1993 9 20 9 9.72 2.91 1993 9 20 12 8.34 2.6 1993 9 20 15 6.72 2.22 1993 9 20 18 9.15 1.86 1993 9 20 21 9.43 1.73 1993 9 21 0 8.12 1.69 1993 9 21 3 8.78 1.75 1993 9 21 6 8.85 1.83 1993 9 21 9 9.91 1.93 1993 9 21 12 8.85 2.06 1993 9 21 15 8.86 2.06 1993 9 21 18 9.25 2.02 1993 9 21 21 8.99 2.07 1993 9 22 0 8.12 2.05 1993 9 22 3 8.83 2.02 1993 9 22 6 8.94 2.03 1993 9 22 9 8.29 2.03 1993 9 22 12 5.51 1.85 1993 9 22 15 5.78 1.65 1993 9 22 18 6.21 1.5 1993 9 22 21 5.48 1.36 1993 9 23 0 4.69 1.22 1993 9 23 3 7.55 1.14 1993 9 23 6 8.94 1.28 1993 9 23 9 9.13 1.5 1993 9 23 12 9.41 1.73 1993 9 23 15 11.36 2.11 1993 9 23 18 9.02 2.42 1993 9 23 21 9.6 2.33 1993 9 24 0 6.07 2.12 1993 9 24 3 7.23 1.92 1993 9 24 6 6.45 1.78 1993 9 24 9 6.22 1.67 1993 9 24 12 6.13 1.62 1993 9 24 15 4.4 1.59 1993 9 24 18 2.18 1.54 1993 9 24 21 2.44 1.51 1993 9 25 0 4.22 1.47 1993 9 25 3 7.26 1.42 1993 9 25 6 7.96 1.46 1993 9 25 9 10.2 1.63 1993 9 25 12 10.78 1.95 1993 9 25 15 10.91 2.22 1993 9 25 18 11.2 2.51 1993 9 25 21 12.39 3 1993 9 26 0 9.24 3.65 1993 9 26 3 9.4 3.54 1993 9 26 6 8.83 3.35 1993 9 26 9 7.6 3.14 1993 9 26 12 8.01 2.94 1993 9 26 15 8.71 2.74 1993 9 26 18 8.16 2.56 1993 9 26 21 9.14 2.46 1993 9 27 0 8.13 2.39 1993 9 27 3 7.98 2.23 1993 9 27 6 8.96 2.12 1993 9 27 9 8.69 2.17 1993 9 27 12 8.47 2.18 1993 9 27 15 8.8 2.13 1993 9 27 18 9.11 2.12 1993 9 27 21 12.36 2.44 1993 9 28 0 9.19 2.82 1993 9 28 3 9.78 3.12 1993 9 28 6 8.04 3.36 1993 9 28 9 9.92 3.26 1993 9 28 12 10.2 3.11 1993 9 28 15 14.36 3.23 1993 9 28 18 14.03 3.85 1993 9 28 21 12.99 4.09 1993 9 29 0 11.22 4 1993 9 29 3 9.46 3.64 1993 9 29 6 8.72 3.33 1993 9 29 9 8.11 3 1993 9 29 12 8.72 2.62 1993 9 29 15 11.78 2.59 1993 9 29 18 13.49 2.85 1993 9 29 21 13.71 3.23 1993 9 30 0 9.15 3.34 1993 9 30 3 9.32 3.03 1993 9 30 6 7.48 2.7 1993 9 30 9 7.76 2.41 1993 9 30 12 9.35 2.28 1993 9 30 15 11.16 2.37 1993 9 30 18 10.38 2.53 1993 9 30 21 10.26 2.54 1993 10 1 0 9.07 2.53 1993 10 1 3 10.65 2.47 1993 10 1 6 10.53 2.44 1993 10 1 9 11.05 2.37 1993 10 1 12 10.63 2.44 1993 10 1 15 12.11 2.72 1993 10 1 18 12.91 3.07 1993 10 1 21 12.24 3.22 1993 10 2 0 9.86 3.32 1993 10 2 3 11.36 3.29 1993 10 2 6 9.79 3.36 1993 10 2 9 10.38 3.23 1993 10 2 12 7.67 3.03 1993 10 2 15 9.06 2.77 1993 10 2 18 6.04 2.48 1993 10 2 21 3.94 2.2 1993 10 3 0 3.87 1.9 1993 10 3 3 7.9 1.57 1993 10 3 6 10.94 1.37 1993 10 3 9 11.8 1.54 1993 10 3 12 13.89 1.99 1993 10 3 15 12 2.47 1993 10 3 18 9.33 2.59 1993 10 3 21 8.63 2.39 1993 10 4 0 8.29 2.11 1993 10 4 3 8.78 1.87 1993 10 4 6 8.14 1.87 1993 10 4 9 7.83 1.85 1993 10 4 12 6.91 1.68 1993 10 4 15 4.54 1.56 1993 10 4 18 4.65 1.44 1993 10 4 21 5.58 1.26 1993 10 5 0 4.24 1.09 1993 10 5 3 6.94 0.97 1993 10 5 6 9.12 1.05 1993 10 5 9 10.69 1.28 1993 10 5 12 11.18 1.66 1993 10 5 15 13.85 2.19 1993 10 5 18 14.24 2.71 1993 10 5 21 10.15 3.07 1993 10 6 0 8.4 2.97 1993 10 6 3 8.59 2.69 1993 10 6 6 7.71 2.37 1993 10 6 9 5.84 2.11 1993 10 6 12 6.77 1.91 1993 10 6 15 6.31 1.76 1993 10 6 18 4.82 1.65 1993 10 6 21 7.98 1.61 1993 10 7 0 6.42 1.62 1993 10 7 3 10.37 1.69 1993 10 7 6 11.63 2 1993 10 7 9 10.94 2.1 1993 10 7 12 6.24 2 1993 10 7 15 9.71 1.98 1993 10 7 18 6.77 2.04 1993 10 7 21 6.61 2.07 1993 10 8 0 4.93 1.96 1993 10 8 3 10.12 1.75 1993 10 8 6 10.73 1.88 1993 10 8 9 11.1 1.86 1993 10 8 12 11.3 2.24 1993 10 8 15 10.35 2.63 1993 10 8 18 9.96 2.71 1993 10 8 21 8.37 2.73 1993 10 9 0 5.55 2.66 1993 10 9 3 10.37 2.39 1993 10 9 6 11.86 2.28 1993 10 9 9 11.31 2.38 1993 10 9 12 9.75 2.43 1993 10 9 15 8.57 2.32 1993 10 9 18 8.83 2.18 1993 10 9 21 7.51 2.02 1993 10 10 0 5.54 1.76 1993 10 10 3 6.51 1.61 1993 10 10 6 7.01 1.57 1993 10 10 9 9.5 1.6 1993 10 10 12 8.77 1.79 1993 10 10 15 8.33 1.8 1993 10 10 18 7.41 1.65 1993 10 10 21 5.83 1.5 1993 10 11 0 2.95 1.4 1993 10 11 3 3.86 1.37 1993 10 11 6 5.64 1.41 1993 10 11 9 7.42 1.45 1993 10 11 12 10.11 1.55 1993 10 11 15 10.84 1.89 1993 10 11 18 11.56 2.27 1993 10 11 21 8.76 2.39 1993 10 12 0 5.64 2.15 1993 10 12 3 8.57 1.94 1993 10 12 6 11.74 2 1993 10 12 9 8.36 1.92 1993 10 12 12 12.34 2.06 1993 10 12 15 12.24 2.38 1993 10 12 18 13.46 3.04 1993 10 12 21 10.32 4.31 1993 10 13 0 6.94 4.63 1993 10 13 3 11.16 3.72 1993 10 13 6 10.31 3.08 1993 10 13 9 13 2.91 1993 10 13 12 11.89 3.08 1993 10 13 15 11.18 3.15 1993 10 13 18 9.48 2.92 1993 10 13 21 9.68 2.61 1993 10 14 0 6.88 2.46 1993 10 14 3 11.83 2.51 1993 10 14 6 10.33 2.79 1993 10 14 9 10.08 2.8 1993 10 14 12 8.06 2.71 1993 10 14 15 9.95 2.58 1993 10 14 18 8.96 2.58 1993 10 14 21 9.1 2.53 1993 10 15 0 4.07 2.52 1993 10 15 3 9.52 2.44 1993 10 15 6 9.17 2.5 1993 10 15 9 8.72 2.49 1993 10 15 12 10.48 2.43 1993 10 15 15 10.26 2.41 1993 10 15 18 9.49 2.33 1993 10 15 21 7.05 2.22 1993 10 16 0 4.15 2.03 1993 10 16 3 7.26 1.91 1993 10 16 6 10.45 2.02 1993 10 16 9 9.51 2.22 1993 10 16 12 10.44 2.33 1993 10 16 15 8.96 2.22 1993 10 16 18 3.45 2 1993 10 16 21 6.03 1.81 1993 10 17 0 3.49 1.81 1993 10 17 3 3.67 1.85 1993 10 17 6 5.01 1.87 1993 10 17 9 5.74 1.89 1993 10 17 12 5.75 1.83 1993 10 17 15 6.42 1.75 1993 10 17 18 6.34 1.71 1993 10 17 21 6.38 1.69 1993 10 18 0 6.52 1.71 1993 10 18 3 8.59 1.82 1993 10 18 6 7.57 1.95 1993 10 18 9 11.42 2.07 1993 10 18 12 11.04 2.42 1993 10 18 15 10.71 2.68 1993 10 18 18 9.87 2.64 1993 10 18 21 8.72 2.4 1993 10 19 0 8.43 2.12 1993 10 19 3 8.69 1.94 1993 10 19 6 9.42 1.86 1993 10 19 9 10.45 1.94 1993 10 19 12 10.04 2.02 1993 10 19 15 8.94 2.04 1993 10 19 18 7.44 1.88 1993 10 19 21 8.31 1.73 1993 10 20 0 6.16 1.59 1993 10 20 3 4.82 1.41 1993 10 20 6 4.51 1.3 1993 10 20 9 4.68 1.22 1993 10 20 12 3.05 1.15 1993 10 20 15 3.27 1.07 1993 10 20 18 4.36 0.99 1993 10 20 21 5.84 0.99 1993 10 21 0 3.82 1.03 1993 10 21 3 3.48 0.96 1993 10 21 6 3.97 0.9 1993 10 21 9 4.73 0.86 1993 10 21 12 5.11 0.84 1993 10 21 15 6.92 0.87 1993 10 21 18 6.58 1.02 1993 10 21 21 5.92 1.02 1993 10 22 0 4.81 0.92 1993 10 22 3 5.17 0.83 1993 10 22 6 5.97 0.8 1993 10 22 9 6 0.85 1993 10 22 12 4.4 0.82 1993 10 22 15 3.93 0.73 1993 10 22 18 3.39 0.68 1993 10 22 21 4.03 0.65 1993 10 23 0 4.23 0.64 1993 10 23 3 5.44 0.67 1993 10 23 6 7.8 0.81 1993 10 23 9 7.08 1.01 1993 10 23 12 5.02 1.07 1993 10 23 15 4.72 1.01 1993 10 23 18 3.99 0.94 1993 10 23 21 4.21 0.88 1993 10 24 0 4.44 0.84 1993 10 24 3 3 0.81 1993 10 24 6 3.22 0.77 1993 10 24 9 3.09 0.74 1993 10 24 12 2.65 0.73 1993 10 24 15 4.24 0.74 1993 10 24 18 5.5 0.81 1993 10 24 21 5.82 0.93 1993 10 25 0 4.83 0.91 1993 10 25 3 6.4 0.87 1993 10 25 6 6.37 0.92 1993 10 25 9 8.61 1.2 1993 10 25 12 5.21 1.4 1993 10 25 15 5.74 1.26 1993 10 25 18 7.17 1.13 1993 10 25 21 6.14 1.11 1993 10 26 0 5.7 1.05 1993 10 26 3 7.17 1.12 1993 10 26 6 6.01 1.24 1993 10 26 9 6.13 1.23 1993 10 26 12 4.4 1.16 1993 10 26 15 5.8 1.05 1993 10 26 18 6.96 0.99 1993 10 26 21 5.17 0.97 1993 10 27 0 3.44 0.89 1993 10 27 3 3.79 0.82 1993 10 27 6 5.32 0.76 1993 10 27 9 5.37 0.78 1993 10 27 12 6.62 0.84 1993 10 27 15 8.74 1.12 1993 10 27 18 9.72 1.54 1993 10 27 21 12.02 2.11 1993 10 28 0 12.62 2.88 1993 10 28 3 16.5 3.68 1993 10 28 6 12.66 4.22 1993 10 28 9 11.1 4.35 1993 10 28 12 14.49 4.2 1993 10 28 15 14.19 4.23 1993 10 28 18 15.17 3.99 1993 10 28 21 13.54 3.93 1993 10 29 0 9.8 3.69 1993 10 29 3 9.9 3.24 1993 10 29 6 8.33 2.92 1993 10 29 9 8.92 2.7 1993 10 29 12 9.39 2.52 1993 10 29 15 8.88 2.38 1993 10 29 18 5.87 2.19 1993 10 29 21 6.87 1.96 1993 10 30 0 6.13 1.8 1993 10 30 3 9.41 1.67 1993 10 30 6 10.25 1.81 1993 10 30 9 11.23 2.05 1993 10 30 12 9.34 2.19 1993 10 30 15 10.36 2.25 1993 10 30 18 9.17 2.4 1993 10 30 21 8.83 2.45 1993 10 31 0 4.97 2.34 1993 10 31 3 7.36 2.11 1993 10 31 6 6.75 1.94 1993 10 31 9 4.23 1.78 1993 10 31 12 3.42 1.69 1993 10 31 15 6.88 1.65 1993 10 31 18 7.38 1.61 1993 10 31 21 7.5 1.64 1993 11 1 0 4.9 1.72 1993 11 1 3 6.16 1.73 1993 11 1 6 7.64 1.75 1993 11 1 9 8.52 1.82 1993 11 1 12 6.81 1.77 1993 11 1 15 6.93 1.64 1993 11 1 18 8.51 1.52 1993 11 1 21 7.87 1.5 1993 11 2 0 8.46 1.42 1993 11 2 3 10.13 1.45 1993 11 2 6 13.19 1.86 1993 11 2 9 11.32 2.16 1993 11 2 12 12.01 2.68 1993 11 2 15 12.83 3.48 1993 11 2 18 11.9 3.75 1993 11 2 21 12.97 3.8 1993 11 3 0 10.65 3.88 1993 11 3 3 9.31 3.68 1993 11 3 6 9.21 3.21 1993 11 3 9 14.93 2.92 1993 11 3 12 15.4 3.52 1993 11 3 15 14.8 3.99 1993 11 3 18 12.89 3.88 1993 11 3 21 9.3 3.32 1993 11 4 0 5.54 2.83 1993 11 4 3 1.05 2.45 1993 11 4 6 6.48 2.17 1993 11 4 9 5.32 1.98 1993 11 4 12 5.23 1.86 1993 11 4 15 4.39 1.74 1993 11 4 18 4.49 1.57 1993 11 4 21 5.86 1.45 1993 11 5 0 5.68 1.45 1993 11 5 3 4.81 1.4 1993 11 5 6 4.01 1.34 1993 11 5 9 4.14 1.29 1993 11 5 12 2.38 1.27 1993 11 5 15 0.52 1.25 1993 11 5 18 6.33 1.22 1993 11 5 21 7.83 1.34 1993 11 6 0 6.7 1.34 1993 11 6 3 10.21 1.44 1993 11 6 6 10.37 1.73 1993 11 6 9 10.73 1.91 1993 11 6 12 9.68 2.12 1993 11 6 15 11.61 2.31 1993 11 6 18 14.48 2.7 1993 11 6 21 12.42 3.15 1993 11 7 0 8.99 3.09 1993 11 7 3 8.46 2.74 1993 11 7 6 4.85 2.32 1993 11 7 9 5.76 1.98 1993 11 7 12 8.59 1.73 1993 11 7 15 10.46 1.67 1993 11 7 18 12.14 1.9 1993 11 7 21 11.12 2.19 1993 11 8 0 9.42 2.49 1993 11 8 3 10.51 2.71 1993 11 8 6 9.35 2.73 1993 11 8 9 11.37 2.85 1993 11 8 12 8.3 3.13 1993 11 8 15 6.89 3.25 1993 11 8 18 4.3 3.2 1993 11 8 21 6.71 3.16 1993 11 9 0 8.63 3.06 1993 11 9 3 11.4 2.99 1993 11 9 6 9.1 3.03 1993 11 9 9 8.59 3.02 1993 11 9 12 6.43 2.97 1993 11 9 15 5.75 2.95 1993 11 9 18 5.53 2.73 1993 11 9 21 6.53 2.44 1993 11 10 0 2.33 2.18 1993 11 10 3 2.77 2 1993 11 10 6 3.25 1.89 1993 11 10 9 3.25 1.85 1993 11 10 12 0.97 1.93 1993 11 10 15 4 1.87 1993 11 10 18 4.14 1.76 1993 11 10 21 6.91 1.66 1993 11 11 0 6.07 1.61 1993 11 11 3 6.6 1.54 1993 11 11 6 5.21 1.48 1993 11 11 9 3.82 1.39 1993 11 11 12 4.19 1.3 1993 11 11 15 6.37 1.23 1993 11 11 18 6.29 1.19 1993 11 11 21 5.77 1.17 1993 11 12 0 2.66 1.11 1993 11 12 3 6.78 1.06 1993 11 12 6 6.41 1.09 1993 11 12 9 7.27 1.13 1993 11 12 12 7.19 1.15 1993 11 12 15 9.21 1.31 1993 11 12 18 9.92 1.53 1993 11 12 21 9.96 1.67 1993 11 13 0 8.41 1.85 1993 11 13 3 10.08 1.95 1993 11 13 6 10.95 2.34 1993 11 13 9 7.92 2.28 1993 11 13 12 11.4 2.34 1993 11 13 15 12.08 2.61 1993 11 13 18 10.21 2.94 1993 11 13 21 9.79 3.07 1993 11 14 0 8.19 3.08 1993 11 14 3 11.93 3.1 1993 11 14 6 8.45 3 1993 11 14 9 7.19 2.69 1993 11 14 12 6.25 2.41 1993 11 14 15 5.07 2.2 1993 11 14 18 6.08 2.02 1993 11 14 21 5.65 1.86 1993 11 15 0 3.63 1.69 1993 11 15 3 5.61 1.5 1993 11 15 6 4.06 1.31 1993 11 15 9 7.09 1.16 1993 11 15 12 9.02 1.16 1993 11 15 15 11.22 1.52 1993 11 15 18 13.03 2.12 1993 11 15 21 10.49 2.51 1993 11 16 0 6.92 2.47 1993 11 16 3 7.11 2.28 1993 11 16 6 7.94 2.1 1993 11 16 9 6.86 1.98 1993 11 16 12 6.25 1.84 1993 11 16 15 9.22 1.94 1993 11 16 18 9.26 2.45 1993 11 16 21 8.99 2.5 1993 11 17 0 5.02 2.45 1993 11 17 3 6.83 2.4 1993 11 17 6 5.99 2.3 1993 11 17 9 7.92 2.06 1993 11 17 12 8.34 1.82 1993 11 17 15 10.78 1.75 1993 11 17 18 10.51 1.92 1993 11 17 21 11.2 2.26 1993 11 18 0 10.48 2.51 1993 11 18 3 12.41 2.79 1993 11 18 6 10.86 3.04 1993 11 18 9 11.27 3 1993 11 18 12 14.18 3 1993 11 18 15 12.15 3.15 1993 11 18 18 13.03 3.26 1993 11 18 21 13.07 3.35 1993 11 19 0 9.32 3.33 1993 11 19 3 12.06 3.21 1993 11 19 6 10.24 3.11 1993 11 19 9 8.88 2.81 1993 11 19 12 7.16 2.48 1993 11 19 15 8.34 2.24 1993 11 19 18 7.88 2.04 1993 11 19 21 6.9 1.91 1993 11 20 0 3.52 1.86 1993 11 20 3 6.23 1.89 1993 11 20 6 7.53 1.94 1993 11 20 9 9.14 2.19 1993 11 20 12 6.11 2.29 1993 11 20 15 10.12 2.26 1993 11 20 18 11.97 2.41 1993 11 20 21 11.44 2.51 1993 11 21 0 9.64 2.44 1993 11 21 3 9.64 2.4 1993 11 21 6 9.68 2.4 1993 11 21 9 8.99 2.39 1993 11 21 12 9.52 2.39 1993 11 21 15 11.03 2.51 1993 11 21 18 13.76 3.1 1993 11 21 21 14.13 3.87 1993 11 22 0 11.07 4.27 1993 11 22 3 9.97 4.31 1993 11 22 6 10.22 3.9 1993 11 22 9 12.92 3.75 1993 11 22 12 12.37 3.86 1993 11 22 15 12.63 3.92 1993 11 22 18 11.62 3.77 1993 11 22 21 11.12 3.74 1993 11 23 0 9.2 3.6 1993 11 23 3 8.48 3.31 1993 11 23 6 7.46 2.96 1993 11 23 9 8.87 2.63 1993 11 23 12 9.8 2.43 1993 11 23 15 8.52 2.31 1993 11 23 18 8.28 2.09 1993 11 23 21 7.47 1.92 1993 11 24 0 4.62 1.74 1993 11 24 3 5.86 1.56 1993 11 24 6 6.91 1.43 1993 11 24 9 5.62 1.33 1993 11 24 12 5.11 1.2 1993 11 24 15 7.54 1.12 1993 11 24 18 7.4 1.14 1993 11 24 21 7.53 1.2 1993 11 25 0 8.69 1.36 1993 11 25 3 10.28 1.73 1993 11 25 6 10.18 2.16 1993 11 25 9 9.49 2.34 1993 11 25 12 8.14 2.4 1993 11 25 15 11.19 2.45 1993 11 25 18 12.36 2.66 1993 11 25 21 10.81 2.86 1993 11 26 0 12.33 2.86 1993 11 26 3 11.39 2.79 1993 11 26 6 9.4 2.69 1993 11 26 9 8.55 2.43 1993 11 26 12 10.28 2.29 1993 11 26 15 10.27 2.32 1993 11 26 18 11.35 2.44 1993 11 26 21 10.94 2.55 1993 11 27 0 9.46 2.61 1993 11 27 3 10.35 2.53 1993 11 27 6 9.36 2.34 1993 11 27 9 7.06 2.18 1993 11 27 12 5.11 2.01 1993 11 27 15 0.74 1.86 1993 11 27 18 0.42 1.72 1993 11 27 21 1.43 1.59 1993 11 28 0 0.69 1.53 1993 11 28 3 1.14 1.49 1993 11 28 6 2.82 1.4 1993 11 28 9 2.97 1.31 1993 11 28 12 4.53 1.31 1993 11 28 15 6.65 1.44 1993 11 28 18 4.85 1.67 1993 11 28 21 2.64 1.94 1993 11 29 0 4.86 2.19 1993 11 29 3 7.05 2.38 1993 11 29 6 5.79 2.5 1993 11 29 9 8.41 2.55 1993 11 29 12 10.11 2.68 1993 11 29 15 8.71 2.78 1993 11 29 18 7.62 2.76 1993 11 29 21 7.61 2.69 1993 11 30 0 4.52 2.58 1993 11 30 3 7.88 2.49 1993 11 30 6 7.47 2.47 1993 11 30 9 10.14 2.32 1993 11 30 12 8.67 2.25 1993 11 30 15 6.89 2.39 1993 11 30 18 6.44 2.6 1993 11 30 21 2.35 2.64 1993 12 1 0 1.84 2.79 1993 12 1 3 5.86 2.85 1993 12 1 6 7.89 2.87 1993 12 1 9 9.11 2.9 1993 12 1 12 12.5 3 1993 12 1 15 13.76 3.31 1993 12 1 18 12.01 3.57 1993 12 1 21 9.72 3.62 1993 12 2 0 8.74 3.46 1993 12 2 3 7.09 3.29 1993 12 2 6 7.6 3.12 1993 12 2 9 4.59 3.01 1993 12 2 12 5 2.84 1993 12 2 15 4.36 2.65 1993 12 2 18 7.86 2.52 1993 12 2 21 13.14 2.8 1993 12 3 0 6.57 2.84 1993 12 3 3 3.57 2.76 1993 12 3 6 2.71 2.48 1993 12 3 9 6.36 2.11 1993 12 3 12 6.27 1.96 1993 12 3 15 6.68 1.89 1993 12 3 18 4.74 1.83 1993 12 3 21 2.46 1.76 1993 12 4 0 1.56 1.75 1993 12 4 3 0.56 1.75 1993 12 4 6 5.85 1.65 1993 12 4 9 8.52 1.58 1993 12 4 12 8.06 1.63 1993 12 4 15 8.05 1.77 1993 12 4 18 4.24 1.74 1993 12 4 21 4.62 1.63 1993 12 5 0 7.24 1.6 1993 12 5 3 6.58 1.62 1993 12 5 6 6.38 1.62 1993 12 5 9 6.03 1.61 1993 12 5 12 6.56 1.49 1993 12 5 15 8.69 1.44 1993 12 5 18 9.43 1.56 1993 12 5 21 9.27 1.91 1993 12 6 0 5.33 1.99 1993 12 6 3 6.61 2 1993 12 6 6 4.29 1.92 1993 12 6 9 6.08 1.83 1993 12 6 12 6.56 1.82 1993 12 6 15 5.33 1.86 1993 12 6 18 6.09 1.81 1993 12 6 21 9.01 1.78 1993 12 7 0 7.22 1.88 1993 12 7 3 11.74 2.07 1993 12 7 6 11.63 2.48 1993 12 7 9 11.6 2.81 1993 12 7 12 11.24 3.05 1993 12 7 15 9.7 3.01 1993 12 7 18 11.58 2.75 1993 12 7 21 13.57 2.8 1993 12 8 0 14.61 3.12 1993 12 8 3 16.08 3.89 1993 12 8 6 9.16 4.29 1993 12 8 9 11.64 4.17 1993 12 8 12 9.12 4.16 1993 12 8 15 7.63 3.91 1993 12 8 18 6.49 3.48 1993 12 8 21 3.34 3.02 1993 12 9 0 5.23 2.57 1993 12 9 3 8.67 2.15 1993 12 9 6 6.09 1.87 1993 12 9 9 12.41 1.8 1993 12 9 12 11.76 2.1 1993 12 9 15 14.06 2.3 1993 12 9 18 11.21 2.55 1993 12 9 21 8.1 2.37 1993 12 10 0 5.57 1.99 1993 12 10 3 6.18 1.73 1993 12 10 6 5.17 1.54 1993 12 10 9 5.47 1.38 1993 12 10 12 4.65 1.23 1993 12 10 15 6.71 1.06 1993 12 10 18 10.15 1 1993 12 10 21 10.15 1.26 1993 12 11 0 10.66 1.68 1993 12 11 3 13.72 2.5 1993 12 11 6 15.83 3.34 1993 12 11 9 7.18 3.37 1993 12 11 12 7.31 2.98 1993 12 11 15 7.86 2.7 1993 12 11 18 5.43 2.53 1993 12 11 21 4.49 2.49 1993 12 12 0 4.08 2.61 1993 12 12 3 7.67 2.64 1993 12 12 6 13.46 2.83 1993 12 12 9 9.78 2.84 1993 12 12 12 5.07 2.61 1993 12 12 15 7.85 2.33 1993 12 12 18 8.91 2.14 1993 12 12 21 7.66 1.92 1993 12 13 0 10.05 1.72 1993 12 13 3 9.9 1.82 1993 12 13 6 7.33 2.14 1993 12 13 9 7.34 2.22 1993 12 13 12 4.41 1.86 1993 12 13 15 6.56 1.6 1993 12 13 18 6.75 1.41 1993 12 13 21 6.41 1.26 1993 12 14 0 6.61 1.2 1993 12 14 3 5.65 1.24 1993 12 14 6 5.73 1.46 1993 12 14 9 7.19 1.77 1993 12 14 12 5.12 1.85 1993 12 14 15 4.85 1.73 1993 12 14 18 5.29 1.53 1993 12 14 21 7.03 1.36 1993 12 15 0 6.44 1.24 1993 12 15 3 7.83 1.17 1993 12 15 6 10.06 1.34 1993 12 15 9 11.18 1.73 1993 12 15 12 9.76 2.07 1993 12 15 15 8.03 2.04 1993 12 15 18 10.22 2.16 1993 12 15 21 12.55 2.65 1993 12 16 0 8.37 2.96 1993 12 16 3 8.66 2.95 1993 12 16 6 8.75 2.91 1993 12 16 9 9.07 2.75 1993 12 16 12 12.04 2.61 1993 12 16 15 12.31 2.79 1993 12 16 18 16.55 3.5 1993 12 16 21 15.24 4.28 1993 12 17 0 12.45 4.45 1993 12 17 3 10.41 4.05 1993 12 17 6 5.45 3.5 1993 12 17 9 9.11 2.93 1993 12 17 12 11.84 2.68 1993 12 17 15 12.35 2.78 1993 12 17 18 13.31 3.07 1993 12 17 21 10.47 3.22 1993 12 18 0 9.57 3.07 1993 12 18 3 9.87 2.88 1993 12 18 6 10.06 2.94 1993 12 18 9 12.95 3.3 1993 12 18 12 10.7 3.47 1993 12 18 15 9.55 3.16 1993 12 18 18 7.8 2.76 1993 12 18 21 7.16 2.4 1993 12 19 0 5.16 2.16 1993 12 19 3 7.32 1.97 1993 12 19 6 6.94 1.89 1993 12 19 9 10.69 1.88 1993 12 19 12 13.48 2.16 1993 12 19 15 14.77 2.82 1993 12 19 18 14.06 3.3 1993 12 19 21 12.99 3.48 1993 12 20 0 12.91 3.51 1993 12 20 3 11.16 3.39 1993 12 20 6 6.51 2.93 1993 12 20 9 6.89 2.42 1993 12 20 12 4.72 1.97 1993 12 20 15 4.34 1.58 1993 12 20 18 4.52 1.28 1993 12 20 21 4.97 1.06 1993 12 21 0 7.13 0.92 1993 12 21 3 8.03 0.96 1993 12 21 6 6.84 1.03 1993 12 21 9 5.18 1.03 1993 12 21 12 2.77 1.1 1993 12 21 15 1.56 1.19 1993 12 21 18 3.61 1.13 1993 12 21 21 6.22 1.06 1993 12 22 0 5.2 1.06 1993 12 22 3 5.49 1.1 1993 12 22 6 3.76 1.22 1993 12 22 9 0.66 1.43 1993 12 22 12 0.81 1.57 1993 12 22 15 1.19 1.57 1993 12 22 18 1.4 1.54 1993 12 22 21 1.06 1.54 1993 12 23 0 1.21 1.52 1993 12 23 3 0.81 1.48 1993 12 23 6 0.18 1.47 1993 12 23 9 3.99 1.48 1993 12 23 12 6.08 1.54 1993 12 23 15 5.73 1.61 1993 12 23 18 7.52 1.63 1993 12 23 21 5.62 1.62 1993 12 24 0 4.8 1.55 1993 12 24 3 7.66 1.51 1993 12 24 6 6.22 1.56 1993 12 24 9 6.57 1.6 1993 12 24 12 4.35 1.71 1993 12 24 15 7.6 1.88 1993 12 24 18 7.16 2.06 1993 12 24 21 9.15 2.19 1993 12 25 0 11.95 2.33 1993 12 25 3 13.76 2.71 1993 12 25 6 9.98 2.83 1993 12 25 9 9.76 2.88 1993 12 25 12 9.37 2.62 1993 12 25 15 10.67 2.4 1993 12 25 18 10.73 2.42 1993 12 25 21 8.8 2.4 1993 12 26 0 6.69 2.28 1993 12 26 3 5.04 2.2 1993 12 26 6 4.02 2.19 1993 12 26 9 4.38 2.16 1993 12 26 12 7.36 2.06 1993 12 26 15 9.14 2.03 1993 12 26 18 9.79 2.07 1993 12 26 21 8.58 1.98 1993 12 27 0 5.2 1.86 1993 12 27 3 3.62 1.76 1993 12 27 6 5.52 1.68 1993 12 27 9 4.36 1.6 1993 12 27 12 5.93 1.5 1993 12 27 15 7.67 1.49 1993 12 27 18 7.64 1.57 1993 12 27 21 6.68 1.55 1993 12 28 0 4.61 1.45 1993 12 28 3 2.98 1.34 1993 12 28 6 2.57 1.27 1993 12 28 9 1.94 1.22 1993 12 28 12 5.44 1.19 1993 12 28 15 8.45 1.33 1993 12 28 18 10.71 1.54 1993 12 28 21 8.76 1.88 1993 12 29 0 6.45 1.92 1993 12 29 3 6.37 1.83 1993 12 29 6 7.59 1.7 1993 12 29 9 10.78 1.68 1993 12 29 12 9.73 2.24 1993 12 29 15 7.69 2.44 1993 12 29 18 8.96 2.29 1993 12 29 21 11.67 2.34 1993 12 30 0 7.8 2.57 1993 12 30 3 10.61 2.68 1993 12 30 6 8.72 3.09 1993 12 30 9 10.7 3.31 1993 12 30 12 11.12 3.5 1993 12 30 15 10.48 3.62 1993 12 30 18 12.61 3.74 1993 12 30 21 11.01 3.89 1993 12 31 0 9.94 3.78 1993 12 31 3 9.38 3.48 1993 12 31 6 10.04 3.14 1993 12 31 9 12.34 3.05 1993 12 31 12 12.18 3.33 1993 12 31 15 12.15 3.57 1993 12 31 18 10.88 3.58 1993 12 31 21 10.99 3.51 From warren.weckesser at gmail.com Mon Jan 5 07:40:08 2015 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Mon, 5 Jan 2015 07:40:08 -0500 Subject: [SciPy-User] Problems with fitting Weibell distribution In-Reply-To: <54AA8347.8090303@griffinpc.co.uk> References: <54AA8347.8090303@griffinpc.co.uk> Message-ID: On Mon, Jan 5, 2015 at 7:27 AM, Alun (Griffin PC) wrote: > Hi > > I am trying to fit a Weibull distribution to some data, with the following > code: > > # Python script to fit metocean data > > import scipy.stats as s > import numpy as np > import matplotlib.pyplot as plt > > # Load data > > data = np.loadtxt("meteo.prn",skiprows=1,usecols=(4,)) > > # Fit data > > p0, p1, p2, p3 = s.exponweib.fit(data, floc=0) > > # Plot data > > x = np.linspace(data.min(), data.max(), 1000) > y = s.exponweib(p0, p1, p2, p3).pdf(x) > > plt.plot(x, y) > plt.hist(data, data.max(), normed=True) > plt.show() > > Unfortunately, I don't get a distribution that looks anything like the > inputs. I have searched the web and the above code is based on a couple of > other posts but I am getting confused about the arguments that the fit > function returns and which the EXPONWEIB function needs. All help greatly > appreciated! > > Thanks!! > > Alun Griffiths > > When I run your script with scipy 0.14.0, I get the attached figure. Is that what you get? Warren > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: figure_1.png Type: image/png Size: 26358 bytes Desc: not available URL: From alun at griffinpc.co.uk Mon Jan 5 07:45:28 2015 From: alun at griffinpc.co.uk (Alun (Griffin PC)) Date: Mon, 05 Jan 2015 12:45:28 +0000 Subject: [SciPy-User] Problems with fitting Weibell distribution In-Reply-To: References: <54AA8347.8090303@griffinpc.co.uk> Message-ID: <54AA8768.7050707@griffinpc.co.uk> Hi Wayne no - but that is what I am trying to do! My plot is attached. I am using SciPy v12 best regards Alun On 05/01/2015 12:40, Warren Weckesser wrote: > > > On Mon, Jan 5, 2015 at 7:27 AM, Alun (Griffin PC) > > wrote: > > Hi > > I am trying to fit a Weibull distribution to some data, with the > following code: > > # Python script to fit metocean data > > import scipy.stats as s > import numpy as np > import matplotlib.pyplot as plt > > # Load data > > data = np.loadtxt("meteo.prn",skiprows=1,usecols=(4,)) > > # Fit data > > p0, p1, p2, p3 = s.exponweib.fit(data, floc=0) > > # Plot data > > x = np.linspace(data.min(), data.max(), 1000) > y = s.exponweib(p0, p1, p2, p3).pdf(x) > > plt.plot(x, y) > plt.hist(data, data.max(), normed=True) > plt.show() > > Unfortunately, I don't get a distribution that looks anything like > the inputs. I have searched the web and the above code is based > on a couple of other posts but I am getting confused about the > arguments that the fit function returns and which the EXPONWEIB > function needs. All help greatly appreciated! > > Thanks!! > > Alun Griffiths > > > > When I run your script with scipy 0.14.0, I get the attached figure. > Is that what you get? > > Warren > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: figure_1.png Type: image/png Size: 21694 bytes Desc: not available URL: From warren.weckesser at gmail.com Mon Jan 5 08:44:10 2015 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Mon, 5 Jan 2015 08:44:10 -0500 Subject: [SciPy-User] Problems with fitting Weibell distribution In-Reply-To: <54AA8768.7050707@griffinpc.co.uk> References: <54AA8347.8090303@griffinpc.co.uk> <54AA8768.7050707@griffinpc.co.uk> Message-ID: On Mon, Jan 5, 2015 at 7:45 AM, Alun (Griffin PC) wrote: > Hi Wayne > > no - but that is what I am trying to do! My plot is attached. I am using > SciPy v12 > > I just tried it with scipy 0.12.1, and I got the same plot you got. Any chance you can upgrade scipy? Warren > best regards > > Alun > > > On 05/01/2015 12:40, Warren Weckesser wrote: > > > > On Mon, Jan 5, 2015 at 7:27 AM, Alun (Griffin PC) > wrote: > >> Hi >> >> I am trying to fit a Weibull distribution to some data, with the >> following code: >> >> # Python script to fit metocean data >> >> import scipy.stats as s >> import numpy as np >> import matplotlib.pyplot as plt >> >> # Load data >> >> data = np.loadtxt("meteo.prn",skiprows=1,usecols=(4,)) >> >> # Fit data >> >> p0, p1, p2, p3 = s.exponweib.fit(data, floc=0) >> >> # Plot data >> >> x = np.linspace(data.min(), data.max(), 1000) >> y = s.exponweib(p0, p1, p2, p3).pdf(x) >> >> plt.plot(x, y) >> plt.hist(data, data.max(), normed=True) >> plt.show() >> >> Unfortunately, I don't get a distribution that looks anything like the >> inputs. I have searched the web and the above code is based on a couple of >> other posts but I am getting confused about the arguments that the fit >> function returns and which the EXPONWEIB function needs. All help greatly >> appreciated! >> >> Thanks!! >> >> Alun Griffiths >> >> > > When I run your script with scipy 0.14.0, I get the attached figure. Is > that what you get? > > Warren > > > >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user >> >> > > > _______________________________________________ > SciPy-User mailing listSciPy-User at scipy.orghttp://mail.scipy.org/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alun at griffinpc.co.uk Mon Jan 5 10:05:34 2015 From: alun at griffinpc.co.uk (Alun (Griffin PC)) Date: Mon, 05 Jan 2015 15:05:34 +0000 Subject: [SciPy-User] Problems with fitting Weibell distribution In-Reply-To: References: <54AA8347.8090303@griffinpc.co.uk> <54AA8768.7050707@griffinpc.co.uk> Message-ID: <54AAA83E.6080101@griffinpc.co.uk> Hi Warren upgrading to SciPy 0.14 did the trick - thanks! Alun On 05/01/2015 13:44, Warren Weckesser wrote: > > > On Mon, Jan 5, 2015 at 7:45 AM, Alun (Griffin PC) > > wrote: > > Hi Wayne > > no - but that is what I am trying to do! My plot is attached. I > am using SciPy v12 > > > > I just tried it with scipy 0.12.1, and I got the same plot you got. > Any chance you can upgrade scipy? > > Warren > > > best regards > > Alun > > > On 05/01/2015 12:40, Warren Weckesser wrote: >> >> >> On Mon, Jan 5, 2015 at 7:27 AM, Alun (Griffin PC) >> > wrote: >> >> Hi >> >> I am trying to fit a Weibull distribution to some data, with >> the following code: >> >> # Python script to fit metocean data >> >> import scipy.stats as s >> import numpy as np >> import matplotlib.pyplot as plt >> >> # Load data >> >> data = np.loadtxt("meteo.prn",skiprows=1,usecols=(4,)) >> >> # Fit data >> >> p0, p1, p2, p3 = s.exponweib.fit(data, floc=0) >> >> # Plot data >> >> x = np.linspace(data.min(), data.max(), 1000) >> y = s.exponweib(p0, p1, p2, p3).pdf(x) >> >> plt.plot(x, y) >> plt.hist(data, data.max(), normed=True) >> plt.show() >> >> Unfortunately, I don't get a distribution that looks anything >> like the inputs. I have searched the web and the above code >> is based on a couple of other posts but I am getting confused >> about the arguments that the fit function returns and which >> the EXPONWEIB function needs. All help greatly appreciated! >> >> Thanks!! >> >> Alun Griffiths >> >> >> >> When I run your script with scipy 0.14.0, I get the attached >> figure. Is that what you get? >> >> Warren >> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user >> >> >> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jan 6 02:14:22 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 6 Jan 2015 08:14:22 +0100 Subject: [SciPy-User] Proceedings of EuroSciPy 2014 Message-ID: On Tue, Dec 23, 2014 at 7:06 PM, wrote: > Dear scientist using Python, > > We are glad to announce the publication of the proceedings of the 7th > European > Conference on Python in Science, EuroSciPy 2014, still in 2014! > > The proceedings cover various scientific fields in which Python and its > scientific libraries are used. You may obtain the table of contents and > all the > articles on the arXiv at http://arxiv.org/abs/1412.7030 > For convenience, the articles' titles are listed below. > > It is a useful reference to have as the publication of software-related > scientific work is not always straightforward. > > Thanks go to all authors and reviewers for their contributions. The reviews > were conducted publicly at > https://github.com/euroscipy/euroscipy_proceedings > > Pierre de Buyl & Nelle Varoquaux, editors > > PS: there was no large announcement of the proceedings of EuroSciPy 2013. > In the > hope that this can increase their visibility, here is the URL > Proceedings of EuroSciPy 2013: http://arxiv.org/abs/1405.0166 > > Pierre de Buyl, Nelle Varoquaux: Preface > J?r?me Kieffer, Giannis Ashiotis: PyFAI: a Python library for high > performance azimuthal integration on GPU > Andrew Leonard, Huw Morgan: Temperature diagnostics of the solar > atmosphere using SunPy > Bastian Venthur, Benjamin Blankertz: Wyrm, A Pythonic Toolbox for > Brain-Computer Interfacing > Christophe Pouzat, Georgios Is. Detorakis: SPySort: Neuronal Spike Sorting > with Python > Thomas Cokelaer, Julio Saez-Rodriguez: Using Python to Dive into > Signalling Data with CellNOpt and BioServices > Davide Monari, Francesco Cenni, Erwin Aertbeli?n, Kaat Desloovere: > Py3DFreeHandUS: a library for voxel-array reconstruction using > Ultrasonography and attitude sensors > Esteban Fuentes, Hector E. Martinez: SClib, a hack for straightforward > embedded C functions in Python > Jamie A Dean, Liam C Welsh, Kevin J Harrington, Christopher M Nutting, > Sarah L Gulliford: Predictive Modelling of Toxicity Resulting from > Radiotherapy Treatments of Head and Neck Cancer > Rebecca R. Murphy, Sophie E. Jackson, David Klenerman: pyFRET: A Python > Library for Single Molecule Fluorescence Data Analysis > Robert Cimrman: Enhancing SfePy with Isogeometric Analysis > Steve Brasier, Fred Pollard: A Python-based Post-processing Toolset For > Seismic Analyses > Vladim?r Luke?, Miroslav Ji??k, Alena Jon??ov?, Eduard Rohan, Ond?ej > Bubl?k, Robert Cimrman: Numerical simulation of liver perfusion: from CT > scans to FE model > > _______________________________________________ > euroscipy-org mailing list > euroscipy-org at python.org > https://mail.python.org/mailman/listinfo/euroscipy-org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Samuel.St-Jean at USherbrooke.ca Wed Jan 7 13:05:04 2015 From: Samuel.St-Jean at USherbrooke.ca (Samuel St-Jean) Date: Wed, 07 Jan 2015 13:05:04 -0500 Subject: [SciPy-User] cephes hyp1f1 vs scipy hyp1f1 stability In-Reply-To: <54AC490D.3080802@usherbrooke.ca> References: <54AC490D.3080802@usherbrooke.ca> Message-ID: Hello fellow scipy users, I have a problem with scipy.special.hyp1f1, where it is a bit unstable for large negative values of x. For example, hyp1f1(a,b,x) with a=-0.5, b=12 and z=-1406.25 gives nan (those values are from a real, legitimate use case and not just values to make it blow up). According to https://github.com/scipy/scipy/blob/master/scipy/special/generate_ufuncs.py at line 101, the wrapper implements some functions from the cephes library and of the specfun_wrappers. Since the cephes library also implements hyp1f1, I was wondering if anyone used it successfully or if there is any reason to not use it. Is it more stable than the current available version, or how would one use that version instead of the specfun_wrapper version (or are they simply the same)? Worst case I'll just try to wrap it with cython I guess (which is not that easy under windows so far, but that's another issue). mpmath also has it's own version, but it's on the slow side sadly. Any suggestion if it would be feasible to use the cephes version or a fast, working implementation? Samuel -------------- next part -------------- An HTML attachment was scrubbed... URL: From ziyuang at gmail.com Thu Jan 8 18:49:27 2015 From: ziyuang at gmail.com (Ziyuan Lin) Date: Thu, 8 Jan 2015 15:49:27 -0800 (PST) Subject: [SciPy-User] =?utf-8?q?How_does_=E2=80=9Cline_search_failed?= =?utf-8?q?=E2=80=9D_affect_the_optimization=3F?= Message-ID: Hi all, I am using scipy.optimize.minimize for my optimization problem. Specifically, I tried solvers "L-BFGS-B" and "TNC", but both give me "Linear search failed"-like messages on my problem. What is the reason of the failure of the line search on these solvers? Does it mean that the final "optimal" value is potentially not optimal? Thank you. Best regards, Ziyuan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vaggi.federico at gmail.com Fri Jan 9 15:44:38 2015 From: vaggi.federico at gmail.com (federico vaggi) Date: Fri, 9 Jan 2015 21:44:38 +0100 Subject: [SciPy-User] SciPy-User Digest, Vol 137, Issue 7 In-Reply-To: References: Message-ID: BFGS uses a 'line-search' to find out the optimal step size it can take in the direction of the gradient. If the line search fails, it means that the algorithm cannot find a step size small enouogh such that the function decreases in the gradient direction. Are you sure your gradient information is accurate? Are you using a finite differences scheme to evaluate the gradient? On Fri, Jan 9, 2015 at 7:00 PM, wrote: > Send SciPy-User mailing list submissions to > scipy-user at scipy.org > > To subscribe or unsubscribe via the World Wide Web, visit > http://mail.scipy.org/mailman/listinfo/scipy-user > or, via email, send a message with subject or body 'help' to > scipy-user-request at scipy.org > > You can reach the person managing the list at > scipy-user-owner at scipy.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-User digest..." > > > Today's Topics: > > 1. How does ?line search failed? affect the optimization? > (Ziyuan Lin) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Thu, 8 Jan 2015 15:49:27 -0800 (PST) > From: Ziyuan Lin > Subject: [SciPy-User] How does ?line search failed? affect the > optimization? > To: scipy-user at googlegroups.com > Message-ID: > Content-Type: text/plain; charset="utf-8" > > Hi all, > > I am using scipy.optimize.minimize for my optimization problem. > Specifically, I tried solvers "L-BFGS-B" and "TNC", but both give me > "Linear search failed"-like messages on my problem. What is the reason of > the failure of the line search on these solvers? Does it mean that the > final "optimal" value is potentially not optimal? Thank you. > > Best regards, > Ziyuan > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > http://mail.scipy.org/pipermail/scipy-user/attachments/20150108/d48992a4/attachment-0001.html > > ------------------------------ > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > > End of SciPy-User Digest, Vol 137, Issue 7 > ****************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Jan 11 12:50:47 2015 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 11 Jan 2015 19:50:47 +0200 Subject: [SciPy-User] ANN: Scipy 0.15.0 release Message-ID: <54B2B7F7.4030708@iki.fi> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Dear all, We are pleased to announce the Scipy 0.15.0 release. The 0.15.0 release contains bugfixes and new features, most important of which are mentioned in the excerpt from the release notes below. Source tarballs, binaries, and full release notes are available at https://sourceforge.net/projects/scipy/files/scipy/0.15.0/ Best regards, Pauli Virtanen ========================== SciPy 0.15.0 Release Notes ========================== SciPy 0.15.0 is the culmination of 6 months of hard work. It contains several new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.16.x branch, and on adding new features on the master branch. This release requires Python 2.6, 2.7 or 3.2-3.4 and NumPy 1.5.1 or greater. New features ============ Linear Programming Interface - ---------------------------- The new function `scipy.optimize.linprog` provides a generic linear programming similar to the way `scipy.optimize.minimize` provides a generic interface to nonlinear programming optimizers. Currently the only method supported is *simplex* which provides a two-phase, dense-matrix-based simplex algorithm. Callbacks functions are supported, allowing the user to monitor the progress of the algorithm. Differential evolution, a global optimizer - ------------------------------------------ A new `scipy.optimize.differential_evolution` function has been added to the ``optimize`` module. Differential Evolution is an algorithm used for finding the global minimum of multivariate functions. It is stochastic in nature (does not use gradient methods), and can search large areas of candidate space, but often requires larger numbers of function evaluations than conventional gradient based techniques. ``scipy.signal`` improvements - ----------------------------- The function `scipy.signal.max_len_seq` was added, which computes a Maximum Length Sequence (MLS) signal. ``scipy.integrate`` improvements - -------------------------------- It is now possible to use `scipy.integrate` routines to integrate multivariate ctypes functions, thus avoiding callbacks to Python and providing better performance. ``scipy.linalg`` improvements - ----------------------------- The function `scipy.linalg.orthogonal_procrustes` for solving the procrustes linear algebra problem was added. BLAS level 2 functions ``her``, ``syr``, ``her2`` and ``syr2`` are now wrapped in ``scipy.linalg``. ``scipy.sparse`` improvements - ----------------------------- `scipy.sparse.linalg.svds` can now take a ``LinearOperator`` as its main input. ``scipy.special`` improvements - ------------------------------ Values of ellipsoidal harmonic (i.e. Lame) functions and associated normalization constants can be now computed using ``ellip_harm``, ``ellip_harm_2``, and ``ellip_normal``. New convenience functions ``entr``, ``rel_entr`` ``kl_div``, ``huber``, and ``pseudo_huber`` were added. ``scipy.sparse.csgraph`` improvements - ------------------------------------- Routines ``reverse_cuthill_mckee`` and ``maximum_bipartite_matching`` for computing reorderings of sparse graphs were added. ``scipy.stats`` improvements - ---------------------------- Added a Dirichlet multivariate distribution, `scipy.stats.dirichlet`. The new function `scipy.stats.median_test` computes Mood's median test. The new function `scipy.stats.combine_pvalues` implements Fisher's and Stouffer's methods for combining p-values. `scipy.stats.describe` returns a namedtuple rather than a tuple, allowing users to access results by index or by name. Deprecated features =================== The `scipy.weave` module is deprecated. It was the only module never ported to Python 3.x, and is not recommended to be used for new code - use Cython instead. In order to support existing code, ``scipy.weave`` has been packaged separately: https://github.com/scipy/weave. It is a pure Python package, and can easily be installed with ``pip install weave``. `scipy.special.bessel_diff_formula` is deprecated. It is a private function, and therefore will be removed from the public API in a following release. ``scipy.stats.nanmean``, ``nanmedian`` and ``nanstd`` functions are deprecated in favor of their numpy equivalents. Backwards incompatible changes ============================== scipy.ndimage - ------------- The functions `scipy.ndimage.minimum_positions`, `scipy.ndimage.maximum_positions`` and `scipy.ndimage.extrema` return positions as ints instead of floats. scipy.integrate - --------------- The format of banded Jacobians in `scipy.integrate.ode` solvers is changed. Note that the previous documentation of this feature was erroneous. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iEYEARECAAYFAlSyt/cACgkQ6BQxb7O0pWA8SACfXmpUsJcXT5espj71OYpeaj5b JJwAoL10ud3q1f51A5Ij4lgqMeZGnHlj =ZmOl -----END PGP SIGNATURE----- From jni.soma at gmail.com Sun Jan 11 18:21:10 2015 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Mon, 12 Jan 2015 10:21:10 +1100 Subject: [SciPy-User] Why does generic_filter coerce input to float64? Message-ID: Hi all, I'm wondering, why does ndimage.generic_filter cast the input array as a float before passing it to the user-supplied function? I have at least one application that requires integer input. For small integers, this is not a problem (other than resulting in at least two unnecessary casts), but for big ones this will cause a loss of precision. I'd love to contribute a patch but unfortunately I couldn't find the place in the C-code where this happens. Thanks! Juan. -------------- next part -------------- An HTML attachment was scrubbed... URL: From peterskurt at msn.com Sun Jan 11 19:55:32 2015 From: peterskurt at msn.com (KURT PETERS) Date: Sun, 11 Jan 2015 17:55:32 -0700 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables Message-ID: I'm trying to use scipy.optimize.minimize. I've tried multiple "multivariate" methods that don't seem to actually take multivariate data and derivatives. Can someone tell me how I can make the multivariate part of the solver actually work? Here's an example: My main function the following (typical length for N is 3): input guess is a x0=np.array([1,2,3]) the optimization function returns: def calc_f3d(...): f3d = np.ones((np.max([3,N]),1) .... do some assignments to f3d[row,0] .... return np.linalg.norm(f3d) # numpy.array that's 3x1 The jacobian returns a Nx3 matrix: def jacob3d(...): df = np.ones((np.max([3,N]),3)) ... do some assignments to df[row,col] return df # note numpy.array that's 3x3 The optimize call is: OptimizeResult = optimize.minimize( fun=tdcalc.calc_f3d, x0=ract, jac=tdcalc.jacob3d, method='BFGS', args=(operdata,), tol=1.0e-8, options={'maxiter': 40000, 'xtol':1e-8}) <--- ops change based on whether using Newton-CG or BFGS When I use BFGS, I get: Traceback (most recent call last): File "./tdoa_calc.py", line 664, in options={'maxiter': 40000, 'gtol':1e-8}) File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", line 348, in minimize return _minimize_bfgs(fun, x0, args, jac, callback, **options) File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", line 779, in _minimize_bfgs old_fval, old_old_fval) File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", line 95, in line_search_wolfe1 c1=c1, c2=c2, amax=amax, amin=amin, xtol=xtol) File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", line 147, in scalar_search_wolfe1 alpha1 = min(1.0, 1.01*2*(phi0 - old_phi0)/derphi0) ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() When I use Newton-CG, I get: Traceback (most recent call last): File "./tdoa_calc.py", line 655, in options={'maxiter': 40000, 'xtol':1e-8}) File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", line 351, in minimize **options) File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", line 1320, in _minimize_newtoncg eta = numpy.min([0.5, numpy.sqrt(maggrad)]) File "/usr/lib64/python2.7/site-packages/numpy/core/fromnumeric.py", line 1982, in amin out=out, keepdims=keepdims) File "/usr/lib64/python2.7/site-packages/numpy/core/_methods.py", line 14, in _amin out=out, keepdims=keepdims) ValueError: setting an array element with a sequence. -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Sun Jan 11 19:58:31 2015 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 12 Jan 2015 11:58:31 +1100 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables In-Reply-To: References: Message-ID: `calc_f3d` needs to return a single number, the overall 'cost'. On 12 January 2015 at 11:55, KURT PETERS wrote: > I'm trying to use scipy.optimize.minimize. > I've tried multiple "multivariate" methods that don't seem to actually > take multivariate data and derivatives. Can someone tell me how I can make > the multivariate part of the solver actually work? > > Here's an example: > My main function the following (typical length for N is 3): > > input guess is a x0=np.array([1,2,3]) > the optimization function returns: > def calc_f3d(...): > f3d = np.ones((np.max([3,N]),1) > .... do some assignments to f3d[row,0] .... > return np.linalg.norm(f3d) # numpy.array that's 3x1 > > The jacobian returns a Nx3 matrix: > def jacob3d(...): > df = np.ones((np.max([3,N]),3)) > ... do some assignments to df[row,col] > return df # note numpy.array that's 3x3 > > The optimize call is: > OptimizeResult = optimize.minimize( > fun=tdcalc.calc_f3d, > x0=ract, > jac=tdcalc.jacob3d, > method='BFGS', > args=(operdata,), > tol=1.0e-8, > options={'maxiter': 40000, 'xtol':1e-8}) <--- ops change based on > whether using Newton-CG or BFGS > > When I use BFGS, I get: > Traceback (most recent call last): > File "./tdoa_calc.py", line 664, in > options={'maxiter': 40000, 'gtol':1e-8}) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", > line 348, in minimize > return _minimize_bfgs(fun, x0, args, jac, callback, **options) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", > line 779, in _minimize_bfgs > old_fval, old_old_fval) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", > line 95, in line_search_wolfe1 > c1=c1, c2=c2, amax=amax, amin=amin, xtol=xtol) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", > line 147, in scalar_search_wolfe1 > alpha1 = min(1.0, 1.01*2*(phi0 - old_phi0)/derphi0) > ValueError: The truth value of an array with more than one element is > ambiguous. Use a.any() or a.all() > > When I use Newton-CG, I get: > Traceback (most recent call last): > File "./tdoa_calc.py", line 655, in > options={'maxiter': 40000, 'xtol':1e-8}) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", > line 351, in minimize > **options) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", > line 1320, in _minimize_newtoncg > eta = numpy.min([0.5, numpy.sqrt(maggrad)]) > File "/usr/lib64/python2.7/site-packages/numpy/core/fromnumeric.py", > line 1982, in amin > out=out, keepdims=keepdims) > File "/usr/lib64/python2.7/site-packages/numpy/core/_methods.py", line > 14, in _amin > out=out, keepdims=keepdims) > ValueError: setting an array element with a sequence. > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From eric.moore2 at nih.gov Mon Jan 12 08:25:50 2015 From: eric.moore2 at nih.gov (Moore, Eric (NIH/NIDDK) [F]) Date: Mon, 12 Jan 2015 13:25:50 +0000 Subject: [SciPy-User] Why does generic_filter coerce input to float64? In-Reply-To: References: Message-ID: <649847CE7F259144A0FD99AC64E7326DEACB98@msgb09.nih.gov> > From: Juan Nunez-Iglesias [mailto:jni.soma at gmail.com] > Sent: Sunday, January 11, 2015 6:21 PM > To: scipy-user at scipy.org > Subject: [SciPy-User] Why does generic_filter coerce input to float64? > > Hi all, > > I'm wondering, why does ndimage.generic_filter cast the input array as a float > before passing it to the user-supplied function? I have at least one application > that requires integer input. For small integers, this is not a problem (other > than resulting in at least two unnecessary casts), but for big ones this will > cause a loss of precision. > > I'd love to contribute a patch but unfortunately I couldn't find the place in > the C-code where this happens. > > Thanks! > > Juan. > The cast happens in the expansion of the CASE_FILTER_POINT macro[1], which is used just below that in NI_GenericFilter. This looks like a deliberate choice though, so I don't think that we can just change it. One option might be to add this as a keep_dtype argument, which would default to false. Eric 1. https://github.com/scipy/scipy/blob/master/scipy/ndimage/src/ni_filters.c#L869 From peterskurt at msn.com Mon Jan 12 18:26:31 2015 From: peterskurt at msn.com (KURT PETERS) Date: Mon, 12 Jan 2015 16:26:31 -0700 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables (Andrew Nelson) In-Reply-To: References: Message-ID: > Date: Sun, 11 Jan 2015 17:55:32 -0700 > From: KURT PETERS > Subject: [SciPy-User] optimize.minimize - help me understand arrays as > variables > To: "scipy-user at scipy.org" > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > I'm trying to use scipy.optimize.minimize. > I've tried multiple "multivariate" methods that don't seem to actually take multivariate data and derivatives. Can someone tell me how I can make the multivariate part of the solver actually work? > > Here's an example: > My main function the following (typical length for N is 3): > > input guess is a x0=np.array([1,2,3]) > the optimization function returns: > def calc_f3d(...): > f3d = np.ones((np.max([3,N]),1) > .... do some assignments to f3d[row,0] .... > return np.linalg.norm(f3d) # numpy.array that's 3x1 > > The jacobian returns a Nx3 matrix: > def jacob3d(...): > df = np.ones((np.max([3,N]),3)) > ... do some assignments to df[row,col] > return df # note numpy.array that's 3x3 > > The optimize call is: > OptimizeResult = optimize.minimize( > fun=tdcalc.calc_f3d, > x0=ract, > jac=tdcalc.jacob3d, > method='BFGS', > args=(operdata,), > tol=1.0e-8, > options={'maxiter': 40000, 'xtol':1e-8}) <--- ops change based on whether using Newton-CG or BFGS > > When I use BFGS, I get: > Traceback (most recent call last): > File "./tdoa_calc.py", line 664, in > options={'maxiter': 40000, 'gtol':1e-8}) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", line 348, in minimize > return _minimize_bfgs(fun, x0, args, jac, callback, **options) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", line 779, in _minimize_bfgs > old_fval, old_old_fval) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", line 95, in line_search_wolfe1 > c1=c1, c2=c2, amax=amax, amin=amin, xtol=xtol) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", line 147, in scalar_search_wolfe1 > alpha1 = min(1.0, 1.01*2*(phi0 - old_phi0)/derphi0) > ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() > > When I use Newton-CG, I get: > Traceback (most recent call last): > File "./tdoa_calc.py", line 655, in > options={'maxiter': 40000, 'xtol':1e-8}) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", line 351, in minimize > **options) > File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", line 1320, in _minimize_newtoncg > eta = numpy.min([0.5, numpy.sqrt(maggrad)]) > File "/usr/lib64/python2.7/site-packages/numpy/core/fromnumeric.py", line 1982, in amin > out=out, keepdims=keepdims) > File "/usr/lib64/python2.7/site-packages/numpy/core/_methods.py", line 14, in _amin > out=out, keepdims=keepdims) > ValueError: setting an array element with a sequence. > ======================================================================== > Date: Mon, 12 Jan 2015 11:58:31 +1100 > From: Andrew Nelson > Subject: Re: [SciPy-User] optimize.minimize - help me understand > arrays as variables > To: SciPy Users List > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > `calc_f3d` needs to return a single number, the overall 'cost'. > the "return np.linalg.norm(f3d)" DOES return a scalar ( a single number). numpy.linalg.norm([]) returns a single number. Best, Kurt -------------- next part -------------- An HTML attachment was scrubbed... URL: From jni.soma at gmail.com Mon Jan 12 22:51:35 2015 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Mon, 12 Jan 2015 19:51:35 -0800 (PST) Subject: [SciPy-User] Why does generic_filter coerce input to float64? In-Reply-To: <649847CE7F259144A0FD99AC64E7326DEACB98@msgb09.nih.gov> References: <649847CE7F259144A0FD99AC64E7326DEACB98@msgb09.nih.gov> Message-ID: <1421121095363.8221d149@Nodemailer> Hi Eric, Thanks for the response! Changing it would be a very bad thing and probably break a lot of code out there. However, a kwarg would be most useful to me! Would a PR to add this be welcome? At scikit-image, we converged on "preserve_dtype" as a slightly easier-to-parse name for such a kwarg. Thoughts? Juan. On Tue, Jan 13, 2015 at 12:26 AM, Moore, Eric (NIH/NIDDK) [F] wrote: >> From: Juan Nunez-Iglesias [mailto:jni.soma at gmail.com] >> Sent: Sunday, January 11, 2015 6:21 PM >> To: scipy-user at scipy.org >> Subject: [SciPy-User] Why does generic_filter coerce input to float64? >> >> Hi all, >> >> I'm wondering, why does ndimage.generic_filter cast the input array as a float >> before passing it to the user-supplied function? I have at least one application >> that requires integer input. For small integers, this is not a problem (other >> than resulting in at least two unnecessary casts), but for big ones this will >> cause a loss of precision. >> >> I'd love to contribute a patch but unfortunately I couldn't find the place in >> the C-code where this happens. >> >> Thanks! >> >> Juan. >> > The cast happens in the expansion of the CASE_FILTER_POINT macro[1], which is used > just below that in NI_GenericFilter. > This looks like a deliberate choice though, so I don't think that we can just change > it. One option might be to add this as a keep_dtype argument, which would default > to false. > Eric > 1. https://github.com/scipy/scipy/blob/master/scipy/ndimage/src/ni_filters.c#L869 > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhmerchant at gmail.com Tue Jan 13 22:13:32 2015 From: bhmerchant at gmail.com (Brian Merchant) Date: Tue, 13 Jan 2015 19:13:32 -0800 Subject: [SciPy-User] Novice scipy user: having some efficiency issues with using scipy.integrate.odeint for my research project, and not sure how to proceed... In-Reply-To: References: Message-ID: Hi Rob, Thanks for your response! I didn't realize I had one here, until today. Since I wrote that question, I have been done two important things: 1) learn more math: last semester was numerical methods, this semester is dynamical systems 2) learned how to use Cython -- already, this has helped me speed up my simulations to the point where making things like bifurcation diagrams is a bigger deal for me (another thing I know PyDSTool can do too) This semester, in my dynamical systems course, we are going to be using lots of XPPAUT, but I plan on using PyDSTool. I have already had a look at tutorials like: http://www.ni.gsu.edu/~rclewley/PyDSTool/Tutorial/Tutorial_VdP.html I am excited! Again, sorry for not seeing this message earlier, and thanks for offering to help! I will post on the discussion board if I find something specific I could use some help with. Kind regards, Brian On Wed, Oct 8, 2014 at 8:26 AM, Rob Clewley wrote: > Hi Brian, > > I know we've talked about this before, but here are some more general > suggestions about your problem. > > > > So, I changed the implementation so that everything was done in > "one-shot" > > (luckily the only non-DE rule I had so far was easily converted to a DE > > rule, although there is one coming up which I don't see any easy > conversion > > for...[that is the subject of this > > question]( > http://scicomp.stackexchange.com/questions/14765/scipy-integrate-odeint-how-can-odeint-access-a-parameter-set-that-is-evolving-i) > ). > > > > Having looked at that, your problem is not so bad. You will need to > solve a piecewise (a.k.a. "hybrid") system, but you shouldn't be > discretizing it based on time but based on events -- i.e. when your > particle crosses into a new discrete area you are calling p. You then > have a discrete state update for your parameters and then you can > restart your smooth integrator until the next transition. This is a > well-studied problem in this form and is numerically soluble with a > hybrid solver such as that provided by PyDSTool. Between positional > transitions, your dynamical system is smooth and can be solved with a > regular ODE solver. The Jacobian within each domain should be simple > enough to pre-calculate (as a function of p or

). There are a few > hybrid model examples with PyDSTool on the tutorial and I am willing > to help a bit with the setup once you've given a new script for your > problem a shot. Take a copy of a helpful example (e.g. the SLIP pogo > stick dynamics) and adapt it to set up what you can. Put in some > comments etc. of what you need to happen. > > PyDSTool will be able to convert the ODEs to C code automatically but > not the transition rules, which will still happen in python. This will > not be the *fastest* way to solve it but, more importantly, this way > will give you an accurate solution in a form that you can understand > and manipulate, and you can worry about optimizing speed later, IMO. > > -Rob > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eric.moore2 at nih.gov Wed Jan 14 08:28:21 2015 From: eric.moore2 at nih.gov (Moore, Eric (NIH/NIDDK) [F]) Date: Wed, 14 Jan 2015 13:28:21 +0000 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables (Andrew Nelson) In-Reply-To: References: Message-ID: <649847CE7F259144A0FD99AC64E7326DEACDAC@msgb09.nih.gov> Kurt, It is difficult to say what the issue is without seeing your code. Can you post a brief example that doesn't work for you? A good thing to try until someone responds would be to take an example from the optimization tutorial [1] and try running it, and then modifying that known working script to solve your problem. Eric 1. http://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html From eric.moore2 at nih.gov Wed Jan 14 08:31:57 2015 From: eric.moore2 at nih.gov (Moore, Eric (NIH/NIDDK) [F]) Date: Wed, 14 Jan 2015 13:31:57 +0000 Subject: [SciPy-User] Why does generic_filter coerce input to float64? In-Reply-To: <1421121095363.8221d149@Nodemailer> References: <649847CE7F259144A0FD99AC64E7326DEACB98@msgb09.nih.gov> <1421121095363.8221d149@Nodemailer> Message-ID: <649847CE7F259144A0FD99AC64E7326DEACDC0@msgb09.nih.gov> > From: Juan Nunez-Iglesias [mailto:jni.soma at gmail.com] > Sent: Monday, January 12, 2015 10:52 PM > To: SciPy Users List > Subject: Re: [SciPy-User] Why does generic_filter coerce input to float64? > > Hi Eric, > > Thanks for the response! > > Changing it would be a very bad thing and probably break a lot of code out > there. However, a kwarg would be most useful to me! Would a PR to add this > be welcome? > > At scikit-image, we converged on "preserve_dtype" as a slightly > easier-to-parse name for such a kwarg. Thoughts? > > Juan. > > Yes, I think so. From peterskurt at msn.com Wed Jan 14 10:01:44 2015 From: peterskurt at msn.com (KURT PETERS) Date: Wed, 14 Jan 2015 08:01:44 -0700 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables (Andrew Nelson) (KURT PETERS) In-Reply-To: References: Message-ID: Has ANYONE actually gotten the multivariate to work when using their own Jacobian? I haven't gotten any response based on my input below, but I have to believe someone has gotten it to work. Regards, Kurt Re: optimize.minimize - help me understand arrays as variables (Andrew Nelson) (KURT PETERS) > Date: Mon, 12 Jan 2015 16:26:31 -0700 > From: KURT PETERS > Subject: Re: [SciPy-User] optimize.minimize - help me understand > arrays as variables (Andrew Nelson) > To: "scipy-user at scipy.org" , > "andyfaff at gmail.com" > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > > Date: Sun, 11 Jan 2015 17:55:32 -0700 > > From: KURT PETERS > > Subject: [SciPy-User] optimize.minimize - help me understand arrays as > > variables > > To: "scipy-user at scipy.org" > > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > > > I'm trying to use scipy.optimize.minimize. > > I've tried multiple "multivariate" methods that don't seem to actually take multivariate data and derivatives. Can someone tell me how I can make the multivariate part of the solver actually work? > > > > Here's an example: > > My main function the following (typical length for N is 3): > > > > input guess is a x0=np.array([1,2,3]) > > the optimization function returns: > > def calc_f3d(...): > > f3d = np.ones((np.max([3,N]),1) > > .... do some assignments to f3d[row,0] .... > > return np.linalg.norm(f3d) # numpy.array that's 3x1 > > > > The jacobian returns a Nx3 matrix: > > def jacob3d(...): > > df = np.ones((np.max([3,N]),3)) > > ... do some assignments to df[row,col] > > return df # note numpy.array that's 3x3 > > > > The optimize call is: > > OptimizeResult = optimize.minimize( > > fun=tdcalc.calc_f3d, > > x0=ract, > > jac=tdcalc.jacob3d, > > method='BFGS', > > args=(operdata,), > > tol=1.0e-8, > > options={'maxiter': 40000, 'xtol':1e-8}) <--- ops change based on whether using Newton-CG or BFGS > > > > When I use BFGS, I get: > > Traceback (most recent call last): > > File "./tdoa_calc.py", line 664, in > > options={'maxiter': 40000, 'gtol':1e-8}) > > File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", line 348, in minimize > > return _minimize_bfgs(fun, x0, args, jac, callback, **options) > > File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", line 779, in _minimize_bfgs > > old_fval, old_old_fval) > > File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", line 95, in line_search_wolfe1 > > c1=c1, c2=c2, amax=amax, amin=amin, xtol=xtol) > > File "/usr/lib64/python2.7/site-packages/scipy/optimize/linesearch.py", line 147, in scalar_search_wolfe1 > > alpha1 = min(1.0, 1.01*2*(phi0 - old_phi0)/derphi0) > > ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() > > > > When I use Newton-CG, I get: > > Traceback (most recent call last): > > File "./tdoa_calc.py", line 655, in > > options={'maxiter': 40000, 'xtol':1e-8}) > > File "/usr/lib64/python2.7/site-packages/scipy/optimize/_minimize.py", line 351, in minimize > > **options) > > File "/usr/lib64/python2.7/site-packages/scipy/optimize/optimize.py", line 1320, in _minimize_newtoncg > > eta = numpy.min([0.5, numpy.sqrt(maggrad)]) > > File "/usr/lib64/python2.7/site-packages/numpy/core/fromnumeric.py", line 1982, in amin > > out=out, keepdims=keepdims) > > File "/usr/lib64/python2.7/site-packages/numpy/core/_methods.py", line 14, in _amin > > out=out, keepdims=keepdims) > > ValueError: setting an array element with a sequence. > > > ======================================================================== > > Date: Mon, 12 Jan 2015 11:58:31 +1100 > > From: Andrew Nelson > > Subject: Re: [SciPy-User] optimize.minimize - help me understand > > arrays as variables > > To: SciPy Users List > > Message-ID: > > > > Content-Type: text/plain; charset="utf-8" > > > > `calc_f3d` needs to return a single number, the overall 'cost'. > > > > the "return np.linalg.norm(f3d)" DOES return a scalar ( a single number). > > numpy.linalg.norm([]) returns a single number. > > Best, > Kurt -------------- next part -------------- An HTML attachment was scrubbed... URL: From jzuhone at gmail.com Wed Jan 14 18:07:01 2015 From: jzuhone at gmail.com (John ZuHone) Date: Wed, 14 Jan 2015 18:07:01 -0500 Subject: [SciPy-User] ANN: yt-3.1 release Message-ID: The yt community is proud to announce the release of yt 3.1. yt (http://yt-project.org) is an open source, community-developed toolkit for analysis and visualization of volumetric data of all types, with a particular emphasis on astrophysical simulations and nuclear engineering simulations. This is a scheduled feature release. Highlighted changes in yt 3.1: Major changes: ++++++++++++++ * The RADMC-3D export analysis module has been updated. * Performance improvements for grid frontends. * Added a frontend for Dark Matter-only NMSU Art simulations. * The absorption spectrum generator has been updated. * The PerspectiveCamera has been updated and a new SphericalCamera has been added. * The unit system now supports unit equivalencies and has improved support for MKS units. * Data object selection can now be chained, allowing selecting based on multiple constraints. * Added the ability to manually override the simulation unit system. * The documentation has been reorganized and has seen substantial improvements. Minor or bugfix changes: ++++++++++++++++++++++++ * The Gadget InternalEnergy and StarFormationRate fields are now read in with the correct units. * Substantial improvements for the PPVCube analysis module and support for FITS datasets. * The center of a PlotWindow plot can now be set to the maximum or minimum of any field. * Projections are now performed using an explicit path length field for all coordinate systems. * Fix for the camera.draw_line function. * Minor fixes and improvements for yt plots. * Significant documentation reorganization and improvement. * Miscellaneous code cleanup. * yt now hooks up to the python logging infrastructure in a more standard fashion, avoiding issues with yt logging showing up with using other libraries. * Improvements for the yt-rockstar interface. * It is now possible to supply a default value for get_field_parameter. * A bug in the interpretation of the units of RAMSES simulations has been fixed. * Improvements and bugfixes for the halo analysis framework. * Fix issues with the default setting for the "center" field parameter. * yt can now be run in parallel on a subset of available processors using an MPI subcommunicator. * Fix for incorrect units when loading an Athena simulation as a time series. * Improved support for Enzo 3.0 simulations that have not produced any active particles. * Fix for periodic radius vector calculation. * Improvements for the Maestro and Castro frontends. * Clump finding is now supported for more generic types of data. * Fix unit consistency issue when mixing dimensionless unit symbols. * Improved memory footprint in the photon_simulator. * Large grids in Athena datasets produced by the join_vtk script can now be optionally split, improving parallel performance. * Slice plots now accept a ?data_source" keyword argument. * Nearest neighbor distance field added. * Improvements for the ORION2 frontend. * Enzo 3.0 frontend can now read active particle attributes that are arrays of any shape. * Fixes for accessing deposit fields for FLASH data. * Added wrapper functions for numpy array manipulation functions. * Added support for packed HDF5 Enzo datasets. A more comprehensive list of the changes in this release, with links to the corresponding pull requests, can be found at http://yt-project.org/docs/3.1/reference/changelog.html. Standard Installation Methods ----------------------------- As with previous releases, you can install yt from source using one of the following methods. 1) From the install script (http://yt-project.org/#getyt): # Installation $ wget http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh $ bash install_script.sh # Update $ yt update 2) From pip (source or binary wheel, see below for more details): # Installation $ pip install yt # Update $ pip install -U yt 3) From the Anaconda Python Distribution (https://store.continuum.io/cshop/anaconda/): # Installation $ conda install yt # Update $ conda update yt Note that it might take a day or two for the conda package to be updated. If you are on the ?stable? branch, updating will bring you from yt 3.0.2 to 3.1, incorporating all changes since 3.0.2, whereas if you are on the ?dev? or ?yt? branch, only the changes since your last update should be incorporated. NEW: Installing Binary Packages via pip --------------------------------------- New to this release is the ability to install binary packages (?wheels?) using pip on Windows and Mac OS X (64-bit only for both). This has the advantage of not needing to install yt from source using a proper compiler setup, which has caused occasional problems on both of these platforms and prevented us from installing yt easily on other Python distributions. We have so far been able to install and run the binary distribution via pip on the following platforms and Python stacks: Windows x86_64: * Enthought Canopy Python (https://www.enthought.com/products/canopy/) * WinPython (http://winpython.sourceforge.net/) Mac OS X x86_64: * Enthought Canopy Python (https://www.enthought.com/products/canopy/) * Homebrew Python (http://brew.sh/) * Python.org Python * Mac OS X?s system Python * MacPorts Python (https://www.macports.org/) This is somewhat experimental, so other distributions may work (or not), please submit bug reports or successes to the mailing list or to the Bitbucket issues page (http://bitbucket.org/yt_analysis/yt/issues). All distributions must be Python v. 2.7. The requirements for installing yt via this method are the same as from source: * NumPy * h5py * HDF5 * SymPy * Matplotlib * IPython (not required, but strongly recommended) To install a new version of yt on one of these platforms, simply do $ pip install yt and you should get the binary distribution automatically. Also, if your python installation is system-wide (e.g., the Mac system Python) you might need to run pip with administrator privileges. For more information, including more installation instructions, links to community resources, and information on contributing to yt?s development, please see the yt homepage at http://yt-project.org and the documentation for yt-3.1 at http://yt-project.org/docs/3.1. yt is the product of a large community of developers and users and we are extraordinarily grateful for and proud of their contributions. Please forward this announcement on to any interested parties. As always, if you have any questions, concerns, or run into any trouble updating please don't hesitate to send a message to the mailing list or stop by our IRC channel. Thank you, The yt development team From gdmcbain at freeshell.org Wed Jan 14 19:01:56 2015 From: gdmcbain at freeshell.org (Geordie McBain) Date: Thu, 15 Jan 2015 11:01:56 +1100 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables (Andrew Nelson) In-Reply-To: References: Message-ID: 2015-01-13 10:26 GMT+11:00 KURT PETERS : > > Date: Sun, 11 Jan 2015 17:55:32 -0700 >> From: KURT PETERS >> Subject: [SciPy-User] optimize.minimize - help me understand arrays as >> variables >> To: "scipy-user at scipy.org" >> Message-ID: >> Content-Type: text/plain; charset="iso-8859-1" >> >> I'm trying to use scipy.optimize.minimize. >> I've tried multiple "multivariate" methods that don't seem to actually >> take multivariate data and derivatives. Can someone tell me how I can make >> the multivariate part of the solver actually work? >> >> Here's an example: >> My main function the following (typical length for N is 3): >> >> input guess is a x0=np.array([1,2,3]) >> the optimization function returns: >> def calc_f3d(...): >> f3d = np.ones((np.max([3,N]),1) >> .... do some assignments to f3d[row,0] .... >> return np.linalg.norm(f3d) # numpy.array that's 3x1 >> >> The jacobian returns a Nx3 matrix: >> def jacob3d(...): >> df = np.ones((np.max([3,N]),3)) >> ... do some assignments to df[row,col] >> return df # note numpy.array that's 3x3 Hello. I think that this might be the problem here: the jac should have the same shape as x, i.e. (N,), not (N, 3); the components of the jac are the partial derivatives of fun with respect to the corresponding components of x. Think of it as the gradient of the objective. Here's a simple shape=(2,) example, taken from D. M. Greig's Optimisation (1980, London: Longman). The exact minimum is 0 at [1, 1]. def f(x): # Greig (1980, p. 48) return (x[1] - x[0]**2)**2 + (1 - x[0])**2 def g(x): # ibid return np.array([-4*x[0]*(x[1] - x[0]**2) - 2 + 2*x[0], 2 * (x[1] - x[0]**2)]) x = np.zeros(2) print('Without Jacobian: ', minimize(f, x)) print('\nWith:', minimize(f, x, jac=g)) From andyfaff at gmail.com Thu Jan 15 03:46:08 2015 From: andyfaff at gmail.com (Andrew Nelson) Date: Thu, 15 Jan 2015 19:46:08 +1100 Subject: [SciPy-User] interpolate.interp1d - constructing a cubic interpolator is Slllooooooow. Message-ID: I'm intending to use interpolation in a curvefitting function. So have been investigating the use of interpolate.interp1d. I'd prefer to use cubic interpolation but it seems to take ages: import numpy as np from scipy.interpolate import interp1d a = np.linspace(-2 * np.pi, 2 * np.pi, 1000) b = np.cos(a) %timeit interp1d(a, b) 10000 loops, best of 3: 71.6 ?s per loop %timeit interp1d(a, b, kind='cubic') 1 loops, best of 3: 5.15 s per loop I'm wondering why it takes 5 orders of magnitude (x72000) longer to calculate a cubic interpolator than a linear interpolator? cheers, Andrew. -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From eraldo.pomponi at gmail.com Thu Jan 15 04:49:22 2015 From: eraldo.pomponi at gmail.com (Eraldo Pomponi) Date: Thu, 15 Jan 2015 10:49:22 +0100 Subject: [SciPy-User] interpolate.interp1d - constructing a cubic interpolator is Slllooooooow. In-Reply-To: References: Message-ID: Dear Andrew, On Thu, Jan 15, 2015 at 9:46 AM, Andrew Nelson wrote: > I'm intending to use interpolation in a curvefitting function. So have > been investigating the use of interpolate.interp1d. > I'd prefer to use cubic interpolation but it seems to take ages: > > import numpy as np > > from scipy.interpolate import interp1d > > a = np.linspace(-2 * np.pi, 2 * np.pi, 1000) > > b = np.cos(a) > > %timeit interp1d(a, b) > > 10000 loops, best of 3: 71.6 ?s per loop > > %timeit interp1d(a, b, kind='cubic') > > 1 loops, best of 3: 5.15 s per loop > > > I'm wondering why it takes 5 orders of magnitude (x72000) longer to > calculate a cubic interpolator than a linear interpolator? > I can reproduce your results but I cannot comment on the reason why there exist a so big difference between the two cases. On the other hand, following the documentation, I would go for the use of the more recent UnivariateSpline ( http://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.interpolate.UnivariateSpline.html#scipy.interpolate.UnivariateSpline) class that doesn't have this drawback. On my system, following your example, I get: %timeit spl = UnivariateSpline(a,b,k=3) 1000 loops, best of 3: 271 ?s per loop Cheers, Eraldo -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Thu Jan 15 06:04:51 2015 From: andyfaff at gmail.com (Andrew Nelson) Date: Thu, 15 Jan 2015 22:04:51 +1100 Subject: [SciPy-User] interpolate.interp1d - constructing a cubic interpolator is Slllooooooow. In-Reply-To: References: Message-ID: Great, thanks. I found that interpolate.InterpolatedUnivariateSpline is going to do the job for me. On 15 January 2015 at 20:49, Eraldo Pomponi wrote: > Dear Andrew, > > On Thu, Jan 15, 2015 at 9:46 AM, Andrew Nelson wrote: > >> I'm intending to use interpolation in a curvefitting function. So have >> been investigating the use of interpolate.interp1d. >> I'd prefer to use cubic interpolation but it seems to take ages: >> >> import numpy as np >> >> from scipy.interpolate import interp1d >> >> a = np.linspace(-2 * np.pi, 2 * np.pi, 1000) >> >> b = np.cos(a) >> >> %timeit interp1d(a, b) >> >> 10000 loops, best of 3: 71.6 ?s per loop >> >> %timeit interp1d(a, b, kind='cubic') >> >> 1 loops, best of 3: 5.15 s per loop >> >> >> I'm wondering why it takes 5 orders of magnitude (x72000) longer to >> calculate a cubic interpolator than a linear interpolator? >> > > I can reproduce your results but I cannot comment on the reason why there > exist a so big difference between the two cases. On the other hand, > following the documentation, I would go for the use of the more recent > UnivariateSpline ( > http://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.interpolate.UnivariateSpline.html#scipy.interpolate.UnivariateSpline) > class that doesn't have this drawback. On my system, following your > example, I get: > > %timeit spl = UnivariateSpline(a,b,k=3) > > 1000 loops, best of 3: 271 ?s per loop > > > Cheers, > > Eraldo > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Jan 15 06:05:09 2015 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 15 Jan 2015 11:05:09 +0000 Subject: [SciPy-User] interpolate.interp1d - constructing a cubic interpolator is Slllooooooow. In-Reply-To: References: Message-ID: Hi, First, the "why" question: interp1d(..., kind='cubic') constructs a continuously differentiable cubic spline. This requires solving an N-by-N linear system of equations, where N=1000 in your example. As an implementation detail, cubic spline interpolation can be made to use banded matrices, but `interp1d(..., kind='cubic')` uses full matrices under the hood, with corresponding consequences for the time/memory footprint. On the other hand, UnivariateSpline (and its equivalent splrep/splev combo) uses a different, more efficient, formulation of the linear algebraic problem. This explains what Eraldo noted, that the UnivariateSpline is faster to construct (and likely faster to evaluate as well). Notice that to get an interpolating spline from FITPACK wrappers, you can use `s=0` like so: >>> UnivariateSpline(x, y, k=3, s=0) Now, depending on your use case you might not need a continuous differentiability. If this is the case, you do not really need a global spline fitting process. For local interpolating schemes, you can use Pchip or Akima1DInterpolator, or you can build your own cubic Hermite interpolator using PPoly or BPoly.from_derivatives. (do *not* use PiecewisePolynomial, if you care about speed at all). Finally, If you feel adventurous, you can try https://github.com/scipy/scipy/pull/3174 which replaces the implementation of `interp1d(..., kind='cubic')`, and adds several new spline constructors for interpolating and LSQ splines. That PR is not yet merged, so comments welcome. Cheers, Evgeni On Thu, Jan 15, 2015 at 9:49 AM, Eraldo Pomponi wrote: > Dear Andrew, > > On Thu, Jan 15, 2015 at 9:46 AM, Andrew Nelson wrote: >> >> I'm intending to use interpolation in a curvefitting function. So have >> been investigating the use of interpolate.interp1d. >> I'd prefer to use cubic interpolation but it seems to take ages: >> >> import numpy as np >> >> from scipy.interpolate import interp1d >> >> a = np.linspace(-2 * np.pi, 2 * np.pi, 1000) >> >> b = np.cos(a) >> >> %timeit interp1d(a, b) >> >> 10000 loops, best of 3: 71.6 ?s per loop >> >> %timeit interp1d(a, b, kind='cubic') >> >> 1 loops, best of 3: 5.15 s per loop >> >> >> I'm wondering why it takes 5 orders of magnitude (x72000) longer to >> calculate a cubic interpolator than a linear interpolator? > > > I can reproduce your results but I cannot comment on the reason why there > exist a so big difference between the two cases. On the other hand, > following the documentation, I would go for the use of the more recent > UnivariateSpline > (http://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.interpolate.UnivariateSpline.html#scipy.interpolate.UnivariateSpline) > class that doesn't have this drawback. On my system, following your example, > I get: > > %timeit spl = UnivariateSpline(a,b,k=3) > > 1000 loops, best of 3: 271 ?s per loop > > > Cheers, > > Eraldo > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > From peterskurt at msn.com Thu Jan 15 10:00:01 2015 From: peterskurt at msn.com (KURT PETERS) Date: Thu, 15 Jan 2015 08:00:01 -0700 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables In-Reply-To: References: Message-ID: See below > Date: Thu, 15 Jan 2015 11:01:56 +1100 > From: Geordie McBain > Subject: Re: [SciPy-User] optimize.minimize - help me understand > arrays as variables (Andrew Nelson) > To: SciPy Users List > Message-ID: > > Content-Type: text/plain; charset=UTF-8 > > 2015-01-13 10:26 GMT+11:00 KURT PETERS : > > > Date: Sun, 11 Jan 2015 17:55:32 -0700 > >> From: KURT PETERS > >> Subject: [SciPy-User] optimize.minimize - help me understand arrays as > >> variables > >> To: "scipy-user at scipy.org" > >> Message-ID: > >> Content-Type: text/plain; charset="iso-8859-1" > >> > >> I'm trying to use scipy.optimize.minimize. > >> I've tried multiple "multivariate" methods that don't seem to actually > >> take multivariate data and derivatives. Can someone tell me how I can make > >> the multivariate part of the solver actually work? > >> > >> Here's an example: > >> My main function the following (typical length for N is 3): > >> > >> input guess is a x0=np.array([1,2,3]) > >> the optimization function returns: > >> def calc_f3d(...): > >> f3d = np.ones((np.max([3,N]),1) > >> .... do some assignments to f3d[row,0] .... > >> return np.linalg.norm(f3d) # numpy.array that's 3x1 > >> > >> The jacobian returns a Nx3 matrix: > >> def jacob3d(...): > >> df = np.ones((np.max([3,N]),3)) > >> ... do some assignments to df[row,col] > >> return df # note numpy.array that's 3x3 > > Hello. I think that this might be the problem here: the jac should > have the same shape as x, i.e. (N,), not (N, 3); the components of the > jac are the partial derivatives of fun with respect to the > corresponding components of x. Think of it as the gradient of the > objective. > > Here's a simple shape=(2,) example, taken from D. M. Greig's > Optimisation (1980, London: Longman). The exact minimum is 0 at [1, > 1]. > > def f(x): # Greig (1980, p. 48) > return (x[1] - x[0]**2)**2 + (1 - x[0])**2 > > def g(x): # ibid > return np.array([-4*x[0]*(x[1] - x[0]**2) - 2 + 2*x[0], > 2 * (x[1] - x[0]**2)]) > > x = np.zeros(2) > print('Without Jacobian: ', minimize(f, x)) > print('\nWith:', minimize(f, x, jac=g)) Geordie, I don't think that's the case. Everything I've ever learned about the Jacobian is that it's the partials of each function with respect to each variable... so two equations with two unknowns, would yield a 2x2. Here's a wiki explaining what I mean: http://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant If what you're saying is right, then the people that developed the function don't know what a Jacobian is. I would find that hard to believe. Kurt -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Thu Jan 15 10:45:26 2015 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Thu, 15 Jan 2015 16:45:26 +0100 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables In-Reply-To: References: Message-ID: On 15 January 2015 at 16:00, KURT PETERS wrote: > Geordie, > I don't think that's the case. Everything I've ever learned about the > Jacobian is that it's the partials of each function with respect to each > variable... so two equations with two unknowns, would yield a 2x2. Ah, but for a minimisation problem you can only have one equation. You can't take the minimum of a vector function, because you can't say one vector is smaller than the other (or, in other words, there are many possible definitions of smaller, and you have to choose). |R^n is not ordered. In your case, your function to be minimised returns np.linalg.norm(f3d), that is, a single number, f(\vec x). Therefore, the jacobian is a vector: [d f(\vec x)/d x_0, d f(\vec x)/d x_1, d f(\vec x)/d x_2]. I believe your results are incorrect because how you are defining f3d: f3d = np.ones((np.max([3,N]),1) There is no need for the extra dimension, just f3d = np.ones(max(3, N)) And do the assignments to f3d[row] /David. From peterskurt at msn.com Thu Jan 15 12:50:27 2015 From: peterskurt at msn.com (KURT PETERS) Date: Thu, 15 Jan 2015 10:50:27 -0700 Subject: [SciPy-User] 2. Re: optimize.minimize - help me understand arrays as variables In-Reply-To: References: Message-ID: > Date: Thu, 15 Jan 2015 11:01:56 +1100 > From: Geordie McBain > Subject: Re: [SciPy-User] optimize.minimize - help me understand > arrays as variables (Andrew Nelson) > To: SciPy Users List > Message-ID: > > Content-Type: text/plain; charset=UTF-8 > > 2015-01-13 10:26 GMT+11:00 KURT PETERS : > > > Date: Sun, 11 Jan 2015 17:55:32 -0700 > >> From: KURT PETERS > >> Subject: [SciPy-User] optimize.minimize - help me understand arrays as > >> variables > >> To: "scipy-user at scipy.org" > >> Message-ID: > >> Content-Type: text/plain; charset="iso-8859-1" > >> > >> I'm trying to use scipy.optimize.minimize. > >> I've tried multiple "multivariate" methods that don't seem to actually > >> take multivariate data and derivatives. Can someone tell me how I can make > >> the multivariate part of the solver actually work? > >> > >> Here's an example: > >> My main function the following (typical length for N is 3): > >> > >> input guess is a x0=np.array([1,2,3]) > >> the optimization function returns: > >> def calc_f3d(...): > >> f3d = np.ones((np.max([3,N]),1) > >> .... do some assignments to f3d[row,0] .... > >> return np.linalg.norm(f3d) # numpy.array that's 3x1 > >> > >> The jacobian returns a Nx3 matrix: > >> def jacob3d(...): > >> df = np.ones((np.max([3,N]),3)) > >> ... do some assignments to df[row,col] > >> return df # note numpy.array that's 3x3 > > Hello. I think that this might be the problem here: the jac should > have the same shape as x, i.e. (N,), not (N, 3); the components of the > jac are the partial derivatives of fun with respect to the > corresponding components of x. Think of it as the gradient of the > objective. > > Here's a simple shape=(2,) example, taken from D. M. Greig's > Optimisation (1980, London: Longman). The exact minimum is 0 at [1, > 1]. > > def f(x): # Greig (1980, p. 48) > return (x[1] - x[0]**2)**2 + (1 - x[0])**2 > > def g(x): # ibid > return np.array([-4*x[0]*(x[1] - x[0]**2) - 2 + 2*x[0], > 2 * (x[1] - x[0]**2)]) > > x = np.zeros(2) > print('Without Jacobian: ', minimize(f, x)) > print('\nWith:', minimize(f, x, jac=g)) I'm going to try the root function. I just saw the words "multivariate scalar function" in the documentation for minimize. Maybe my assumption was that it could handle multiple functions. As I read further, there's a distinction to multiple functions in the "Root" function, such as with Krylov. I'm going to see if that behaves the way I would expect. Perhaps, I misunderstood the documentation. I'm going to try that and let the group know how that worked. Kurt -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Jan 18 14:22:54 2015 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 18 Jan 2015 21:22:54 +0200 Subject: [SciPy-User] ANN: Scipy 0.15.1 Message-ID: <54BC080E.7040109@iki.fi> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Dear all, We are pleased to announce the Scipy 0.15.1 release. Scipy 0.15.1 contains only bugfixes. The module ``scipy.linalg.calc_lwork`` removed in Scipy 0.15.0 is restored. This module is not a part of Scipy's public API, and although it is available again in Scipy 0.15.1, using it is deprecated and it may be removed again in a future Scipy release. Source tarballs, binaries, and full release notes are available at https://sourceforge.net/projects/scipy/files/scipy/0.15.1/ Best regards, Pauli Virtanen ========================== SciPy 0.15.1 Release Notes ========================== SciPy 0.15.1 is a bug-fix release with no new features compared to 0.15.0. Issues fixed - ------------ * `#4413 `__: BUG: Tests too strict, f2py doesn't have to overwrite this array * `#4417 `__: BLD: avoid using NPY_API_VERSION to check not using deprecated... * `#4418 `__: Restore and deprecate scipy.linalg.calc_work -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iEYEARECAAYFAlS8CA4ACgkQ6BQxb7O0pWCmOQCgzg9AXDaqRaK5/QBWopIrv2OA WkEAn0ltDfDHFpw0zMzB9mUscAAb2xnE =JrGj -----END PGP SIGNATURE----- From renan.ee.ufsm at gmail.com Mon Jan 19 19:07:04 2015 From: renan.ee.ufsm at gmail.com (Renan Birck Pinheiro) Date: Mon, 19 Jan 2015 22:07:04 -0200 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function Message-ID: I'm trying to design a analog Butterworth filter using the 'buttord' function. My settings are: Passband frequency (Fpass) = 10 Hz -> Wp = 2*pi*10 Hz Stopband frequency (Fstop) = 100 Hz -> Ws = 2*pi*100 Hz The passband and stopband losses/attenuations (Rp, Rs) are 3 and 80 dB respectively. In MATLAB I use the command line [N, Wn] = buttord(Wp, Ws, Rp, Rs, 's') that gives me N = 5, Wn = 99.581776302. But in SciPy I did this: from numpy import pi from scipy import signal Wp = 2 * pi * 10 Ws = 2 * pi * 100 Rp = 3 Rs = 80 (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) and I get N = 5 and Wn = 62.861698649592753. Wn is different than the value that MATLAB gives. What am I doing wrong here? I read both the documentation of MATLAB and SciPy and couldn't find the problem. ?Thanks,? -- Renan Birck Pinheiro - Chip Inside Tecnologia Acad. Engenharia El?trica - UFSM - Santa Maria, Brasil http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 *Talk is cheap, show me the code*. - Linus Torvalds -------------- next part -------------- An HTML attachment was scrubbed... URL: From peterskurt at msn.com Mon Jan 19 20:41:36 2015 From: peterskurt at msn.com (KURT PETERS) Date: Mon, 19 Jan 2015 18:41:36 -0700 Subject: [SciPy-User] optimize.minimize - help me understand arrays as variables (KURT PETERS) In-Reply-To: References: Message-ID: > Date: Thu, 15 Jan 2015 08:00:01 -0700 > From: KURT PETERS > Subject: Re: [SciPy-User] optimize.minimize - help me understand > arrays as variables > To: "scipy-user at scipy.org" , > "gdmcbain at freeshell.org" > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > See below > > Date: Thu, 15 Jan 2015 11:01:56 +1100 > > From: Geordie McBain > > Subject: Re: [SciPy-User] optimize.minimize - help me understand > > arrays as variables (Andrew Nelson) > > To: SciPy Users List > > Message-ID: > > > > Content-Type: text/plain; charset=UTF-8 > > > > 2015-01-13 10:26 GMT+11:00 KURT PETERS : > > > > Date: Sun, 11 Jan 2015 17:55:32 -0700 > > >> From: KURT PETERS > > >> Subject: [SciPy-User] optimize.minimize - help me understand arrays as > > >> variables > > >> To: "scipy-user at scipy.org" > > >> Message-ID: > > >> Content-Type: text/plain; charset="iso-8859-1" > > >> > > >> I'm trying to use scipy.optimize.minimize. > > >> I've tried multiple "multivariate" methods that don't seem to actually > > >> take multivariate data and derivatives. Can someone tell me how I can make > > >> the multivariate part of the solver actually work? > > >> > > >> Here's an example: > > >> My main function the following (typical length for N is 3): > > >> > > >> input guess is a x0=np.array([1,2,3]) > > >> the optimization function returns: > > >> def calc_f3d(...): > > >> f3d = np.ones((np.max([3,N]),1) > > >> .... do some assignments to f3d[row,0] .... > > >> return np.linalg.norm(f3d) # numpy.array that's 3x1 > > >> > > >> The jacobian returns a Nx3 matrix: > > >> def jacob3d(...): > > >> df = np.ones((np.max([3,N]),3)) > > >> ... do some assignments to df[row,col] > > >> return df # note numpy.array that's 3x3 > > > > Hello. I think that this might be the problem here: the jac should > > have the same shape as x, i.e. (N,), not (N, 3); the components of the > > jac are the partial derivatives of fun with respect to the > > corresponding components of x. Think of it as the gradient of the > > objective. > > > > Here's a simple shape=(2,) example, taken from D. M. Greig's > > Optimisation (1980, London: Longman). The exact minimum is 0 at [1, > > 1]. > > > > def f(x): # Greig (1980, p. 48) > > return (x[1] - x[0]**2)**2 + (1 - x[0])**2 > > > > def g(x): # ibid > > return np.array([-4*x[0]*(x[1] - x[0]**2) - 2 + 2*x[0], > > 2 * (x[1] - x[0]**2)]) > > > > x = np.zeros(2) > > print('Without Jacobian: ', minimize(f, x)) > > print('\nWith:', minimize(f, x, jac=g)) > > Geordie, > I don't think that's the case. Everything I've ever learned about the Jacobian is that it's the partials of each function with respect to each variable... so two equations with two unknowns, would yield a 2x2. Here's a wiki explaining what I mean: > http://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant > > If what you're saying is right, then the people that developed the function don't know what a Jacobian is. I would find that hard to believe. > > Kurt > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: http://mail.scipy.org/pipermail/scipy-user/attachments/20150115/9a06ee97/attachment-0001.html > > ------------------------------ > > Message: 3 > Date: Thu, 15 Jan 2015 16:45:26 +0100 > From: Da?id > Subject: Re: [SciPy-User] optimize.minimize - help me understand > arrays as variables > To: SciPy Users List > Message-ID: > > Content-Type: text/plain; charset=UTF-8 > > On 15 January 2015 at 16:00, KURT PETERS wrote: > > Geordie, > > I don't think that's the case. Everything I've ever learned about the > > Jacobian is that it's the partials of each function with respect to each > > variable... so two equations with two unknowns, would yield a 2x2. > > Ah, but for a minimisation problem you can only have one equation. You > can't take the minimum of a vector function, because you can't say one > vector is smaller than the other (or, in other words, there are many > possible definitions of smaller, and you have to choose). |R^n is not > ordered. > > In your case, your function to be minimised returns > np.linalg.norm(f3d), that is, a single number, f(\vec x). Therefore, > the jacobian is a vector: [d f(\vec x)/d x_0, d f(\vec x)/d x_1, d > f(\vec x)/d x_2]. > > I believe your results are incorrect because how you are defining f3d: > > f3d = np.ones((np.max([3,N]),1) > > There is no need for the extra dimension, just > > f3d = np.ones(max(3, N)) > > And do the assignments to f3d[row] > > > /David. > > > ------------------------------ > > Message: 4 > Date: Thu, 15 Jan 2015 10:50:27 -0700 > From: KURT PETERS > Subject: [SciPy-User] 2. Re: optimize.minimize - help me understand > arrays as variables > To: "scipy-user at scipy.org" , > "gdmcbain at freeshell.org" > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > > Date: Thu, 15 Jan 2015 11:01:56 +1100 > > From: Geordie McBain > > Subject: Re: [SciPy-User] optimize.minimize - help me understand > > arrays as variables (Andrew Nelson) > > To: SciPy Users List > > Message-ID: > > > > Content-Type: text/plain; charset=UTF-8 > > > > 2015-01-13 10:26 GMT+11:00 KURT PETERS : > > > > Date: Sun, 11 Jan 2015 17:55:32 -0700 > > >> From: KURT PETERS > > >> Subject: [SciPy-User] optimize.minimize - help me understand arrays as > > >> variables > > >> To: "scipy-user at scipy.org" > > >> Message-ID: > > >> Content-Type: text/plain; charset="iso-8859-1" > > >> > > >> I'm trying to use scipy.optimize.minimize. > > >> I've tried multiple "multivariate" methods that don't seem to actually > > >> take multivariate data and derivatives. Can someone tell me how I can make > > >> the multivariate part of the solver actually work? > > >> > > >> Here's an example: > > >> My main function the following (typical length for N is 3): > > >> > > >> input guess is a x0=np.array([1,2,3]) > > >> the optimization function returns: > > >> def calc_f3d(...): > > >> f3d = np.ones((np.max([3,N]),1) > > >> .... do some assignments to f3d[row,0] .... > > >> return np.linalg.norm(f3d) # numpy.array that's 3x1 > > >> > > >> The jacobian returns a Nx3 matrix: > > >> def jacob3d(...): > > >> df = np.ones((np.max([3,N]),3)) > > >> ... do some assignments to df[row,col] > > >> return df # note numpy.array that's 3x3 > > > > Hello. I think that this might be the problem here: the jac should > > have the same shape as x, i.e. (N,), not (N, 3); the components of the > > jac are the partial derivatives of fun with respect to the > > corresponding components of x. Think of it as the gradient of the > > objective. > > > > Here's a simple shape=(2,) example, taken from D. M. Greig's > > Optimisation (1980, London: Longman). The exact minimum is 0 at [1, > > 1]. > > > > def f(x): # Greig (1980, p. 48) > > return (x[1] - x[0]**2)**2 + (1 - x[0])**2 > > > > def g(x): # ibid > > return np.array([-4*x[0]*(x[1] - x[0]**2) - 2 + 2*x[0], > > 2 * (x[1] - x[0]**2)]) > > > > x = np.zeros(2) > > print('Without Jacobian: ', minimize(f, x)) > > print('\nWith:', minimize(f, x, jac=g)) > > I'm going to try the root function. I just saw the words "multivariate scalar function" in the documentation for minimize. Maybe my assumption was that it could handle multiple functions. As I read further, there's a distinction to multiple functions in the "Root" function, such as with Krylov. I'm going to see if that behaves the way I would expect. Perhaps, I misunderstood the documentation. > I'm going to try that and let the group know how that worked. > Kurt > I tried to use the root function. I think there's a problem in the scipy/optimize/minpack.py. In def _check_func, the function does a "len(output_shape)" on line 20. If one returns a proper (3,1) instead of (3,), the "len" will return a "1." This gives the function the false impression that the size is wrong. First off, those checks conducted seem to have a variety of "shoe-horned" checks: minpack.py:def _check_func(checker, argname, thefunc, x0, args, numinputs, output_shape=None): minpack.py: shape, dtype = _check_func('fsolve', 'func', func, x0, args, n, (n,)) minpack.py: _check_func('fsolve', 'fprime', Dfun, x0, args, n, (n,n)) minpack.py: shape, dtype = _check_func('leastsq', 'func', func, x0, args, n) minpack.py: _check_func('leastsq', 'Dfun', Dfun, x0, args, n, (n,m)) minpack.py: _check_func('leastsq', 'Dfun', Dfun, x0, args, n, (m,n)) Why is fsolve and _root_hybr forcing "output_shape" into (n,1)? Why don't they actually use the shape of the array passed? Why are they forcing the input to be flattened instead of accounting for both (3,1) and (3,) possibilities? Doesn't seem logical. Kurt -------------- next part -------------- An HTML attachment was scrubbed... URL: From peterskurt at msn.com Mon Jan 19 20:54:52 2015 From: peterskurt at msn.com (KURT PETERS) Date: Mon, 19 Jan 2015 18:54:52 -0700 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function (Renan Birck Pinheiro) In-Reply-To: References: Message-ID: > Date: Mon, 19 Jan 2015 22:07:04 -0200 > From: Renan Birck Pinheiro > Subject: [SciPy-User] SciPy and MATLAB give different results for > 'buttord' function > To: scipy-user at scipy.org > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > I'm trying to design a analog Butterworth filter using the 'buttord' > function. > My settings are: > > Passband frequency (Fpass) = 10 Hz -> Wp = 2*pi*10 Hz > Stopband frequency (Fstop) = 100 Hz -> Ws = 2*pi*100 Hz > > The passband and stopband losses/attenuations (Rp, Rs) are 3 and 80 dB > respectively. > > In MATLAB I use the command line > > [N, Wn] = buttord(Wp, Ws, Rp, Rs, 's') > > that gives me N = 5, Wn = 99.581776302. > > But in SciPy I did this: > > from numpy import pi > from scipy import signal > Wp = 2 * pi * 10 > Ws = 2 * pi * 100 > Rp = 3 > Rs = 80 > (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) > > and I get N = 5 and Wn = 62.861698649592753. Wn is different than the value > that MATLAB gives. > > What am I doing wrong here? I read both the documentation of MATLAB and > SciPy and couldn't find the problem. > > ?Thanks,? > > -- > Renan Birck Pinheiro - Chip Inside Tecnologia > Acad. Engenharia El?trica - UFSM > - Santa Maria, Brasil > http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 I think Wp and Ws are supposed to be between 0 and 1. -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip at pobox.com Tue Jan 20 12:14:59 2015 From: skip at pobox.com (Skip Montanaro) Date: Tue, 20 Jan 2015 17:14:59 +0000 (UTC) Subject: [SciPy-User] scipy.signal.resample muffs my timestamps? Message-ID: I want to resample a large (400k+) dataset where x are datetime objects and y are floats. The x data are epoch seconds from the past week. For the purposes of this example, I've crudely downsampled them, choosing every 10 elements (Python prompt changed to "... " to fool Gmane). ... len(t) 43051 ... len(x) 43051 ... pprint([datetime.datetime.fromtimestamp(_) for _ in t[:10]]) [datetime.datetime(2015, 1, 12, 0, 0), datetime.datetime(2015, 1, 12, 0, 0, 46, 742044), datetime.datetime(2015, 1, 12, 0, 1, 3, 320089), datetime.datetime(2015, 1, 12, 0, 1, 23, 700560), datetime.datetime(2015, 1, 12, 0, 1, 44, 583401), datetime.datetime(2015, 1, 12, 0, 1, 57, 733937), datetime.datetime(2015, 1, 12, 0, 2, 38, 30245), datetime.datetime(2015, 1, 12, 0, 3, 35, 336342), datetime.datetime(2015, 1, 12, 0, 4, 23, 833251), datetime.datetime(2015, 1, 12, 0, 4, 48, 272131)] ... pprint([datetime.datetime.fromtimestamp(_) for _ in t[-10:]]) [datetime.datetime(2015, 1, 19, 23, 56, 9, 996926), datetime.datetime(2015, 1, 19, 23, 56, 12, 104080), datetime.datetime(2015, 1, 19, 23, 56, 12, 158963), datetime.datetime(2015, 1, 19, 23, 56, 12, 280701), datetime.datetime(2015, 1, 19, 23, 56, 12, 337853), datetime.datetime(2015, 1, 19, 23, 56, 22, 169709), datetime.datetime(2015, 1, 19, 23, 56, 29, 676865), datetime.datetime(2015, 1, 19, 23, 57, 14, 570601), datetime.datetime(2015, 1, 19, 23, 58, 56, 394975), datetime.datetime(2015, 1, 19, 23, 59, 37, 707367)] So, let's get started, downsampling our 43k points to 250: ... res_x, res_t = signal.resample(x, 250, t) (Final Jeopardy tune plays...) ... If I understand correctly, signal.resample should generate 250 evenly spaced points from each of the inputs. ... len(res_x) 250 ... len(res_t) 250 So far, so good. Now, look at the range of res_t: ... pprint([datetime.datetime.fromtimestamp(_) for _ in res_t[:10]]) [datetime.datetime(2015, 1, 12, 0, 0), datetime.datetime(2015, 1, 12, 2, 14, 9, 166940), datetime.datetime(2015, 1, 12, 4, 28, 18, 333880), datetime.datetime(2015, 1, 12, 6, 42, 27, 500820), datetime.datetime(2015, 1, 12, 8, 56, 36, 667761), datetime.datetime(2015, 1, 12, 11, 10, 45, 834701), datetime.datetime(2015, 1, 12, 13, 24, 55, 1641), datetime.datetime(2015, 1, 12, 15, 39, 4, 168581), datetime.datetime(2015, 1, 12, 17, 53, 13, 335521), datetime.datetime(2015, 1, 12, 20, 7, 22, 502461)] ... pprint([datetime.datetime.fromtimestamp(_) for _ in res_t[-10:]]) [datetime.datetime(2015, 2, 3, 8, 36, 40, 65638), datetime.datetime(2015, 2, 3, 10, 50, 49, 232578), datetime.datetime(2015, 2, 3, 13, 4, 58, 399518), datetime.datetime(2015, 2, 3, 15, 19, 7, 566458), datetime.datetime(2015, 2, 3, 17, 33, 16, 733398), datetime.datetime(2015, 2, 3, 19, 47, 25, 900338), datetime.datetime(2015, 2, 3, 22, 1, 35, 67279), datetime.datetime(2015, 2, 4, 0, 15, 44, 234219), datetime.datetime(2015, 2, 4, 2, 29, 53, 401159), datetime.datetime(2015, 2, 4, 4, 44, 2, 568099)] That doesn't look right at all. I'm sure I'm using an outdated version of scipy: ... scipy.version.version '0.9.0' but it's what I have available (it's a long story). If this is a bug requiring upgrade, I'll beat on the powers that be to get a newer version of scipy. I'm happy to provide my data to anyone who would be willing to try this exercise out using a more recent version. Thanks, Skip Montanaro From renan.ee.ufsm at gmail.com Tue Jan 20 13:11:00 2015 From: renan.ee.ufsm at gmail.com (Renan Birck Pinheiro) Date: Tue, 20 Jan 2015 16:11:00 -0200 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function (Re: SciPy-User Digest, Vol 137, Issue 17) Message-ID: 2015-01-20 16:00 GMT-02:00 : > Message: 1 > Date: Mon, 19 Jan 2015 18:54:52 -0700 > From: KURT PETERS > Subject: Re: [SciPy-User] > ?? > SciPy and MATLAB give different results for > 'buttord' function (Renan Birck Pinheiro) > To: "scipy-user at scipy.org" > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > > I think Wp and Ws are supposed to be between 0 and 1. > ?If it was a digital filter, indeed Wp and Ws would need to be between 0 and 1. But this is an analog filter, and in the SciPy documentation it says: "For analog filters, wp and ws are angular frequencies (e.g. rad/s).".? ?Thanks, ?Renan? -- Renan Birck Pinheiro - Chip Inside Tecnologia Acad. Engenharia El?trica - UFSM - Santa Maria, Brasil http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 *Talk is cheap, show me the code*. - Linus Torvalds -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jan 20 14:34:47 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 20 Jan 2015 20:34:47 +0100 Subject: [SciPy-User] scipy.signal.resample muffs my timestamps? In-Reply-To: References: Message-ID: On Tue, Jan 20, 2015 at 6:14 PM, Skip Montanaro wrote: > I want to resample a large (400k+) dataset where x are datetime > objects and y are floats. The x data are epoch seconds from the past > week. For the purposes of this example, I've crudely downsampled them, > choosing every 10 elements (Python prompt changed to "... " to fool > Gmane). > > ... len(t) > 43051 > ... len(x) > 43051 > ... pprint([datetime.datetime.fromtimestamp(_) for _ in t[:10]]) > [datetime.datetime(2015, 1, 12, 0, 0), > datetime.datetime(2015, 1, 12, 0, 0, 46, 742044), > datetime.datetime(2015, 1, 12, 0, 1, 3, 320089), > datetime.datetime(2015, 1, 12, 0, 1, 23, 700560), > datetime.datetime(2015, 1, 12, 0, 1, 44, 583401), > datetime.datetime(2015, 1, 12, 0, 1, 57, 733937), > datetime.datetime(2015, 1, 12, 0, 2, 38, 30245), > datetime.datetime(2015, 1, 12, 0, 3, 35, 336342), > datetime.datetime(2015, 1, 12, 0, 4, 23, 833251), > datetime.datetime(2015, 1, 12, 0, 4, 48, 272131)] > ... pprint([datetime.datetime.fromtimestamp(_) for _ in t[-10:]]) > [datetime.datetime(2015, 1, 19, 23, 56, 9, 996926), > datetime.datetime(2015, 1, 19, 23, 56, 12, 104080), > datetime.datetime(2015, 1, 19, 23, 56, 12, 158963), > datetime.datetime(2015, 1, 19, 23, 56, 12, 280701), > datetime.datetime(2015, 1, 19, 23, 56, 12, 337853), > datetime.datetime(2015, 1, 19, 23, 56, 22, 169709), > datetime.datetime(2015, 1, 19, 23, 56, 29, 676865), > datetime.datetime(2015, 1, 19, 23, 57, 14, 570601), > datetime.datetime(2015, 1, 19, 23, 58, 56, 394975), > datetime.datetime(2015, 1, 19, 23, 59, 37, 707367)] > > So, let's get started, downsampling our 43k points to 250: > > ... res_x, res_t = signal.resample(x, 250, t) > (Final Jeopardy tune plays...) > ... > > If I understand correctly, signal.resample should generate 250 evenly > spaced points from each of the inputs. > > ... len(res_x) > 250 > ... len(res_t) > 250 > > So far, so good. Now, look at the range of res_t: > > ... pprint([datetime.datetime.fromtimestamp(_) for _ in res_t[:10]]) > [datetime.datetime(2015, 1, 12, 0, 0), > datetime.datetime(2015, 1, 12, 2, 14, 9, 166940), > datetime.datetime(2015, 1, 12, 4, 28, 18, 333880), > datetime.datetime(2015, 1, 12, 6, 42, 27, 500820), > datetime.datetime(2015, 1, 12, 8, 56, 36, 667761), > datetime.datetime(2015, 1, 12, 11, 10, 45, 834701), > datetime.datetime(2015, 1, 12, 13, 24, 55, 1641), > datetime.datetime(2015, 1, 12, 15, 39, 4, 168581), > datetime.datetime(2015, 1, 12, 17, 53, 13, 335521), > datetime.datetime(2015, 1, 12, 20, 7, 22, 502461)] > ... pprint([datetime.datetime.fromtimestamp(_) for _ in res_t[-10:]]) > [datetime.datetime(2015, 2, 3, 8, 36, 40, 65638), > datetime.datetime(2015, 2, 3, 10, 50, 49, 232578), > datetime.datetime(2015, 2, 3, 13, 4, 58, 399518), > datetime.datetime(2015, 2, 3, 15, 19, 7, 566458), > datetime.datetime(2015, 2, 3, 17, 33, 16, 733398), > datetime.datetime(2015, 2, 3, 19, 47, 25, 900338), > datetime.datetime(2015, 2, 3, 22, 1, 35, 67279), > datetime.datetime(2015, 2, 4, 0, 15, 44, 234219), > datetime.datetime(2015, 2, 4, 2, 29, 53, 401159), > datetime.datetime(2015, 2, 4, 4, 44, 2, 568099)] > > That doesn't look right at all. > > I'm sure I'm using an outdated version of scipy: > > ... scipy.version.version > '0.9.0' > > but it's what I have available (it's a long story). > > If this is a bug requiring upgrade, I'll beat on the powers that be to > get a newer version of scipy. I'm happy to provide my data to anyone > who would be willing to try this exercise out using a more recent > version. > I doubt that an upgrade will fix your issue; I don't see any bug fixes to signal.resample since 0.9.0 that look relevant. I don't understand that this works for you at all, a quick test with ``t = list_of_datetimes`` gives me: TypeError: unsupported operand type(s) for /: 'datetime.timedelta' and 'float' If you can provide a reproducible example on a generated set of data, that would be the easiest (we can use that as a regression test). Otherwise providing your code with your actual dataset is also OK - if you send me a link or email it to me I'll have a look. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip at pobox.com Tue Jan 20 14:36:35 2015 From: skip at pobox.com (Skip Montanaro) Date: Tue, 20 Jan 2015 19:36:35 +0000 (UTC) Subject: [SciPy-User] scipy.signal.resample muffs my timestamps? References: Message-ID: I managed to download, build and install scipy 0.15.1. I get a similar (though quantitatively different) result. >>> len(t) 430509 >>> len(x) 430509 >>> res_x, res_t = signal.resample(x[::100], 250, t[::100]) >>> len(res_x) 250 >>> len(res_t) 250 >>> t[-1] 1421733595.509921 >>> res_t[-1] 1422456460.5224724 >>> pprint([datetime.datetime.fromtimestamp(t[0]), datetime.datetime.fromtimestamp(t[-1])]) [datetime.datetime(2015, 1, 12, 0, 0), datetime.datetime(2015, 1, 19, 23, 59, 55, 509921)] >>> pprint([datetime.datetime.fromtimestamp(res_t[0]), datetime.datetime.fromtimestamp(res_t[-1])]) [datetime.datetime(2015, 1, 12, 0, 0), datetime.datetime(2015, 1, 28, 8, 47, 40, 522472)] I assume I'm doing something wrong to cause it to expand the range like that. I didn't see any arguments in the help() output which obviously suggested I could change this particular behavior though. From josef.pktd at gmail.com Tue Jan 20 14:52:24 2015 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 20 Jan 2015 14:52:24 -0500 Subject: [SciPy-User] scipy.signal.resample muffs my timestamps? In-Reply-To: References: Message-ID: On Tue, Jan 20, 2015 at 2:36 PM, Skip Montanaro wrote: > I managed to download, build and install scipy 0.15.1. I get a > similar (though quantitatively different) result. > > >>> len(t) > 430509 > >>> len(x) > 430509 > >>> res_x, res_t = signal.resample(x[::100], 250, t[::100]) > >>> len(res_x) > 250 > >>> len(res_t) > 250 > >>> t[-1] > 1421733595.509921 > >>> res_t[-1] > 1422456460.5224724 > >>> pprint([datetime.datetime.fromtimestamp(t[0]), > datetime.datetime.fromtimestamp(t[-1])]) > [datetime.datetime(2015, 1, 12, 0, 0), > datetime.datetime(2015, 1, 19, 23, 59, 55, 509921)] > >>> pprint([datetime.datetime.fromtimestamp(res_t[0]), > datetime.datetime.fromtimestamp(res_t[-1])]) > [datetime.datetime(2015, 1, 12, 0, 0), > datetime.datetime(2015, 1, 28, 8, 47, 40, 522472)] > > I assume I'm doing something wrong to cause it to expand the range > like that. I didn't see any arguments in the help() output which > obviously suggested I could change this particular behavior though. > In case it's rounding issues (my guess), you could try to subtract t[0] from t, and add it again after the resample. There is a small chance it helps. Josef > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Tue Jan 20 15:35:47 2015 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Tue, 20 Jan 2015 15:35:47 -0500 Subject: [SciPy-User] scipy.signal.resample muffs my timestamps? In-Reply-To: References: Message-ID: On Tue, Jan 20, 2015 at 2:36 PM, Skip Montanaro wrote: > I managed to download, build and install scipy 0.15.1. I get a > similar (though quantitatively different) result. > > >>> len(t) > 430509 > >>> len(x) > 430509 > >>> res_x, res_t = signal.resample(x[::100], 250, t[::100]) > >>> len(res_x) > 250 > >>> len(res_t) > 250 > >>> t[-1] > 1421733595.509921 > >>> res_t[-1] > 1422456460.5224724 > >>> pprint([datetime.datetime.fromtimestamp(t[0]), > datetime.datetime.fromtimestamp(t[-1])]) > [datetime.datetime(2015, 1, 12, 0, 0), > datetime.datetime(2015, 1, 19, 23, 59, 55, 509921)] > >>> pprint([datetime.datetime.fromtimestamp(res_t[0]), > datetime.datetime.fromtimestamp(res_t[-1])]) > [datetime.datetime(2015, 1, 12, 0, 0), > datetime.datetime(2015, 1, 28, 8, 47, 40, 522472)] > > I assume I'm doing something wrong to cause it to expand the range > like that. I didn't see any arguments in the help() output which > obviously suggested I could change this particular behavior though. > > `resample` assumes the samples are uniformly spaced, but your timestamps are not. Here are your first timestamps (from your first email): In [40]: t Out[40]: [datetime.datetime(2015, 1, 12, 0, 0), datetime.datetime(2015, 1, 12, 0, 0, 46, 742044), datetime.datetime(2015, 1, 12, 0, 1, 3, 320089), datetime.datetime(2015, 1, 12, 0, 1, 23, 700560), datetime.datetime(2015, 1, 12, 0, 1, 44, 583401), datetime.datetime(2015, 1, 12, 0, 1, 57, 733937), datetime.datetime(2015, 1, 12, 0, 2, 38, 30245), datetime.datetime(2015, 1, 12, 0, 3, 35, 336342), datetime.datetime(2015, 1, 12, 0, 4, 23, 833251), datetime.datetime(2015, 1, 12, 0, 4, 48, 272131)] `dt` holds the intervals between each timestamp. For `resample` to work as expected, these should all be the same: In [41]: dt = np.array([delta.total_seconds() for delta in np.diff(d)]) In [42]: dt Out[42]: array([ 46.742044, 16.578045, 20.380471, 20.882841, 13.150536, 40.296308, 57.306097, 48.496909, 24.43888 ]) By the way, it might be just luck that `resample` didn't crash when given a sequence of `datetime.datetime` objects for `t`. I don't think any of the functions in scipy.signal were explicitly designed to handle `datetime` objects. (There are no tests of such input in the test suite.) In this case, it "works" because of the formula used to create the new time values. Because `resample` assumes the input is uniformly sampled, it needs only the first time difference to figure out the new timestamps. Here's how the new time values are computed in `resample` (`Nx` and `num` are the old and new number of samples, respectively): new_t = arange(0, num) * (t[1] - t[0]) * Nx / float(num) + t[0] I.e. new_t = arange(0, num) * new_dt + t[0] where new_dt = (t[1] - t[0]) * Nx / float(num) `t[1] - t[0]` is a `datetime.timedelta` object, and `new_t` ends up as an array (with object dtype) of `datetime.datetime` instances. Warren > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gb.gabrielebrambilla at gmail.com Tue Jan 20 15:58:35 2015 From: gb.gabrielebrambilla at gmail.com (Gabriele Brambilla) Date: Tue, 20 Jan 2015 15:58:35 -0500 Subject: [SciPy-User] Solving complex equations Message-ID: Hi, is there in python any function that solve complex equations? thanks Gabriele -------------- next part -------------- An HTML attachment was scrubbed... URL: From alan.isaac at gmail.com Tue Jan 20 16:24:18 2015 From: alan.isaac at gmail.com (Alan G Isaac) Date: Tue, 20 Jan 2015 16:24:18 -0500 Subject: [SciPy-User] Solving complex equations In-Reply-To: References: Message-ID: <54BEC782.1060401@gmail.com> On 1/20/2015 3:58 PM, Gabriele Brambilla wrote: > is there in python any function that solve complex equations? http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html hth, Alan Isaac From peterskurt at msn.com Tue Jan 20 19:47:08 2015 From: peterskurt at msn.com (KURT PETERS) Date: Tue, 20 Jan 2015 17:47:08 -0700 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function In-Reply-To: References: Message-ID: > Date: Mon, 19 Jan 2015 18:54:52 -0700 > From: KURT PETERS > Subject: Re: [SciPy-User] SciPy and MATLAB give different results for > 'buttord' function (Renan Birck Pinheiro) > To: "scipy-user at scipy.org" > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > > Date: Mon, 19 Jan 2015 22:07:04 -0200 > > From: Renan Birck Pinheiro > > Subject: [SciPy-User] SciPy and MATLAB give different results for > > 'buttord' function > > To: scipy-user at scipy.org > > Message-ID: > > > > Content-Type: text/plain; charset="utf-8" > > > > I'm trying to design a analog Butterworth filter using the 'buttord' > > function. > > My settings are: > > > > Passband frequency (Fpass) = 10 Hz -> Wp = 2*pi*10 Hz > > Stopband frequency (Fstop) = 100 Hz -> Ws = 2*pi*100 Hz > > > > The passband and stopband losses/attenuations (Rp, Rs) are 3 and 80 dB > > respectively. > > > > In MATLAB I use the command line > > > > [N, Wn] = buttord(Wp, Ws, Rp, Rs, 's') > > > > that gives me N = 5, Wn = 99.581776302. > > > > But in SciPy I did this: > > > > from numpy import pi > > from scipy import signal > > Wp = 2 * pi * 10 > > Ws = 2 * pi * 100 > > Rp = 3 > > Rs = 80 > > (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) > > > > and I get N = 5 and Wn = 62.861698649592753. Wn is different than the value > > that MATLAB gives. > > > > What am I doing wrong here? I read both the documentation of MATLAB and > > SciPy and couldn't find the problem. > > > > ?Thanks,? > > > > -- > > Renan Birck Pinheiro - Chip Inside Tecnologia > > Acad. Engenharia El?trica - UFSM > > - Santa Maria, Brasil > > http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 > > I think Wp and Ws are supposed to be between 0 and 1. I tried it myself and I get your MATLAB answer: file: buttest.py: #!/usr/bin/env python from numpy import pi from scipy import signal Wp = 2 * pi * 10 Ws = 2 * pi * 100 Rp = 3 Rs = 80 (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) print(N,Wn) ~]$ ./buttest.py (5, 99.581776302787929) -------------- next part -------------- An HTML attachment was scrubbed... URL: From renan.ee.ufsm at gmail.com Tue Jan 20 19:53:51 2015 From: renan.ee.ufsm at gmail.com (Renan Birck Pinheiro) Date: Tue, 20 Jan 2015 22:53:51 -0200 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function In-Reply-To: References: Message-ID: 2015-01-20 22:47 GMT-02:00 KURT PETERS : > > I tried it myself and I get your MATLAB answer: > file: buttest.py: > > #!/usr/bin/env python > ?? > from numpy import pi > from scipy import signal > Wp = 2 * pi * 10 > Ws = 2 * pi * 100 > Rp = 3 > Rs = 80 > (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) > print(N,Wn) > > ~]$ ./buttest.py > (5, 99.581776302787929) > Which version of SciPy are you using? I tried with 0.14.0 and 0.15.0 but couldn't reproduce your result. While poking around in the issue tracker, I found this change: https://github.com/scipy/scipy/pull/3235/files where W0 = nat / ((10 ** (0.1 * abs(gstop)) - 1) ** (1.0 / (2.0 * ord))) was changed to W0 = (GPASS - 1.0) ** (-1.0 / (2.0 * ord)). which might explain something. The first implementation is how MATLAB calculates Wn. ?Renan? -- Renan Birck Pinheiro - Chip Inside Tecnologia Acad. Engenharia El?trica - UFSM - Santa Maria, Brasil http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 *Talk is cheap, show me the code*. - Linus Torvalds -------------- next part -------------- An HTML attachment was scrubbed... URL: From sturla.molden at gmail.com Tue Jan 20 20:09:37 2015 From: sturla.molden at gmail.com (Sturla Molden) Date: Wed, 21 Jan 2015 02:09:37 +0100 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function In-Reply-To: References: Message-ID: On 21/01/15 01:47, KURT PETERS wrote: > #!/usr/bin/env python > from numpy import pi > from scipy import signal > Wp = 2 * pi * 10 > Ws = 2 * pi * 100 > Rp = 3 > Rs = 80 > (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) > print(N,Wn) > > ~]$ ./buttest.py > (5, 99.581776302787929) from numpy import pi from scipy import signal Wp = 2 * pi * 10 Ws = 2 * pi * 100 Rp = 3 Rs = 80 (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) print(N,Wn) 5 62.8616986496 SciPy 0.15 on Python 3.4 (Anaconda) Sturla From sturla.molden at gmail.com Tue Jan 20 20:11:49 2015 From: sturla.molden at gmail.com (Sturla Molden) Date: Wed, 21 Jan 2015 02:11:49 +0100 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function In-Reply-To: References: Message-ID: On 21/01/15 01:47, KURT PETERS wrote: > > > and I get N = 5 and Wn = 62.861698649592753. Wn is different than > the value > > > that MATLAB gives. Are you sure that Matlab produces the correct answer? Sturla From edrisse.chermak at gmail.com Wed Jan 21 00:29:09 2015 From: edrisse.chermak at gmail.com (Edrisse Chermak) Date: Wed, 21 Jan 2015 08:29:09 +0300 Subject: [SciPy-User] solving integro-differential equation over a square with variable boundary condition on one edge (scipy.optimize) Message-ID: <54BF3925.6010909@gmail.com> Dear Scipy users, I'm trying to solve a nonlinear integrodifferential equation over a square with Scipy by introducing a *variable* boundary condition on one edge of the square.Scipy documentation example on nonlinear solvers [1] gives a sample code on how to do solve this equation, but with a set of 4 *constant* boundary conditions on each edge : P(x,1) = 1 and P=0 on any other boundary of the square [0,1]x[0,1]. That means the top edge boundary condition is P=1 and P=0 for the 3 other edges : |from scipy.optimizeimport newton_krylov from numpyimport cosh, zeros_like, mgrid, zeros # Defining 2D grid of 75x75 dots : nx, ny= 75, 75 # Defining grid steps hx, hy= 1./(nx-1), 1./(ny-1) # Boundary Conditions for function P(x,y) over square [0,1]x[0,1] P_left, P_right= 0, 0 # P(1,y)=0 and P(0,y)=0 P_top, P_bottom= 1, 0 # P(x,1)=1 and P(x,0)=0 # Defining integro-differential equation for function P(x,y) def residual(P): d2x= zeros_like(P) d2y= zeros_like(P) # Central Difference Approximation d2x[1:-1] = (P[2:] - 2*P[1:-1] + P[:-2]) / hx/hx d2x[0] = (P[1] - 2*P[0] + P_left)/hx/hx d2x[-1] = (P_right- 2*P[-1] + P[-2])/hx/hx d2y[:,1:-1] = (P[:,2:] - 2*P[:,1:-1] + P[:,:-2])/hy/hy d2y[:,0] = (P[:,1] - 2*P[:,0] + P_bottom)/hy/hy d2y[:,-1] = (P_top- 2*P[:,-1] + P[:,-2])/hy/hy # expression of the integro-differential equation return d2x+ d2y- 10*cosh(P).mean()**2 # Defining a guess starting solution guess= zeros((nx, ny), float) # Iterating to find the solution sol= newton_krylov(residual, guess, method='lgmres', verbose=1) print('Residual: %g' % abs(residual(sol)).max()) # visualize the solution import matplotlib.pyplotas plt x, y= mgrid[0:1:(nx*1j), 0:1:(ny*1j)] plt.pcolor(x, y, sol) plt.colorbar() plt.show()| I want to solve the same problem, with a *variable* boundary condition on the top edge that is: |P_top= P(x,1) = 1/(0.5 - abs(x))| That means that P would have a different value on the top edge of the square, as shown on the following figure : y /\ *(P_top)* | *P(x,1)=1/(0.5-|x|)* 1|__________________ | | | | P(0,y)=0 | | P(1,y)=0 (P_left) | | (P_right) | | | | |__________________|_____> x 0 1 P(x,0)=0 (P_bottom) I tried to insert P_top with the formula shown on line 11 of below code but I got an error message that variable 'x' is not defined. Would someone have any idea on how to define properly this P_top variable boundary condition in the residual ? Thanks in advance, [1] http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From klemm at phys.ethz.ch Wed Jan 21 02:37:59 2015 From: klemm at phys.ethz.ch (Hanno Klemm) Date: Wed, 21 Jan 2015 08:37:59 +0100 Subject: [SciPy-User] solving integro-differential equation over a square with variable boundary condition on one edge (scipy.optimize) In-Reply-To: <54BF3925.6010909@gmail.com> References: <54BF3925.6010909@gmail.com> Message-ID: <3018DC80-9BFF-4EDD-A87C-EDAEB00206C5@phys.ethz.ch> , please excuse my brevity. > On 21.01.2015, at 06:29, Edrisse Chermak wrote: > > Dear Scipy users, > > I'm trying to solve a nonlinear integrodifferential equation over a square with Scipy by introducing a variable boundary condition on one edge of the square. Scipy documentation example on nonlinear solvers [1] gives a sample code on how to do solve this equation, but with a set of 4 constant boundary conditions on each edge : P(x,1) = 1 and P=0 on any other boundary of the square [0,1]x[0,1]. That means the top edge boundary condition is P=1 and P=0 for the 3 other edges : > from scipy.optimize import newton_krylov > from numpy import cosh, zeros_like, mgrid, zeros > > # Defining 2D grid of 75x75 dots : > nx, ny = 75, 75 > # Defining grid steps > hx, hy = 1./(nx-1), 1./(ny-1) > # Boundary Conditions for function P(x,y) over square [0,1]x[0,1] > P_left, P_right = 0, 0 # P(1,y)=0 and P(0,y)=0 > P_top, P_bottom = 1, 0 # P(x,1)=1 and P(x,0)=0 > > # Defining integro-differential equation for function P(x,y) > def residual(P): > d2x = zeros_like(P) > d2y = zeros_like(P) > # Central Difference Approximation > d2x[1:-1] = (P[2:] - 2*P[1:-1] + P[:-2]) / hx/hx > d2x[0] = (P[1] - 2*P[0] + P_left)/hx/hx > d2x[-1] = (P_right - 2*P[-1] + P[-2])/hx/hx > d2y[:,1:-1] = (P[:,2:] - 2*P[:,1:-1] + P[:,:-2])/hy/hy > d2y[:,0] = (P[:,1] - 2*P[:,0] + P_bottom)/hy/hy > d2y[:,-1] = (P_top - 2*P[:,-1] + P[:,-2])/hy/hy > # expression of the integro-differential equation > return d2x + d2y - 10*cosh(P).mean()**2 > > # Defining a guess starting solution > guess = zeros((nx, ny), float) > # Iterating to find the solution > sol = newton_krylov(residual, guess, method='lgmres', verbose=1) > print('Residual: %g' % abs(residual(sol)).max()) > # visualize the solution > import matplotlib.pyplot as plt > x, y = mgrid[0:1:(nx*1j), 0:1:(ny*1j)] > plt.pcolor(x, y, sol) > plt.colorbar() > plt.show() > I want to solve the same problem, with a variable boundary condition on the top edge that is: > > P_top = P(x,1) = 1/(0.5 - abs(x)) > That means that P would have a different value on the top edge of the square, as shown on the following figure : > > y > /\ (P_top) > | P(x,1)=1/(0.5-|x|) > 1|__________________ > | | > | | > P(0,y)=0 | | P(1,y)=0 > (P_left) | | (P_right) > | | > | | > |__________________|_____> x > 0 1 > P(x,0)=0 > (P_bottom) > > I tried to insert P_top with the formula shown on line 11 of below code but I got an error message that variable 'x' is not defined. Would someone have any idea on how to define properly this P_top variable boundary condition in the residual ? > > Thanks in advance, > > [1] http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user P_top =1./(0.5 - np.linspace(0,1,nx)) should do the trick. For the interval 0,1 you don't need to take the modulus. Best, Hanno hanno.klemm at me.com Sent from my mobile device -------------- next part -------------- An HTML attachment was scrubbed... URL: From edrisse.chermak at gmail.com Wed Jan 21 02:54:12 2015 From: edrisse.chermak at gmail.com (Edrisse Chermak) Date: Wed, 21 Jan 2015 10:54:12 +0300 Subject: [SciPy-User] solving integro-differential equation over a square with variable boundary condition on one edge (scipy.optimize) In-Reply-To: <3018DC80-9BFF-4EDD-A87C-EDAEB00206C5@phys.ethz.ch> References: <54BF3925.6010909@gmail.com> <3018DC80-9BFF-4EDD-A87C-EDAEB00206C5@phys.ethz.ch> Message-ID: <54BF5B24.80203@gmail.com> Dear Hanno, It works perfectly now with your trick.. thanks so much for your kind help, I really appreciate. Best wishes, Edrisse On 01/21/2015 10:37 AM, Hanno Klemm wrote: > , please excuse my brevity. > > On 21.01.2015, at 06:29, Edrisse Chermak > wrote: > >> Dear Scipy users, >> >> I'm trying to solve a nonlinear integrodifferential equation over a >> square with Scipy by introducing a *variable* boundary condition on >> one edge of the square.Scipy documentation example on nonlinear >> solvers >> >> [1] gives a sample code on how to do solve this equation, but with a >> set of 4 *constant* boundary conditions on each edge : P(x,1) = 1 and >> P=0 on any other boundary of the square [0,1]x[0,1]. That means the >> top edge boundary condition is P=1 and P=0 for the 3 other edges : >> |from scipy.optimizeimport newton_krylov >> from numpyimport cosh, zeros_like, mgrid, zeros >> >> # Defining 2D grid of 75x75 dots : >> nx, ny= 75, 75 >> # Defining grid steps >> hx, hy= 1./(nx-1), 1./(ny-1) >> # Boundary Conditions for function P(x,y) over square [0,1]x[0,1] >> P_left, P_right= 0, 0 # P(1,y)=0 and P(0,y)=0 >> P_top, P_bottom= 1, 0 # P(x,1)=1 and P(x,0)=0 >> >> # Defining integro-differential equation for function P(x,y) >> def residual(P): >> d2x= zeros_like(P) >> d2y= zeros_like(P) >> # Central Difference Approximation >> d2x[1:-1] = (P[2:] - 2*P[1:-1] + P[:-2]) / hx/hx >> d2x[0] = (P[1] - 2*P[0] + P_left)/hx/hx >> d2x[-1] = (P_right- 2*P[-1] + P[-2])/hx/hx >> d2y[:,1:-1] = (P[:,2:] - 2*P[:,1:-1] + P[:,:-2])/hy/hy >> d2y[:,0] = (P[:,1] - 2*P[:,0] + P_bottom)/hy/hy >> d2y[:,-1] = (P_top- 2*P[:,-1] + P[:,-2])/hy/hy >> # expression of the integro-differential equation >> return d2x+ d2y- 10*cosh(P).mean()**2 >> >> # Defining a guess starting solution >> guess= zeros((nx, ny), float) >> # Iterating to find the solution >> sol= newton_krylov(residual, guess, method='lgmres', verbose=1) >> print('Residual: %g' % abs(residual(sol)).max()) >> # visualize the solution >> import matplotlib.pyplotas plt >> x, y= mgrid[0:1:(nx*1j), 0:1:(ny*1j)] >> plt.pcolor(x, y, sol) >> plt.colorbar() >> plt.show()| >> >> I want to solve the same problem, with a *variable* boundary >> condition on the top edge that is: >> >> |P_top= P(x,1) = 1/(0.5 - abs(x))| >> >> That means that P would have a different value on the top edge of the >> square, as shown on the following figure : >> >> y >> /\ *(P_top)* >> | *P(x,1)=1/(0.5-|x|)* >> 1|__________________ >> | | >> | | >> P(0,y)=0 | | P(1,y)=0 >> (P_left) | | (P_right) >> | | >> | | >> |__________________|_____> x >> 0 1 >> P(x,0)=0 >> (P_bottom) >> >> I tried to insert P_top with the formula shown on line 11 of below >> code but I got an error message that variable 'x' is not defined. >> Would someone have any idea on how to define properly this P_top >> variable boundary condition in the residual ? >> >> Thanks in advance, >> >> [1] http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user > > P_top =1./(0.5- np.linspace(0,1,nx)) > > should do the trick. For the interval 0,1 you don't need to take the > modulus. > > Best, > Hanno > > hanno.klemm at me.com > > Sent from my mobile device > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user -- Edrisse -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Wed Jan 21 11:44:48 2015 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Wed, 21 Jan 2015 11:44:48 -0500 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function In-Reply-To: References: Message-ID: On 1/20/15, Renan Birck Pinheiro wrote: > 2015-01-20 22:47 GMT-02:00 KURT PETERS : > >> >> I tried it myself and I get your MATLAB answer: >> file: buttest.py: >> >> #!/usr/bin/env python >> ?? >> from numpy import pi >> from scipy import signal >> Wp = 2 * pi * 10 >> Ws = 2 * pi * 100 >> Rp = 3 >> Rs = 80 >> (N, Wn) = signal.buttord(Wp, Ws, Rp, Rs, analog=True) >> print(N,Wn) >> >> ~]$ ./buttest.py >> (5, 99.581776302787929) >> > > Which version of SciPy are you using? I tried with 0.14.0 and 0.15.0 but > couldn't reproduce your result. > > While poking around in the issue tracker, I found this change: > https://github.com/scipy/scipy/pull/3235/files > > where > W0 = nat / ((10 ** (0.1 * abs(gstop)) - 1) ** (1.0 / (2.0 * ord))) > was changed to > W0 = (GPASS - 1.0) ** (-1.0 / (2.0 * ord)). > > which might explain something. The first implementation is how MATLAB > calculates Wn. When you design a Butterworth filter with buttord, there aren't enough degrees of freedom to meet all the design constraints exactly. So there is a choice of which end of the transition region hits the constraints and which end is "over-designed". The change made in 0.14.0 switched that choice from the stop-band edge to the pass-band edge. A picture will make it clear. The attached script generates the attached plot. (I changed Rp from 3 to 1.5. -3 dB coincides with the gain at Wn, that's why your Wn was the same as Wp.) The filters generated using either the old or new convention both satisfy the design constraints, so they are both correct. With the new convention, the response just bumps against the constraint at the end of the pass-band. With the old convention, the response hits the constraint the edge of the stop-band. Warren > > ?Renan? > > -- > Renan Birck Pinheiro - Chip Inside Tecnologia > > Acad. Engenharia El?trica - UFSM > - Santa Maria, Brasil > http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 > > *Talk is cheap, show me the code*. - Linus Torvalds > -------------- next part -------------- A non-text attachment was scrubbed... Name: buttord_question.py Type: text/x-python Size: 1666 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: buttord_results.png Type: image/png Size: 49118 bytes Desc: not available URL: From renan.ee.ufsm at gmail.com Wed Jan 21 12:42:59 2015 From: renan.ee.ufsm at gmail.com (Renan Birck Pinheiro) Date: Wed, 21 Jan 2015 15:42:59 -0200 Subject: [SciPy-User] SciPy and MATLAB give different results for 'buttord' function In-Reply-To: References: Message-ID: 2015-01-21 14:44 GMT-02:00 Warren Weckesser : > > When you design a Butterworth filter with buttord, there aren't enough > degrees of freedom to meet all the design constraints exactly. So > there is a choice of which end of the transition region hits the > constraints and which end is "over-designed". The change made in > 0.14.0 switched that choice from the stop-band edge to the pass-band > edge. > > A picture will make it clear. The attached script generates the > attached plot. (I changed Rp from 3 to 1.5. -3 dB coincides with the > gain at Wn, that's why your Wn was the same as Wp.) The filters > generated using either the old or new convention both satisfy the > design constraints, so they are both correct. With the new > convention, the response just bumps against the constraint at the end > of the pass-band. With the old convention, the response hits the > constraint the edge of the stop-band. Hello, This pretty much explains it. ?Thank you. ?Renan -- Renan Birck Pinheiro - Chip Inside Tecnologia Acad. Engenharia El?trica - UFSM - Santa Maria, Brasil http://renanbirck.blogspot.com - +55 55 91162798 / +55 55 99034839 *Talk is cheap, show me the code*. - Linus Torvalds -------------- next part -------------- An HTML attachment was scrubbed... URL: From toddrjen at gmail.com Sat Jan 24 04:10:13 2015 From: toddrjen at gmail.com (Todd) Date: Sat, 24 Jan 2015 10:10:13 +0100 Subject: [SciPy-User] General scientific python mailing list Message-ID: Hi, I see a lot of mailing lists for specific scientific python projects, but no mailing list for scientific python in general. Is there any mailing list for general scientific python users? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jan 24 08:24:47 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 24 Jan 2015 14:24:47 +0100 Subject: [SciPy-User] General scientific python mailing list In-Reply-To: References: Message-ID: On Sat, Jan 24, 2015 at 10:10 AM, Todd wrote: > Hi, I see a lot of mailing lists for specific scientific python projects, > but no mailing list for scientific python in general. Is there any mailing > list for general scientific python users? > Depends. For code/user questions and discussions there's not really a more general list than numpy-discussion or scipy-user. For discussion on community issues/needs/infrastructure the NumFOCUS list ( https://groups.google.com/forum/#!forum/numfocus) is more appropriate. Cheers, Ralf > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From almar.klein at gmail.com Mon Jan 26 06:29:34 2015 From: almar.klein at gmail.com (Almar Klein) Date: Mon, 26 Jan 2015 12:29:34 +0100 Subject: [SciPy-User] General scientific python mailing list In-Reply-To: References: Message-ID: <54C6251E.2090403@gmail.com> > Hi, I see a lot of mailing lists for specific scientific python > projects, but no mailing list for scientific python in general. Is > there any mailing list for general scientific python users? > > > Depends. For code/user questions and discussions there's not really a > more general list than numpy-discussion or scipy-user. Since Scipy is also the name of the ecosystem (as is stated on scipy.org), I suppose this mailing list comes closest to a general scientific Python list. Maybe it's worth considering having one list for scipy-the-ecosystem and one for scipy-the-library. - Almar From zruan1991 at gmail.com Wed Jan 28 23:04:54 2015 From: zruan1991 at gmail.com (Zheng Ruan) Date: Wed, 28 Jan 2015 23:04:54 -0500 Subject: [SciPy-User] Help of understanding C code in weave Message-ID: Hi Scipy users, I am trying to read code that uses scipy.weave and I don't understand some of the code in C. To make it easy and clear, I'll just post the part that confused me. I have a numpy array (a) with a shape of (2, 623, 3, 333). And another array numpy array (d) with a shape of (1, 623, 623). In the C code part I have something like this: code = """ ... c = *(a + b); *(d+b) += 1; ... """ In the above code, b and c are float type in C. I just don't understand how the c value are calculated and what does "*(d+b) += 1;" do. The code is very old and I saw some deprecated warnings when I compile it. Thank you so much and any hints are welcome!!! Zheng -------------- next part -------------- An HTML attachment was scrubbed... URL: From msarahan at gmail.com Thu Jan 29 00:35:42 2015 From: msarahan at gmail.com (Michael Sarahan) Date: Wed, 28 Jan 2015 21:35:42 -0800 Subject: [SciPy-User] Help of understanding C code in weave In-Reply-To: References: Message-ID: Hi, That's not very much code to tell what exactly is going on. Based on what little is there, I'd guess that "a" and "d" are pointers to your data, and "b" is an integer offset that moves you through each array. Are you sure that b is float? It makes more sense if it is an int. Those two lines are doing the following: 1. assigning the value of c to the value of a (at offset b) 2. incrementing the value of d (at offset b) by 1. In both cases, you're probably getting hung up on pointers. You might benefit from reading (at least): http://stackoverflow.com/questions/4955198/what-does-dereferencing-a-pointer-mean Google has loads more resources, of course. Pay attention to pointers with high-dimensional data. Incrementing a pointer in a loop can go through your whole (contiguous) memory array (not just one dimension), but exactly how depends on the memory layout of your data. With Python/numpy/C, this should be row-major. Hope this helps. Michael On Wed, Jan 28, 2015 at 8:04 PM, Zheng Ruan wrote: > Hi Scipy users, > > I am trying to read code that uses scipy.weave and I don't understand some > of the code in C. To make it easy and clear, I'll just post the part that > confused me. > > I have a numpy array (a) with a shape of (2, 623, 3, 333). And another > array numpy array (d) with a shape of (1, 623, 623). > > In the C code part I have something like this: > > code = """ > ... > c = *(a + b); > *(d+b) += 1; > ... > """ > > In the above code, b and c are float type in C. I just don't understand > how the c value are calculated and what does "*(d+b) += 1;" do. > > The code is very old and I saw some deprecated warnings when I compile it. > > Thank you so much and any hints are welcome!!! > > Zheng > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From msarahan at gmail.com Thu Jan 29 00:37:45 2015 From: msarahan at gmail.com (Michael Sarahan) Date: Wed, 28 Jan 2015 21:37:45 -0800 Subject: [SciPy-User] Help of understanding C code in weave In-Reply-To: References: Message-ID: Sorry, point 1 doesn't read well. Here's a better statement: storing the value of a (at offset b) into c. On Wed, Jan 28, 2015 at 9:35 PM, Michael Sarahan wrote: > Hi, > > That's not very much code to tell what exactly is going on. Based on what > little is there, I'd guess that "a" and "d" are pointers to your data, and > "b" is an integer offset that moves you through each array. Are you sure > that b is float? It makes more sense if it is an int. > > Those two lines are doing the following: > > 1. assigning the value of c to the value of a (at offset b) > 2. incrementing the value of d (at offset b) by 1. > > In both cases, you're probably getting hung up on pointers. You might > benefit from reading (at least): > > http://stackoverflow.com/questions/4955198/what-does-dereferencing-a-pointer-mean > > Google has loads more resources, of course. Pay attention to pointers > with high-dimensional data. Incrementing a pointer in a loop can go > through your whole (contiguous) memory array (not just one dimension), but > exactly how depends on the memory layout of your data. With > Python/numpy/C, this should be row-major. > > Hope this helps. > Michael > > On Wed, Jan 28, 2015 at 8:04 PM, Zheng Ruan wrote: > >> Hi Scipy users, >> >> I am trying to read code that uses scipy.weave and I don't understand >> some of the code in C. To make it easy and clear, I'll just post the part >> that confused me. >> >> I have a numpy array (a) with a shape of (2, 623, 3, 333). And another >> array numpy array (d) with a shape of (1, 623, 623). >> >> In the C code part I have something like this: >> >> code = """ >> ... >> c = *(a + b); >> *(d+b) += 1; >> ... >> """ >> >> In the above code, b and c are float type in C. I just don't understand >> how the c value are calculated and what does "*(d+b) += 1;" do. >> >> The code is very old and I saw some deprecated warnings when I compile it. >> >> Thank you so much and any hints are welcome!!! >> >> Zheng >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From santoshvkushwaha at gmail.com Thu Jan 29 03:22:48 2015 From: santoshvkushwaha at gmail.com (Santosh Kushwaha) Date: Thu, 29 Jan 2015 08:22:48 +0000 Subject: [SciPy-User] Help of understanding C code in weave In-Reply-To: References: Message-ID: <54C9EDD8.5060209@gmail.com> On 01/29/2015 04:04 AM, Zheng Ruan wrote: > Hi Scipy users, > > I am trying to read code that uses scipy.weave and I don't understand > some of the code in C. To make it easy and clear, I'll just post the > part that confused me. > > I have a numpy array (a) with a shape of (2, 623, 3, 333). And another > array numpy array (d) with a shape of (1, 623, 623). > > In the C code part I have something like this: > > code = """ > ... > c = *(a + b); > *(d+b) += 1; > ... > """ > > In the above code, b and c are float type in C. I just don't > understand how the c value are calculated and what does "*(d+b) += 1;" do. > > The code is very old and I saw some deprecated warnings when I compile it. > > Thank you so much and any hints are welcome!!! > > Zheng > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user In C language, the name of an array can be considered just as a pointer to the beginning of a memory block as big as the array. Only difference being that compiler keeps some extra info for the arrays to keep track of their storage requirements. you can check it by: sizeof(pointer) and sizeof(array); last one should print the sum of size of all element in the array So, back to your question, as 'a' and 'd' are the arrays it could be written as simply a or d respectively. i.e. array = pointer, and when you put an asterisk or indirection to your pointer variable or array variable as in your case it simply means that you are dereferencing the value of array at the beginning of the array. i.e. value at array[0]; there are another forms to write it as well like, *array = array[index], where index = 0, both are same. in case you are retrieving the value of an array at a particular index/offset the it would be like: *(array+index) = array[index/offset], both are same. c = *(a+b); in above code 'a' is the array, 'b' is the index of the array and *(a+b) is the value of the array at position 'b', So basically you are assigning the vale of *(a+b) to variable 'c'. *(d+b) += 1; value of *(d+b) is incremented by 1. And the 'b' and 'c' which are float, would be implicitly cast to int by numpy -- Regards, Santosh Kushwaha -------------- next part -------------- An HTML attachment was scrubbed... URL: From daan.wynen at inria.fr Thu Jan 29 09:58:54 2015 From: daan.wynen at inria.fr (Daan Wynen) Date: Thu, 29 Jan 2015 15:58:54 +0100 Subject: [SciPy-User] fmin_l_bfgs_b stdout gets mixed into following Python stdout Message-ID: <54CA4AAE.6080309@inria.fr> Hi, I couldn't find anything about this issue, and it is not critical but annoying. Whenever I run the solver, the standard output order gets mixed up. What I want to see is the following (blank lines stripped out for saving space): ... ME SAYS: Unoptimized objective: 0.947251976788 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 50 M = 20 At X0 0 variables are exactly at the bounds At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 ... At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 * * * Tit = total number of iterations ... Total User time 0.000E+00 seconds. ME SAYS: After optimization: 7.1252665476e-05 ... Instead, what I get is this: ... ME SAYS: Unoptimized objective: 0.947251976788 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 50 M = 20 At X0 0 variables are exactly at the bounds At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 ... At iterate 3147 f= 8.09457D-05 |proj g|= 5.97659D-05 ME SAYS: After optimization: 7.1252665476e-05 At iterate 3148 f= 8.09298D-05 |proj g|= 5.19516D-05 ... At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 * * * Tit = total number of iterations ... Total User time 0.000E+00 seconds. ... Obviously, when I run an experiment, I would like to see the result at the end, and not buried somewhere in the middle of the iteration outputs. I tried passing callback=lambda _: sys.stdout.flush() and putting another flush after the whole optimization, but it didn't help. The example is on a very fast toy calculation, but this also happens when each iteration takes multiple seconds. I am using: python 2.7.3 (virtualenv from Fedora 18 distribution) scipy 0.14.1 numpy 1.9.1 But this also happens on Canopy. Is this maybe just to be expected when interfacing with fortran, or am I using this the wrong way? Best Regards Daan Wynen -------------- next part -------------- An HTML attachment was scrubbed... URL: From deshpande.jaidev at gmail.com Thu Jan 29 10:25:12 2015 From: deshpande.jaidev at gmail.com (Jaidev Deshpande) Date: Thu, 29 Jan 2015 20:55:12 +0530 Subject: [SciPy-User] fmin_l_bfgs_b stdout gets mixed into following Python stdout In-Reply-To: <54CA4AAE.6080309@inria.fr> References: <54CA4AAE.6080309@inria.fr> Message-ID: On Thu, Jan 29, 2015 at 8:28 PM, Daan Wynen wrote: > Hi, > > I couldn't find anything about this issue, and it is not critical but > annoying. > Whenever I run the solver, the standard output order gets mixed up. > What I want to see is the following (blank lines stripped out for saving > space): > > ... > ME SAYS: Unoptimized objective: 0.947251976788 > > RUNNING THE L-BFGS-B CODE > * * * > Machine precision = 2.220D-16 > N = 50 M = 20 > At X0 0 variables are exactly at the bounds > At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 > At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 > ... > At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 > At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 > * * * > Tit = total number of iterations > ... > Total User time 0.000E+00 seconds. > > ME SAYS: After optimization: 7.1252665476e-05 > ... > > > Instead, what I get is this: > > > ... > ME SAYS: Unoptimized objective: 0.947251976788 > > RUNNING THE L-BFGS-B CODE > * * * > Machine precision = 2.220D-16 > N = 50 M = 20 > At X0 0 variables are exactly at the bounds > At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 > At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 > ... > At iterate 3147 f= 8.09457D-05 |proj g|= 5.97659D-05 > > ME SAYS: After optimization: 7.1252665476e-05 > At iterate 3148 f= 8.09298D-05 |proj g|= 5.19516D-05 > ... > At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 > At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 > * * * > Tit = total number of iterations > ... > Total User time 0.000E+00 seconds. > ... > > Obviously, when I run an experiment, I would like to see the result at the > end, and not buried somewhere in the middle of the iteration outputs. > I tried passing callback=lambda _: sys.stdout.flush() and putting another > flush after the whole optimization, but it didn't help. > The example is on a very fast toy calculation, but this also happens when > each iteration takes multiple seconds. > > I am using: > python 2.7.3 (virtualenv from Fedora 18 distribution) > scipy 0.14.1 > numpy 1.9.1 > But this also happens on Canopy. > > Is this maybe just to be expected when interfacing with fortran, or am I > using this the wrong way? > > Best Regards > Daan Wynen > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > Perhaps this can be fixed by running your Python script in the unbuffered output mode, as $ python -u foo.py -- JD -------------- next part -------------- An HTML attachment was scrubbed... URL: From gidden at wisc.edu Thu Jan 29 10:32:44 2015 From: gidden at wisc.edu (Matthew Gidden) Date: Thu, 29 Jan 2015 09:32:44 -0600 Subject: [SciPy-User] fmin_l_bfgs_b stdout gets mixed into following Python stdout In-Reply-To: References: <54CA4AAE.6080309@inria.fr> Message-ID: If this works for you, not that you can also set the PYTHONUNBUFFERED environment variable (in case you're running this via a testing framework, for instance). On Thu, Jan 29, 2015 at 9:25 AM, Jaidev Deshpande < deshpande.jaidev at gmail.com> wrote: > > > On Thu, Jan 29, 2015 at 8:28 PM, Daan Wynen wrote: > >> Hi, >> >> I couldn't find anything about this issue, and it is not critical but >> annoying. >> Whenever I run the solver, the standard output order gets mixed up. >> What I want to see is the following (blank lines stripped out for saving >> space): >> >> ... >> ME SAYS: Unoptimized objective: 0.947251976788 >> >> RUNNING THE L-BFGS-B CODE >> * * * >> Machine precision = 2.220D-16 >> N = 50 M = 20 >> At X0 0 variables are exactly at the bounds >> At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 >> At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 >> ... >> At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 >> At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 >> * * * >> Tit = total number of iterations >> ... >> Total User time 0.000E+00 seconds. >> >> ME SAYS: After optimization: 7.1252665476e-05 >> ... >> >> >> Instead, what I get is this: >> >> >> ... >> ME SAYS: Unoptimized objective: 0.947251976788 >> >> RUNNING THE L-BFGS-B CODE >> * * * >> Machine precision = 2.220D-16 >> N = 50 M = 20 >> At X0 0 variables are exactly at the bounds >> At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 >> At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 >> ... >> At iterate 3147 f= 8.09457D-05 |proj g|= 5.97659D-05 >> >> ME SAYS: After optimization: 7.1252665476e-05 >> At iterate 3148 f= 8.09298D-05 |proj g|= 5.19516D-05 >> ... >> At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 >> At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 >> * * * >> Tit = total number of iterations >> ... >> Total User time 0.000E+00 seconds. >> ... >> >> Obviously, when I run an experiment, I would like to see the result at >> the end, and not buried somewhere in the middle of the iteration outputs. >> I tried passing callback=lambda _: sys.stdout.flush() and putting another >> flush after the whole optimization, but it didn't help. >> The example is on a very fast toy calculation, but this also happens when >> each iteration takes multiple seconds. >> >> I am using: >> python 2.7.3 (virtualenv from Fedora 18 distribution) >> scipy 0.14.1 >> numpy 1.9.1 >> But this also happens on Canopy. >> >> Is this maybe just to be expected when interfacing with fortran, or am I >> using this the wrong way? >> >> Best Regards >> Daan Wynen >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user >> >> > Perhaps this can be fixed by running your Python script in the unbuffered > output mode, as > > $ python -u foo.py > > -- > JD > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -- Matthew Gidden Ph.D. Candidate, Nuclear Engineering The University of Wisconsin -- Madison Ph. 225.892.3192 -------------- next part -------------- An HTML attachment was scrubbed... URL: From daan.wynen at inria.fr Thu Jan 29 10:38:49 2015 From: daan.wynen at inria.fr (Daan Wynen) Date: Thu, 29 Jan 2015 16:38:49 +0100 Subject: [SciPy-User] fmin_l_bfgs_b stdout gets mixed into following Python stdout In-Reply-To: References: <54CA4AAE.6080309@inria.fr> Message-ID: <54CA5409.8080705@inria.fr> That looks promising! This does not always happen, so it is hard to say for sure, but I haven't seen the error happen again in my examples *so far*. :) thanks! On 01/29/2015 04:32 PM, Matthew Gidden wrote: > If this works for you, not that you can also set the PYTHONUNBUFFERED > > environment variable (in case you're running this via a testing > framework, for instance). > > On Thu, Jan 29, 2015 at 9:25 AM, Jaidev Deshpande > > wrote: > > > > On Thu, Jan 29, 2015 at 8:28 PM, Daan Wynen > wrote: > > Hi, > > I couldn't find anything about this issue, and it is not > critical but annoying. > Whenever I run the solver, the standard output order gets > mixed up. > What I want to see is the following (blank lines stripped out > for saving space): > > ... > ME SAYS: Unoptimized objective: 0.947251976788 > > RUNNING THE L-BFGS-B CODE > * * * > Machine precision = 2.220D-16 > N = 50 M = 20 > At X0 0 variables are exactly at the bounds > At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 > At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 > ... > At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 > At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 > * * * > Tit = total number of iterations > ... > Total User time 0.000E+00 seconds. > > ME SAYS: After optimization: 7.1252665476e-05 > ... > > > Instead, what I get is this: > > > ... > ME SAYS: Unoptimized objective: 0.947251976788 > > RUNNING THE L-BFGS-B CODE > * * * > Machine precision = 2.220D-16 > N = 50 M = 20 > At X0 0 variables are exactly at the bounds > At iterate 0 f= 9.47252D-01 |proj g|= 1.04635D-01 > At iterate 1 f= 7.95655D-01 |proj g|= 9.16550D-02 > ... > At iterate 3147 f= 8.09457D-05 |proj g|= 5.97659D-05 > > ME SAYS: After optimization: 7.1252665476e-05 > At iterate 3148 f= 8.09298D-05 |proj g|= 5.19516D-05 > ... > At iterate 3769 f= 7.12543D-05 |proj g|= 2.07939D-05 > At iterate 3770 f= 7.12527D-05 |proj g|= 1.51021D-04 > * * * > Tit = total number of iterations > ... > Total User time 0.000E+00 seconds. > ... > > Obviously, when I run an experiment, I would like to see the > result at the end, and not buried somewhere in the middle of > the iteration outputs. > I tried passing callback=lambda _: sys.stdout.flush() and > putting another flush after the whole optimization, but it > didn't help. > The example is on a very fast toy calculation, but this also > happens when each iteration takes multiple seconds. > > I am using: > python 2.7.3 (virtualenv from Fedora 18 distribution) > scipy 0.14.1 > numpy 1.9.1 > But this also happens on Canopy. > > Is this maybe just to be expected when interfacing with > fortran, or am I using this the wrong way? > > Best Regards > Daan Wynen > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > > Perhaps this can be fixed by running your Python script in the > unbuffered output mode, as > > $ python -u foo.py > > -- > JD > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > > > > -- > Matthew Gidden > Ph.D. Candidate, Nuclear Engineering > The University of Wisconsin -- Madison > Ph. 225.892.3192 > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From gb.gabrielebrambilla at gmail.com Thu Jan 29 13:21:08 2015 From: gb.gabrielebrambilla at gmail.com (Gabriele Brambilla) Date: Thu, 29 Jan 2015 13:21:08 -0500 Subject: [SciPy-User] Solving equations in SciPy Message-ID: Hi all, is it possible to finding roots of equations in Scipy? Is there any function that does this task? For example an equation solver able to find the complex roots of an equation like X^4 + aX^3 + ... = 0 ? thanks Gabriele -------------- next part -------------- An HTML attachment was scrubbed... URL: From cyrille.rossant at gmail.com Thu Jan 29 13:48:26 2015 From: cyrille.rossant at gmail.com (Cyrille Rossant) Date: Thu, 29 Jan 2015 19:48:26 +0100 Subject: [SciPy-User] Solving equations in SciPy In-Reply-To: References: Message-ID: If you want numerical solutions: http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html If you want analytical (exact, symbolic) solutions: http://docs.sympy.org/latest/modules/solvers/solvers.html 2015-01-29 19:21 GMT+01:00 Gabriele Brambilla : > Hi all, > > is it possible to finding roots of equations in Scipy? Is there any function > that does this task? For example an equation solver able to find the complex > roots of an equation like > > X^4 + aX^3 + ... = 0 > > ? > > thanks > > Gabriele > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > From gb.gabrielebrambilla at gmail.com Thu Jan 29 14:01:39 2015 From: gb.gabrielebrambilla at gmail.com (Gabriele Brambilla) Date: Thu, 29 Jan 2015 14:01:39 -0500 Subject: [SciPy-User] Solving equations in SciPy In-Reply-To: References: Message-ID: thanks Gabriele On Thu, Jan 29, 2015 at 1:48 PM, Cyrille Rossant wrote: > If you want numerical solutions: > http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html > If you want analytical (exact, symbolic) solutions: > http://docs.sympy.org/latest/modules/solvers/solvers.html > > 2015-01-29 19:21 GMT+01:00 Gabriele Brambilla < > gb.gabrielebrambilla at gmail.com>: > > Hi all, > > > > is it possible to finding roots of equations in Scipy? Is there any > function > > that does this task? For example an equation solver able to find the > complex > > roots of an equation like > > > > X^4 + aX^3 + ... = 0 > > > > ? > > > > thanks > > > > Gabriele > > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Jan 29 14:07:35 2015 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 29 Jan 2015 14:07:35 -0500 Subject: [SciPy-User] Solving equations in SciPy In-Reply-To: References: Message-ID: On Thu, Jan 29, 2015 at 2:01 PM, Gabriele Brambilla < gb.gabrielebrambilla at gmail.com> wrote: > thanks > > Gabriele > > On Thu, Jan 29, 2015 at 1:48 PM, Cyrille Rossant < > cyrille.rossant at gmail.com> wrote: > >> If you want numerical solutions: >> http://docs.scipy.org/doc/scipy-0.14.0/reference/optimize.nonlin.html >> If you want analytical (exact, symbolic) solutions: >> http://docs.sympy.org/latest/modules/solvers/solvers.html >> >> 2015-01-29 19:21 GMT+01:00 Gabriele Brambilla < >> gb.gabrielebrambilla at gmail.com>: >> > Hi all, >> > >> > is it possible to finding roots of equations in Scipy? Is there any >> function >> > that does this task? For example an equation solver able to find the >> complex >> > roots of an equation like >> > >> > X^4 + aX^3 + ... = 0 >> > just to add to Cyrille's answer: If you have a polynomial instead of a general non-linear function, then the functions in np.polynomial and np.roots can find all roots. Josef > > >> > ? >> > >> > thanks >> > >> > Gabriele >> > >> > _______________________________________________ >> > SciPy-User mailing list >> > SciPy-User at scipy.org >> > http://mail.scipy.org/mailman/listinfo/scipy-user >> > >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-user >> > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From emmadeona+sherpa at gmail.com Thu Jan 29 14:23:02 2015 From: emmadeona+sherpa at gmail.com (emmaonaw) Date: Thu, 29 Jan 2015 12:23:02 -0700 (MST) Subject: [SciPy-User] Problem with sherpa fitting Message-ID: <1422559382183-19935.post@n7.nabble.com> Hi, I don't know if this is the right forum to ask but I cannot find any forum where to get help with sherpa fitting, so here I go :). I am trying to fit an image using a model of background, PSF and gaussian source, using sherpa in python. I set up my model using the standard tools: show_all() Data Set: 1 Filter: Circle(112.75,164.45,111.12) name = filename.fits x0 = Float64[97344] x1 = Float64[97344] y = Float32[97344] shape = (312, 312) staterror = None syserror = None sky = None eqpos = world crval = [ 278.5712, -7.067 ] crpix = [ 157., 157.] cdelt = [ 0.009, 0.009] crota = 0 epoch = 2000 equinox = 2000 coord = logical Model: 1 (tablemodel.bkgmap + usermodel.Src1) Param Type Value Min Max Units ----- ---- ----- --- --- ----- bkgmap.ampl thawed 1 -3.40282e+38 3.40282e+38 Src1.sigma1 frozen 7.0006 1 30 Src1.ampl thawed 10 -100 100000 Src1.size thawed 10.0009 0 55.5603 Src1.xpos thawed 146.943 124.719 169.167 Src1.ypos thawed 158.894 136.608 181.18 and everything looks okay until I try to fit(): ERROR: Internal Python error in the inspect module. Below is the traceback from this internal error. Traceback (most recent call last): File "/Users/emma/anaconda/lib/python2.7/site-packages/IPython/core/ultratb.py", line 776, in structured_traceback records = _fixed_getinnerframes(etb, context, tb_offset) File "/Users/emma/anaconda/lib/python2.7/site-packages/IPython/core/ultratb.py", line 230, in wrapped return f(*args, **kwargs) AssertionError Unfortunately, your original traceback can not be constructed. I am stuck here, tried several things but not really sure how to continue debugging this, any ideas? is it possible crval should be positive? Thanks a lot Emma -- View this message in context: http://scipy-user.10969.n7.nabble.com/Problem-with-sherpa-fitting-tp19935.html Sent from the Scipy-User mailing list archive at Nabble.com. From zruan1991 at gmail.com Thu Jan 29 14:37:01 2015 From: zruan1991 at gmail.com (Zheng Ruan) Date: Thu, 29 Jan 2015 14:37:01 -0500 Subject: [SciPy-User] Help of understanding C code in weave In-Reply-To: <54C9EDD8.5060209@gmail.com> References: <54C9EDD8.5060209@gmail.com> Message-ID: Thank you for the explanation! That makes more sense to me. Zheng On Thu, Jan 29, 2015 at 3:22 AM, Santosh Kushwaha < santoshvkushwaha at gmail.com> wrote: > On 01/29/2015 04:04 AM, Zheng Ruan wrote: > > Hi Scipy users, > > I am trying to read code that uses scipy.weave and I don't understand > some of the code in C. To make it easy and clear, I'll just post the part > that confused me. > > I have a numpy array (a) with a shape of (2, 623, 3, 333). And another > array numpy array (d) with a shape of (1, 623, 623). > > In the C code part I have something like this: > > code = """ > ... > c = *(a + b); > *(d+b) += 1; > ... > """ > > In the above code, b and c are float type in C. I just don't understand > how the c value are calculated and what does "*(d+b) += 1;" do. > > The code is very old and I saw some deprecated warnings when I compile > it. > > Thank you so much and any hints are welcome!!! > > Zheng > > > _______________________________________________ > SciPy-User mailing listSciPy-User at scipy.orghttp://mail.scipy.org/mailman/listinfo/scipy-user > > In C language, the name of an array can be considered just as a pointer > to the beginning of a memory block as big as the array. Only difference > being that compiler keeps some extra info for the arrays to keep track of > their storage requirements. > you can check it by: > sizeof(pointer) and sizeof(array); > last one should print the sum of size of all element in the array > So, back to your question, as 'a' and 'd' are the arrays it could be > written as simply a or d respectively. > i.e. array = pointer, > > and when you put an asterisk or indirection to your pointer variable or > array variable as in your case it simply means that you are dereferencing > the value of array at the beginning of the array. i.e. value at array[0]; > there are another forms to write it as well like, > *array = array[index], where index = 0, both are same. > > in case you are retrieving the value of an array at a particular > index/offset the it would be like: > *(array+index) = array[index/offset], both are same. > > c = *(a+b); > in above code 'a' is the array, 'b' is the index of the array and *(a+b) > is the value of the array at position 'b', So basically you are assigning > the vale of *(a+b) to variable 'c'. > > *(d+b) += 1; > value of *(d+b) is incremented by 1. > > And the 'b' and 'c' which are float, would be implicitly cast to int by > numpy > -- > Regards, > Santosh Kushwaha > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jzuhone at gmail.com Thu Jan 29 15:55:52 2015 From: jzuhone at gmail.com (John ZuHone) Date: Thu, 29 Jan 2015 15:55:52 -0500 Subject: [SciPy-User] Problem with sherpa fitting In-Reply-To: <1422559382183-19935.post@n7.nabble.com> References: <1422559382183-19935.post@n7.nabble.com> Message-ID: <10BFA93E-7FCE-4D78-BE45-927A4548EF5F@gmail.com> Hi Emma, You might have more luck at the AstroPy mailing list: astropy at scipy.org Even better, send a ticket to the CXC Helpdesk: http://cxc.cfa.harvard.edu/helpdesk/ Best, John > On Jan 29, 2015, at 2:23 PM, emmaonaw wrote: > > Hi, > > I don't know if this is the right forum to ask but I cannot find any forum > where to get help with sherpa fitting, so here I go :). > I am trying to fit an image using a model of background, PSF and gaussian > source, using sherpa in python. > I set up my model using the standard tools: > > show_all() > > Data Set: 1 > Filter: Circle(112.75,164.45,111.12) > name = filename.fits > x0 = Float64[97344] > x1 = Float64[97344] > y = Float32[97344] > shape = (312, 312) > staterror = None > syserror = None > sky = None > eqpos = world > crval = [ 278.5712, -7.067 ] > crpix = [ 157., 157.] > cdelt = [ 0.009, 0.009] > crota = 0 > epoch = 2000 > equinox = 2000 > coord = logical > > Model: 1 > (tablemodel.bkgmap + usermodel.Src1) > Param Type Value Min Max Units > ----- ---- ----- --- --- ----- > bkgmap.ampl thawed 1 -3.40282e+38 3.40282e+38 > Src1.sigma1 frozen 7.0006 1 30 > Src1.ampl thawed 10 -100 100000 > Src1.size thawed 10.0009 0 55.5603 > Src1.xpos thawed 146.943 124.719 169.167 > Src1.ypos thawed 158.894 136.608 181.18 > > and everything looks okay until I try to fit(): > > ERROR: Internal Python error in the inspect module. > Below is the traceback from this internal error. > > Traceback (most recent call last): > File > "/Users/emma/anaconda/lib/python2.7/site-packages/IPython/core/ultratb.py", > line 776, in structured_traceback > records = _fixed_getinnerframes(etb, context, tb_offset) > File > "/Users/emma/anaconda/lib/python2.7/site-packages/IPython/core/ultratb.py", > line 230, in wrapped > return f(*args, **kwargs) > AssertionError > > Unfortunately, your original traceback can not be constructed. > > I am stuck here, tried several things but not really sure how to continue > debugging this, any ideas? is it possible crval should be positive? > Thanks a lot > Emma > > > > > -- > View this message in context: http://scipy-user.10969.n7.nabble.com/Problem-with-sherpa-fitting-tp19935.html > Sent from the Scipy-User mailing list archive at Nabble.com. > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From emmadeona+sherpa at gmail.com Fri Jan 30 09:17:33 2015 From: emmadeona+sherpa at gmail.com (emmaonaw) Date: Fri, 30 Jan 2015 07:17:33 -0700 (MST) Subject: [SciPy-User] Problem with sherpa fitting In-Reply-To: <10BFA93E-7FCE-4D78-BE45-927A4548EF5F@gmail.com> References: <1422559382183-19935.post@n7.nabble.com> <10BFA93E-7FCE-4D78-BE45-927A4548EF5F@gmail.com> Message-ID: <1422627453191-19938.post@n7.nabble.com> Thanks for quick answer John, I will look into that Cheers e -- View this message in context: http://scipy-user.10969.n7.nabble.com/Problem-with-sherpa-fitting-tp19935p19938.html Sent from the Scipy-User mailing list archive at Nabble.com. From aldcroft at head.cfa.harvard.edu Fri Jan 30 13:29:44 2015 From: aldcroft at head.cfa.harvard.edu (Aldcroft, Thomas) Date: Fri, 30 Jan 2015 13:29:44 -0500 Subject: [SciPy-User] Problem with sherpa fitting In-Reply-To: <1422559382183-19935.post@n7.nabble.com> References: <1422559382183-19935.post@n7.nabble.com> Message-ID: On Thu, Jan 29, 2015 at 2:23 PM, emmaonaw wrote: > Hi, > > I don't know if this is the right forum to ask but I cannot find any forum > where to get help with sherpa fitting, so here I go :). > Hi Emma, The official place to get help for sherpa fitting is through the Chandra X-ray Center helpdesk: http://cxc.harvard.edu/helpdesk/ In order to get a reasonable traceback, in the current version of sherpa you need to set an option in your sherpa config file. This file is ~/.sherpa.rc in your home directory, and you need the lines: [verbosity] # Sherpa Chatter level: a non-zero value will display full error traceback level : 2000 Cheers, Tom > I am trying to fit an image using a model of background, PSF and gaussian > source, using sherpa in python. > I set up my model using the standard tools: > > show_all() > > Data Set: 1 > Filter: Circle(112.75,164.45,111.12) > name = filename.fits > x0 = Float64[97344] > x1 = Float64[97344] > y = Float32[97344] > shape = (312, 312) > staterror = None > syserror = None > sky = None > eqpos = world > crval = [ 278.5712, -7.067 ] > crpix = [ 157., 157.] > cdelt = [ 0.009, 0.009] > crota = 0 > epoch = 2000 > equinox = 2000 > coord = logical > > Model: 1 > (tablemodel.bkgmap + usermodel.Src1) > Param Type Value Min Max Units > ----- ---- ----- --- --- ----- > bkgmap.ampl thawed 1 -3.40282e+38 3.40282e+38 > Src1.sigma1 frozen 7.0006 1 30 > Src1.ampl thawed 10 -100 100000 > Src1.size thawed 10.0009 0 55.5603 > Src1.xpos thawed 146.943 124.719 169.167 > Src1.ypos thawed 158.894 136.608 181.18 > > and everything looks okay until I try to fit(): > > ERROR: Internal Python error in the inspect module. > Below is the traceback from this internal error. > > Traceback (most recent call last): > File > "/Users/emma/anaconda/lib/python2.7/site-packages/IPython/core/ultratb.py", > line 776, in structured_traceback > records = _fixed_getinnerframes(etb, context, tb_offset) > File > "/Users/emma/anaconda/lib/python2.7/site-packages/IPython/core/ultratb.py", > line 230, in wrapped > return f(*args, **kwargs) > AssertionError > > Unfortunately, your original traceback can not be constructed. > > I am stuck here, tried several things but not really sure how to continue > debugging this, any ideas? is it possible crval should be positive? > Thanks a lot > Emma > > > > > -- > View this message in context: > http://scipy-user.10969.n7.nabble.com/Problem-with-sherpa-fitting-tp19935.html > Sent from the Scipy-User mailing list archive at Nabble.com. > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhmerchant at gmail.com Sat Jan 31 11:49:19 2015 From: bhmerchant at gmail.com (Brian Merchant) Date: Sat, 31 Jan 2015 08:49:19 -0800 Subject: [SciPy-User] Need clarification on paper "The NumPy array: a structure for efficient numerical computation": what does "vectorize" really mean? Message-ID: In "The NumPy array: a structure for efficient numerical computation" (2011, http://arxiv.org/abs/1102.1523), the authors use the verb/adjective "vectorize" in such a way that I need clarification. On page 3, in subsection "Numerical operations on arrays: vectorization", I get the (perhaps incorrect?) impression that "vectorization" refers to "an operation that can be run using C for loops over C arrays". Already, the vocabulary seems a little weird to me, since if "vectorization" really just means "for loops in a low level language rather than a high level language"...why create a word for the concept based on the root word "vector"? Perhaps there is some history there that I am missing, but I can accept that definition. The answer to the following StackOverflow seems to suggest that "vectorization" means "implemented in a lower level language": http://stackoverflow.com/questions/17483042/explain-the-speed-difference-between-numpys-vectorized-function-application-vs However, on page 5, I see the following sentence: "In a non-vectorized language, no temporary arrays need to be allocated when the output values are calculated in a nested for-loop, e.g. (in C)". Hold on -- how does "vectorized" make sense here? Should I interpret it as "in a low level language, no temporary arrays need to be allocated when the output values are calculated..."? If yes, well...why would temporary arrays need to be allocated in a higher level language? Can you help me make sense of my confusion? Kind regards, Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Sat Jan 31 12:44:37 2015 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 31 Jan 2015 17:44:37 +0000 Subject: [SciPy-User] Need clarification on paper "The NumPy array: a structure for efficient numerical computation": what does "vectorize" really mean? In-Reply-To: References: Message-ID: On Sat, Jan 31, 2015 at 4:49 PM, Brian Merchant wrote: > > In "The NumPy array: a structure for efficient numerical computation" (2011, http://arxiv.org/abs/1102.1523), the authors use the verb/adjective "vectorize" in such a way that I need clarification. > > On page 3, in subsection "Numerical operations on arrays: vectorization", I get the (perhaps incorrect?) impression that "vectorization" refers to "an operation that can be run using C for loops over C arrays". Already, the vocabulary seems a little weird to me, since if "vectorization" really just means "for loops in a low level language rather than a high level language"...why create a word for the concept based on the root word "vector"? Perhaps there is some history there that I am missing, but I can accept that definition. The answer to the following StackOverflow seems to suggest that "vectorization" means "implemented in a lower level language": http://stackoverflow.com/questions/17483042/explain-the-speed-difference-between-numpys-vectorized-function-application-vs Rather, it means that the loop over the array is implicit in the syntax of the language rather than explicit; that is, the language deals with arrays ("vectors") as first-class objects with their own mathematical operations rather than just containers of numbers that one must operate on independently. For example, Fortran 90 has vectorized operations, but it is not calling down to a lower level language to do it. It's just part of the language. E.g. http://www.cs.uwm.edu/~cs151/Bacon/Lecture/HTML/ch11s12.html The use of the term "vector" for this does have long history in computing to refer to things like this: http://en.wikipedia.org/wiki/Vector_processor#Early_work > However, on page 5, I see the following sentence: "In a non-vectorized language, no temporary arrays need to be allocated when the output values are calculated in a nested for-loop, e.g. (in C)". Hold on -- how does "vectorized" make sense here? Should I interpret it as "in a low level language, no temporary arrays need to be allocated when the output values are calculated..."? If yes, well...why would temporary arrays need to be allocated in a higher level language? This isn't true of all languages with vector operations (c.f. Fortran 90), just some high-level languages. As for Python, the temporaries come in because the operations are evaluated independently of the operations in the expression, and the loops happen inside of each operation. In numpy, A = B * C + D (B*C) must be calculated first into a temporary before D can be added to it. Each one of those are a separate loop. Compare this to the typical implementation in C which would allocate the space for A and do a single loop: for (i=0; i From srey at asu.edu Sat Jan 31 22:02:04 2015 From: srey at asu.edu (Serge Rey) Date: Sat, 31 Jan 2015 20:02:04 -0700 Subject: [SciPy-User] ANN PySAL 1.9 Message-ID: On behalf of the PySAL developers I'm happy to announce the release of version 1.9. PySAL is a library of tools for spatial data analysis and geocomputation written in Python. PySAL 1.9 marks the tenth official release of PySAL. Thanks to the team we closed a total of 113 issues, 44 pull requests and 69 regular issues since the last release 6 months ago. We also are very happy to have produced a new book on spatial econometrics that features the spreg module from PySAL: Anselin, L. and S.J. Rey (2014) Modern Spatial Econometrics in Practice: A guide to GeoDa, GeoDaSpace and PySAL. http://www.amazon.com/Modern-Spatial-Econometrics-Practice-GeoDaSpace-ebook/dp/B00RI9I44K For downloads and more details see: http://pysal.org s. -- Sergio (Serge) Rey Professor, School of Geographical Sciences and Urban Planning GeoDa Center for Geospatial Analysis and Computation Arizona State University http://geoplan.asu.edu/rey Editor, Geographical Analysis http://wileyonlinelibrary.com/journal/gean