From falcone.giuseppe at gmail.com Mon Oct 2 09:26:26 2017 From: falcone.giuseppe at gmail.com (Giuseppe Falcone) Date: Mon, 2 Oct 2017 15:26:26 +0200 Subject: [SciPy-User] filtering object Message-ID: Hi to all, I'm new to this list and to scipy! I have a question for you.... I use pdal to elaborate las files (lidar data). As result of elaboration I have a ndarray like this: [( 626708.60087012, 4481781.14910498, 7.478, 0, 2, 2, 1, 0, 4, -11., 0, 0, 0., 7.478),( 626708.34087012, 4481780.92910498, 5.418, 0, 2, 3, 1, 0, 4, -11., 0, 0, 0., 5.418),....] the name of dimensions are: (u'X', u'Y', u'Z', u'Intensity', u'ReturnNumber', u'NumberOfReturns', u'ScanDirectionFlag', u'EdgeOfFlightLine', u'Classification', u'ScanAngleRank', u'UserData', u'PointSourceId', u'GpsTime', u'HeightAboveGround') I want to split this array in two subarray: the first with element that have ReturnNumber (fifth value) dimension = 1 and the second with all others elements. There is an efficient way to do this? Thanks. Giuseppe -------------- next part -------------- An HTML attachment was scrubbed... URL: From hturesson at gmail.com Mon Oct 2 09:35:28 2017 From: hturesson at gmail.com (Hjalmar Turesson) Date: Mon, 2 Oct 2017 09:35:28 -0400 Subject: [SciPy-User] filtering object In-Reply-To: References: Message-ID: Seems like you have a structured array. Try: B1 = A[A['RetrunNumber'] == 1] B2 = A[A['RetrunNumber'] != 1] Where A is your original array, and B1 and B2 are the arrays selected in 'ReturnNumber'. See abut structured arrays here: https://docs.scipy.org/doc/numpy-1.13.0/user/basics.rec.html and recarrays: https://docs.scipy.org/doc/numpy-1.13.0/reference/generated/numpy.recarray.html Best On Mon, Oct 2, 2017 at 9:26 AM, Giuseppe Falcone wrote: > Hi to all, > > I'm new to this list and to scipy! > I have a question for you.... > > I use pdal to elaborate las files (lidar data). > As result of elaboration I have a ndarray like this: > > [( 626708.60087012, 4481781.14910498, 7.478, 0, 2, 2, 1, 0, 4, -11., 0, > 0, 0., 7.478),( 626708.34087012, 4481780.92910498, 5.418, 0, 2, 3, 1, > 0, 4, -11., 0, 0, 0., 5.418),....] > > the name of dimensions are: > (u'X', u'Y', u'Z', u'Intensity', u'ReturnNumber', u'NumberOfReturns', > u'ScanDirectionFlag', u'EdgeOfFlightLine', u'Classification', > u'ScanAngleRank', u'UserData', u'PointSourceId', u'GpsTime', > u'HeightAboveGround') > > I want to split this array in two subarray: the first with element that > have ReturnNumber (fifth value) dimension = 1 and the second with all > others elements. > > > There is an efficient way to do this? > Thanks. > > Giuseppe > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From giovanni.bonaccorsi at gmail.com Sat Oct 7 05:35:24 2017 From: giovanni.bonaccorsi at gmail.com (Giovanni Bonaccorsi) Date: Sat, 7 Oct 2017 11:35:24 +0200 Subject: [SciPy-User] Question about the behavior of hypergeom.cdf Message-ID: Hi everyone, first post here. I have a doubt about broadcasting in scipy. I'm trying to apply broadcasting with hypergeom. The problem is that while hypergeom.pmf() accept my arrays without complaining the hypergeom.cdf() function throws out this error: TypeError: only length-1 arrays can be converted to Python scalars Now, I don't know if this is the expected behavior of the function, but from the documentation it doesn't seem necessary to take particular care to arrays argument in the hypergeom.cdf() function with respect to hypergeom.pmf(). Is that right or have I missed something? I'm attaching a minimal example to reproduce the error, I'm using Python2.7 through Anaconda with scipy and numpy updated to the last releases. import numpy as np from scipy.stats import hypergeom # this works x = np.array([[1,2,3],[4,5,6],[11,12,13]]) M,n,N = [20,7,12] hypergeom.pmf(x,M,n,N) hypergeom.cdf(x,M,n,N) # this works M=np.array([25]) n=np.array([3,6,9]) N=np.array([2,4,6]) hypergeom.pmf(x,M,n,N) # this doesn't work hypergeom.cdf(x,M,n,N) Thanks a lot for the support., Giovanni -------------- next part -------------- An HTML attachment was scrubbed... URL: From jni.soma at gmail.com Sun Oct 8 21:59:52 2017 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Mon, 9 Oct 2017 12:59:52 +1100 Subject: [SciPy-User] Question about the behavior of hypergeom.cdf In-Reply-To: References: Message-ID: <7032591c-327d-413e-9679-74507e27efb1@Spark> I?m not familiar with this part of the codebase, but after a bit of playing around, I think this isn?t yet implemented, maybe because it?s too hard? The relevant in scipy/stats/_distn_infrastructure.py?call is: -> 2775 ? ? ? ? m = arange(int(self.a), k+1) self.a is the lower bound of the cdf to be computed, in this case, an array [0, 0, 0] to match the input N = self.b = [2, 4, 6]. The problem of course is that there is no general way to vectorize np.arange, because you would get a jagged array. I hope someone with more knowledge of the library will chime in, but I think you?ll have to implement your own looping logic in the short-to-medium term? Juan. On 7 Oct 2017, 8:36 PM +1100, Giovanni Bonaccorsi , wrote: > Hi everyone, > first post here. I have a doubt about broadcasting in scipy. > > I'm trying to apply broadcasting with hypergeom. The problem is that while hypergeom.pmf() accept my arrays without complaining the hypergeom.cdf() function throws out this error: > > TypeError: only length-1 arrays can be converted to Python scalars > Now, I don't know if this is the expected behavior of the function, but from the documentation it doesn't seem necessary to take particular care to arrays argument in the hypergeom.cdf() function with respect to hypergeom.pmf(). Is that right or have I missed something? > > I'm attaching a minimal example to reproduce the error, I'm using Python2.7 through Anaconda with scipy and numpy updated to the last releases. > > import numpy as np > from scipy.stats import hypergeom > > # this works > x = np.array([[1,2,3],[4,5,6],[11,12,13]]) > M,n,N = [20,7,12] > hypergeom.pmf(x,M,n,N) > hypergeom.cdf(x,M,n,N) > > # this works > M=np.array([25]) > n=np.array([3,6,9]) > N=np.array([2,4,6]) > hypergeom.pmf(x,M,n,N) > > # this doesn't work > hypergeom.cdf(x,M,n,N) > > Thanks a lot for the support., > Giovanni > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Oct 8 23:24:04 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 8 Oct 2017 23:24:04 -0400 Subject: [SciPy-User] Question about the behavior of hypergeom.cdf In-Reply-To: <7032591c-327d-413e-9679-74507e27efb1@Spark> References: <7032591c-327d-413e-9679-74507e27efb1@Spark> Message-ID: On Sun, Oct 8, 2017 at 9:59 PM, Juan Nunez-Iglesias wrote: > I?m not familiar with this part of the codebase, but after a bit of > playing around, I think this isn?t yet implemented, maybe because it?s too > hard? The relevant in scipy/stats/_distn_infrastructure.py call is: > > -> 2775 m = arange(int(self.a), k+1) > > self.a is the lower bound of the cdf to be computed, in this case, an > array [0, 0, 0] to match the input N = self.b = [2, 4, 6]. The problem of > course is that there is no general way to vectorize np.arange, because you > would get a jagged array. > > I hope someone with more knowledge of the library will chime in, but I > think you?ll have to implement your own looping logic in the > short-to-medium term? > > Juan. > > On 7 Oct 2017, 8:36 PM +1100, Giovanni Bonaccorsi < > giovanni.bonaccorsi at gmail.com>, wrote: > > Hi everyone, > first post here. I have a doubt about broadcasting in scipy. > > I'm trying to apply broadcasting with hypergeom. The problem is that while > hypergeom.pmf() accept my arrays without complaining the hypergeom.cdf() > function throws out this error: > > TypeError: only length-1 arrays can be converted to Python scalars > > Now, I don't know if this is the expected behavior of the function, but > from the documentation it doesn't seem necessary to take particular care to > arrays argument in the hypergeom.cdf() function with respect to > hypergeom.pmf(). Is that right or have I missed something? > > I'm attaching a minimal example to reproduce the error, I'm using > Python2.7 through Anaconda with scipy and numpy updated to the last > releases. > > import numpy as np > from scipy.stats import hypergeom > > # this works > x = np.array([[1,2,3],[4,5,6],[11,12,13]]) > M,n,N = [20,7,12] > hypergeom.pmf(x,M,n,N) > hypergeom.cdf(x,M,n,N) > > # this works > M=np.array([25]) > n=np.array([3,6,9]) > N=np.array([2,4,6]) > hypergeom.pmf(x,M,n,N) > > # this doesn't work > hypergeom.cdf(x,M,n,N) > > I haven't seriously looked at the distribution code in a while. The reason was and, I guess, is that the support of a distribution is not vectorized, i.e. the bounds.a and .b. This means that in distributions where the support depends on a shape parameter some methods are not vectorized in that parameter. It might work in some cases but there is no generic support for it in the base classes. AFAIK .a and .b are always scalar Looks like https://github.com/scipy/scipy/issues/1320 is still open. However, I don't remember much discussion about the discrete distributions like hypergeom. Josef > > Thanks a lot for the support., > Giovanni > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoniy.py at gmail.com Wed Oct 11 13:01:31 2017 From: antoniy.py at gmail.com (Antonio Polino) Date: Wed, 11 Oct 2017 19:01:31 +0200 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree Message-ID: Hello all, I have the same question I posted on stack overflow:?https://stackoverflow.com/questions/46693557/finding-closest-point-in-array-inverse-of-kdtree I have a very large ndarray A, and a sorted list of points k (a small list, about 30 points). For every element of A, I want to determine the closest element in the list of points k, together with the index. So something like: ? ? >>> A = np.asarray([3, 4, 5, 6]) ? ? >>> k = np.asarray([4.1, 3]) ? ? >>> values, indices ? ? [3, 4.1, 4.1, 4.1], [1, 0, 0, 0] Now, the problem is that A is very very large. So I can't do something inefficient like adding one dimension to A, take the abs difference to k, and then take the minimum of each column. For now I have been using np.searchsorted, as shown in the second answer here: https://stackoverflow.com/questions/2566412/find-nearest-value-in-numpy-array but even this is too slow. I thought of using scipy.spatial.KDTree: ? ? >>> d = scipy.spatial.KDTree(k) ? ? >>> d.query(A) This turns out to be much slower than the searchsorted solution. On the other hand, the array A is always the same, only k changes. So it would be beneficial to use some auxiliary structure (like a "inverse KDTree") on A, and then query the results on the small array k. Is there something like that? -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Wed Oct 11 16:13:06 2017 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Wed, 11 Oct 2017 22:13:06 +0200 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree In-Reply-To: References: Message-ID: On 11 October 2017 at 19:01, Antonio Polino wrote: > On the other hand, the array A is always the same, only k changes. So it > would be beneficial to use some auxiliary structure (like a "inverse > KDTree") on A, and then query the results on the small array k. > You can sort A and use searchsorted to find the positions of the midpoints of k. Then, you assign the values to the group. In your example (adding one more and sorting k): >>> A = np.asarray([3, 4, 5, 6]) >>> k = np.asarray([3, 4.1, 5.2]) >>> midpoints = np.array([ 3.55, 4.65]) # k[:-1] + np.diff(k) / 2 >>> np.searchsorted(A, midpoints) array([1, 2]) So, this means that A[:1] belong to k[0], A[1:2] belong to k[2] and A[2:] belong to k[3] /David. PS: in your later email I see your name as Ant. -------------- next part -------------- An HTML attachment was scrubbed... URL: From xabart at gmail.com Wed Oct 11 19:16:37 2017 From: xabart at gmail.com (Xavier Barthelemy) Date: Thu, 12 Oct 2017 10:16:37 +1100 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree In-Reply-To: References: Message-ID: Hi Antonio, I think yo are looking for the scipy.spatial.distance_matrix function I used it inside a wave crest tracking algorithm (on probably a much smaller matrix than your), but unfortunately, there is no magical method to reduce significantly the computation time. Have a go at it Xavier 2017-10-12 7:13 GMT+11:00 Da?id : > On 11 October 2017 at 19:01, Antonio Polino wrote: > >> On the other hand, the array A is always the same, only k changes. So it >> would be beneficial to use some auxiliary structure (like a "inverse >> KDTree") on A, and then query the results on the small array k. >> > > You can sort A and use searchsorted to find the positions of the midpoints > of k. Then, you assign the values to the group. > > In your example (adding one more and sorting k): > > >>> A = np.asarray([3, 4, 5, 6]) > >>> k = np.asarray([3, 4.1, 5.2]) > >>> midpoints = np.array([ 3.55, 4.65]) # k[:-1] + np.diff(k) / 2 > >>> np.searchsorted(A, midpoints) > array([1, 2]) > > So, this means that A[:1] belong to k[0], A[1:2] belong to k[2] and A[2:] > belong to k[3] > > > /David. > > PS: in your later email I see your name as Ant. > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -- ? Quand le gouvernement viole les droits du peuple, l'insurrection est, pour le peuple et pour chaque portion du peuple, le plus sacr? des droits et le plus indispensable des devoirs ? D?claration des droits de l'homme et du citoyen, article 35, 1793 -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Oct 11 19:24:59 2017 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 11 Oct 2017 16:24:59 -0700 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree In-Reply-To: References: Message-ID: On Wed, Oct 11, 2017 at 10:01 AM, Ant wrote: > > Hello all, > > I have the same question I posted on stack overflow: https://stackoverflow.com/questions/46693557/finding-closest-point-in-array-inverse-of-kdtree > > I have a very large ndarray A, and a sorted list of points k (a small list, about 30 points). > > For every element of A, I want to determine the closest element in the list of points k, together with the index. So something like: > > >>> A = np.asarray([3, 4, 5, 6]) > >>> k = np.asarray([4.1, 3]) > >>> values, indices > [3, 4.1, 4.1, 4.1], [1, 0, 0, 0] > > Now, the problem is that A is very very large. So I can't do something inefficient like adding one dimension to A, take the abs difference to k, and then take the minimum of each column. > > For now I have been using np.searchsorted, as shown in the second answer here: https://stackoverflow.com/questions/2566412/find-nearest-value-in-numpy-array but even this is too slow. > > I thought of using scipy.spatial.KDTree: > > >>> d = scipy.spatial.KDTree(k) > >>> d.query(A) > > This turns out to be much slower than the searchsorted solution. > > On the other hand, the array A is always the same, only k changes. So it would be beneficial to use some auxiliary structure (like a "inverse KDTree") on A, and then query the results on the small array k. > > Is there something like that? The KDTree and BallTree implementations in scikit-learn have implementations for querying with other trees. Unfortunately, these implementations are hidden behind an interface that builds the query tree on demand and then throws it away. You'd have to subclass in Cython and expose the `dualtree` implementations as a Python-exposed method. https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/neighbors/binary_tree.pxi#L1250-L1254 -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoniy.py at gmail.com Thu Oct 12 09:18:29 2017 From: antoniy.py at gmail.com (Ant) Date: Thu, 12 Oct 2017 15:18:29 +0200 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree In-Reply-To: References: Message-ID: <4eb575c2-1d6f-4df9-9ee5-29d5b48dbe9e@Spark> @Da?id Thank you! This indeed is twice as fast as doing it normally (not counting the fixed time of sorting A, of course). I would still like to speed it up, getting another 2x? speedup. This is my code, please tell me if you have any suggestions! *Preprocessing part* #it can be slow, it is not repeated indices_sort = np.argsort(A) sortedA = A[indices_sort] inv_indices_sort = np.argsort(indices_sort) *Repeated part midpoints = k[:-1] + np.diff(k)/2 idx_aux = np.searchsorted(sortedA, midpoints) idx = [] count = 0 final_indices = np.zeros(sortedA.shape, dtype=int) old_obj = None for obj in idx_aux: if obj != old_obj: idx.append((obj, count)) old_obj = obj count += 1 old_idx = 0 for idx_A, idx_k in idx: final_indices[old_idx:idx_A] = idx_k old_idx = idx_A final_indices[old_idx:] = len(k)-1 indicesClosest = final_indices[inv_idx_sort] Thank you! @Xavier Thank you for your answer, but won?t spatial distance be too slow? If it computes the distances from every point of A to every point of k it is computing a lot of unnecessary things. @Robert Thank you for the link! Do you think that it will offer signifiant advantages over the search sorted solution? I ask because I have never written anything in Cython (I wouldn?t know where to start, to be fair) so I am a little reluctant to start messing with scipy internal code :) On 12 Oct 2017, 01:26 +0200, Robert Kern , wrote: > On Wed, Oct 11, 2017 at 10:01 AM, Ant wrote: > > > > Hello all, > > > > I have the same question I posted on stack overflow: https://stackoverflow.com/questions/46693557/finding-closest-point-in-array-inverse-of-kdtree > > > > I have a very large ndarray A, and a sorted list of points k (a small list, about 30 points). > > > > For every element of A, I want to determine the closest element in the list of points k, together with the index. So something like: > > > > ? ? >>> A = np.asarray([3, 4, 5, 6]) > > ? ? >>> k = np.asarray([4.1, 3]) > > ? ? >>> values, indices > > ? ? [3, 4.1, 4.1, 4.1], [1, 0, 0, 0] > > > > Now, the problem is that A is very very large. So I can't do something inefficient like adding one dimension to A, take the abs difference to k, and then take the minimum of each column. > > > > For now I have been using np.searchsorted, as shown in the second answer here: https://stackoverflow.com/questions/2566412/find-nearest-value-in-numpy-array but even this is too slow. > > > > I thought of using scipy.spatial.KDTree: > > > > ? ? >>> d = scipy.spatial.KDTree(k) > > ? ? >>> d.query(A) > > > > This turns out to be much slower than the searchsorted solution. > > > > On the other hand, the array A is always the same, only k changes. So it would be beneficial to use some auxiliary structure (like a "inverse KDTree") on A, and then query the results on the small array k. > > > > Is there something like that? > > The KDTree and BallTree implementations in scikit-learn have implementations for querying with other trees. Unfortunately, these implementations are hidden behind an interface that builds the query tree on demand and then throws it away. You'd have to subclass in Cython and expose the `dualtree` implementations as a Python-exposed method. > > https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/neighbors/binary_tree.pxi#L1250-L1254 > > -- > Robert Kern > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoniy.py at gmail.com Thu Oct 12 09:21:12 2017 From: antoniy.py at gmail.com (Ant) Date: Thu, 12 Oct 2017 15:21:12 +0200 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree In-Reply-To: References: Message-ID: <46652ec7-1708-4c2a-9b45-1d6a41fe12c3@Spark> Sorry, I included your reply into Robert. I am a little confused by this system :D On 11 Oct 2017, 22:15 +0200, Da?id , wrote: > > On 11 October 2017 at 19:01, Antonio Polino wrote: > > > On the other hand, the array A is always the same, only k changes. So it would be beneficial to use some auxiliary structure (like a "inverse KDTree") on A, and then query the results on the small array k. > > > > You can sort A and use searchsorted to find the positions of the midpoints of k. Then, you assign the values to the group. > > > > In your example (adding one more and sorting k): > > > > ? ? >>> A = np.asarray([3, 4, 5, 6]) > > ? ? >>> k = np.asarray([3, 4.1, 5.2]) > > ??? >>> midpoints = np.array([ 3.55,? 4.65])?? #? k[:-1] + np.diff(k) / 2 > > ??? >>> np.searchsorted(A, midpoints) > > ??? array([1, 2]) > > > > So, this means that A[:1] belong to k[0], A[1:2] belong to k[2] and A[2:] belong to k[3] > > > > > > /David. > > > > PS: in your later email I see your name as Ant. > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.atasever at gmail.com Thu Oct 12 09:34:31 2017 From: s.atasever at gmail.com (Sema Atasever) Date: Thu, 12 Oct 2017 16:34:31 +0300 Subject: [SciPy-User] Segmentation fault (core dumped) Message-ID: Dear scipy-users: I am trying to write a python program which uses SciPy library that makes hierarchical clustering. After I type in my input at the "shell> " prompt: *python SciPy_clustering.py * gives me the following error: *Segmentation fault (core dumped)* How can i fix this problem? There is enough memory on my system (516 GB), python code uses just 21GB memory. My dataset includes 73.622 data samples (rows) and 22 features (columns). When i shrink dataset (31.000) it works fine correctly. Thanks in Advance. *python Clustering Code:* # needed imports import numpy as np import time from matplotlib import pyplot as plt from scipy.cluster.hierarchy import dendrogram, linkage from scipy.cluster.hierarchy import cophenet from scipy.spatial.distance import pdist start_time = time.clock() X=np.loadtxt(open("dataset.txt", "rb"), delimiter=";") print ('\nX.shape') print (X.shape) # generate the linkage matrix Z = linkage(X, 'single') np.savetxt('Z_output.txt', Z, fmt='%-7.5f') c, coph_dists = cophenet(Z, pdist(X)) print("Cophenetic Correlation Coefficient : \n") print(c) print("\nRunning Time:") print (time.clock() - start_time, "seconds") -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Thu Oct 12 10:16:27 2017 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Thu, 12 Oct 2017 16:16:27 +0200 Subject: [SciPy-User] Finding closest point in array - inverse of KDTree In-Reply-To: <4eb575c2-1d6f-4df9-9ee5-29d5b48dbe9e@Spark> References: <4eb575c2-1d6f-4df9-9ee5-29d5b48dbe9e@Spark> Message-ID: On 12 October 2017 at 15:18, Ant wrote: > @Da?id > Thank you! This indeed is twice as fast as doing it normally (not counting > the fixed time of sorting A, of course). > I would still like to speed it up, getting another 2x speedup. This is my > code, please tell me if you have any suggestions! > As always, when optimising you must profile. For A of size 3000000 and k of size 30, this is what I get: https://gist.github.com/Dapid/ed23a1bb8e96782c0b698edecff14435 89% of the time is being spent on this line (and it gets worse as you increase the size of A): indicesClosest = final_indices[inv_idx_sort] > > I don't know from the top of my head of a faster way of doing this, so, can you somehow adapt your problem to use your sorted indexes of A instead? This you can very easily rewrite unrolled in Cython, I think you can scrape a bit of time there. Here is a good tutorial for Numpy: http://docs.cython.org/en/latest/src/userguide/numpy_tutorial.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Thu Oct 12 10:20:25 2017 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Thu, 12 Oct 2017 16:20:25 +0200 Subject: [SciPy-User] Segmentation fault (core dumped) In-Reply-To: References: Message-ID: How far in your code do you reach? That will help narrowing down the problem. Also, do you have the core dump? Can you run gdb on it? On 12 October 2017 at 15:34, Sema Atasever wrote: > Dear scipy-users: > > I am trying to write a python program which uses SciPy library that makes > hierarchical clustering. > > After I type in my input at the "shell> " prompt: *python > SciPy_clustering.py * > > gives me the following error: > > *Segmentation fault (core dumped)* > > How can i fix this problem? > > There is enough memory on my system (516 GB), python code uses just 21GB > memory. > > My dataset includes 73.622 data samples (rows) and 22 features (columns). > When i shrink dataset (31.000) it works fine correctly. > > Thanks in Advance. > > *python Clustering Code:* > # needed imports > import numpy as np > import time > from matplotlib import pyplot as plt > from scipy.cluster.hierarchy import dendrogram, linkage > from scipy.cluster.hierarchy import cophenet > from scipy.spatial.distance import pdist > > start_time = time.clock() > > X=np.loadtxt(open("dataset.txt", "rb"), delimiter=";") > > print ('\nX.shape') > print (X.shape) > > # generate the linkage matrix > Z = linkage(X, 'single') > > np.savetxt('Z_output.txt', Z, fmt='%-7.5f') > > c, coph_dists = cophenet(Z, pdist(X)) > > print("Cophenetic Correlation Coefficient : \n") > print(c) > > print("\nRunning Time:") > print (time.clock() - start_time, "seconds") > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stuart at stuartreynolds.net Thu Oct 12 16:52:07 2017 From: stuart at stuartreynolds.net (Stuart Reynolds) Date: Thu, 12 Oct 2017 13:52:07 -0700 Subject: [SciPy-User] OpenBLAS/OpenMP env vars used? Message-ID: I'm using statsmodels and and curious about whether it's optimizers can be sped up. According to https://github.com/xianyi/OpenBLAS (which I think statsmodels uses through scipy) BLAS looks at the environment GOTO_NUM_THREADS, for open MP. It looks like it does, insofar as: python myscript.py ... works fine (but provided no speedup), and: GOTO_NUM_THREADS=4 python myscript.py consistently changes the behavior of my program (it hits a singular matrix error in this case). Is there any guidance on whether these variables can be safely used? Thanks, - Stuart From kmtac at LIVE.COM Thu Oct 12 16:19:25 2017 From: kmtac at LIVE.COM (. kt) Date: Thu, 12 Oct 2017 20:19:25 +0000 Subject: [SciPy-User] Code of conduct: 12 Oct blog featured on planet scipy Message-ID: Hello, I don't usually post to this list, but I check the blogs at planet.scipy.org regularly. They generally have useful, interesting and appropriate content. The 12 October 2017 post by Philip Herron seems to be a glaring exception. By the post's title -- something about "sex toys for men" -- the material seems to be inappropriate for a scientific python blog aggregator. (I didn't read the post, it could be satire or something -- but a blog post with that title should never appear on planet.scipy.org.) It also seems to be the type of thing that could give scientific python a bad name, on a high visibility website, a link to which is on the scipy.org main page. Perhaps a good time to test the proposed code of conduct. Meanwhile, perhaps someone who can edit planet.scipy.org should remove the post -- sooner, rather than later. Thanks, kmtac -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Oct 13 02:45:04 2017 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 12 Oct 2017 23:45:04 -0700 Subject: [SciPy-User] Code of conduct: 12 Oct blog featured on planet scipy In-Reply-To: References: Message-ID: On Thu, Oct 12, 2017 at 1:19 PM, . kt wrote: > > Hello, > > I don't usually post to this list, but I check the blogs at planet.scipy.org regularly. They generally have useful, interesting and appropriate content. > > The 12 October 2017 post by Philip Herron seems to be a glaring exception. By the post's title -- something about "sex toys for men" -- the material seems to be inappropriate for a scientific python blog aggregator. (I didn't read the post, it could be satire or something -- but a blog post with that title should never appear on planet.scipy.org.) It also seems to be the type of thing that could give scientific python a bad name, on a high visibility website, a link to which is on the scipy.org main page. > > Perhaps a good time to test the proposed code of conduct. FWIW, it seems more likely to me that the site got hacked and taken over by a spammer. > Meanwhile, perhaps someone who can edit planet.scipy.org should remove the post -- sooner, rather than later. It already seems to be gone. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Fri Oct 13 03:47:41 2017 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 13 Oct 2017 09:47:41 +0200 Subject: [SciPy-User] [SciPy-Dev] Code of conduct: 12 Oct blog featured on planet scipy In-Reply-To: References: Message-ID: <20171013074741.GZ3536894@phare.normalesup.org> On Thu, Oct 12, 2017 at 11:45:04PM -0700, Robert Kern wrote: > On Thu, Oct 12, 2017 at 1:19 PM, . kt wrote: > > Perhaps a good time to test the proposed code of conduct. > FWIW, it seems more likely to me that the site got hacked and taken over by a > spammer. Indeed. > > Meanwhile, perhaps someone who can edit planet.scipy.org should remove the post -- sooner, rather than later. > It already seems to be gone. Removing the feed is the first thing that I did when I woke up this morning (Paris time zone). I have admin rights on the planet. My name is listed there as an administrator. People should not hesitate to contact me if something is wrong with the planet and I haven't noticed. I'll take action as quickly as possible. Ga?l From Jerome.Kieffer at esrf.fr Fri Oct 13 04:46:08 2017 From: Jerome.Kieffer at esrf.fr (Jerome Kieffer) Date: Fri, 13 Oct 2017 10:46:08 +0200 Subject: [SciPy-User] [SciPy-Dev] Code of conduct: 12 Oct blog featured on planet scipy In-Reply-To: <20171013074741.GZ3536894@phare.normalesup.org> References: <20171013074741.GZ3536894@phare.normalesup.org> Message-ID: <20171013104608.06f91a8c@lintaillefer.esrf.fr> On Fri, 13 Oct 2017 09:47:41 +0200 Gael Varoquaux wrote: > > I have admin rights on the planet. My name is listed there as an > administrator. People should not hesitate to contact me if something is > wrong with the planet and I haven't noticed. I'll take action as quickly > as possible. Thanks for your work, Gael. It does not always show up that "institutional-looking" sites are actually managed by good-willing individuals with the limits associated to human-beings. -- J?r?me Kieffer tel +33 476 882 445 From kmtac at LIVE.COM Fri Oct 13 10:29:42 2017 From: kmtac at LIVE.COM (. kt) Date: Fri, 13 Oct 2017 14:29:42 +0000 Subject: [SciPy-User] [SciPy-Dev] Code of conduct: 12 Oct blog featured on planet scipy In-Reply-To: References: Message-ID: Date: Fri, 13 Oct 2017 09:47:41 +0200 From: Gael Varoquaux To: SciPy Developers List Cc: SciPy Users List Subject: Re: [SciPy-User] [SciPy-Dev] Code of conduct: 12 Oct blog featured on planet scipy Message-ID: <20171013074741.GZ3536894 at phare.normalesup.org> Content-Type: text/plain; charset=iso-8859-1 >On Thu, Oct 12, 2017 at 11:45:04PM -0700, Robert Kern wrote: > >On Thu, Oct 12, 2017 at 1:19 PM, . kt wrote: >> > Perhaps a good time to test the proposed code of conduct. > >FWIW, it seems more likely to me that the site got hacked and taken over by a > >spammer. >Indeed. >> > Meanwhile, perhaps someone who can edit planet.scipy.org should remove the post -- sooner, rather than later. > >It already seems to be gone. >Removing the feed is the first thing that I did when I woke up this >morning (Paris time zone). >I have admin rights on the planet. My name is listed there as an >administrator. People should not hesitate to contact me if something is >wrong with the planet and I haven't noticed. I'll take action as quickly >as possible. >Ga?l Thanks, Gael! I had looked for the administrator at planet scipy, but missed it despite your name being there in plain site on the sidebar. In the unlikely event that this happens again, I'll email you directly. It's good that there was no bad intent on anyone's part. I actually did assume that the website was run by good-willing individuals. My concern was more that newcomers to python for data analysis would not have the context to know this, especially since programming is now dominated by men*. In particular, I was thinking of my sister, a mid-career MBA who says that the one gap in her education was not learning how to program. I'm planning to teach her -- and perhaps others in her small, women-owned consulting firm -- how to program in python, to replace/supplement Excel-based analysis. I would also recommend planet.scipy.org to her as useful resource -- it's where I discovered pandas and ipython/jupyter notebook. But she and others would not have the background to know Wednesday's (hacked) post was very unusual. Given the coverage of programming in the media (e.g., https://www.theatlantic.com/magazine/archive/2017/04/why-is-silicon-valley-so-awful-to-women/517788/), it would be easy for those new to scientific python to make the wrong assumption. *(But used to be dominated by women -- https://www.smithsonianmag.com/smart-news/computer-programming-used-to-be-womens-work-718061/ , https://www.theatlantic.com/business/archive/2016/09/what-programmings-past-reveals-about-todays-gender-pay-gap/498797/.) Best, kmtac -------------- next part -------------- An HTML attachment was scrubbed... URL: From winash12 at gmail.com Mon Oct 16 09:31:27 2017 From: winash12 at gmail.com (ashwin .D) Date: Mon, 16 Oct 2017 19:01:27 +0530 Subject: [SciPy-User] Obtaining contour data Message-ID: Hello, I have height data on a two dimensional equidistant latitude longitude grid and my ultimate goal is to calculate the curvature of the contours. Here is an illustration of what I want to do - http://www.indiana.edu/%7Egeog109/topics/10_Forces&Winds/sfc_trough.html. The data is plotted on a two dimensional latitude longitude grid and then the height contours are calculated. I am looking to obtain the coordinates of the contours(shown in red) in terms of latitude and longitude so that I can use that information to calculate the curvature of the contour points using a least squares method. Is it possible to do this in Scipy ? The curvature of the earth is itself irrelevant in this case and it can be assumed to be flat. Regards, Ashwin. -------------- next part -------------- An HTML attachment was scrubbed... URL: From contact at nicolas-cellier.net Mon Oct 16 10:20:01 2017 From: contact at nicolas-cellier.net (Nicolas Cellier) Date: Mon, 16 Oct 2017 16:20:01 +0200 Subject: [SciPy-User] Obtaining contour data In-Reply-To: References: Message-ID: It may help you https://stackoverflow.com/questions/5666056/matplotlib-extracting-data-from-contour-lines 2017-10-16 15:31 GMT+02:00 ashwin .D : > Hello, > I have height data on a two dimensional equidistant latitude > longitude grid and my ultimate goal is to calculate the curvature of the > contours. Here is an illustration of what I want to do - > http://www.indiana.edu/%7Egeog109/topics/10_Forces&Winds/sfc_trough.html. > The data is plotted on a two dimensional latitude longitude grid and then > the height contours are calculated. I am looking to obtain the coordinates > of the contours(shown in red) in terms of latitude and longitude so that I > can use that information to calculate the curvature of the contour points > using a least squares method. Is it possible to do this in Scipy ? The > curvature of the earth is itself irrelevant in this case and it can be > assumed to be flat. > > Regards, > Ashwin. > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From winash12 at gmail.com Mon Oct 16 10:24:54 2017 From: winash12 at gmail.com (ashwin .D) Date: Mon, 16 Oct 2017 19:54:54 +0530 Subject: [SciPy-User] Obtaining contour data In-Reply-To: References: Message-ID: Yes I looked at that. The thing is my data is in raster coordinates(latitude and longitude - equidistant grid). When I pull out the coordinates from collections are these polygons (vectors) or is the raster entity still retained ? On Mon, Oct 16, 2017 at 7:50 PM, Nicolas Cellier < contact at nicolas-cellier.net> wrote: > It may help you > > https://stackoverflow.com/questions/5666056/matplotlib- > extracting-data-from-contour-lines > > 2017-10-16 15:31 GMT+02:00 ashwin .D : > >> Hello, >> I have height data on a two dimensional equidistant latitude >> longitude grid and my ultimate goal is to calculate the curvature of the >> contours. Here is an illustration of what I want to do - >> http://www.indiana.edu/%7Egeog109/topics/10_Forces&Winds/sfc_trough.html. >> The data is plotted on a two dimensional latitude longitude grid and then >> the height contours are calculated. I am looking to obtain the coordinates >> of the contours(shown in red) in terms of latitude and longitude so that I >> can use that information to calculate the curvature of the contour points >> using a least squares method. Is it possible to do this in Scipy ? The >> curvature of the earth is itself irrelevant in this case and it can be >> assumed to be flat. >> >> Regards, >> Ashwin. >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From contact at nicolas-cellier.net Mon Oct 16 10:28:25 2017 From: contact at nicolas-cellier.net (Nicolas Cellier) Date: Mon, 16 Oct 2017 16:28:25 +0200 Subject: [SciPy-User] Obtaining contour data In-Reply-To: References: Message-ID: <1bfd9eac-17e8-4c67-9825-c076a00e6ca9@nicolas-cellier.net> If you fear that, you can easily map the raster coordinates to the real ones and convert this after extracting the contour. It's an extra step but should not be a real difficulty. ?Envoy? par TypeApp ? Le 16 oct. 2017 16:25, ? 16:25, "ashwin .D" a ?crit: >Yes I looked at that. The thing is my data is in raster >coordinates(latitude and longitude - equidistant grid). When I pull out >the >coordinates from collections are these polygons (vectors) or is the >raster >entity still retained ? > >On Mon, Oct 16, 2017 at 7:50 PM, Nicolas Cellier < >contact at nicolas-cellier.net> wrote: > >> It may help you >> >> https://stackoverflow.com/questions/5666056/matplotlib- >> extracting-data-from-contour-lines >> >> 2017-10-16 15:31 GMT+02:00 ashwin .D : >> >>> Hello, >>> I have height data on a two dimensional equidistant >latitude >>> longitude grid and my ultimate goal is to calculate the curvature of >the >>> contours. Here is an illustration of what I want to do - >>> >http://www.indiana.edu/%7Egeog109/topics/10_Forces&Winds/sfc_trough.html. >>> The data is plotted on a two dimensional latitude longitude grid and >then >>> the height contours are calculated. I am looking to obtain the >coordinates >>> of the contours(shown in red) in terms of latitude and longitude so >that I >>> can use that information to calculate the curvature of the contour >points >>> using a least squares method. Is it possible to do this in Scipy ? >The >>> curvature of the earth is itself irrelevant in this case and it can >be >>> assumed to be flat. >>> >>> Regards, >>> Ashwin. >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > >------------------------------------------------------------------------ > >_______________________________________________ >SciPy-User mailing list >SciPy-User at python.org >https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From jslavin at cfa.harvard.edu Mon Oct 16 13:39:36 2017 From: jslavin at cfa.harvard.edu (Slavin, Jonathan) Date: Mon, 16 Oct 2017 13:39:36 -0400 Subject: [SciPy-User] Obtaining contour data Message-ID: Ashwin, I have found that the scikit image routine find_contours is easier to work with than extracting the data from the ContourSet object as in the cited stackoverflow example. If you do something like from skimage.measure import find_contours cs = find_contours(array, values) (where values are the levels you want for your contours and array is the grid of values) then you get back a list of contour lines. For each contour you have a Nx2 array of effective indices into the array. You can convert to your lat-long grid by multiplying by the appropriate scales. skimage also has a nice routine called grid_points_in_poly which will return a boolean array that is True for all points of the input array that are inside the polygon to which you can use one of the contours as an input. Regards, Jon On Mon, Oct 16, 2017 at 10:31 AM, wrote: > > Date: Mon, 16 Oct 2017 19:01:27 +0530 > From: "ashwin .D" > To: scipy-user at python.org > Subject: [SciPy-User] Winds/sfc_trough.html > > . > > The data is plotted on a two dimensional latitude longitude grid and then > > the height contours are calculated. I am looking to obtain the > coordinates > > of the contours(shown in red) in terms of latitude and longitude so that > I > > can use that information to calculate the curvature of the contour > points > > using a least squares method. Is it possible to do this in Scipy ? The > > curvature of the earth is itself irrelevant in this case and it can be > > assumed to be flat. > > > > Regards, > > Ashwin. > > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > -- ________________________________________________________ Jonathan D. Slavin Harvard-Smithsonian CfA jslavin at cfa.harvard.edu 60 Garden Street, MS 83 phone: (617) 496-7981 Cambridge, MA 02138-1516 cell: (781) 363-0035 USA ________________________________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Oct 18 06:04:40 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 18 Oct 2017 23:04:40 +1300 Subject: [SciPy-User] ANN: second SciPy 1.0.0 release candidate Message-ID: Hi all, I'm excited to be able to announce the availability of the second (and hopefully last) release candidate of Scipy 1.0. This is a big release, and a version number that has been 16 years in the making. It contains a few more deprecations and backwards incompatible changes than an average release. Therefore please do test it on your own code, and report any issues on the Github issue tracker or on the scipy-dev mailing list. Sources and binary wheels can be found at https://pypi.python.org/pypi/scipy and https://github.com/scipy/scipy/releases/tag/v1.0.0rc2. To install with pip: pip install --pre --upgrade scipy The most important issues fixed after v1.0.0rc1 is https://github.com/scipy/scipy/issues/7969 (missing DLL in Windows wheel). Pull requests merged after v1.0.0rc1: - `#7948 `__: DOC: add note on checking for deprecations before upgrade to... - `#7952 `__: DOC: update SciPy Roadmap for 1.0 release and recent discussions. - `#7960 `__: BUG: optimize: revert changes to bfgs in gh-7165 - `#7962 `__: TST: special: mark a failing hyp2f1 test as xfail - `#7973 `__: BUG: fixed keyword in 'info' in ``_get_mem_available`` utility - `#7986 `__: TST: Relax test_trsm precision to 5 decimals - `#8001 `__: TST: fix test failures from Matplotlib 2.1 update - `#8010 `__: BUG: signal: fix crash in lfilter - `#8019 `__: MAINT: fix test failures with NumPy master Thanks to everyone who contributed to this release! Ralf ========================== SciPy 1.0.0 Release Notes ========================== .. note:: Scipy 1.0.0 is not released yet! .. contents:: SciPy 1.0.0 is the culmination of 8 months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 1.0.x branch, and on adding new features on the master branch. Some of the highlights of this release are: - Major build improvements. Windows wheels are available on PyPI for the first time, and continuous integration has been set up on Windows and OS X in addition to Linux. - A set of new ODE solvers and a unified interface to them (`scipy.integrate.solve_ivp`). - Two new trust region optimizers and a new linear programming method, with improved performance compared to what `scipy.optimize` offered previously. - Many new BLAS and LAPACK functions were wrapped. The BLAS wrappers are now complete. This release requires Python 2.7 or 3.4+ and NumPy 1.8.2 or greater. This is also the last release to support LAPACK 3.1.x - 3.3.x. Moving the lowest supported LAPACK version to >3.2.x was long blocked by Apple Accelerate providing the LAPACK 3.2.1 API. We have decided that it's time to either drop Accelerate or, if there is enough interest, provide shims for functions added in more recent LAPACK versions so it can still be used. New features ============ `scipy.cluster` improvements ---------------------------- `scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a linkage matrix to minimize distances between adjacent leaves, was added. `scipy.fftpack` improvements ---------------------------- N-dimensional versions of the discrete sine and cosine transforms and their inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``. `scipy.integrate` improvements ------------------------------ A set of new ODE solvers have been added to `scipy.integrate`. The convenience function `scipy.integrate.solve_ivp` allows uniform access to all solvers. The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and ``LSODA``) can also be used directly. `scipy.linalg` improvements ---------------------------- The BLAS wrappers in `scipy.linalg.blas` have been completed. Added functions are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``, ``*spr``, ``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, ``*sbmv``, ``*spr2``, Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``, ``*hetrd``, ``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``, ``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added. The function `scipy.linalg.subspace_angles` has been added to compute the subspace angles between two matrices. The function `scipy.linalg.clarkson_woodruff_transform` has been added. It finds low-rank matrix approximation via the Clarkson-Woodruff Transform. The functions `scipy.linalg.eigh_tridiagonal` and `scipy.linalg.eigvalsh_tridiagonal`, which find the eigenvalues and eigenvectors of tridiagonal hermitian/symmetric matrices, were added. `scipy.ndimage` improvements ---------------------------- Support for homogeneous coordinate transforms has been added to `scipy.ndimage.affine_transform`. The ``ndimage`` C code underwent a significant refactoring, and is now a lot easier to understand and maintain. `scipy.optimize` improvements ----------------------------- The methods ``trust-region-exact`` and ``trust-krylov`` have been added to the function `scipy.optimize.minimize`. These new trust-region methods solve the subproblem with higher accuracy at the cost of more Hessian factorizations (compared to dogleg) or more matrix vector products (compared to ncg) but usually require less nonlinear iterations and are able to deal with indefinite Hessians. They seem very competitive against the other Newton methods implemented in scipy. `scipy.optimize.linprog` gained an interior point method. Its performance is superior (both in accuracy and speed) to the older simplex method. `scipy.signal` improvements --------------------------- An argument ``fs`` (sampling frequency) was added to the following functions: ``firwin``, ``firwin2``, ``firls``, and ``remez``. This makes these functions consistent with many other functions in `scipy.signal` in which the sampling frequency can be specified. `scipy.signal.freqz` has been sped up significantly for FIR filters. `scipy.sparse` improvements --------------------------- Iterating over and slicing of CSC and CSR matrices is now faster by up to ~35%. The ``tocsr`` method of COO matrices is now several times faster. The ``diagonal`` method of sparse matrices now takes a parameter, indicating which diagonal to return. `scipy.sparse.linalg` improvements ---------------------------------- A new iterative solver for large-scale nonsymmetric sparse linear systems, `scipy.sparse.linalg.gcrotmk`, was added. It implements ``GCROT(m,k)``, a flexible variant of ``GCROT``. `scipy.sparse.linalg.lsmr` now accepts an initial guess, yielding potentially faster convergence. SuperLU was updated to version 5.2.1. `scipy.spatial` improvements ---------------------------- Many distance metrics in `scipy.spatial.distance` gained support for weights. The signatures of `scipy.spatial.distance.pdist` and `scipy.spatial.distance.cdist` were changed to ``*args, **kwargs`` in order to support a wider range of metrics (e.g. string-based metrics that need extra keywords). Also, an optional ``out`` parameter was added to ``pdist`` and ``cdist`` allowing the user to specify where the resulting distance matrix is to be stored `scipy.stats` improvements -------------------------- The methods ``cdf`` and ``logcdf`` were added to `scipy.stats.multivariate_normal`, providing the cumulative distribution function of the multivariate normal distribution. New statistical distance functions were added, namely `scipy.stats.wasserstein_distance` for the first Wasserstein distance and `scipy.stats.energy_distance` for the energy distance. Deprecated features =================== The following functions in `scipy.misc` are deprecated: ``bytescale``, ``fromimage``, ``imfilter``, ``imread``, ``imresize``, ``imrotate``, ``imsave``, ``imshow`` and ``toimage``. Most of those functions have unexpected behavior (like rescaling and type casting image data without the user asking for that). Other functions simply have better alternatives. ``scipy.interpolate.interpolate_wrapper`` and all functions in that submodule are deprecated. This was a never finished set of wrapper functions which is not relevant anymore. The ``fillvalue`` of `scipy.signal.convolve2d` will be cast directly to the dtypes of the input arrays in the future and checked that it is a scalar or an array with a single element. ``scipy.spatial.distance.matching`` is deprecated. It is an alias of `scipy.spatial.distance.hamming`, which should be used instead. Implementation of `scipy.spatial.distance.wminkowski` was based on a wrong interpretation of the metric definition. In scipy 1.0 it has been just deprecated in the documentation to keep retro-compatibility but is recommended to use the new version of `scipy.spatial.distance.minkowski` that implements the correct behaviour. Positional arguments of `scipy.spatial.distance.pdist` and `scipy.spatial.distance.cdist` should be replaced with their keyword version. Backwards incompatible changes ============================== The following deprecated functions have been removed from `scipy.stats`: ``betai``, ``chisqprob``, ``f_value``, ``histogram``, ``histogram2``, ``pdf_fromgamma``, ``signaltonoise``, ``square_of_sums``, ``ss`` and ``threshold``. The following deprecated functions have been removed from `scipy.stats.mstats`: ``betai``, ``f_value_wilks_lambda``, ``signaltonoise`` and ``threshold``. The deprecated ``a`` and ``reta`` keywords have been removed from `scipy.stats.shapiro`. The deprecated functions ``sparse.csgraph.cs_graph_components`` and ``sparse.linalg.symeig`` have been removed from `scipy.sparse`. The following deprecated keywords have been removed in `scipy.sparse.linalg`: ``drop_tol`` from ``splu``, and ``xtype`` from ``bicg``, ``bicgstab``, ``cg``, ``cgs``, ``gmres``, ``qmr`` and ``minres``. The deprecated functions ``expm2`` and ``expm3`` have been removed from `scipy.linalg`. The deprecated keyword ``q`` was removed from `scipy.linalg.expm`. And the deprecated submodule ``linalg.calc_lwork`` was removed. The deprecated functions ``C2K``, ``K2C``, ``F2C``, ``C2F``, ``F2K`` and ``K2F`` have been removed from `scipy.constants`. The deprecated ``ppform`` class was removed from `scipy.interpolate`. The deprecated keyword ``iprint`` was removed from `scipy.optimize.fmin_cobyla`. The default value for the ``zero_phase`` keyword of `scipy.signal.decimate` has been changed to True. The ``kmeans`` and ``kmeans2`` functions in `scipy.cluster.vq` changed the method used for random initialization, so using a fixed random seed will not necessarily produce the same results as in previous versions. `scipy.special.gammaln` does not accept complex arguments anymore. The deprecated functions ``sph_jn``, ``sph_yn``, ``sph_jnyn``, ``sph_in``, ``sph_kn``, and ``sph_inkn`` have been removed. Users should instead use the functions ``spherical_jn``, ``spherical_yn``, ``spherical_in``, and ``spherical_kn``. Be aware that the new functions have different signatures. The cross-class properties of `scipy.signal.lti` systems have been removed. The following properties/setters have been removed: Name - (accessing/setting has been removed) - (setting has been removed) * StateSpace - (``num``, ``den``, ``gain``) - (``zeros``, ``poles``) * TransferFunction (``A``, ``B``, ``C``, ``D``, ``gain``) - (``zeros``, ``poles``) * ZerosPolesGain (``A``, ``B``, ``C``, ``D``, ``num``, ``den``) - () ``signal.freqz(b, a)`` with ``b`` or ``a`` >1-D raises a ``ValueError``. This was a corner case for which it was unclear that the behavior was well-defined. The method ``var`` of `scipy.stats.dirichlet` now returns a scalar rather than an ndarray when the length of alpha is 1. Other changes ============= SciPy now has a formal governance structure. It consists of a BDFL (Pauli Virtanen) and a Steering Committee. See `the governance document `_ for details. It is now possible to build SciPy on Windows with MSVC + gfortran! Continuous integration has been set up for this build configuration on Appveyor, building against OpenBLAS. Continuous integration for OS X has been set up on TravisCI. The SciPy test suite has been migrated from ``nose`` to ``pytest``. ``scipy/_distributor_init.py`` was added to allow redistributors of SciPy to add custom code that needs to run when importing SciPy (e.g. checks for hardware, DLL search paths, etc.). Support for PEP 518 (specifying build system requirements) was added - see ``pyproject.toml`` in the root of the SciPy repository. In order to have consistent function names, the function ``scipy.linalg.solve_lyapunov`` is renamed to `scipy.linalg.solve_continuous_lyapunov`. The old name is kept for backwards-compatibility. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Oct 25 06:14:07 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 25 Oct 2017 23:14:07 +1300 Subject: [SciPy-User] SciPy 1.0 released! Message-ID: Hi all, We are extremely pleased to announce the release of SciPy 1.0, 16 years after version 0.1 saw the light of day. It has been a long, productive journey to get here, and we anticipate many more exciting new features and releases in the future. Why 1.0 now? ------------ A version number should reflect the maturity of a project - and SciPy was a mature and stable library that is heavily used in production settings for a long time already. From that perspective, the 1.0 version number is long overdue. Some key project goals, both technical (e.g. Windows wheels and continuous integration) and organisational (a governance structure, code of conduct and a roadmap), have been achieved recently. Many of us are a bit perfectionist, and therefore are reluctant to call something "1.0" because it may imply that it's "finished" or "we are 100% happy with it". This is normal for many open source projects, however that doesn't make it right. We acknowledge to ourselves that it's not perfect, and there are some dusty corners left (that will probably always be the case). Despite that, SciPy is extremely useful to its users, on average has high quality code and documentation, and gives the stability and backwards compatibility guarantees that a 1.0 label imply. Some history and perspectives ----------------------------- - 2001: the first SciPy release - 2005: transition to NumPy - 2007: creation of scikits - 2008: scipy.spatial module and first Cython code added - 2010: moving to a 6-monthly release cycle - 2011: SciPy development moves to GitHub - 2011: Python 3 support - 2012: adding a sparse graph module and unified optimization interface - 2012: removal of scipy.maxentropy - 2013: continuous integration with TravisCI - 2015: adding Cython interface for BLAS/LAPACK and a benchmark suite - 2017: adding a unified C API with scipy.LowLevelCallable; removal of scipy.weave - 2017: SciPy 1.0 release **Pauli Virtanen** is SciPy's Benevolent Dictator For Life (BDFL). He says: *Truthfully speaking, we could have released a SciPy 1.0 a long time ago, so I'm happy we do it now at long last. The project has a long history, and during the years it has matured also as a software project. I believe it has well proved its merit to warrant a version number starting with unity.* *Since its conception 15+ years ago, SciPy has largely been written by and for scientists, to provide a box of basic tools that they need. Over time, the set of people active in its development has undergone some rotation, and we have evolved towards a somewhat more systematic approach to development. Regardless, this underlying drive has stayed the same, and I think it will also continue propelling the project forward in future. This is all good, since not long after 1.0 comes 1.1.* **Travis Oliphant** is one of SciPy's creators. He says: *I'm honored to write a note of congratulations to the SciPy developers and the entire SciPy community for the release of SciPy 1.0. This release represents a dream of many that has been patiently pursued by a stalwart group of pioneers for nearly 2 decades. Efforts have been broad and consistent over that time from many hundreds of people. From initial discussions to efforts coding and packaging to documentation efforts to extensive conference and community building, the SciPy effort has been a global phenomenon that it has been a privilege to participate in.* *The idea of SciPy was already in multiple people?s minds in 1997 when I first joined the Python community as a young graduate student who had just fallen in love with the expressibility and extensibility of Python. The internet was just starting to bringing together like-minded mathematicians and scientists in nascent electronically-connected communities. In 1998, there was a concerted discussion on the matrix-SIG, python mailing list with people like Paul Barrett, Joe Harrington, Perry Greenfield, Paul Dubois, Konrad Hinsen, David Ascher, and others. This discussion encouraged me in 1998 and 1999 to procrastinate my PhD and spend a lot of time writing extension modules to Python that mostly wrapped battle-tested Fortran and C-code making it available to the Python user. This work attracted the help of others like Robert Kern, Pearu Peterson and Eric Jones who joined their efforts with mine in 2000 so that by 2001, the first SciPy release was ready. This was long before Github simplified collaboration and input from others and the "patch" command and email was how you helped a project improve.* *Since that time, hundreds of people have spent an enormous amount of time improving the SciPy library and the community surrounding this library has dramatically grown. I stopped being able to participate actively in developing the SciPy library around 2010. Fortunately, at that time, Pauli Virtanen and Ralf Gommers picked up the pace of development supported by dozens of other key contributors such as David Cournapeau, Evgeni Burovski, Josef Perktold, and Warren Weckesser. While I have only been able to admire the development of SciPy from a distance for the past 7 years, I have never lost my love of the project and the concept of community-driven development. I remain driven even now by a desire to help sustain the development of not only the SciPy library but many other affiliated and related open-source projects. I am extremely pleased that SciPy is in the hands of a world-wide community of talented developers who will ensure that SciPy remains an example of how grass-roots, community-driven development can succeed.* **Fernando Perez** offers a wider community perspective: *The existence of a nascent Scipy library, and the incredible --if tiny by today's standards-- community surrounding it is what drew me into the scientific Python world while still a physics graduate student in 2001. Today, I am awed when I see these tools power everything from high school education to the research that led to the 2017 Nobel Prize in physics.* *Don't be fooled by the 1.0 number: this project is a mature cornerstone of the modern scientific computing ecosystem. I am grateful for the many who have made it possible, and hope to be able to contribute again to it in the future. My sincere congratulations to the whole team!* Highlights of this release -------------------------- Some of the highlights of this release are: - Major build improvements. Windows wheels are available on PyPI for the first time, and continuous integration has been set up on Windows and OS X in addition to Linux. - A set of new ODE solvers and a unified interface to them (`scipy.integrate.solve_ivp`). - Two new trust region optimizers and a new linear programming method, with improved performance compared to what `scipy.optimize` offered previously. - Many new BLAS and LAPACK functions were wrapped. The BLAS wrappers are now complete. Upgrading and compatibility --------------------------- There have been a number of deprecations and API changes in this release, which are documented below. Before upgrading, we recommend that users check that their own code does not use deprecated SciPy functionality (to do so, run your code with ``python -Wd`` and check for ``DeprecationWarning`` s). This release requires Python 2.7 or >=3.4 and NumPy 1.8.2 or greater. This is also the last release to support LAPACK 3.1.x - 3.3.x. Moving the lowest supported LAPACK version to >3.2.x was long blocked by Apple Accelerate providing the LAPACK 3.2.1 API. We have decided that it's time to either drop Accelerate or, if there is enough interest, provide shims for functions added in more recent LAPACK versions so it can still be used. New features ============ `scipy.cluster` improvements ---------------------------- `scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a linkage matrix to minimize distances between adjacent leaves, was added. `scipy.fftpack` improvements ---------------------------- N-dimensional versions of the discrete sine and cosine transforms and their inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``. `scipy.integrate` improvements ------------------------------ A set of new ODE solvers have been added to `scipy.integrate`. The convenience function `scipy.integrate.solve_ivp` allows uniform access to all solvers. The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and ``LSODA``) can also be used directly. `scipy.linalg` improvements ---------------------------- The BLAS wrappers in `scipy.linalg.blas` have been completed. Added functions are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``, ``*spr``, ``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, ``*sbmv``, ``*spr2``, Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``, ``*hetrd``, ``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``, ``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added. The function `scipy.linalg.subspace_angles` has been added to compute the subspace angles between two matrices. The function `scipy.linalg.clarkson_woodruff_transform` has been added. It finds low-rank matrix approximation via the Clarkson-Woodruff Transform. The functions `scipy.linalg.eigh_tridiagonal` and `scipy.linalg.eigvalsh_tridiagonal`, which find the eigenvalues and eigenvectors of tridiagonal hermitian/symmetric matrices, were added. `scipy.ndimage` improvements ---------------------------- Support for homogeneous coordinate transforms has been added to `scipy.ndimage.affine_transform`. The ``ndimage`` C code underwent a significant refactoring, and is now a lot easier to understand and maintain. `scipy.optimize` improvements ----------------------------- The methods ``trust-region-exact`` and ``trust-krylov`` have been added to the function `scipy.optimize.minimize`. These new trust-region methods solve the subproblem with higher accuracy at the cost of more Hessian factorizations (compared to dogleg) or more matrix vector products (compared to ncg) but usually require less nonlinear iterations and are able to deal with indefinite Hessians. They seem very competitive against the other Newton methods implemented in scipy. `scipy.optimize.linprog` gained an interior point method. Its performance is superior (both in accuracy and speed) to the older simplex method. `scipy.signal` improvements --------------------------- An argument ``fs`` (sampling frequency) was added to the following functions: ``firwin``, ``firwin2``, ``firls``, and ``remez``. This makes these functions consistent with many other functions in `scipy.signal` in which the sampling frequency can be specified. `scipy.signal.freqz` has been sped up significantly for FIR filters. `scipy.sparse` improvements --------------------------- Iterating over and slicing of CSC and CSR matrices is now faster by up to ~35%. The ``tocsr`` method of COO matrices is now several times faster. The ``diagonal`` method of sparse matrices now takes a parameter, indicating which diagonal to return. `scipy.sparse.linalg` improvements ---------------------------------- A new iterative solver for large-scale nonsymmetric sparse linear systems, `scipy.sparse.linalg.gcrotmk`, was added. It implements ``GCROT(m,k)``, a flexible variant of ``GCROT``. `scipy.sparse.linalg.lsmr` now accepts an initial guess, yielding potentially faster convergence. SuperLU was updated to version 5.2.1. `scipy.spatial` improvements ---------------------------- Many distance metrics in `scipy.spatial.distance` gained support for weights. The signatures of `scipy.spatial.distance.pdist` and `scipy.spatial.distance.cdist` were changed to ``*args, **kwargs`` in order to support a wider range of metrics (e.g. string-based metrics that need extra keywords). Also, an optional ``out`` parameter was added to ``pdist`` and ``cdist`` allowing the user to specify where the resulting distance matrix is to be stored `scipy.stats` improvements -------------------------- The methods ``cdf`` and ``logcdf`` were added to `scipy.stats.multivariate_normal`, providing the cumulative distribution function of the multivariate normal distribution. New statistical distance functions were added, namely `scipy.stats.wasserstein_distance` for the first Wasserstein distance and `scipy.stats.energy_distance` for the energy distance. Deprecated features =================== The following functions in `scipy.misc` are deprecated: ``bytescale``, ``fromimage``, ``imfilter``, ``imread``, ``imresize``, ``imrotate``, ``imsave``, ``imshow`` and ``toimage``. Most of those functions have unexpected behavior (like rescaling and type casting image data without the user asking for that). Other functions simply have better alternatives. ``scipy.interpolate.interpolate_wrapper`` and all functions in that submodule are deprecated. This was a never finished set of wrapper functions which is not relevant anymore. The ``fillvalue`` of `scipy.signal.convolve2d` will be cast directly to the dtypes of the input arrays in the future and checked that it is a scalar or an array with a single element. ``scipy.spatial.distance.matching`` is deprecated. It is an alias of `scipy.spatial.distance.hamming`, which should be used instead. Implementation of `scipy.spatial.distance.wminkowski` was based on a wrong interpretation of the metric definition. In scipy 1.0 it has been just deprecated in the documentation to keep retro-compatibility but is recommended to use the new version of `scipy.spatial.distance.minkowski` that implements the correct behaviour. Positional arguments of `scipy.spatial.distance.pdist` and `scipy.spatial.distance.cdist` should be replaced with their keyword version. Backwards incompatible changes ============================== The following deprecated functions have been removed from `scipy.stats`: ``betai``, ``chisqprob``, ``f_value``, ``histogram``, ``histogram2``, ``pdf_fromgamma``, ``signaltonoise``, ``square_of_sums``, ``ss`` and ``threshold``. The following deprecated functions have been removed from `scipy.stats.mstats`: ``betai``, ``f_value_wilks_lambda``, ``signaltonoise`` and ``threshold``. The deprecated ``a`` and ``reta`` keywords have been removed from `scipy.stats.shapiro`. The deprecated functions ``sparse.csgraph.cs_graph_components`` and ``sparse.linalg.symeig`` have been removed from `scipy.sparse`. The following deprecated keywords have been removed in `scipy.sparse.linalg`: ``drop_tol`` from ``splu``, and ``xtype`` from ``bicg``, ``bicgstab``, ``cg``, ``cgs``, ``gmres``, ``qmr`` and ``minres``. The deprecated functions ``expm2`` and ``expm3`` have been removed from `scipy.linalg`. The deprecated keyword ``q`` was removed from `scipy.linalg.expm`. And the deprecated submodule ``linalg.calc_lwork`` was removed. The deprecated functions ``C2K``, ``K2C``, ``F2C``, ``C2F``, ``F2K`` and ``K2F`` have been removed from `scipy.constants`. The deprecated ``ppform`` class was removed from `scipy.interpolate`. The deprecated keyword ``iprint`` was removed from `scipy.optimize.fmin_cobyla`. The default value for the ``zero_phase`` keyword of `scipy.signal.decimate` has been changed to True. The ``kmeans`` and ``kmeans2`` functions in `scipy.cluster.vq` changed the method used for random initialization, so using a fixed random seed will not necessarily produce the same results as in previous versions. `scipy.special.gammaln` does not accept complex arguments anymore. The deprecated functions ``sph_jn``, ``sph_yn``, ``sph_jnyn``, ``sph_in``, ``sph_kn``, and ``sph_inkn`` have been removed. Users should instead use the functions ``spherical_jn``, ``spherical_yn``, ``spherical_in``, and ``spherical_kn``. Be aware that the new functions have different signatures. The cross-class properties of `scipy.signal.lti` systems have been removed. The following properties/setters have been removed: Name - (accessing/setting has been removed) - (setting has been removed) * StateSpace - (``num``, ``den``, ``gain``) - (``zeros``, ``poles``) * TransferFunction (``A``, ``B``, ``C``, ``D``, ``gain``) - (``zeros``, ``poles``) * ZerosPolesGain (``A``, ``B``, ``C``, ``D``, ``num``, ``den``) - () ``signal.freqz(b, a)`` with ``b`` or ``a`` >1-D raises a ``ValueError``. This was a corner case for which it was unclear that the behavior was well-defined. The method ``var`` of `scipy.stats.dirichlet` now returns a scalar rather than an ndarray when the length of alpha is 1. Other changes ============= SciPy now has a formal governance structure. It consists of a BDFL (Pauli Virtanen) and a Steering Committee. See `the governance document < https://github.com/scipy/scipy/blob/master/doc/source/dev/governance/governance.rst >`_ for details. It is now possible to build SciPy on Windows with MSVC + gfortran! Continuous integration has been set up for this build configuration on Appveyor, building against OpenBLAS. Continuous integration for OS X has been set up on TravisCI. The SciPy test suite has been migrated from ``nose`` to ``pytest``. ``scipy/_distributor_init.py`` was added to allow redistributors of SciPy to add custom code that needs to run when importing SciPy (e.g. checks for hardware, DLL search paths, etc.). Support for PEP 518 (specifying build system requirements) was added - see ``pyproject.toml`` in the root of the SciPy repository. In order to have consistent function names, the function ``scipy.linalg.solve_lyapunov`` is renamed to `scipy.linalg.solve_continuous_lyapunov`. The old name is kept for backwards-compatibility. Authors ======= * @arcady + * @xoviat + * Anton Akhmerov * Dominic Antonacci + * Alessandro Pietro Bardelli * Ved Basu + * Michael James Bedford + * Ray Bell + * Juan M. Bello-Rivas + * Sebastian Berg * Felix Berkenkamp * Jyotirmoy Bhattacharya + * Matthew Brett * Jonathan Bright * Bruno Jim?nez + * Evgeni Burovski * Patrick Callier * Mark Campanelli + * CJ Carey * Robert Cimrman * Adam Cox + * Michael Danilov + * David Haberth?r + * Andras Deak + * Philip DeBoer * Anne-Sylvie Deutsch * Cathy Douglass + * Dominic Else + * Guo Fei + * Roman Feldbauer + * Yu Feng * Jaime Fernandez del Rio * Orestis Floros + * David Freese + * Adam Geitgey + * James Gerity + * Dezmond Goff + * Christoph Gohlke * Ralf Gommers * Dirk Gorissen + * Matt Haberland + * David Hagen + * Charles Harris * Lam Yuen Hei + * Jean Helie + * Gaute Hope + * Guillaume Horel + * Franziska Horn + * Yevhenii Hyzyla + * Vladislav Iakovlev + * Marvin Kastner + * Mher Kazandjian * Thomas Keck * Adam Kurkiewicz + * Ronan Lamy + * J.L. Lanfranchi + * Eric Larson * Denis Laxalde * Gregory R. Lee * Felix Lenders + * Evan Limanto * Julian Lukwata + * Fran?ois Magimel * Syrtis Major + * Charles Masson + * Nikolay Mayorov * Tobias Megies * Markus Meister + * Roman Mirochnik + * Jordi Montes + * Nathan Musoke + * Andrew Nelson * M.J. Nichol * Juan Nunez-Iglesias * Arno Onken + * Nick Papior + * Dima Pasechnik + * Ashwin Pathak + * Oleksandr Pavlyk + * Stefan Peterson * Ilhan Polat * Andrey Portnoy + * Ravi Kumar Prasad + * Aman Pratik * Eric Quintero * Vedant Rathore + * Tyler Reddy * Joscha Reimer * Philipp Rentzsch + * Antonio Horta Ribeiro * Ned Richards + * Kevin Rose + * Benoit Rostykus + * Matt Ruffalo + * Eli Sadoff + * Pim Schellart * Nico Schl?mer + * Klaus Sembritzki + * Nikolay Shebanov + * Jonathan Tammo Siebert * Scott Sievert * Max Silbiger + * Mandeep Singh + * Michael Stewart + * Jonathan Sutton + * Deep Tavker + * Martin Thoma * James Tocknell + * Aleksandar Trifunovic + * Paul van Mulbregt + * Jacob Vanderplas * Aditya Vijaykumar * Pauli Virtanen * James Webber * Warren Weckesser * Eric Wieser + * Josh Wilson * Zhiqing Xiao + * Evgeny Zhurko * Nikolay Zinov + * Z? Vin?cius + A total of 121 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Oct 25 13:09:14 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 25 Oct 2017 11:09:14 -0600 Subject: [SciPy-User] [Numpy-discussion] SciPy 1.0 released! In-Reply-To: References: Message-ID: On Wed, Oct 25, 2017 at 4:14 AM, Ralf Gommers wrote: > Hi all, > > We are extremely pleased to announce the release of SciPy 1.0, 16 years > after > version 0.1 saw the light of day. It has been a long, productive journey > to > get here, and we anticipate many more exciting new features and releases > in the > future. > > > Why 1.0 now? > ------------ > > A version number should reflect the maturity of a project - and SciPy was a > mature and stable library that is heavily used in production settings for a > long time already. From that perspective, the 1.0 version number is long > overdue. > > Some key project goals, both technical (e.g. Windows wheels and continuous > integration) and organisational (a governance structure, code of conduct > and a > roadmap), have been achieved recently. > > Many of us are a bit perfectionist, and therefore are reluctant to call > something "1.0" because it may imply that it's "finished" or "we are 100% > happy > with it". This is normal for many open source projects, however that > doesn't > make it right. We acknowledge to ourselves that it's not perfect, and > there > are some dusty corners left (that will probably always be the case). > Despite > that, SciPy is extremely useful to its users, on average has high quality > code > and documentation, and gives the stability and backwards compatibility > guarantees that a 1.0 label imply. > > > Some history and perspectives > ----------------------------- > > - 2001: the first SciPy release > - 2005: transition to NumPy > - 2007: creation of scikits > - 2008: scipy.spatial module and first Cython code added > - 2010: moving to a 6-monthly release cycle > - 2011: SciPy development moves to GitHub > - 2011: Python 3 support > - 2012: adding a sparse graph module and unified optimization interface > - 2012: removal of scipy.maxentropy > - 2013: continuous integration with TravisCI > - 2015: adding Cython interface for BLAS/LAPACK and a benchmark suite > - 2017: adding a unified C API with scipy.LowLevelCallable; removal of > scipy.weave > - 2017: SciPy 1.0 release > > > **Pauli Virtanen** is SciPy's Benevolent Dictator For Life (BDFL). He > says: > > *Truthfully speaking, we could have released a SciPy 1.0 a long time ago, > so I'm > happy we do it now at long last. The project has a long history, and > during the > years it has matured also as a software project. I believe it has well > proved > its merit to warrant a version number starting with unity.* > > *Since its conception 15+ years ago, SciPy has largely been written by and > for > scientists, to provide a box of basic tools that they need. Over time, the > set > of people active in its development has undergone some rotation, and we > have > evolved towards a somewhat more systematic approach to development. > Regardless, > this underlying drive has stayed the same, and I think it will also > continue > propelling the project forward in future. This is all good, since not long > after 1.0 comes 1.1.* > > **Travis Oliphant** is one of SciPy's creators. He says: > > *I'm honored to write a note of congratulations to the SciPy developers > and the > entire SciPy community for the release of SciPy 1.0. This release > represents > a dream of many that has been patiently pursued by a stalwart group of > pioneers > for nearly 2 decades. Efforts have been broad and consistent over that > time > from many hundreds of people. From initial discussions to efforts coding > and > packaging to documentation efforts to extensive conference and community > building, the SciPy effort has been a global phenomenon that it has been a > privilege to participate in.* > > *The idea of SciPy was already in multiple people?s minds in 1997 when I > first > joined the Python community as a young graduate student who had just > fallen in > love with the expressibility and extensibility of Python. The internet > was > just starting to bringing together like-minded mathematicians and > scientists in > nascent electronically-connected communities. In 1998, there was a > concerted > discussion on the matrix-SIG, python mailing list with people like Paul > Barrett, Joe Harrington, Perry Greenfield, Paul Dubois, Konrad Hinsen, > David > Ascher, and others. This discussion encouraged me in 1998 and 1999 to > procrastinate my PhD and spend a lot of time writing extension modules to > Python that mostly wrapped battle-tested Fortran and C-code making it > available > to the Python user. This work attracted the help of others like Robert > Kern, > Pearu Peterson and Eric Jones who joined their efforts with mine in 2000 so > that by 2001, the first SciPy release was ready. This was long before > Github > simplified collaboration and input from others and the "patch" command and > email was how you helped a project improve.* > > *Since that time, hundreds of people have spent an enormous amount of time > improving the SciPy library and the community surrounding this library has > dramatically grown. I stopped being able to participate actively in > developing > the SciPy library around 2010. Fortunately, at that time, Pauli Virtanen > and > Ralf Gommers picked up the pace of development supported by dozens of > other key > contributors such as David Cournapeau, Evgeni Burovski, Josef Perktold, and > Warren Weckesser. While I have only been able to admire the development > of > SciPy from a distance for the past 7 years, I have never lost my love of > the > project and the concept of community-driven development. I remain driven > even now by a desire to help sustain the development of not only the SciPy > library but many other affiliated and related open-source projects. I am > extremely pleased that SciPy is in the hands of a world-wide community of > talented developers who will ensure that SciPy remains an example of how > grass-roots, community-driven development can succeed.* > > **Fernando Perez** offers a wider community perspective: > > *The existence of a nascent Scipy library, and the incredible --if tiny by > today's standards-- community surrounding it is what drew me into the > scientific Python world while still a physics graduate student in 2001. > Today, > I am awed when I see these tools power everything from high school > education to > the research that led to the 2017 Nobel Prize in physics.* > > *Don't be fooled by the 1.0 number: this project is a mature cornerstone > of the > modern scientific computing ecosystem. I am grateful for the many who have > made it possible, and hope to be able to contribute again to it in the > future. > My sincere congratulations to the whole team!* > > > Highlights of this release > -------------------------- > > Some of the highlights of this release are: > > - Major build improvements. Windows wheels are available on PyPI for the > first time, and continuous integration has been set up on Windows and OS > X > in addition to Linux. > - A set of new ODE solvers and a unified interface to them > (`scipy.integrate.solve_ivp`). > - Two new trust region optimizers and a new linear programming method, with > improved performance compared to what `scipy.optimize` offered > previously. > - Many new BLAS and LAPACK functions were wrapped. The BLAS wrappers are > now > complete. > > > Upgrading and compatibility > --------------------------- > > There have been a number of deprecations and API changes in this release, > which > are documented below. Before upgrading, we recommend that users check that > their own code does not use deprecated SciPy functionality (to do so, run > your > code with ``python -Wd`` and check for ``DeprecationWarning`` s). > > This release requires Python 2.7 or >=3.4 and NumPy 1.8.2 or greater. > > This is also the last release to support LAPACK 3.1.x - 3.3.x. Moving the > lowest supported LAPACK version to >3.2.x was long blocked by Apple > Accelerate > providing the LAPACK 3.2.1 API. We have decided that it's time to either > drop > Accelerate or, if there is enough interest, provide shims for functions > added > in more recent LAPACK versions so it can still be used. > > > New features > ============ > > `scipy.cluster` improvements > ---------------------------- > > `scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a > linkage matrix to minimize distances between adjacent leaves, was added. > > > `scipy.fftpack` improvements > ---------------------------- > > N-dimensional versions of the discrete sine and cosine transforms and their > inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``. > > > `scipy.integrate` improvements > ------------------------------ > > A set of new ODE solvers have been added to `scipy.integrate`. The > convenience > function `scipy.integrate.solve_ivp` allows uniform access to all solvers. > The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and > ``LSODA``) > can also be used directly. > > > `scipy.linalg` improvements > ---------------------------- > > The BLAS wrappers in `scipy.linalg.blas` have been completed. Added > functions > are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``, > ``*spr``, > ``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, > ``*sbmv``, > ``*spr2``, > > Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``, > ``*hetrd``, > ``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``, > ``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added. > > The function `scipy.linalg.subspace_angles` has been added to compute the > subspace angles between two matrices. > > The function `scipy.linalg.clarkson_woodruff_transform` has been added. > It finds low-rank matrix approximation via the Clarkson-Woodruff Transform. > > The functions `scipy.linalg.eigh_tridiagonal` and > `scipy.linalg.eigvalsh_tridiagonal`, which find the eigenvalues and > eigenvectors of tridiagonal hermitian/symmetric matrices, were added. > > > `scipy.ndimage` improvements > ---------------------------- > > Support for homogeneous coordinate transforms has been added to > `scipy.ndimage.affine_transform`. > > The ``ndimage`` C code underwent a significant refactoring, and is now > a lot easier to understand and maintain. > > > `scipy.optimize` improvements > ----------------------------- > > The methods ``trust-region-exact`` and ``trust-krylov`` have been added to > the > function `scipy.optimize.minimize`. These new trust-region methods solve > the > subproblem with higher accuracy at the cost of more Hessian factorizations > (compared to dogleg) or more matrix vector products (compared to ncg) but > usually require less nonlinear iterations and are able to deal with > indefinite > Hessians. They seem very competitive against the other Newton methods > implemented in scipy. > > `scipy.optimize.linprog` gained an interior point method. Its performance > is > superior (both in accuracy and speed) to the older simplex method. > > > `scipy.signal` improvements > --------------------------- > > An argument ``fs`` (sampling frequency) was added to the following > functions: > ``firwin``, ``firwin2``, ``firls``, and ``remez``. This makes these > functions > consistent with many other functions in `scipy.signal` in which the > sampling > frequency can be specified. > > `scipy.signal.freqz` has been sped up significantly for FIR filters. > > > `scipy.sparse` improvements > --------------------------- > > Iterating over and slicing of CSC and CSR matrices is now faster by up to > ~35%. > > The ``tocsr`` method of COO matrices is now several times faster. > > The ``diagonal`` method of sparse matrices now takes a parameter, > indicating > which diagonal to return. > > > `scipy.sparse.linalg` improvements > ---------------------------------- > > A new iterative solver for large-scale nonsymmetric sparse linear systems, > `scipy.sparse.linalg.gcrotmk`, was added. It implements ``GCROT(m,k)``, a > flexible variant of ``GCROT``. > > `scipy.sparse.linalg.lsmr` now accepts an initial guess, yielding > potentially > faster convergence. > > SuperLU was updated to version 5.2.1. > > > `scipy.spatial` improvements > ---------------------------- > > Many distance metrics in `scipy.spatial.distance` gained support for > weights. > > The signatures of `scipy.spatial.distance.pdist` and > `scipy.spatial.distance.cdist` were changed to ``*args, **kwargs`` in > order to > support a wider range of metrics (e.g. string-based metrics that need extra > keywords). Also, an optional ``out`` parameter was added to ``pdist`` and > ``cdist`` allowing the user to specify where the resulting distance matrix > is > to be stored > > > `scipy.stats` improvements > -------------------------- > > The methods ``cdf`` and ``logcdf`` were added to > `scipy.stats.multivariate_normal`, providing the cumulative distribution > function of the multivariate normal distribution. > > New statistical distance functions were added, namely > `scipy.stats.wasserstein_distance` for the first Wasserstein distance and > `scipy.stats.energy_distance` for the energy distance. > > > Deprecated features > =================== > > The following functions in `scipy.misc` are deprecated: ``bytescale``, > ``fromimage``, ``imfilter``, ``imread``, ``imresize``, ``imrotate``, > ``imsave``, ``imshow`` and ``toimage``. Most of those functions have > unexpected > behavior (like rescaling and type casting image data without the user > asking > for that). Other functions simply have better alternatives. > > ``scipy.interpolate.interpolate_wrapper`` and all functions in that > submodule > are deprecated. This was a never finished set of wrapper functions which > is > not relevant anymore. > > The ``fillvalue`` of `scipy.signal.convolve2d` will be cast directly to the > dtypes of the input arrays in the future and checked that it is a scalar or > an array with a single element. > > ``scipy.spatial.distance.matching`` is deprecated. It is an alias of > `scipy.spatial.distance.hamming`, which should be used instead. > > Implementation of `scipy.spatial.distance.wminkowski` was based on a wrong > interpretation of the metric definition. In scipy 1.0 it has been just > deprecated in the documentation to keep retro-compatibility but is > recommended > to use the new version of `scipy.spatial.distance.minkowski` that > implements > the correct behaviour. > > Positional arguments of `scipy.spatial.distance.pdist` and > `scipy.spatial.distance.cdist` should be replaced with their keyword > version. > > > Backwards incompatible changes > ============================== > > The following deprecated functions have been removed from `scipy.stats`: > ``betai``, ``chisqprob``, ``f_value``, ``histogram``, ``histogram2``, > ``pdf_fromgamma``, ``signaltonoise``, ``square_of_sums``, ``ss`` and > ``threshold``. > > The following deprecated functions have been removed from > `scipy.stats.mstats`: > ``betai``, ``f_value_wilks_lambda``, ``signaltonoise`` and ``threshold``. > > The deprecated ``a`` and ``reta`` keywords have been removed from > `scipy.stats.shapiro`. > > The deprecated functions ``sparse.csgraph.cs_graph_components`` and > ``sparse.linalg.symeig`` have been removed from `scipy.sparse`. > > The following deprecated keywords have been removed in > `scipy.sparse.linalg`: > ``drop_tol`` from ``splu``, and ``xtype`` from ``bicg``, ``bicgstab``, > ``cg``, > ``cgs``, ``gmres``, ``qmr`` and ``minres``. > > The deprecated functions ``expm2`` and ``expm3`` have been removed from > `scipy.linalg`. The deprecated keyword ``q`` was removed from > `scipy.linalg.expm`. And the deprecated submodule ``linalg.calc_lwork`` > was > removed. > > The deprecated functions ``C2K``, ``K2C``, ``F2C``, ``C2F``, ``F2K`` and > ``K2F`` have been removed from `scipy.constants`. > > The deprecated ``ppform`` class was removed from `scipy.interpolate`. > > The deprecated keyword ``iprint`` was removed from > `scipy.optimize.fmin_cobyla`. > > The default value for the ``zero_phase`` keyword of `scipy.signal.decimate` > has been changed to True. > > The ``kmeans`` and ``kmeans2`` functions in `scipy.cluster.vq` changed the > method used for random initialization, so using a fixed random seed will > not necessarily produce the same results as in previous versions. > > `scipy.special.gammaln` does not accept complex arguments anymore. > > The deprecated functions ``sph_jn``, ``sph_yn``, ``sph_jnyn``, ``sph_in``, > ``sph_kn``, and ``sph_inkn`` have been removed. Users should instead use > the functions ``spherical_jn``, ``spherical_yn``, ``spherical_in``, and > ``spherical_kn``. Be aware that the new functions have different > signatures. > > The cross-class properties of `scipy.signal.lti` systems have been removed. > The following properties/setters have been removed: > > Name - (accessing/setting has been removed) - (setting has been removed) > > * StateSpace - (``num``, ``den``, ``gain``) - (``zeros``, ``poles``) > * TransferFunction (``A``, ``B``, ``C``, ``D``, ``gain``) - (``zeros``, > ``poles``) > * ZerosPolesGain (``A``, ``B``, ``C``, ``D``, ``num``, ``den``) - () > > ``signal.freqz(b, a)`` with ``b`` or ``a`` >1-D raises a ``ValueError``. > This > was a corner case for which it was unclear that the behavior was > well-defined. > > The method ``var`` of `scipy.stats.dirichlet` now returns a scalar rather > than > an ndarray when the length of alpha is 1. > > > Other changes > ============= > > SciPy now has a formal governance structure. It consists of a BDFL (Pauli > Virtanen) and a Steering Committee. See `the governance document > dev/governance/governance.rst>`_ > for details. > > It is now possible to build SciPy on Windows with MSVC + gfortran! > Continuous > integration has been set up for this build configuration on Appveyor, > building > against OpenBLAS. > > Continuous integration for OS X has been set up on TravisCI. > > The SciPy test suite has been migrated from ``nose`` to ``pytest``. > > ``scipy/_distributor_init.py`` was added to allow redistributors of SciPy > to > add custom code that needs to run when importing SciPy (e.g. checks for > hardware, DLL search paths, etc.). > > Support for PEP 518 (specifying build system requirements) was added - see > ``pyproject.toml`` in the root of the SciPy repository. > > In order to have consistent function names, the function > ``scipy.linalg.solve_lyapunov`` is renamed to > `scipy.linalg.solve_continuous_lyapunov`. The old name is kept for > backwards-compatibility. > > > Authors > ======= > > * @arcady + > * @xoviat + > * Anton Akhmerov > * Dominic Antonacci + > * Alessandro Pietro Bardelli > * Ved Basu + > * Michael James Bedford + > * Ray Bell + > * Juan M. Bello-Rivas + > * Sebastian Berg > * Felix Berkenkamp > * Jyotirmoy Bhattacharya + > * Matthew Brett > * Jonathan Bright > * Bruno Jim?nez + > * Evgeni Burovski > * Patrick Callier > * Mark Campanelli + > * CJ Carey > * Robert Cimrman > * Adam Cox + > * Michael Danilov + > * David Haberth?r + > * Andras Deak + > * Philip DeBoer > * Anne-Sylvie Deutsch > * Cathy Douglass + > * Dominic Else + > * Guo Fei + > * Roman Feldbauer + > * Yu Feng > * Jaime Fernandez del Rio > * Orestis Floros + > * David Freese + > * Adam Geitgey + > * James Gerity + > * Dezmond Goff + > * Christoph Gohlke > * Ralf Gommers > * Dirk Gorissen + > * Matt Haberland + > * David Hagen + > * Charles Harris > * Lam Yuen Hei + > * Jean Helie + > * Gaute Hope + > * Guillaume Horel + > * Franziska Horn + > * Yevhenii Hyzyla + > * Vladislav Iakovlev + > * Marvin Kastner + > * Mher Kazandjian > * Thomas Keck > * Adam Kurkiewicz + > * Ronan Lamy + > * J.L. Lanfranchi + > * Eric Larson > * Denis Laxalde > * Gregory R. Lee > * Felix Lenders + > * Evan Limanto > * Julian Lukwata + > * Fran?ois Magimel > * Syrtis Major + > * Charles Masson + > * Nikolay Mayorov > * Tobias Megies > * Markus Meister + > * Roman Mirochnik + > * Jordi Montes + > * Nathan Musoke + > * Andrew Nelson > * M.J. Nichol > * Juan Nunez-Iglesias > * Arno Onken + > * Nick Papior + > * Dima Pasechnik + > * Ashwin Pathak + > * Oleksandr Pavlyk + > * Stefan Peterson > * Ilhan Polat > * Andrey Portnoy + > * Ravi Kumar Prasad + > * Aman Pratik > * Eric Quintero > * Vedant Rathore + > * Tyler Reddy > * Joscha Reimer > * Philipp Rentzsch + > * Antonio Horta Ribeiro > * Ned Richards + > * Kevin Rose + > * Benoit Rostykus + > * Matt Ruffalo + > * Eli Sadoff + > * Pim Schellart > * Nico Schl?mer + > * Klaus Sembritzki + > * Nikolay Shebanov + > * Jonathan Tammo Siebert > * Scott Sievert > * Max Silbiger + > * Mandeep Singh + > * Michael Stewart + > * Jonathan Sutton + > * Deep Tavker + > * Martin Thoma > * James Tocknell + > * Aleksandar Trifunovic + > * Paul van Mulbregt + > * Jacob Vanderplas > * Aditya Vijaykumar > * Pauli Virtanen > * James Webber > * Warren Weckesser > * Eric Wieser + > * Josh Wilson > * Zhiqing Xiao + > * Evgeny Zhurko > * Nikolay Zinov + > * Z? Vin?cius + > > A total of 121 people contributed to this release. > People with a "+" by their names contributed a patch for the first time. > This list of names is automatically generated, and may not be fully > complete. > > > Cheers, > Ralf > > Congratulations to all. SciPy provides wonderful tools that are free for all to use. That those tools are available, and easily installed, is a great boon to many who would otherwise be at a disadvantage for lack of money or access; that, in itself, will have a major impact. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From dhoese at gmail.com Wed Oct 25 15:04:14 2017 From: dhoese at gmail.com (David Hoese) Date: Wed, 25 Oct 2017 14:04:14 -0500 Subject: [SciPy-User] ANN: VisPy 0.5 released Message-ID: <08e2f3b4-1e54-2c64-88ec-3b0eef519363@gmail.com> Hi all, I am happy to announce the release of VisPy 0.5. It has taken a while for some of us new maintainers to get spun up on every component of this project, but after more than two years since the last release VisPy is back. Many components have been refactored, new visuals and other features added, and over 177 pull requests merged. What is VisPy? -------------- VisPy is a Python library for interactive scientific visualization that is designed to be fast, scalable, and easy to use. VisPy leverages the computational power of modern Graphics Processing Units (GPUs) through the OpenGL library to display very large datasets. Applications of VisPy include: High-quality interactive scientific plots with millions of points. Direct visualization of real-time data. Fast interactive visualization of 3D models (meshes, volume rendering). OpenGL visualization demos. Scientific GUIs with fast, scalable visualization widgets (Qt or IPython notebook with WebGL). See the Gallery and many other example scripts on the VisPy website (http://vispy.org/). Upgrading --------- VisPy supports Python 2.7 and 3.x on Linux, Mac OSX, and Windows. VisPy's heavy use of the GPU means that users will need to have modern and up-to-date video drivers for their system. VisPy can use one of many backends, see the documentation for details. Due to the large refactor of VisPy, users of the previous 0.4 release will likely have to change their code. Links ----- GitHub: https://github.com/vispy/vispy Website: http://vispy.org/ Gitter (for chat): https://gitter.im/vispy/vispy Mailing list: https://groups.google.com/forum/#!forum/vispy Contributing ------------ Help is always welcome. We have over 250 GitHub issues and pull requests that we are still sorting through. Feel free to talk to us on Gitter or send in a pull request. Thanks, Dave From boyan.penkov at gmail.com Wed Oct 25 16:01:25 2017 From: boyan.penkov at gmail.com (Boyan Penkov) Date: Wed, 25 Oct 2017 16:01:25 -0400 Subject: [SciPy-User] scipy.integrate.solve_bvp with explicit dependence on independent variable? Message-ID: Hello all, I am trying to use scipy.integrate.solve_bvp to solve a boundary value problem of a function in one variable -- $\phi(r)$ -- for a cylindrical geometry. The examples here are pretty good: https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.integrate.solve_bvp.html However, when casting into a ocupled first-order system, the issue here is that I have explicit dependence on the independent variable $r$. When I write the equivalent of >>> def fun(x, y): ... return np.vstack((y[1], -np.exp(y[0]))) , the interpreter promptly replies "TypeError: only length-1 arrays can be converted to Python scalars" -- this makes sense, since So, can scipy.integrate.solve_bvp be used in problems where there's an explicit dependence on $r$, for functions where the right-hand side is F(dq/dx, q, r)? Cheers! -- Boyan Penkov From pav at iki.fi Wed Oct 25 18:51:22 2017 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 26 Oct 2017 00:51:22 +0200 Subject: [SciPy-User] scipy.integrate.solve_bvp with explicit dependence on independent variable? In-Reply-To: References: Message-ID: <1508971882.3720.9.camel@iki.fi> ke, 2017-10-25 kello 16:01 -0400, Boyan Penkov kirjoitti: [clip] > So, can scipy.integrate.solve_bvp be used in problems where there's > an > explicit dependence on $r$, for functions where the right-hand side > is > F(dq/dx, q, r)? Does "dq/dx" mean "dq/dr" and equation of type d^2 q/dr^2 = F(dq/dr, q, r) Then the yes, the independent variable is the first argument in "def fun(r, y)". If this is not what you are looking for, if you write explicitly the first order system you want to solve maybe better answers come. -- Pauli Virtanen From boyan.penkov at gmail.com Wed Oct 25 18:57:38 2017 From: boyan.penkov at gmail.com (Boyan Penkov) Date: Wed, 25 Oct 2017 18:57:38 -0400 Subject: [SciPy-User] scipy.integrate.solve_bvp with explicit dependence on independent variable? In-Reply-To: <1508971882.3720.9.camel@iki.fi> References: <1508971882.3720.9.camel@iki.fi> Message-ID: <43C55EDD-B750-4BE2-8B2C-0B38FA85EFA9@gmail.com> -- Boyan Penkov www.boyanpenkov.com > On Oct 25, 2017, at 18:51, Pauli Virtanen wrote: > > ke, 2017-10-25 kello 16:01 -0400, Boyan Penkov kirjoitti: > [clip] >> So, can scipy.integrate.solve_bvp be used in problems where there's >> an >> explicit dependence on $r$, for functions where the right-hand side >> is >> F(dq/dx, q, r)? > > Does "dq/dx" mean "dq/dr" and equation of type > > d^2 q/dr^2 = F(dq/dr, q, r) Whoops, yeah?. you are correct, I do indeed mean dq/dr... > > Then the yes, the independent variable is the first argument in > "def fun(r, y)". If this is not what you are looking for, if you write > explicitly the first order system you want to solve maybe better > answers come. The issue seems to be that I then generally have x = linspace(?.) which makes the output of fun(x,y) have length len(x) + 1 and not just 2, as may be expected. Is there a syntactic solution to this? > > -- > Pauli Virtanen > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From daniele at grinta.net Fri Oct 27 23:24:54 2017 From: daniele at grinta.net (Daniele Nicolodi) Date: Fri, 27 Oct 2017 21:24:54 -0600 Subject: [SciPy-User] integrate.quad(stats.norm(loc=100.0).pdf, -np.inf, np.inf) = 0 ? Message-ID: Hello, sorry for the cryptic Subject, but I didn't know how else to title this. I don't have much experience with numerical integration, so it may very well be that I'm doing something stupid, please advise :-) I need to compute some numerical integrals of some functions that are very "localized" in their domain (trying to compute some posterior probability density). A Gaussian (normal) function is a good example. I find that if the region in which the function is significantly non zero is away from the center of the integration interval, the integration routines "step over" that domain interval and do not "see" the function having values different from zero. For example, integrating a Gaussian centered around zero, works as expected: integrate.quad(stats.norm(loc=0.0).pdf, -np.inf, np.inf) (0.9999999999999998, 1.0178191320905743e-08) but if I shift the Gaussian such that the function value at 0 is numerically zero, the integration fails: integrate.quad(stats.norm(loc=100.0).pdf, -np.inf, np.inf) (0.0, 0.0) same if I have an integration interval not centered: integrate.quad(stats.norm(loc=0.0).pdf, -100, 1000) (4.471689909323402e-30, 8.454234364421014e-30) vs: integrate.quad(stats.norm(loc=0.0).pdf, -100, 100) (1.0000000000000002, 1.0346447361664605e-12) I definitely see why those examples are problematic, but I don't know what I can do about it other than evaluating the functions on a grid to restrict somehow the integration limits. Does anyone have any suggestion? Thanks! Cheers, Daniele