From ralf.gommers at gmail.com Mon Oct 1 13:29:23 2012 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 1 Oct 2012 19:29:23 +0200 Subject: [SciPy-Dev] Docstrings for the optimize functions In-Reply-To: <1349028108.98033.YahooMailNeo@web31801.mail.mud.yahoo.com> References: <1349028108.98033.YahooMailNeo@web31801.mail.mud.yahoo.com> Message-ID: On Sun, Sep 30, 2012 at 8:01 PM, The Helmbolds wrote: > Problem: As we update the scipy.optimize docstrings, how should we deal > with the difference between the "new style" or 'minimize' calling sequence > and its return dictionary, and the "old style" or 'pre-minimize' calling > sequence and return values? > > Background and Discussion: The docstring for the over-arching 'minimize' > function is OK as far as it goes. However, while it mentions that the > different methods differ in the number and/or kind of parameters in both > their calling sequences and in their 'Results' dictionaries, it (quite > properly) does not go into the nitty-gritty details of these items for each > 'method'. Yet, those details should be described someplace, especially for > new users. > Described in http://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.show_options.html > > Suggested Approach: I propose the following approach to documenting the > individual 'methods' while we are in this "transition" period: > > 1. In the Summary (or Extended Summary) of the method, include words to > the effect that: Although this documentation for the most part describes > the "old style" calling sequence and return values, it is strongly > recommended that new code invoking this method use the "new style" or > 'minimize' calling sequence and return values, and that serious > consideration be given to updating existing code to the "new style". To > ease the transition, one or more examples illustrating both the "old style" > and the "new style" calling sequences and return values are provided in the > Examples section. > > 2. Leave the docstring's sections on Parameters, Returns, Other > Parameters, Raises, Notes, and References, discuss as though the "old > style" calling sequence is being used. Add to these sections remarks on the > "new style" only when clarity requires it. For example, it may be helpful > to add a sentence to the Parameters and Returns sections stting that: The > Examples section shows how to adapt this to the "new style" calling > sequence and return values. > > 3. In the examples, first illustrate how to use the "old style" version. > Then add an example illustrating how exactly the same problem would be > handled when the "new style's" 'minimize' calling sequence is used. In that > example discuss just so much of the details of the method's "options > dictionary", "constraints dictionary sequence", and "Results dictionary" as > seems necessary to guide a new user. In other words, adhere to the > priniciple that "All documentation should be as brief as possible -- but no > briefer !!!" > Many of the fmin_ docstrings have no examples. I think your suggestion of adding first an example using that fmin_ function directly, and then below it doing the same with minimize(method='fmin_xxx') would be useful. The fmin_ docstrings already link to minimize() in the See Also section. Adding a few words to that like "minimize(method='fmin_xxx') is recommended, `fmin_` is kept for backwards compatibility" would be useful. That's all I'd do. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From helmrp at yahoo.com Wed Oct 3 23:01:06 2012 From: helmrp at yahoo.com (The Helmbolds) Date: Wed, 3 Oct 2012 20:01:06 -0700 (PDT) Subject: [SciPy-Dev] Webroot False Positive When Attempting to Install Scipy 0.11.0 References: Message-ID: <1349319666.94987.YahooMailNeo@web31812.mail.mud.yahoo.com> I'm happy to report that Webroot responded to my inquiry about its blocking the installation of SciPy version 0.11.0 due to "Win32.Autoblock.1 detected" in part as follows: "Thank you for submitting your report.??We have examined the logs from your system and found that the detected items were the result of a false positive, and are not a threat.??We have updated our security definitions to address this." Bob H???? From ralf.gommers at gmail.com Fri Oct 5 14:55:37 2012 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 5 Oct 2012 20:55:37 +0200 Subject: [SciPy-Dev] license of special.wofz (#1741) Message-ID: Hi, http://projects.scipy.org/scipy/ticket/1741 reports that the complex error function (scipy.special.wofz) is licensed such that it can't be included in scipy. It's algorithm 680 from ACM TOMS. I found some previous discussions on TOMS licensing, but not on this function. An MIT licensed alternative is available (see #1741). Does anyone remember details about the licensing of this function? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Oct 7 07:19:17 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 07 Oct 2012 14:19:17 +0300 Subject: [SciPy-Dev] Make orthogonal eval_* ufuncs? In-Reply-To: <505DEB17.8080800@redtetrahedron.org> References: <505DEB17.8080800@redtetrahedron.org> Message-ID: Hi, 22.09.2012 19:45, Eric Moore kirjoitti: > There are some issues with the eval_* routines for orthogonal polynomials. > > 1. They appear to support out parameters, but they don't. > 2. They choke when handed lists rather than arrays. (This is ticket #1435) > 3. special.binom is exported but doesn't appear in the docs on scipy.org. > > I started to fix these issues this morning by turning them into ufuncs > and calling hyp2f1, Gamma and lgam from cephes.h and hyp1f1_wrap from > specfun_wrappers.h directly. However the various error handling > routines defined in _cephesmodule.c (line 1185+) are actually called in > cephes/mtherr.c. This means that I can't actually do it that way, > unless I move some things around. I'd propose moving all of the eval_* > functions defined in orthogonal_eval.pyx to _cephesmodule.c and turning > them all into ufuncs. > > Thoughts? This might be useful. There's a pull request currently that tries to make the organization and adding more ufuncs easier: https://github.com/scipy/scipy/pull/333 Once this is merged, you can write the kernel functions of the ufunc loops in orthogonal_eval.pxd directly in Cython, by using the usual Cython C interface to deal with Cephes et al. The drawback here is that you need separate kernel functions for complex-valued types. Or, if you want so start right away, you can work on a branch based on that PR: git remote add pv git://github.com/pv/scipy-work.git git fetch pv git checkout -b my-ortheval-fixes pv/special-cleanup -- Pauli Virtanen From pav at iki.fi Sun Oct 7 11:40:41 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 07 Oct 2012 18:40:41 +0300 Subject: [SciPy-Dev] Bundling Boost? Message-ID: Hi, I'd like to consider replacing some of the function implementations in scipy.special with versions from the C++ Boost library, cf. http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/index.html The reason is that these implementations appear to be made with more care than what we use currently, and behave more correctly in corner cases (cf. e.g. ticket #1740). To minimize work, it would be useful just to use the Boost functions direcly, rather than doing manual C transcriptions. The drawback here is that the portion of Boost library required weights about 8 MB of source code, and we would most likely like to bundle it, as it is not really a standard part of many installations. This does not reflect much on the compiled binary size, however. I'm not 100 % certain about the compiler support. Perhaps C++ is already mature enough to work across the platforms we care about. I'm not aware of many good BSD-compatible floating-point special function libraries, so if you know others, or would be opposed to bundling Boost, please chime up! -- Pauli Virtanen From charlesr.harris at gmail.com Sun Oct 7 12:21:45 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 7 Oct 2012 10:21:45 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 9:40 AM, Pauli Virtanen wrote: > Hi, > > I'd like to consider replacing some of the function implementations in > scipy.special with versions from the C++ Boost library, cf. > > http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/index.html > > The reason is that these implementations appear to be made with more > care than what we use currently, and behave more correctly in corner > cases (cf. e.g. ticket #1740). > > To minimize work, it would be useful just to use the Boost functions > direcly, rather than doing manual C transcriptions. The drawback here is > that the portion of Boost library required weights about 8 MB of source > code, and we would most likely like to bundle it, as it is not really a > standard part of many installations. This does not reflect much on the > compiled binary size, however. > > I'm not 100 % certain about the compiler support. Perhaps C++ is already > mature enough to work across the platforms we care about. > > I'm not aware of many good BSD-compatible floating-point special > function libraries, so if you know others, or would be opposed to > bundling Boost, please chime up! > > I think using the boost library is a good idea. It is well tested and looks to support quad precision, something we will probably want at some point. It also looks to be highly templated and tightly integrated, so I suspect getting it properly interfaced might be non-trivial. The same holds for the distributions, but we have done much the same. It might be worth looking over the boost classes for some ideas. As to the size of the code, the current scipy/special library is ~40MB and I expect we can get rid of some of that. We should check for LLVM compatibility to make sure Apple isn't a problem, but it looks like most other C++ compilers will work, Boost does try hard for universality. Compile times will probably increase if we keep all the templates. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Oct 7 12:48:15 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 07 Oct 2012 19:48:15 +0300 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: 07.10.2012 19:21, Charles R Harris kirjoitti: [clip] > I think using the boost library is a good idea. It is well tested and > looks to support quad precision, something we will probably want at some > point. It also looks to be highly templated and tightly integrated, so I > suspect getting it properly interfaced might be non-trivial. The same > holds for the distributions, but we have done much the same. It might be > worth looking over the boost classes for some ideas. > > As to the size of the code, the current scipy/special library is ~40MB > and I expect we can get rid of some of that. We should check for LLVM > compatibility to make sure Apple isn't a problem, but it looks like most > other C++ compilers will work, Boost does try hard for universality. > > Compile times will probably increase if we keep all the templates. Integrating it is actually not so hard, it's here if someone wants to try (e.g. if it works at all on OSX): https://github.com/pv/scipy-work/commits/special-boost and you get `scipy.special._ufuncs_cxx.jv`. -- Pauli Virtanen From charlesr.harris at gmail.com Sun Oct 7 13:18:50 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 7 Oct 2012 11:18:50 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 10:48 AM, Pauli Virtanen wrote: > 07.10.2012 19:21, Charles R Harris kirjoitti: > [clip] > > I think using the boost library is a good idea. It is well tested and > > looks to support quad precision, something we will probably want at some > > point. It also looks to be highly templated and tightly integrated, so I > > suspect getting it properly interfaced might be non-trivial. The same > > holds for the distributions, but we have done much the same. It might be > > worth looking over the boost classes for some ideas. > > > > As to the size of the code, the current scipy/special library is ~40MB > > and I expect we can get rid of some of that. We should check for LLVM > > compatibility to make sure Apple isn't a problem, but it looks like most > > other C++ compilers will work, Boost does try hard for universality. > > > > Compile times will probably increase if we keep all the templates. > > Integrating it is actually not so hard, it's here if someone wants to > try (e.g. if it works at all on OSX): > > https://github.com/pv/scipy-work/commits/special-boost > > and you get `scipy.special._ufuncs_cxx.jv`. > A few questions. Are you using the C compatibility option for boost? How are you dealing with errors? Any support for complex numbers? I think it is a plus that Boost supports floats, doubles, and long doubles in the C compatibility mode. I don't know if we will also need complex for some of the functions. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Oct 7 14:04:23 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 07 Oct 2012 21:04:23 +0300 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: 07.10.2012 20:18, Charles R Harris kirjoitti: [clip] > A few questions. > > Are you using the C compatibility option for boost? No, just write a extern C wrapper function that calls it. > How are you dealing with errors? I'm not --- I suppose exceptions could be caught, however. > Any support for complex numbers? AFAIK, boost doesn't support complex-valued functions. Pauli From charlesr.harris at gmail.com Sun Oct 7 14:24:14 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 7 Oct 2012 12:24:14 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 12:04 PM, Pauli Virtanen wrote: > 07.10.2012 20:18, Charles R Harris kirjoitti: > [clip] > > A few questions. > > > > Are you using the C compatibility option for boost? > > No, just write a extern C wrapper function that calls it. > > > How are you dealing with errors? > > I'm not --- I suppose exceptions could be caught, however. > > > Any support for complex numbers? > > AFAIK, boost doesn't support complex-valued functions. > One could always hope ;) Grepping shows complex returns for the inverse trig functions in boost/math/complex, I think that is for C++ standards compatibility. They aren't among the special_functions, however. The math library also include quaternions and octonions, are you planning on including the whole math library or just the special_functions? Just to be clear, I like this effort, we have already borrowed some tests from Boost, we might as well take the functions that go with them. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanforeest at gmail.com Sun Oct 7 14:33:41 2012 From: vanforeest at gmail.com (nicky van foreest) Date: Sun, 7 Oct 2012 20:33:41 +0200 Subject: [SciPy-Dev] stats.distributions.py documentation In-Reply-To: References: Message-ID: HI Ralf, Sorry for being so slow at getting back to your comments. I have definitely not forgotten this mail. However, for the next few weeks I have to a considerable amount of teaching... Once my workload is a bit lower, I 'll come up with a plan. Nicky On 21 September 2012 21:27, Ralf Gommers wrote: > > > On Sun, Sep 16, 2012 at 11:10 PM, nicky van foreest > wrote: >> >> Hi, >> >> Below are two proposals to handle the documentation of the scipy >> distributions. >> >> The first is to add a set of examples to each distribution, see the >> list at the end of the mail as an example. However, I actually wonder >> whether it wouldn't be better to put this stuff in the stats tutorial. >> (I recently updated this, but given the list below, it is still not >> complete.) The list below is a bit long... too long perhaps. >> >> I actualy get the feeling that, given the enormous diversity of the >> distributions, it may not be possible to automatically generate a set >> of simple examples that work for each and every distributions. Such >> examples then would involve the usage of x.dist.b, and so on, and this >> is not particularly revealing to first (and second) time users. > > > This is exactly what the problem is currently. > >> >> A possible resolution is to include just one or two generic examples >> in the example doc string (e.g., dist.rvs(size = (2,3)) ), and refer >> to the tutorial for the rest. The tutorial then should show extensive >> examples for each method of the norm distribution. I assume that then >> any user of other distributions can figure out how to proceed for >> his/her own distribution. > > > This is a huge amount of work, and the generic example still won't run if > you copy-paste it into a terminal. > >> >> >> The second possibility would be to follow Josef's suggestion: >> --snip snip >> Splitting up the distributions pdf docs in tutorial into separate >> pages for individual distributions, make them nicer with code and >> graphs and link them from the docstring of the distribution. > > > Linking to the tutorial from the docstrings is a good idea, but the > docstrings themselves should be enough to get started. >> >> >> This would keep the docstring itself from blowing up, but we could get >> the full html reference if we need to. >> >> --snip snip >> >> This idea offers a lot of opportunities. In a previous mail I >> mentioned that I don't quite like that the documentation is spread >> over multiple documents. There are doc strings in distributions.py >> (leading to a bloated file), > > > It's not that bad imho. The typical docstring looks like: > """A beta prima continuous random variable. > > %(before_notes)s > > Notes > ----- > The probability density function for `betaprime` is:: > > betaprime.pdf(x, a, b) = > gamma(a+b) / (gamma(a)*gamma(b)) * x**(a-1) * (1-x)**(-a-b) > > for ``x > 0``, ``a > 0``, ``b > 0``. > > %(example)s > """ > > It can't be much shorter than that. > > >> and there is continuous.rst. Part of the >> implementation can be understood from the doc-string, typically, the >> density function, but not the rest; > > > The pdf and support are given, that's enough to define the distribution. So > that should stay. It doesn't mean we have to copy the whole wikipedia page > for each distribution. > >> >> this requires continuous.rst. >> Besides this, in case some specific distribution requires extra >> explanation/examples, this will have to put in the doc-string, making >> distributions.py longer still. Thus, to take up Josef's suggestion, >> what about a documentation file organised like this: > > > Are you suggesting a reST page here, or a .py file with only docs, and new > magic to make part of the content show up as docstring? The former sounds > better to me. > >> >> >> # some tag to tell that these are the docs for the norm distribution >> # eg. >> # norm_gen >> >> Normal Distribution >> ---------------------------- >> >> Notes >> ^^^^^^^ >> # should be used by the interpreter >> The probability density function for `norm` is:: >> >> norm.pdf(x) = exp(-x**2/2)/sqrt(2*pi) >> >> Simple Examples >> ^^^^^^^^^^^^^^^^^^^^ >> # used for by interpreter >> >>> norm.rvs( size = (2,3) ) >> >> Extensive Examples >> ^^^^^^^^^^^^^^^^^^^^^^^^ >> # Not used by the interpreter, but certainly by a html viewer, >> containing graphs, hard/specific examples. >> >> Mathematical Details >> ^^^^^^^^^^^^^^^^^^^^^^ >> >> Stuff from continuous.rst >> >> # dist2_gen >> Distribution number 2 >> ----------------------------------------- >> etc >> >> It shouldn't be too hard to parse such a document, and couple each >> piece of documentation to a distribution in distributions.py (or am I >> mistaken?) as we use the class name as the tag in the documentation >> file. The doc-string for a distribution in distributions.py can then >> be removed, >> >> Nicky >> >> Example for the examples section of the docstring of norm. > > > This example is good. Perhaps the frozen distribution needs a few words of > explanation. I suggest to do a few more of these for common distributions, > and link to the norm() docstring from less common distributions. Other than > that, I wouldn't change anything about the docstrings. Built docs could be > reworked more thoroughly. > > Ralf > > >> >> >> Notes >> ----- >> The probability density function for `norm` is:: >> >> norm.pdf(x) = exp(-x**2/2)/sqrt(2*pi) >> >> #%(example)s >> >> Examples >> -------- >> >> Setting the mean and standard deviation: >> >> >>> from scipy.stats import norm >> >>> norm.cdf(0.0) >> >>> norm.cdf(0., 1) # set mu = loc = 1 >> >>> norm.cdf(0., 1, 2) # mu = loc = 1, scale = sigma = 2 >> >>> norm.cdf(0., loc = 1, scale = 2) # mu = loc = 1, scale = >> sigma = 2 >> >> Frozen rvs >> >> >>> norm(1., 2.).cdf(0) >> >>> x = norm(scale = 2.) >> >>> x.cdf(0.0) >> >> Moments >> >> >>> norm(loc = 2).stats() >> >>> norm.mean() >> >>> norm.moment(2, scale = 3.) >> >>> x.std() >> >>> x.var() >> >> Random number generation >> >> >>> norm.rvs(3, 1, size = (2,3)) # loc = 3, scale =1, array of >> shape (2,3) >> >>> norm.rvs(3, 1, size = [2,3]) >> >>> x.rvs(3) # array with 3 random deviates >> >>> x.rvs([3,4]) # array of shape (3,4) with deviates >> >> Expectations >> >> >>> norm.expect(lambda x: x, loc = 1) # 1.00000 >> >>> norm.expect(lambda x: x**2, loc = 1., scale = 2.) # second >> moment >> >> Support of the distribution >> >> >>> norm.a # left limit, -np.inf here >> >>> norm.b # right limit, np.inf here >> >> Plot of the cdf >> >> >>> import numpy as np >> >>> x = np.linspace(0, 3) >> >>> P = norm.cdf(x) >> >>> plt.plot(x,P) >> >>> plt.show() >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From cournape at gmail.com Sun Oct 7 15:36:34 2012 From: cournape at gmail.com (David Cournapeau) Date: Sun, 7 Oct 2012 20:36:34 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 7:04 PM, Pauli Virtanen wrote: > 07.10.2012 20:18, Charles R Harris kirjoitti: > [clip] >> A few questions. >> >> Are you using the C compatibility option for boost? > > No, just write a extern C wrapper function that calls it. > >> How are you dealing with errors? > > I'm not --- I suppose exceptions could be caught, however. Is it possible to set up the boost mode so that ERRNO is used instead of exceptions ? I would rather be as close as possible to C semantics as possible, David From ralf.gommers at gmail.com Sun Oct 7 15:52:54 2012 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 7 Oct 2012 21:52:54 +0200 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 6:21 PM, Charles R Harris wrote: > > > On Sun, Oct 7, 2012 at 9:40 AM, Pauli Virtanen wrote: > >> Hi, >> >> I'd like to consider replacing some of the function implementations in >> scipy.special with versions from the C++ Boost library, cf. >> >> http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/index.html >> >> The reason is that these implementations appear to be made with more >> care than what we use currently, and behave more correctly in corner >> cases (cf. e.g. ticket #1740). >> >> To minimize work, it would be useful just to use the Boost functions >> direcly, rather than doing manual C transcriptions. The drawback here is >> that the portion of Boost library required weights about 8 MB of source >> code, and we would most likely like to bundle it, as it is not really a >> standard part of many installations. This does not reflect much on the >> compiled binary size, however. >> >> I'm not 100 % certain about the compiler support. Perhaps C++ is already >> mature enough to work across the platforms we care about. >> > An obvious worry is Windows. MinGW is explicitly not supported, and "may or may not work": http://www.boost.org/doc/libs/1_51_0/more/getting_started/windows.html > >> I'm not aware of many good BSD-compatible floating-point special >> function libraries, so if you know others, or would be opposed to >> bundling Boost, please chime up! >> >> > I think using the boost library is a good idea. It is well tested and > looks to support quad precision, something we will probably want at some > point. It also looks to be highly templated and tightly integrated, so I > suspect getting it properly interfaced might be non-trivial. The same holds > for the distributions, but we have done much the same. It might be worth > looking over the boost classes for some ideas. > > As to the size of the code, the current scipy/special library is ~40MB and > I expect we can get rid of some of that. > It's ~11Mb for me. We should check for LLVM compatibility to make sure Apple isn't a problem, > but it looks like most other C++ compilers will work, Boost does try hard > for universality. > > Compile times will probably increase if we keep all the templates. > That's a much bigger worry than how many Mb of source code it is. Does anyone have an idea of how much compile time would increase and how much RAM it would use? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Sun Oct 7 19:58:46 2012 From: fperez.net at gmail.com (Fernando Perez) Date: Sun, 7 Oct 2012 16:58:46 -0700 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 8:40 AM, Pauli Virtanen wrote: > I'm not 100 % certain about the compiler support. Perhaps C++ is already > mature enough to work across the platforms we care about. It's worth having a good test of the compiler situation first. I have horrible memories a few years ago (not that long, though) of trying to use the Boost Graph Library for a project with python wrappers, and the Boost version this used wouldn't even compile with the newest gcc of the current ubuntu at the time. My only solution was to build gcc itself from source. So before we end up foisting upon RedHat users or similar the requirement that to build scipy they need to rebuild their compilers from source, which I suspect wouldn't be a very popular move, let's find out what is the oldest version of gcc this particular part of boost will require. Boost is great, but it's also famous for pushing compilers very, very far beyond their comfort zone. So this step should not be taken lightly. It would also be good to know: - does it compile with MS compilers? If so, what's the oldest version that works? - and what about the Intel ones? Not trying to rain on the parade, but over the last 10 years I've tried to use boost a few times, and every occasion has led to compiler pain. So I'd be cautious with putting it in as a scipy dependency. If it turns out that this part of Boost is less sensitive to compiler details, then great! I'd love to be proven wrong in my paranoia here... Cheers, f From josef.pktd at gmail.com Sun Oct 7 21:26:39 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 7 Oct 2012 21:26:39 -0400 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 3:52 PM, Ralf Gommers wrote: > > > On Sun, Oct 7, 2012 at 6:21 PM, Charles R Harris > wrote: >> >> >> >> On Sun, Oct 7, 2012 at 9:40 AM, Pauli Virtanen wrote: >>> >>> Hi, >>> >>> I'd like to consider replacing some of the function implementations in >>> scipy.special with versions from the C++ Boost library, cf. >>> >>> http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/index.html >>> >>> The reason is that these implementations appear to be made with more >>> care than what we use currently, and behave more correctly in corner >>> cases (cf. e.g. ticket #1740). >>> >>> To minimize work, it would be useful just to use the Boost functions >>> direcly, rather than doing manual C transcriptions. The drawback here is >>> that the portion of Boost library required weights about 8 MB of source >>> code, and we would most likely like to bundle it, as it is not really a >>> standard part of many installations. This does not reflect much on the >>> compiled binary size, however. >>> >>> I'm not 100 % certain about the compiler support. Perhaps C++ is already >>> mature enough to work across the platforms we care about. > > > An obvious worry is Windows. MinGW is explicitly not supported, and "may or > may not work": > http://www.boost.org/doc/libs/1_51_0/more/getting_started/windows.html I think that line refers only to MSYS shell, not to MinGW compiler. A few years ago I managed to get quantlib with python bindings build on top of boost with MinGW without too much problems, given that I had not much idea of what I was doing. (I'm not sure I didn't get boost binaries and only build quantlib) It looks like there are lots of instructions and examples building boost with mingw on the web. Josef > >>> >>> >>> I'm not aware of many good BSD-compatible floating-point special >>> function libraries, so if you know others, or would be opposed to >>> bundling Boost, please chime up! >>> >> >> I think using the boost library is a good idea. It is well tested and >> looks to support quad precision, something we will probably want at some >> point. It also looks to be highly templated and tightly integrated, so I >> suspect getting it properly interfaced might be non-trivial. The same holds >> for the distributions, but we have done much the same. It might be worth >> looking over the boost classes for some ideas. >> >> As to the size of the code, the current scipy/special library is ~40MB and >> I expect we can get rid of some of that. > > > It's ~11Mb for me. > >> We should check for LLVM compatibility to make sure Apple isn't a problem, >> but it looks like most other C++ compilers will work, Boost does try hard >> for universality. >> >> Compile times will probably increase if we keep all the templates. > > > That's a much bigger worry than how many Mb of source code it is. Does > anyone have an idea of how much compile time would increase and how much RAM > it would use? > > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Sun Oct 7 21:31:12 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 7 Oct 2012 21:31:12 -0400 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 7:58 PM, Fernando Perez wrote: > On Sun, Oct 7, 2012 at 8:40 AM, Pauli Virtanen wrote: >> I'm not 100 % certain about the compiler support. Perhaps C++ is already >> mature enough to work across the platforms we care about. > > It's worth having a good test of the compiler situation first. I have > horrible memories a few years ago (not that long, though) of trying to > use the Boost Graph Library for a project with python wrappers, and > the Boost version this used wouldn't even compile with the newest gcc > of the current ubuntu at the time. My only solution was to build gcc > itself from source. > > So before we end up foisting upon RedHat users or similar the > requirement that to build scipy they need to rebuild their compilers > from source, which I suspect wouldn't be a very popular move, let's > find out what is the oldest version of gcc this particular part of > boost will require. > > Boost is great, but it's also famous for pushing compilers very, very > far beyond their comfort zone. So this step should not be taken > lightly. It would also be good to know: > > - does it compile with MS compilers? If so, what's the oldest version > that works? the boost doc page that Ralf linked to has Visual Studio 2003 and 2005 as example case for compilers and IDEs Josef > - and what about the Intel ones? > > Not trying to rain on the parade, but over the last 10 years I've > tried to use boost a few times, and every occasion has led to compiler > pain. So I'd be cautious with putting it in as a scipy dependency. > > If it turns out that this part of Boost is less sensitive to compiler > details, then great! I'd love to be proven wrong in my paranoia > here... > > Cheers, > > f > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From slasley at space.umd.edu Sun Oct 7 15:40:47 2012 From: slasley at space.umd.edu (Scott Lasley) Date: Sun, 7 Oct 2012 15:40:47 -0400 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: <1D321B84-1571-4C8A-A901-F3AEF238A142@space.umd.edu> On Oct 7, 2012, at 12:48 PM, Pauli Virtanen wrote: > 07.10.2012 19:21, Charles R Harris kirjoitti: > [clip] >> I think using the boost library is a good idea. It is well tested and >> looks to support quad precision, something we will probably want at some >> point. It also looks to be highly templated and tightly integrated, so I >> suspect getting it properly interfaced might be non-trivial. The same >> holds for the distributions, but we have done much the same. It might be >> worth looking over the boost classes for some ideas. >> >> As to the size of the code, the current scipy/special library is ~40MB >> and I expect we can get rid of some of that. We should check for LLVM >> compatibility to make sure Apple isn't a problem, but it looks like most >> other C++ compilers will work, Boost does try hard for universality. >> >> Compile times will probably increase if we keep all the templates. > > Integrating it is actually not so hard, it's here if someone wants to > try (e.g. if it works at all on OSX): > > https://github.com/pv/scipy-work/commits/special-boost > > and you get `scipy.special._ufuncs_cxx.jv`. > > -- > Pauli Virtanen I was able to build it with clang 4.1 under OS X 10.8.2. I don't see any *_ufuncs_cxx.jv files, but I do see these _ufunc files in ./build ./build/lib.macosx-10.7-intel-2.7/scipy/special/_ufuncs.so ./build/lib.macosx-10.7-intel-2.7/scipy/special/_ufuncs_cxx.so ./build/lib.macosx-10.7-intel-2.7/scipy/special/generate_ufuncs.py ./build/temp.macosx-10.7-intel-2.7/scipy/special/_ufuncs.o ./build/temp.macosx-10.7-intel-2.7/scipy/special/_ufuncs_cxx.o I see scipy.special._ufuncs which is not in scipy-0.12.0.dev_941351d from github In [2]: print dir(scipy.special._ufuncs) ['__builtins__', '__doc__', '__file__', '__name__', '__package__', '__test__', '_eval_chebyt', '_lambertw', 'airy', 'airye', 'bdtr', 'bdtrc', 'bdtri', 'bdtrik', 'bdtrin', 'bei', 'beip', 'ber', 'berp', 'besselpoly', 'beta', 'betainc', 'betaincinv', 'betaln', 'btdtr', 'btdtri', 'btdtria', 'btdtrib', 'cbrt', 'chdtr', 'chdtrc', 'chdtri', 'chdtriv', 'chndtr', 'chndtridf', 'chndtrinc', 'chndtrix', 'cosdg', 'cosm1', 'cotdg', 'dawsn', 'ellipe', 'ellipeinc', 'ellipj', 'ellipkinc', 'ellipkm1', 'erf', 'erfc', 'errprint', 'exp1', 'exp10', 'exp2', 'expi', 'expit', 'expm1', 'expn', 'fdtr', 'fdtrc', 'fdtri', 'fdtridfd', 'fresnel', 'gamma', 'gammainc', 'gammaincc', 'gammainccinv', 'gammaincinv', 'gammaln', 'gdtr', 'gdtrc', 'gdtria', 'gdtrib', 'gdtrix', 'hankel1', 'hankel1e', 'hankel2', 'hankel2e', 'hyp1f1', 'hyp1f2', 'hyp2f0', 'hyp2f1', 'hyp3f0', 'hyperu', 'i0', 'i0e', 'i1', 'i1e', 'it2i0k0', 'it2j0y0', 'it2struve0', 'itairy', 'iti0k0', 'itj0y0', 'itmodstruve0', 'itstruve0', 'iv', 'ive', 'j0', 'j1', 'jn', 'jv', 'jve', 'k0', 'k0e', 'k1', 'k1e', 'kei', 'keip', 'kelvin', 'ker', 'kerp', 'kn', 'kolmogi', 'kolmogorov', 'kv', 'kve', 'log1p', 'log_ndtr', 'logit', 'lpmv', 'mathieu_a', 'mathieu_b', 'mathieu_cem', 'mathieu_modcem1', 'mathieu_modcem2', 'mathieu_modsem1', 'mathieu_modsem2', 'mathieu_sem', 'modfresnelm', 'modfresnelp', 'modstruve', 'nbdtr', 'nbdtrc', 'nbdtri', 'nbdtrik', 'nbdtrin', 'ncfdtr', 'ncfdtri', 'ncfdtridfd', 'ncfdtridfn', 'ncfdtrinc', 'nctdtr', 'nctdtridf', 'nctdtrinc', 'nctdtrit', 'ndtr', 'ndtri', 'nrdtrimn', 'nrdtrisd', 'obl_ang1', 'obl_ang1_cv', 'obl_cv', 'obl_rad1', 'obl_rad1_cv', 'obl_rad2', 'obl_rad2_cv', 'pbdv', 'pbvv', 'pbwa', 'pdtr', 'pdtrc', 'pdtri', 'pdtrik', 'pro_ang1', 'pro_ang1_cv', 'pro_cv', 'pro_rad1', 'pro_rad1_cv', 'pro_rad2', 'pro_rad2_cv', 'psi', 'radian', 'rgamma', 'round', 'shichi', 'sici', 'sindg', 'smirnov', 'smirnovi', 'spence', 'stdtr', 'stdtridf', 'stdtrit', 'struve', 'tandg', 'tklmbda', 'wofz', 'y0', 'y1', 'yn', 'yv', 'yve', 'zeta', 'zetac'] In [3]: scipy.special.test('full') Running unit tests for scipy.special NumPy version 1.8.0.dev-3f10c36 NumPy is installed in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy-1.8.0.dev_3f10c36-py2.7-macosx-10.7-intel.egg/numpy SciPy version 0.12.0.dev-ccbfd79 SciPy is installed in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy-0.12.0.dev_ccbfd79-py2.7-macosx-10.7-intel.egg/scipy Python version 2.7.3 (default, Jun 24 2012, 10:45:32) [GCC 4.2.1 Compatible Apple Clang 4.0 ((tags/Apple/clang-421.10.42))] nose version 1.1.2 ..................E...................................................K.K.............................................................................................................................................................................................................................................................................................................................................................................................K........K..............SSSSSSSS............................ ====================================================================== ERROR: test_iv_cephes_vs_amos_mass_test (test_basic.TestBessel) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy-0.12.0.dev_ccbfd79-py2.7-macosx-10.7-intel.egg/scipy/special/tests/test_basic.py", line 1643, in test_iv_cephes_vs_amos_mass_test v[imsk] = v.astype(int) ValueError: NumPy boolean array indexing assignment cannot assign 1000000 input values to the 125035 output values where the mask is true ---------------------------------------------------------------------- Ran 514 tests in 1.993s FAILED (KNOWNFAIL=4, SKIP=8, errors=1) Out[3]: The same test fails in scipy-0.12.0.dev_941351d From cournape at gmail.com Mon Oct 8 06:41:41 2012 From: cournape at gmail.com (David Cournapeau) Date: Mon, 8 Oct 2012 11:41:41 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 12:58 AM, Fernando Perez wrote: > On Sun, Oct 7, 2012 at 8:40 AM, Pauli Virtanen wrote: >> I'm not 100 % certain about the compiler support. Perhaps C++ is already >> mature enough to work across the platforms we care about. > > It's worth having a good test of the compiler situation first. I have > horrible memories a few years ago (not that long, though) of trying to > use the Boost Graph Library for a project with python wrappers, and > the Boost version this used wouldn't even compile with the newest gcc > of the current ubuntu at the time. My only solution was to build gcc > itself from source. > > So before we end up foisting upon RedHat users or similar the > requirement that to build scipy they need to rebuild their compilers > from source, which I suspect wouldn't be a very popular move, let's > find out what is the oldest version of gcc this particular part of > boost will require. > > Boost is great, but it's also famous for pushing compilers very, very > far beyond their comfort zone. So this step should not be taken > lightly. It would also be good to know: > > - does it compile with MS compilers? If so, what's the oldest version > that works? > - and what about the Intel ones? > > Not trying to rain on the parade, but over the last 10 years I've > tried to use boost a few times, and every occasion has led to compiler > pain. So I'd be cautious with putting it in as a scipy dependency. We don't want to put boost altogether as a dependency, for all the reasons you're giving. In this case, I don't mind so much, because: - the templates for each special function are used in only one compilation unit, so they will be compiled only once - there is a mode when you can force the exceptions do be disabled and use ERRNO instead, so not too many issues there either. David From njs at pobox.com Mon Oct 8 07:09:04 2012 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 8 Oct 2012 12:09:04 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 11:41 AM, David Cournapeau wrote: > In this case, I don't mind so much, because: > - the templates for each special function are used in only one > compilation unit, so they will be compiled only once > - there is a mode when you can force the exceptions do be disabled > and use ERRNO instead, so not too many issues there either. Wait, literally 'errno'? That seems like a terrible idea, given how tightly wound that API is with low-level system interfaces and compatibility hacks. C wrappers that looked like extern "C" double my_c_j0_wrapper(double x, int * errflags) { try { return boost_j0(x); } except (boost::exception e) { *errflags |= exc_to_flags(e); } } would be just as easy to auto-generate as anything else, and we'd never have to fight with thread-local storage platform compatibility issues or anything... -n From cournape at gmail.com Mon Oct 8 08:51:50 2012 From: cournape at gmail.com (David Cournapeau) Date: Mon, 8 Oct 2012 13:51:50 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 12:09 PM, Nathaniel Smith wrote: > On Mon, Oct 8, 2012 at 11:41 AM, David Cournapeau wrote: >> In this case, I don't mind so much, because: >> - the templates for each special function are used in only one >> compilation unit, so they will be compiled only once >> - there is a mode when you can force the exceptions do be disabled >> and use ERRNO instead, so not too many issues there either. > > Wait, literally 'errno'? That seems like a terrible idea, given how > tightly wound that API is with low-level system interfaces and > compatibility hacks. While I am not a fan of errno either, errno_on_error seems the closest to what libraries using C semantics would expect: http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/math_toolkit/main_overview/error_handling.html As for compatibility goes, we have been using the C math function which has the same behaviour for quite some time now, David From andy.terrel at gmail.com Mon Oct 8 10:22:39 2012 From: andy.terrel at gmail.com (Andy Ray Terrel) Date: Mon, 8 Oct 2012 09:22:39 -0500 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: To comment on compiler support for boost, I have supported Boost on Intel for some time. The problem compilers are pgi and xl, but there is pressure on these vendors to better support metatemplate progamming. -- Andy From scopatz at gmail.com Mon Oct 8 11:41:03 2012 From: scopatz at gmail.com (Anthony Scopatz) Date: Mon, 8 Oct 2012 11:41:03 -0400 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 10:22 AM, Andy Ray Terrel wrote: > To comment on compiler support for boost, I have supported Boost on > Intel for some time. The problem compilers are pgi and xl, but there > is pressure on these vendors to better support metatemplate > progamming. > I second Andy's comments about pgi and xl. Otherwise, I am generally in favor of adding Boost. Be Well Anthony > > -- Andy > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Oct 8 12:25:49 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 8 Oct 2012 10:25:49 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 9:41 AM, Anthony Scopatz wrote: > On Mon, Oct 8, 2012 at 10:22 AM, Andy Ray Terrel wrote: > >> To comment on compiler support for boost, I have supported Boost on >> Intel for some time. The problem compilers are pgi and xl, but there >> is pressure on these vendors to better support metatemplate >> progamming. >> > > I second Andy's comments about pgi and xl. Otherwise, I am generally in > favor > of adding Boost. > I don't think the special_functions make use of metatemplate techniques. The ublas templates in boost/math do, but that is a different application. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From andy.terrel at gmail.com Mon Oct 8 13:22:35 2012 From: andy.terrel at gmail.com (Andy Ray Terrel) Date: Mon, 8 Oct 2012 12:22:35 -0500 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 11:25 AM, Charles R Harris wrote: > > > On Mon, Oct 8, 2012 at 9:41 AM, Anthony Scopatz wrote: >> >> On Mon, Oct 8, 2012 at 10:22 AM, Andy Ray Terrel >> wrote: >>> >>> To comment on compiler support for boost, I have supported Boost on >>> Intel for some time. The problem compilers are pgi and xl, but there >>> is pressure on these vendors to better support metatemplate >>> progamming. >> >> >> I second Andy's comments about pgi and xl. Otherwise, I am generally in >> favor >> of adding Boost. > > > I don't think the special_functions make use of metatemplate techniques. The > ublas templates in boost/math do, but that is a different application. > > Chuck Well pgi is still doing really bad. http://www.boost.org/development/tests/trunk/developer/math.html > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From aron at ahmadia.net Mon Oct 8 13:27:43 2012 From: aron at ahmadia.net (Aron Ahmadia) Date: Mon, 8 Oct 2012 20:27:43 +0300 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: Andy, I agree with your point but why are you linking to boost trunk as opposed to tests against a recent release? My understanding is that the IBM compiler team, for example, always makes sure that some version of boost builds against their C++ compilers, even if it is not bleeding-edge. A On Mon, Oct 8, 2012 at 8:22 PM, Andy Ray Terrel wrote: > On Mon, Oct 8, 2012 at 11:25 AM, Charles R Harris > wrote: > > > > > > On Mon, Oct 8, 2012 at 9:41 AM, Anthony Scopatz > wrote: > >> > >> On Mon, Oct 8, 2012 at 10:22 AM, Andy Ray Terrel > > >> wrote: > >>> > >>> To comment on compiler support for boost, I have supported Boost on > >>> Intel for some time. The problem compilers are pgi and xl, but there > >>> is pressure on these vendors to better support metatemplate > >>> progamming. > >> > >> > >> I second Andy's comments about pgi and xl. Otherwise, I am generally in > >> favor > >> of adding Boost. > > > > > > I don't think the special_functions make use of metatemplate techniques. > The > > ublas templates in boost/math do, but that is a different application. > > > > Chuck > > Well pgi is still doing really bad. > > http://www.boost.org/development/tests/trunk/developer/math.html > > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Oct 8 14:10:51 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 8 Oct 2012 12:10:51 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 11:27 AM, Aron Ahmadia wrote: > Andy, I agree with your point but why are you linking to boost trunk as > opposed to tests against a recent release? My understanding is that the > IBM compiler team, for example, always makes sure that some version of > boost builds against their C++ compilers, even if it is not bleeding-edge. > The released versionlooks better. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From andy.terrel at gmail.com Mon Oct 8 14:51:58 2012 From: andy.terrel at gmail.com (Andy Ray Terrel) Date: Mon, 8 Oct 2012 13:51:58 -0500 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 1:10 PM, Charles R Harris wrote: > > > On Mon, Oct 8, 2012 at 11:27 AM, Aron Ahmadia wrote: >> >> Andy, I agree with your point but why are you linking to boost trunk as >> opposed to tests against a recent release? My understanding is that the IBM >> compiler team, for example, always makes sure that some version of boost >> builds against their C++ compilers, even if it is not bleeding-edge. > > > The released version looks better. > > Chuck Aron, I linked to that one because it listed pgi, it didn't list xl. I don't have access to a machine with xl *nudge**nudge* but since you do for at least another week, you might provide some insight. Chuck, unfortunately that one is better because it doesn't list pgi. Personally, I use gcc, intel, and/or clang for almost everything so I'm not pushing for keeping progress back for pgi and xl, rather just commenting on the state of compilers since Fernando brought it up. -- Andy > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav at iki.fi Mon Oct 8 18:19:39 2012 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 09 Oct 2012 01:19:39 +0300 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: 08.10.2012 02:58, Fernando Perez kirjoitti: [clip] > So before we end up foisting upon RedHat users or similar the > requirement that to build scipy they need to rebuild their compilers > from source, which I suspect wouldn't be a very popular move, let's > find out what is the oldest version of gcc this particular part of > boost will require. The venerable gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-48) manages to compile this subset of Boost without problems, so I wouldn't be too worried about Linux platforms. -- Pauli Virtanen From njs at pobox.com Mon Oct 8 18:28:24 2012 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 8 Oct 2012 23:28:24 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Mon, Oct 8, 2012 at 11:19 PM, Pauli Virtanen wrote: > 08.10.2012 02:58, Fernando Perez kirjoitti: > [clip] >> So before we end up foisting upon RedHat users or similar the >> requirement that to build scipy they need to rebuild their compilers >> from source, which I suspect wouldn't be a very popular move, let's >> find out what is the oldest version of gcc this particular part of >> boost will require. > > The venerable gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-48) manages to > compile this subset of Boost without problems, so I wouldn't be too > worried about Linux platforms. numpy currently requires a specific rather-old version of mingw to build on win32, might be a good idea to double-check that too. -n From ben.root at ou.edu Tue Oct 9 09:09:07 2012 From: ben.root at ou.edu (Benjamin Root) Date: Tue, 9 Oct 2012 09:09:07 -0400 Subject: [SciPy-Dev] Scipy docstrings In-Reply-To: References: Message-ID: On Fri, Sep 21, 2012 at 3:25 PM, Cera, Tim wrote: > Thanks. I looked at the scipy commits yesterday and came to the same > conclusion. > > Kindest regards, > Tim > > I was just looking at scipy.io.netcdf's docstring on the doceditor page, and it is showing a rather large change waiting for review, essentially undoing all of the work I did back in 2010. Note that the changes have yet to be merged in, but I don't know how to flag as rejected. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Tue Oct 9 09:14:55 2012 From: ben.root at ou.edu (Benjamin Root) Date: Tue, 9 Oct 2012 09:14:55 -0400 Subject: [SciPy-Dev] Scipy docstrings In-Reply-To: References: Message-ID: On Tue, Oct 9, 2012 at 9:09 AM, Benjamin Root wrote: > > > On Fri, Sep 21, 2012 at 3:25 PM, Cera, Tim wrote: > >> Thanks. I looked at the scipy commits yesterday and came to the same >> conclusion. >> >> Kindest regards, >> Tim >> >> > I was just looking at scipy.io.netcdf's docstring on the doceditor page, > and it is showing a rather large change waiting for review, essentially > undoing all of the work I did back in 2010. Note that the changes have yet > to be merged in, but I don't know how to flag as rejected. > > Ben Root > Actually, looking again at the docstrings, I am not at all sure if what is currently in v0.11.x is the old stuff, while what is on the doceditor is what I originally created and had merged in back in 2010. This is confusing. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Tue Oct 9 12:31:04 2012 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 09 Oct 2012 19:31:04 +0300 Subject: [SciPy-Dev] Scipy docstrings In-Reply-To: References: Message-ID: 09.10.2012 16:14, Benjamin Root kirjoitti: [clip] > Actually, looking again at the docstrings, I am not at all sure if what > is currently in v0.11.x is the old stuff, while what is on the doceditor > is what I originally created and had merged in back in 2010. This is > confusing. What is in the doc editor is still what you originally wrote: http://docs.scipy.org/scipy/docs/scipy.io.netcdf/log/ Whether the merges suggested by it always make sense is then another question. -- Pauli Virtanen From jstevenson131 at gmail.com Tue Oct 9 17:30:05 2012 From: jstevenson131 at gmail.com (Jacob Stevenson) Date: Tue, 9 Oct 2012 22:30:05 +0100 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping Message-ID: Hi everyone, I just submitted a pull request to incorporate a basinhopping routine in the optimize package. Basinhopping is a powerful algorithm, but all the hard work would be done by the minimizers that already exist in scipy.optimize, so the additional code needed is really not very much. The following is from the documentation notes: Basin hopping is a random algorithm which attempts to find the global minimum of a smooth scalar function of one or more variables. The algorithm was originally described by David Wales http://www-wales.ch.cam.ac.uk/ . The algorithm is iterative with each iteration composed of the following steps 1) random displacement of the coordinates 2) local minimization 3) accept or reject the new coordinates based on the minimized function value. This global minimization method has been shown to be extremely efficient on a wide variety of problems in physics and chemistry. It is especially efficient when the function has many minima separated by large barriers. See the Cambridge Cluster Database http://www-wales.ch.cam.ac.uk/CCD.html for database of molecular systems that have been optimized primarily using basin hopping. This database includes minimization problems exceeding 300 degrees of freedom. Thanks, Jake p.s. this is my first post and first submission -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Oct 10 14:55:25 2012 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 10 Oct 2012 20:55:25 +0200 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping In-Reply-To: References: Message-ID: On Tue, Oct 9, 2012 at 11:30 PM, Jacob Stevenson wrote: > Hi everyone, I just submitted a pull request to incorporate a basinhopping > routine in the optimize package. Basinhopping is a powerful algorithm, > but all the hard work would be done by the minimizers that already exist in > scipy.optimize, so the additional code needed is really not very much. > > The following is from the documentation notes: > > Basin hopping is a random algorithm which attempts to find the global > minimum of a smooth scalar function of one or more variables. The > algorithm was originally described by David Wales > http://www-wales.ch.cam.ac.uk/ . The algorithm is iterative with each > iteration composed of the following steps > > 1) random displacement of the coordinates > > 2) local minimization > > 3) accept or reject the new coordinates based on the minimized function > value. > > This global minimization method has been shown to be extremely efficient > on a wide variety of problems in physics and chemistry. It is especially > efficient when the function has many minima separated by large barriers. > See the Cambridge Cluster Database http://www-wales.ch.cam.ac.uk/CCD.htmlfor database of molecular systems that have been optimized primarily using > basin hopping. This database includes minimization problems exceeding 300 > degrees of freedom. > > Thanks, > Jake > > p.s. this is my first post and first submission > Hi Jake, welcome! It looks to me like basin hopping would be a nice addition to scipy.optimize. It looks quite similar to simulated annealing, which we already have, and may be more efficient for certain classes of problems. You're hinting at that already in the notes section, but more details on that comparison would be good to put in the docs. A benchmark comparing your algorithm with anneal() would be good to see also. This PR made me wonder what other algorithms would be good to have, and what's out of scope for scipy. My feeling is that some similar stochastic algorithms to this one could be nice to have, but that other types of global optimizers are out of scope - there's OpenOpt, IPOPT, PyGMO, PyEvolve etc. for that. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Wed Oct 10 15:24:21 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 10 Oct 2012 15:24:21 -0400 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping In-Reply-To: References: Message-ID: On Wed, Oct 10, 2012 at 2:55 PM, Ralf Gommers wrote: > > > On Tue, Oct 9, 2012 at 11:30 PM, Jacob Stevenson > wrote: >> >> Hi everyone, I just submitted a pull request to incorporate a basinhopping >> routine in the optimize package. Basinhopping is a powerful algorithm, but >> all the hard work would be done by the minimizers that already exist in >> scipy.optimize, so the additional code needed is really not very much. >> >> The following is from the documentation notes: >> >> Basin hopping is a random algorithm which attempts to find the global >> minimum of a smooth scalar function of one or more variables. The algorithm >> was originally described by David Wales http://www-wales.ch.cam.ac.uk/ . >> The algorithm is iterative with each iteration composed of the following >> steps >> >> 1) random displacement of the coordinates >> >> 2) local minimization >> >> 3) accept or reject the new coordinates based on the minimized function >> value. >> >> This global minimization method has been shown to be extremely efficient >> on a wide variety of problems in physics and chemistry. It is especially >> efficient when the function has many minima separated by large barriers. >> See the Cambridge Cluster Database http://www-wales.ch.cam.ac.uk/CCD.html >> for database of molecular systems that have been optimized primarily using >> basin hopping. This database includes minimization problems exceeding 300 >> degrees of freedom. >> >> Thanks, >> Jake >> >> p.s. this is my first post and first submission > > > Hi Jake, welcome! > > It looks to me like basin hopping would be a nice addition to > scipy.optimize. It looks quite similar to simulated annealing, which we > already have, and may be more efficient for certain classes of problems. > You're hinting at that already in the notes section, but more details on > that comparison would be good to put in the docs. A benchmark comparing your > algorithm with anneal() would be good to see also. A comparison would be useful, but I think the differences are large enough that the relative benefits would depend a lot on the problem. Anneal has to many tuning parameters, and like most of these global optimizers they use a large number of function evaluations even close to the optimum. (my impression) My impression is that hybrid optimizers, like basinhopping, are much better for problems that have several local minima but still have a nice local behavior. Several of the problems in statsmodels would fall into this category. I sent a while ago a message to the user mailing list ("OT: global optimizati?on, hybrid global local search"). I think this kind of optimizers would be a good addition to scipy, and I would be glad to be able to use something more systematic than just picking several random starting values for local optimization. Josef > > This PR made me wonder what other algorithms would be good to have, and > what's out of scope for scipy. My feeling is that some similar stochastic > algorithms to this one could be nice to have, but that other types of global > optimizers are out of scope - there's OpenOpt, IPOPT, PyGMO, PyEvolve etc. > for that. > > Cheers, > Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Wed Oct 10 15:38:10 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 10 Oct 2012 15:38:10 -0400 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping In-Reply-To: References: Message-ID: On Wed, Oct 10, 2012 at 3:24 PM, wrote: > On Wed, Oct 10, 2012 at 2:55 PM, Ralf Gommers wrote: >> >> >> On Tue, Oct 9, 2012 at 11:30 PM, Jacob Stevenson >> wrote: >>> >>> Hi everyone, I just submitted a pull request to incorporate a basinhopping >>> routine in the optimize package. Basinhopping is a powerful algorithm, but >>> all the hard work would be done by the minimizers that already exist in >>> scipy.optimize, so the additional code needed is really not very much. >>> >>> The following is from the documentation notes: >>> >>> Basin hopping is a random algorithm which attempts to find the global >>> minimum of a smooth scalar function of one or more variables. The algorithm >>> was originally described by David Wales http://www-wales.ch.cam.ac.uk/ . >>> The algorithm is iterative with each iteration composed of the following >>> steps >>> >>> 1) random displacement of the coordinates >>> >>> 2) local minimization >>> >>> 3) accept or reject the new coordinates based on the minimized function >>> value. >>> >>> This global minimization method has been shown to be extremely efficient >>> on a wide variety of problems in physics and chemistry. It is especially >>> efficient when the function has many minima separated by large barriers. >>> See the Cambridge Cluster Database http://www-wales.ch.cam.ac.uk/CCD.html >>> for database of molecular systems that have been optimized primarily using >>> basin hopping. This database includes minimization problems exceeding 300 >>> degrees of freedom. >>> >>> Thanks, >>> Jake >>> >>> p.s. this is my first post and first submission >> >> >> Hi Jake, welcome! >> >> It looks to me like basin hopping would be a nice addition to >> scipy.optimize. It looks quite similar to simulated annealing, which we >> already have, and may be more efficient for certain classes of problems. >> You're hinting at that already in the notes section, but more details on >> that comparison would be good to put in the docs. A benchmark comparing your >> algorithm with anneal() would be good to see also. > > A comparison would be useful, but I think the differences are large > enough that the relative benefits would depend a lot on the problem. > > Anneal has to many tuning parameters, and like most of these global > optimizers they use a large number of function evaluations even close > to the optimum. (my impression) > > My impression is that hybrid optimizers, like basinhopping, are much > better for problems that have several local minima but still have a > nice local behavior. Several of the problems in statsmodels would fall > into this category. > > I sent a while ago a message to the user mailing list ("OT: global > optimization, hybrid global local search"). > > I think this kind of optimizers would be a good addition to scipy, and > I would be glad to be able to use something more systematic than just > picking several random starting values for local optimization. just as additional motivation: least trimmed squares: It calculates least squares estimation on a subset of the observations and treats the rest as outliers. There can be a very large number of local optima, given by which observations are included. The best algorithm I have seen in the literature uses a few hundred random starts, optimizes a few steps and then optimizes the 10 or 30 best to convergence. Josef > > Josef > > >> >> This PR made me wonder what other algorithms would be good to have, and >> what's out of scope for scipy. My feeling is that some similar stochastic >> algorithms to this one could be nice to have, but that other types of global >> optimizers are out of scope - there's OpenOpt, IPOPT, PyGMO, PyEvolve etc. >> for that. >> >> Cheers, >> Ralf >> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> From ralf.gommers at gmail.com Wed Oct 10 17:49:55 2012 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 10 Oct 2012 23:49:55 +0200 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Tue, Oct 9, 2012 at 12:28 AM, Nathaniel Smith wrote: > On Mon, Oct 8, 2012 at 11:19 PM, Pauli Virtanen wrote: > > 08.10.2012 02:58, Fernando Perez kirjoitti: > > [clip] > >> So before we end up foisting upon RedHat users or similar the > >> requirement that to build scipy they need to rebuild their compilers > >> from source, which I suspect wouldn't be a very popular move, let's > >> find out what is the oldest version of gcc this particular part of > >> boost will require. > > > > The venerable gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-48) manages to > > compile this subset of Boost without problems, so I wouldn't be too > > worried about Linux platforms. > > numpy currently requires a specific rather-old version of mingw to > build on win32, might be a good idea to double-check that too. > Pauli's branch builds for me with MinGW. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From wardefar at iro.umontreal.ca Wed Oct 10 18:07:06 2012 From: wardefar at iro.umontreal.ca (David Warde-Farley) Date: Wed, 10 Oct 2012 18:07:06 -0400 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Sun, Oct 7, 2012 at 11:40 AM, Pauli Virtanen wrote: > Hi, > > I'd like to consider replacing some of the function implementations in > scipy.special with versions from the C++ Boost library, cf. > http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/index.html > > The reason is that these implementations appear to be made with more > care than what we use currently, and behave more correctly in corner > cases (cf. e.g. ticket #1740). > > To minimize work, it would be useful just to use the Boost functions > direcly, rather than doing manual C transcriptions. The drawback here is > that the portion of Boost library required weights about 8 MB of source > code, and we would most likely like to bundle it, as it is not really a > standard part of many installations. This does not reflect much on the > compiled binary size, however. > > I'm not 100 % certain about the compiler support. Perhaps C++ is already > mature enough to work across the platforms we care about. > > I'm not aware of many good BSD-compatible floating-point special > function libraries, so if you know others, or would be opposed to > bundling Boost, please chime up! At SciPy2011 a few of us managed to convince Andreas Kl?ckner to bundle the necessary bits of boost with PyCUDA so as to remove a dependency and grow the user-base. I recall this causing trouble for some segment of his users that had boost installed on the system, but I don't know the details. Andreas (CCed): Do you remember what problems those were? Fred B. told me about it over lunch one day but I can't recall what exactly went wrong. David From mail.till at gmx.de Wed Oct 10 22:20:02 2012 From: mail.till at gmx.de (Till Stensitzki) Date: Thu, 11 Oct 2012 02:20:02 +0000 (UTC) Subject: [SciPy-Dev] =?utf-8?q?optimize=3A_add_algorithm_for_global_optimi?= =?utf-8?q?zation=09basin_hopping?= References: Message-ID: As far as i can tell, simulated annealing is quite out of touch with more modern optimization strategies. See http://scicomp.stackexchange.com/questions/3372/simulated-annealing-proof-of-convergence and the references mentioned, so more optimization algorithms are good. But i think the scipy staff should decide the scope of scipy: either add almost every proved and usable algorithm with an python implementation or only the provide the most famous. I prefer the first way. OT things i would like to see in scipy, if anybody has too much time: * non-negative matrix factorization * a data fitting function, aka curve_fit on steroids. Maybe lmfit, kaptyn or mpfit. * an bouded linear least squares solver, like bvls. * Bayesian frequency estimation * NFFT From g.crosswhite at uq.edu.au Wed Oct 10 23:46:33 2012 From: g.crosswhite at uq.edu.au (Gregory Crosswhite) Date: Thu, 11 Oct 2012 13:46:33 +1000 Subject: [SciPy-Dev] Making lobpcg support complex matrix Message-ID: <50764119.6020402@uq.edu.au> Dear all, At the moment lobpcg seems to have been designed with only real numbers in mind; this is unfortunate because I have been having some trouble with eigs and was hoping to try it out as an alternative. Fortunately, it looks to me like the solution is rather simple. I was able to get it to work by replacing all instances of ".T" with ".T.conj()" --- which meant that all of the dot products were now correct for complex numbers --- and by replacing the cast to "float64" with a cast to "complex128". Changing the second cast is of course less than ideal in the cases where "float64" is indeed what is being used, but something like it is needed to make the answers not be garbage for complex input matrices. With these changes, I generated 10x10 complex Hermitian matrices A and B to serve as respectively the problem matrix and the normalization/metric matrix, and ran lobpcg with a tolerance of 1e-10. For the standard eigenvalue problem with k=2 (and random X) lobpcg got essentially the same answers as eigh for the lowest two eigenvalue/eigenvector pairs, and for the generalized eigenvalue problem it got the same eigenvalues as eigh and it got eigenvectors whose entries were within ~10^-4 of the eigenvectors returned by eigh (after dividing by the first entry to make the vectors proportionately the same) and such that the normalized overlap between the eigenvectors of lobpcg and eigh (using B as the metric) was within ~ 10^-8 of 1 (not surprising as this is the square of the first number). I ran a quick check where I dropped the imaginary parts of A and B and re-ran this analysis and say the errors fall to respectively ~ 10^-6 and 10^-12, so the algorithm gets less accurate results for complex numbers than for real numbers, though I don't know enough about how it works to speculate on which this would be. Anyway, I don't know much about lobpcg so there might be some issue that I've missed in simply adding ".conj()" everywhere a ".T" appeared to fix the dot products. I do think it would be very nice, though, to be able to use lobpcg for complex matrices, and so I would be willing to submit a patch towards this end. Thoughts? Cheers, Greg From eraldo.pomponi at gmail.com Thu Oct 11 03:35:10 2012 From: eraldo.pomponi at gmail.com (Eraldo Pomponi) Date: Thu, 11 Oct 2012 09:35:10 +0200 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping In-Reply-To: References: Message-ID: Just a note: OT things i would like to see in scipy, if anybody has too much time: > * non-negative matrix factorization > NMF is already available in scikit-learn. > * a data fitting function, aka curve_fit on steroids. Maybe lmfit, kaptyn > or > mpfit. > * an bouded linear least squares solver, like bvls. > * Bayesian frequency estimation > * NFFT Cheers, Eraldo -------------- next part -------------- An HTML attachment was scrubbed... URL: From wardefar at iro.umontreal.ca Thu Oct 11 01:11:27 2012 From: wardefar at iro.umontreal.ca (David Warde-Farley) Date: Thu, 11 Oct 2012 01:11:27 -0400 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: It looks like Andreas' message didn't make it to the list, so here it is forwarded. Looks like there are a few things to learn about bundling boost, but nothing show-stopping. Of course, some of it (compiler hell) is out of our control, but "get a better compiler" is a reasonable response on most platforms (... maybe not Windows). ---------- Forwarded message ---------- From: Andreas Kloeckner Date: Wed, Oct 10, 2012 at 7:43 PM Subject: Re: [SciPy-Dev] Bundling Boost? To: David Warde-Farley , SciPy Developers List David Warde-Farley writes: > On Sun, Oct 7, 2012 at 11:40 AM, Pauli Virtanen wrote: >> Hi, >> >> I'd like to consider replacing some of the function implementations in >> scipy.special with versions from the C++ Boost library, cf. >> http://www.boost.org/doc/libs/1_51_0/libs/math/doc/sf_and_dist/html/index.html >> >> The reason is that these implementations appear to be made with more >> care than what we use currently, and behave more correctly in corner >> cases (cf. e.g. ticket #1740). >> >> To minimize work, it would be useful just to use the Boost functions >> direcly, rather than doing manual C transcriptions. The drawback here is >> that the portion of Boost library required weights about 8 MB of source >> code, and we would most likely like to bundle it, as it is not really a >> standard part of many installations. This does not reflect much on the >> compiled binary size, however. >> >> I'm not 100 % certain about the compiler support. Perhaps C++ is already >> mature enough to work across the platforms we care about. >> >> I'm not aware of many good BSD-compatible floating-point special >> function libraries, so if you know others, or would be opposed to >> bundling Boost, please chime up! > > At SciPy2011 a few of us managed to convince Andreas Kl?ckner to > bundle the necessary bits of boost with PyCUDA so as to remove a > dependency and grow the user-base. I recall this causing trouble for > some segment of his users that had boost installed on the system, but > I don't know the details. > > Andreas (CCed): Do you remember what problems those were? Fred B. told > me about it over lunch one day but I can't recall what exactly went > wrong. Two main failure modes: - User has boost headers in /usr/include/boost. If stuff goes wrong, you'll build the binary parts of boost that you ship (whether you have those depends on which of the libraries you use) against the non-matching headers on the user's system. This usually results in missing symbols, but could conceivably lead to crashes. This doesn't seem to happen too much if you make sure that you have a -I for your shipped headers (i.e. only old/broken compilers). - Two modules using their individual shipped versions of boost::python (if you plan on using that) get loaded. Really weird stuff starts happening. The fix here is to -Dboost=mypackageboost. Strangely, this works without a hitch. That way, the two live in different C++ namespaces. They won't know about each other, but they also won't step on each other's feet. I've found both of those to be pretty benign. See here: https://github.com/inducer/aksetup/blob/master/aksetup_helper.py#L534 and here: https://github.com/inducer/bpl-subset for my work on this. ("bpl" = "boost python library"). do-extract here uses bcp, which refers to this: http://www.boost.org/doc/libs/1_51_0/tools/bcp/doc/html/index.html HTH, Andreas From sturla at molden.no Thu Oct 11 05:37:13 2012 From: sturla at molden.no (Sturla Molden) Date: Thu, 11 Oct 2012 11:37:13 +0200 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping In-Reply-To: References: Message-ID: <50769349.2030300@molden.no> On 11.10.2012 04:20, Till Stensitzki wrote: > As far as i can tell, simulated annealing is quite out of touch with more modern > optimization strategies. In Bayesian statistics, Markov chain Monte Carlo is the standard integration tool. If you want a peak or a root instead of the integral, Metropolis-Hastings becomes simulated annealing. Sturla From cournape at gmail.com Thu Oct 11 04:39:15 2012 From: cournape at gmail.com (David Cournapeau) Date: Thu, 11 Oct 2012 09:39:15 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Thu, Oct 11, 2012 at 6:11 AM, David Warde-Farley wrote: > It looks like Andreas' message didn't make it to the list, so here it > is forwarded. > > Looks like there are a few things to learn about bundling boost, but > nothing show-stopping. Of course, some of it (compiler hell) is out of > our control, but "get a better compiler" is a reasonable response on > most platforms (... maybe not Windows). Bundling all of boost is not on the table I think: we are just talking about the math stuff which look mostly like simple templates, in which case most issues are not relevant. Using more boost should be a different discussion. David From charlesr.harris at gmail.com Thu Oct 11 09:42:30 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 11 Oct 2012 07:42:30 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Thu, Oct 11, 2012 at 2:39 AM, David Cournapeau wrote: > On Thu, Oct 11, 2012 at 6:11 AM, David Warde-Farley > wrote: > > It looks like Andreas' message didn't make it to the list, so here it > > is forwarded. > > > > Looks like there are a few things to learn about bundling boost, but > > nothing show-stopping. Of course, some of it (compiler hell) is out of > > our control, but "get a better compiler" is a reasonable response on > > most platforms (... maybe not Windows). > > Bundling all of boost is not on the table I think: we are just talking > about the math stuff which look mostly like simple templates, in > which case most issues are not relevant. Using more boost should be a > different discussion. > Well, I did notice more boost dependencies showing up in the include files. C++ libraries tend to have a lot of intertwined parts, making it difficult to tease out just that little bit you really want. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Thu Oct 11 10:10:45 2012 From: cournape at gmail.com (David Cournapeau) Date: Thu, 11 Oct 2012 15:10:45 +0100 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Thu, Oct 11, 2012 at 2:42 PM, Charles R Harris wrote: > > > On Thu, Oct 11, 2012 at 2:39 AM, David Cournapeau > wrote: >> >> On Thu, Oct 11, 2012 at 6:11 AM, David Warde-Farley >> wrote: >> > It looks like Andreas' message didn't make it to the list, so here it >> > is forwarded. >> > >> > Looks like there are a few things to learn about bundling boost, but >> > nothing show-stopping. Of course, some of it (compiler hell) is out of >> > our control, but "get a better compiler" is a reasonable response on >> > most platforms (... maybe not Windows). >> >> Bundling all of boost is not on the table I think: we are just talking >> about the math stuff which look mostly like simple templates, in >> which case most issues are not relevant. Using more boost should be a >> different discussion. > > > Well, I did notice more boost dependencies showing up in the include files. > C++ libraries tend to have a lot of intertwined parts, making it difficult > to tease out just that little bit you really want. I guess that's what 'well designed' mean in c++ land :) More seriously, as long as we don't use the stuff outside math/special functions, it does not matter much (except for sdist size). And surely, smart_ptr and co can be removed in some ways or others. David From charlesr.harris at gmail.com Thu Oct 11 10:35:19 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 11 Oct 2012 08:35:19 -0600 Subject: [SciPy-Dev] Bundling Boost? In-Reply-To: References: Message-ID: On Thu, Oct 11, 2012 at 8:10 AM, David Cournapeau wrote: > On Thu, Oct 11, 2012 at 2:42 PM, Charles R Harris > wrote: > > > > > > On Thu, Oct 11, 2012 at 2:39 AM, David Cournapeau > > wrote: > >> > >> On Thu, Oct 11, 2012 at 6:11 AM, David Warde-Farley > >> wrote: > >> > It looks like Andreas' message didn't make it to the list, so here it > >> > is forwarded. > >> > > >> > Looks like there are a few things to learn about bundling boost, but > >> > nothing show-stopping. Of course, some of it (compiler hell) is out of > >> > our control, but "get a better compiler" is a reasonable response on > >> > most platforms (... maybe not Windows). > >> > >> Bundling all of boost is not on the table I think: we are just talking > >> about the math stuff which look mostly like simple templates, in > >> which case most issues are not relevant. Using more boost should be a > >> different discussion. > > > > > > Well, I did notice more boost dependencies showing up in the include > files. > > C++ libraries tend to have a lot of intertwined parts, making it > difficult > > to tease out just that little bit you really want. > > I guess that's what 'well designed' mean in c++ land :) > I think it is called 'reusability' in c++ land :) > > More seriously, as long as we don't use the stuff outside math/special > functions, it does not matter much (except for sdist size). And > surely, smart_ptr and co can be removed in some ways or others. > > Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ognen at enthought.com Fri Oct 12 09:38:43 2012 From: ognen at enthought.com (Ognen Duzlevski) Date: Fri, 12 Oct 2012 08:38:43 -0500 Subject: [SciPy-Dev] scipy.org will be down today for about 30 minutes Message-ID: Hello, The scipy.org machine will be down today for about 30 minutes or so. I will be adding an extra drive to the machine since it has been slowly running out of space. Thanks! Ognen From ndbecker2 at gmail.com Fri Oct 12 11:25:24 2012 From: ndbecker2 at gmail.com (Neal Becker) Date: Fri, 12 Oct 2012 11:25:24 -0400 Subject: [SciPy-Dev] Bundling Boost? References: Message-ID: Perhaps some day things will be easier: https://svn.boost.org/trac/boost/wiki/CMakeModularizeLibrary https://svn.boost.org/trac/boost/wiki/CMakeModularizationStatus From nils106 at googlemail.com Wed Oct 17 08:35:46 2012 From: nils106 at googlemail.com (Nils Wagner) Date: Wed, 17 Oct 2012 14:35:46 +0200 Subject: [SciPy-Dev] Making lobpcg support complex matrix In-Reply-To: <50764119.6020402@uq.edu.au> References: <50764119.6020402@uq.edu.au> Message-ID: Hi Greg, that is already a ticket http://projects.scipy.org/scipy/ticket/452 Cheers, Nils On 10/11/12, Gregory Crosswhite wrote: > Dear all, > > At the moment lobpcg seems to have been designed with only real numbers > in mind; this is unfortunate because I have been having some trouble > with eigs and was hoping to try it out as an alternative. > > Fortunately, it looks to me like the solution is rather simple. I was > able to get it to work by replacing all instances of ".T" with > ".T.conj()" --- which meant that all of the dot products were now > correct for complex numbers --- and by replacing the cast to "float64" > with a cast to "complex128". Changing the second cast is of course less > than ideal in the cases where "float64" is indeed what is being used, > but something like it is needed to make the answers not be garbage for > complex input matrices. > > With these changes, I generated 10x10 complex Hermitian matrices A and B > to serve as respectively the problem matrix and the normalization/metric > matrix, and ran lobpcg with a tolerance of 1e-10. For the standard > eigenvalue problem with k=2 (and random X) lobpcg got essentially the > same answers as eigh for the lowest two eigenvalue/eigenvector pairs, > and for the generalized eigenvalue problem it got the same eigenvalues > as eigh and it got eigenvectors whose entries were within ~10^-4 of the > eigenvectors returned by eigh (after dividing by the first entry to make > the vectors proportionately the same) and such that the normalized > overlap between the eigenvectors of lobpcg and eigh (using B as the > metric) was within ~ 10^-8 of 1 (not surprising as this is the square of > the first number). I ran a quick check where I dropped the imaginary > parts of A and B and re-ran this analysis and say the errors fall to > respectively ~ 10^-6 and 10^-12, so the algorithm gets less accurate > results for complex numbers than for real numbers, though I don't know > enough about how it works to speculate on which this would be. > > Anyway, I don't know much about lobpcg so there might be some issue that > I've missed in simply adding ".conj()" everywhere a ".T" appeared to fix > the dot products. I do think it would be very nice, though, to be able > to use lobpcg for complex matrices, and so I would be willing to submit > a patch towards this end. > > Thoughts? > > Cheers, > Greg > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav at iki.fi Wed Oct 17 18:17:33 2012 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 17 Oct 2012 22:17:33 +0000 (UTC) Subject: [SciPy-Dev] Making lobpcg support complex matrix References: <50764119.6020402@uq.edu.au> Message-ID: Gregory Crosswhite uq.edu.au> writes: [clip] > Anyway, I don't know much about lobpcg so there might be some issue that > I've missed in simply adding ".conj()" everywhere a ".T" appeared to fix > the dot products. I do think it would be very nice, though, to be able > to use lobpcg for complex matrices, and so I would be willing to submit > a patch towards this end. The original LOBPCG paper IIRC talked only about real symmetric problems, but it seems other existing implementations accept also Hermitian input. So yes, a patch fixing the implementation in Scipy for complex Hermitian matrices (plus adding tests) would be welcome. The easiest way (for us, not necessarily for you at least the first time :) to submit it would be via a pull request on Github; not a requirement though. As Nils said, there's a ticket with at least some work done in this direction: http://projects.scipy.org/scipy/ticket/452 I'm not sure what is the exact status with that, but you may want to take a look if it is of some help. -- Pauli Virtanen From jstevenson131 at gmail.com Thu Oct 18 04:05:01 2012 From: jstevenson131 at gmail.com (Jacob Stevenson) Date: Thu, 18 Oct 2012 09:05:01 +0100 Subject: [SciPy-Dev] optimize: add algorithm for global optimization basin hopping In-Reply-To: <50769349.2030300@molden.no> References: <50769349.2030300@molden.no> Message-ID: <3AB2B358-253A-4194-A73A-611076519F7A@gmail.com> I thought I would mention here on the mailing list that I redid the basin hopping global optimisation algorithm with a more advanced feature set. There are now two versions basinhopping, and basinhopping_advanced. I would be very interested which version people think is more appropriate for scipy. Jake On 11 Oct 2012, at 10:37, Sturla Molden wrote: > On 11.10.2012 04:20, Till Stensitzki wrote: >> As far as i can tell, simulated annealing is quite out of touch with more modern >> optimization strategies. > > In Bayesian statistics, Markov chain Monte Carlo is the standard > integration tool. If you want a peak or a root instead of the integral, > Metropolis-Hastings becomes simulated annealing. > > Sturla > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From dmitrey15 at ukr.net Thu Oct 18 04:27:49 2012 From: dmitrey15 at ukr.net (Dmitrey) Date: Thu, 18 Oct 2012 08:27:49 +0000 (UTC) Subject: [SciPy-Dev] =?utf-8?q?optimize=3A_add_algorithm_for_global_optimi?= =?utf-8?q?zation=09basin_hopping?= References: <50769349.2030300@molden.no> <3AB2B358-253A-4194-A73A-611076519F7A@gmail.com> Message-ID: Jacob Stevenson gmail.com> writes: > > I thought I would mention here on the mailing list that I redid the basin hopping global optimisation > algorithm with a more advanced feature set. There are now two versions basinhopping, and > basinhopping_advanced. I would be very interested which version people think is more appropriate for scipy. hi Jacob, I would like to see its comparison with mlsl or psarms, that have similar algorithm (combination of local and global search) and Python API available. You could easily done it in OpenOpt framework ( http://openopt.org ), additional computation time is usually insufficient. There you could compare it with some other global solvers ( http://openopt.org/GLP ), especially I would recommend interalg ( http://openopt.org/interalg ), de and asa. From pav at iki.fi Sat Oct 20 09:04:23 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sat, 20 Oct 2012 13:04:23 +0000 (UTC) Subject: [SciPy-Dev] Argument casts in special functions Message-ID: Hi, Related to: https://github.com/scipy/scipy/pull/334 Currently we have the following behavior for the integer-order Bessel function: from scipy.special import yn print yn(1, 0.123) # -> -5.2816754510498471 print yn(1.5, 0.123) # -> -5.2816754510498471 That is, the floating point argument is silently truncated to an integer. This is not so optimal: the more strict behavior would be to return a `nan` to signify the domain error. The question now is: is it worth preserving the old behavior of the existing functions? My guess would be yes, but I think the unsafe casts should be used just as a legacy special case, and not as a general rule. (Also, the integer-order functions could be deprecated --- there is no speed advantage in them as the kernel selection can be done based on input / input data type). -- Pauli Virtanen From charlesr.harris at gmail.com Sat Oct 20 11:15:20 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 20 Oct 2012 09:15:20 -0600 Subject: [SciPy-Dev] Argument casts in special functions In-Reply-To: References: Message-ID: On Sat, Oct 20, 2012 at 7:04 AM, Pauli Virtanen wrote: > Hi, > > Related to: https://github.com/scipy/scipy/pull/334 > > Currently we have the following behavior for the integer-order > Bessel function: > > from scipy.special import yn > print yn(1, 0.123) # -> -5.2816754510498471 > print yn(1.5, 0.123) # -> -5.2816754510498471 > > That is, the floating point argument is silently truncated > to an integer. > > This is not so optimal: the more strict behavior would be to > return a `nan` to signify the domain error. > > The question now is: is it worth preserving the old behavior > of the existing functions? > > My guess would be yes, but I think the unsafe casts should be > used just as a legacy special case, and not as a general rule. > > (Also, the integer-order functions could be deprecated --- there > is no speed advantage in them as the kernel selection can be done > based on input / input data type). > The behavior of jn is different and its docstring is the jv docstring, so it looks like an alias of jv. For the modified functions of the first kind there is only iv, probably because in is an unfortunate name for python. However, the modified function of the second kind, kn, behaves like yn. I'd suggest leaving the various integer functions as they are, but making the real order functions dispatch to the integer computations, if there are such, for integer orders. A note could then be added to the integer order versions suggesting the real versions and at some point it would probably be reasonable to deprecate them. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From nils106 at googlemail.com Wed Oct 31 15:03:21 2012 From: nils106 at googlemail.com (Nils Wagner) Date: Wed, 31 Oct 2012 19:03:21 +0000 Subject: [SciPy-Dev] scipy.test() segfault Message-ID: Program received signal SIGSEGV, Segmentation fault. 0x00007ffff7ab2b26 in PyObject_RichCompare () from /usr/lib64/libpython2.7.so.1.0 (gdb) bt #0 0x00007ffff7ab2b26 in PyObject_RichCompare () from /usr/lib64/libpython2.7.so.1.0 #1 0x00007ffff7ab2def in PyObject_RichCompareBool () from /usr/lib64/libpython2.7.so.1.0 #2 0x00007fffe3ba8584 in OBJECT_compare (ip1=, ip2=) at scipy/signal/sigtoolsmodule.c:827 #3 0x00007ffff74b8b64 in msort_with_tmp.part.0 () from /lib64/libc.so.6 #4 0x00007ffff74b8e9b in qsort_r () from /lib64/libc.so.6 #5 0x00007fffe3ba90d6 in PyArray_OrderFilterND (op1=, op2=, order=54989472) at scipy/signal/sigtoolsmodule.c:969 #6 0x00007fffe3ba92a7 in sigtools_order_filterND (__NPY_UNUSED_TAGGEDdummy=, args=) at scipy/signal/sigtoolsmodule.c:1135 #7 0x00007ffff7affcaa in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #8 0x00007ffff7b059c3 in PyEval_EvalCodeEx () from /usr/lib64/libpython2.7.so.1.0 #9 0x00007ffff7aff7ce in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #10 0x00007ffff7b0204f in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #11 0x00007ffff7b05727 in PyEval_EvalCodeEx () from /usr/lib64/libpython2.7.so.1.0 #12 0x00007ffff7a9ddc8 in ?? () from /usr/lib64/libpython2.7.so.1.0 #13 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #14 0x00007ffff7b00cae in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #15 0x00007ffff7b05727 in PyEval_EvalCodeEx () from /usr/lib64/libpython2.7.so.1.0 #16 0x00007ffff7a9dba1 in ?? () from /usr/lib64/libpython2.7.so.1.0 #17 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #18 0x00007ffff7a87c8e in ?? () from /usr/lib64/libpython2.7.so.1.0 #19 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #20 0x00007ffff7acb84a in ?? () from /usr/lib64/libpython2.7.so.1.0 #21 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #22 0x00007ffff7affe75 in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #23 0x00007ffff7b0204f in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #24 0x00007ffff7b05727 in PyEval_EvalCodeEx () from /usr/lib64/libpython2.7.so.1.0 #25 0x00007ffff7a9ddc8 in ?? () from /usr/lib64/libpython2.7.so.1.0 #26 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #27 0x00007ffff7b00cae in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #28 0x00007ffff7b05727 in PyEval_EvalCodeEx () from /usr/lib64/libpython2.7.so.1.0 #29 0x00007ffff7a9dba1 in ?? () from /usr/lib64/libpython2.7.so.1.0 #30 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #31 0x00007ffff7a87c8e in ?? () from /usr/lib64/libpython2.7.so.1.0 #32 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #33 0x00007ffff7acb84a in ?? () from /usr/lib64/libpython2.7.so.1.0 #34 0x00007ffff7a7d464 in PyObject_Call () from /usr/lib64/libpython2.7.so.1.0 #35 0x00007ffff7affe75 in PyEval_EvalFrameEx () from /usr/lib64/libpython2.7.so.1.0 #36 0x00007ffff7b05727 in PyEval_EvalCodeEx () from /usr/lib64/libpython2.7.so.1.0 >> scipy.__version__ '0.12.0.dev-bb436fa' x86_64 GNU/Linux -------------- next part -------------- An HTML attachment was scrubbed... URL: