[Numpy-discussion] Showing by examples how Python-Numpy can be efficient even for computationally intensive tasks

PIERRE AUGIER pierre.augier at univ-grenoble-alpes.fr
Thu Nov 26 16:14:40 EST 2020


I changed the email subject because I'd like to focus less on CO2 (a very interesting subject, but not my focus here) and more on computing...

----- Mail original -----
> De: "Andy Ray Terrel" <andy.terrel at gmail.com>
> À: "numpy-discussion" <numpy-discussion at python.org>
> Envoyé: Mardi 24 Novembre 2020 18:27:52
> Objet: Re: [Numpy-discussion] Comment published in Nature Astronomy about The ecological impact of computing with Python

> I think we, the community, does have to take it seriously. NumPy and the rest of
> the ecosystem is trying to raise money to hire developers. This sentiment,
> which is much wider than a single paper, is a prevalent roadblock.
> 
> -- Andy

I agree. I don't know if it is a matter of scientific field, but I tend to hear more and more people explaining that they don't use Python because of performance. Or telling that they don't have performance problems because they don't use Python.

Some communities (I won't give names 🙂) communicate a lot on the bad performances of Python-Numpy.

I am well aware that performance is in many cases not so important but it is not a good thing to have such bad reputation. I think we have to show what is doable with Python-Numpy code to get very good performance.

----- Mail original -----
> De: "Sebastian Berg" <sebastian at sipsolutions.net>
> Envoyé: Mardi 24 Novembre 2020 18:25:02
> Objet: Re: [Numpy-discussion] Comment published in Nature Astronomy about The ecological impact of computing with Python

>> Is there already something planned to answer to Zwart (2020)?
> 
> I don't think there is any need for rebuttal. The author is right
> right, you should not write the core of an N-Body simulation in Python
> :).  I completely disagree with the focus on programming
> languages/tooling, quite honestly.

I'm not a fan of this focus neither. But we have to realize that many people think like that and are sensible to such arguments. Being so bad in all benchmark games does not help the scientific Python community (especially in the long terms).

> A PhD who writes performance critical code, must get the education
> necessary to do it well.  That may mean learning something beyond
> Python, but not replacing Python entirely.

I'm really not sure. Or at least that depends on the type of performance critical code. I see many students or scientists who sometimes need to write few functions that are not super inefficient. For many people, I don't see why they would need to learn and use another language.

I did my PhD (in turbulence) with Fortran (and Matlab) and I have really nothing against Fortran. However, I'm really happy that we code in my group nearly everything in Python (+ a bit of C++ for the fun). For example, Fluidsim (https://foss.heptapod.net/fluiddyn/fluidsim) is ~100% Python and I know that it is very efficient (more efficient than many alternatives written with a lot of C++/Fortran). I realize that it wouldn't be possible for all kinds of code (and fluidsim uses fluidfft, written in C++ / Cython / Python), but being 100% Python has a lot of advantages (I won't list them here).

For a N-Body simulation, why not using Python? Using Python, you get a very readable, clear and efficient implementation (see https://github.com/paugier/nbabel), even faster than what you can get with easy C++/Fortran/Julia. IMHO, it is just what one needs for most PhD in astronomy. 

Of course, for many things, one needs native languages! Have a look at Pythran C++ code, it's beautiful 🙂 ! But I don't think every scientist that writes critical code has to become an expert in C++ or Fortran (or Julia).

I also sometimes have to read and use C++ and Fortran codes written by scientists. Sometimes (often), I tend to think that they would be more productive with other tools to reach the same performance. I think it is only a matter of education and not of tooling, but using serious tools does not make you a serious developer, and reaching the level in C++/Fortran to write efficient, clean, readable and maintainable codes in not so easy for a PhD or scientist that has other things to do.

Python-Numpy is so slow for some algorithms that many Python-Numpy users would benefit to know how to accelerate it. Just an example, with some elapsed times (in s) for the N-Body problem (see https://github.com/paugier/nbabel#smaller-benchmarks-between-different-python-solutions):

| Transonic-Pythran | Transonic-Numba | High-level Numpy | PyPy OOP | PyPy lists |
|-------------------|-----------------|------------------|----------|------------|
| 0.48              | 3.91            | 686              | 87       | 15         |

For comparison, we have for this case `{"c++": 0.85, "Fortran": 0.62, "Julia": 2.57}`.

Note that just adding `from transonic import jit` to the simple high-level Numpy code and then decorating the function `compute_accelerations` with `@jit`, the elapsed time decreases to 8 s (a x85 speedup!, with Pythran 0.9.8).

I conclude from these types of results that we need to tell Python users how to accelerate their Python-Numpy codes when they feel the need of it. I think acceleration tools should be mentioned in Numpy website. I also think we should spend a bit of energy to play some benchmark games.

It would be much better if we can change the widespread idea on Python performance for numerical problems from "Python is very slow and ineffective for most algorithms" to "interpreted Python can be very slow but with the existing Python accelerators, one can be extremely efficient with Python".

Pierre

> 
> On Tue, Nov 24, 2020 at 11:12 AM Ilhan Polat < [ mailto:ilhanpolat at gmail.com |
> ilhanpolat at gmail.com ] > wrote:
> 
> Do we have to take it seriously to start with? Because, with absolutely no
> offense meant, I am having significant difficulty doing so.
> 
> On Tue, Nov 24, 2020 at 4:58 PM PIERRE AUGIER < [
> mailto:pierre.augier at univ-grenoble-alpes.fr |
> pierre.augier at univ-grenoble-alpes.fr ] > wrote:
> 
> 
> Hi,
> 
> I recently took a bit of time to study the comment "The ecological impact of
> high-performance computing in astrophysics" published in Nature Astronomy
> (Zwart, 2020, [ https://www.nature.com/articles/s41550-020-1208-y |
> https://www.nature.com/articles/s41550-020-1208-y ] , [
> https://arxiv.org/pdf/2009.11295.pdf | https://arxiv.org/pdf/2009.11295.pdf ]
> ), where it is stated that "Best however, for the environment is to abandon
> Python for a more environmentally friendly (compiled) programming language.".
> 
> I wrote a simple Python-Numpy implementation of the problem used for this study
> ( [ https://www.nbabel.org/ | https://www.nbabel.org ] ) and, accelerated by
> Transonic-Pythran, it's very efficient. Here are some numbers (elapsed times in
> s, smaller is better):
> 
>| # particles | Py | C++ | Fortran | Julia |
>|-------------|-----|-----|---------|-------|
>| 1024 | 29 | 55 | 41 | 45 |
>| 2048 | 123 | 231 | 166 | 173 |
> 
> The code and a modified figure are here: [ https://github.com/paugier/nbabel |
> https://github.com/paugier/nbabel ] (There is no check on the results for [
> https://www.nbabel.org/ | https://www.nbabel.org ] , so one still has to be
> very careful.)
> 
> I think that the Numpy community should spend a bit of energy to show what can
> be done with the existing tools to get very high performance (and low CO2
> production) with Python. This work could be the basis of a serious reply to the
> comment by Zwart (2020).
> 
> Unfortunately the Python solution in [ https://www.nbabel.org/ |
> https://www.nbabel.org ] is very bad in terms of performance (and therefore CO2
> production). It is also true for most of the Python solutions for the Computer
> Language Benchmarks Game in [
> https://benchmarksgame-team.pages.debian.net/benchmarksgame/ |
> https://benchmarksgame-team.pages.debian.net/benchmarksgame/ ] (codes here [
> https://salsa.debian.org/benchmarksgame-team/benchmarksgame#what-else |
> https://salsa.debian.org/benchmarksgame-team/benchmarksgame#what-else ] ).
> 
> We could try to fix this so that people see that in many cases, it is not
> necessary to "abandon Python for a more environmentally friendly (compiled)
> programming language". One of the longest and hardest task would be to
> implement the different cases of the Computer Language Benchmarks Game in
> standard and modern Python-Numpy. Then, optimizing and accelerating such code
> should be doable and we should be able to get very good performance at least
> for some cases. Good news for this project, (i) the first point can be done by
> anyone with good knowledge in Python-Numpy (many potential workers), (ii) for
> some cases, there are already good Python implementations and (iii) the work
> can easily be parallelized.
> 
> It is not a criticism, but the (beautiful and very nice) new Numpy website [
> https://numpy.org/ | https://numpy.org/ ] is not very convincing in terms of
> performance. It's written "Performant The core of NumPy is well-optimized C
> code. Enjoy the flexibility of Python with the speed of compiled code." It's
> true that the core of Numpy is well-optimized C code but to seriously compete
> with C++, Fortran or Julia in terms of numerical performance, one needs to use
> other tools to move the compiled-interpreted boundary outside the hot loops. So
> it could be reasonable to mention such tools (in particular Numba, Pythran,
> Cython and Transonic).
> 
> Is there already something planned to answer to Zwart (2020)?
> 
> Any opinions or suggestions on this potential project?
> 
> Pierre
> 
> PS: Of course, alternative Python interpreters (PyPy, GraalPython, Pyjion,
> Pyston, etc.) could also be used, especially if HPy ( [
> https://github.com/hpyproject/hpy | https://github.com/hpyproject/hpy ] ) is
> successful (C core of Numpy written in HPy, Cython able to produce HPy code,
> etc.). However, I tend to be a bit skeptical in the ability of such
> technologies to reach very high performance for low-level Numpy code
> (performance that can be reached by replacing whole Python functions with
> optimized compiled code). Of course, I hope I'm wrong! IMHO, it does not remove
> the need for a successful HPy!
> 
> --
> Pierre Augier - CR CNRS [ http://www.legi.grenoble-inp.fr/ |
> http://www.legi.grenoble-inp.fr ]
> LEGI (UMR 5519) Laboratoire des Ecoulements Geophysiques et Industriels
> BP53, 38041 Grenoble Cedex, France tel:+33.4.56.52.86.16
> _______________________________________________
> NumPy-Discussion mailing list
> [ mailto:NumPy-Discussion at python.org | NumPy-Discussion at python.org ]
> [ https://mail.python.org/mailman/listinfo/numpy-discussion |
> https://mail.python.org/mailman/listinfo/numpy-discussion ]
> _______________________________________________
> NumPy-Discussion mailing list
> [ mailto:NumPy-Discussion at python.org | NumPy-Discussion at python.org ]
> [ https://mail.python.org/mailman/listinfo/numpy-discussion |
> https://mail.python.org/mailman/listinfo/numpy-discussion ]
> 
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion


More information about the NumPy-Discussion mailing list