From donald at stufft.io Fri Nov 16 19:50:14 2018 From: donald at stufft.io (Donald Stufft) Date: Fri, 16 Nov 2018 19:50:14 -0500 Subject: [pypy-dev] Slowdown on latest 3.5 nightly? Message-ID: I?ve been working on porting PyPI/Warehouse to using PyPy3, using the latest PyPy3 release (6.0.0) my tests on Warehouse take ~200 seconds to run, which is roughly within the realm of what I expected. I noticed when I tried to use the latest PyPy3 3.5 nightly (pypy-c-jit-95315-0a5cd8d36e99-linux64.tar.bz2) that my tests were randomly ?hanging? (or taking so long to complete a single test that I got bored and filled it). It wasn?t every time, (IOW, the test it happened to would switch around each run). I thought maybe it was a problem with the functional tests, and ran only the unit tests, and it got ~19% of the way through and happened again (and in general the tests were way way slower). I went back one nightly release, to pypy-c-jit-95312-3bb86a88e1d3-linux64.tar.bz2 and tried with that, and that works as expected, with a similar runtime to PyPy3-6.0.0 and without any hangs. From kristopher.kuhlman at gmail.com Sat Nov 17 10:39:08 2018 From: kristopher.kuhlman at gmail.com (Kris Kuhlman) Date: Sat, 17 Nov 2018 08:39:08 -0700 Subject: [pypy-dev] pypy, vmprof and mpmath Message-ID: I am using pypy to run numerical integration calculations with the arbitrary precision library mpmath (http://mpmath.org). I am using pypy2-v6.0.0-osx64 and version 1.0 of mpmath (from github). I install mpmath with pypy and use the native (python only) version; I am not using the gmp backend. I have noticed that sometimes seemingly small changes in my scripts result in big changes in runtime with pypy, and I was wondering if there were some jit options or settings I could change to improve things. Following advice from the pypy website, I tried using vmprof to profile one of my scripts. I installed vmprof into pypy using pip, starting with "pypy -m ensurepip". First I tried to run one of my scripts with vmprof, which typically take <10 minutes to run, but it ran for a few hours and so I eventually killed it (it doesn't respond to ^C, so I have to put it into the background and kill it explicitly with the kill command). Then I tried just a small test script like the following: cat -> test.py import mpmath as mp print mp.pi ^D This runs fine from pypy as: $pypy test.py >3.14159265358979 but running this same script using vmprof and pypy as pypy -m vmprof -o output.log test.py uses 4GB of ram, 100% processor, and if left running will generate a output.log file of growing size (so I kill it). If I run the following: cat -> test2.py x = 3.14159 print x ^D vmprof runs quickly and generates a small output.log file (~400kb). As an aside, if I try to use the --web option, it hangs on "Uploading to http://vmprof.com..." If I try to go to this website in my browser there is an error page about bad gateway from Cloudflare. I seem to be able to use mpmath with pypy just fine, but the combination of pypy, mpmath, and vmprof seems to not work. Any suggestions? I am not an experienced vmprof user. Kris -------------- next part -------------- An HTML attachment was scrubbed... URL: From ronan.lamy at gmail.com Mon Nov 19 15:12:16 2018 From: ronan.lamy at gmail.com (Ronan Lamy) Date: Mon, 19 Nov 2018 20:12:16 +0000 Subject: [pypy-dev] pypy, vmprof and mpmath In-Reply-To: References: Message-ID: <94ce04d1-dc81-2ba7-56e6-9b7944558c31@gmail.com> Le 17/11/18 ? 15:39, Kris Kuhlman a ?crit?: > I am using pypy to run numerical integration calculations with the > arbitrary precision library mpmath (http://mpmath.org). > > I am using pypy2-v6.0.0-osx64 and version 1.0 of mpmath (from github). I > install mpmath with pypy and use the native (python only) version; I am > not using the gmp backend. > > I have noticed that sometimes seemingly small changes in my scripts > result in big changes in runtime with pypy, and I was wondering if there > were some jit options or settings I could change to improve things. > Following advice from the pypy website, I tried using vmprof to profile > one of my scripts. > > I installed vmprof into pypy using pip, starting with "pypy -m ensurepip". > > First I tried to run one of my scripts with vmprof, which typically take > <10 minutes to run, but it ran for a few hours and so I eventually > killed it (it doesn't respond to ^C, so I have to put it into the > background and kill it explicitly with the kill command). > > Then I tried just a small test script like the following: > > cat -> test.py > import mpmath as mp > print mp.pi > ^D > > This runs fine from pypy as: > > $pypy test.py > >>3.14159265358979 > > > > but running this same script using vmprof and pypy as > > > pypy -m vmprof -o output.log test.py > > > uses 4GB of ram, 100% processor, and if left running will generate a > output.log file of growing size (so I kill it). > > > If I run the following: > > > cat -> test2.py > > x = 3.14159 > > print x > > ^D > > > vmprof runs quickly and generates a small output.log file (~400kb). As > an aside, if I try to use the --web option, it hangs on "Uploading to > http://vmprof.com..." ?If I try to go to this website in my browser > there is an error page about bad gateway from Cloudflare. > > > I seem to be able to use mpmath with pypy just fine, but the combination > of pypy, mpmath, and vmprof seems to not work. > > > Any suggestions?? I am not an experienced vmprof user. > I tried your test script on Linux and everything works as expected locally. So I guess the issue is specific to OS X, or to your own system. Try using the --no-native option, as in 'pypy -m vmprof -o output.log --no-native test.py'. That should disable most of the platform-specific code, and it doesn't lose much information on pypy anyway. From kristopher.kuhlman at gmail.com Tue Nov 20 08:47:15 2018 From: kristopher.kuhlman at gmail.com (Kris Kuhlman) Date: Tue, 20 Nov 2018 06:47:15 -0700 Subject: [pypy-dev] pypy, vmprof and mpmath In-Reply-To: <94ce04d1-dc81-2ba7-56e6-9b7944558c31@gmail.com> References: <94ce04d1-dc81-2ba7-56e6-9b7944558c31@gmail.com> Message-ID: Thank you for the suggestion. Using the ?no-native option, both the test script and my original numerical integration script run under vmprof on pypy. Kris On Mon, Nov 19, 2018 at 1:12 PM Ronan Lamy wrote: > Le 17/11/18 ? 15:39, Kris Kuhlman a ?crit : > > I am using pypy to run numerical integration calculations with the > > arbitrary precision library mpmath (http://mpmath.org). > > > > I am using pypy2-v6.0.0-osx64 and version 1.0 of mpmath (from github). I > > install mpmath with pypy and use the native (python only) version; I am > > not using the gmp backend. > > > > I have noticed that sometimes seemingly small changes in my scripts > > result in big changes in runtime with pypy, and I was wondering if there > > were some jit options or settings I could change to improve things. > > Following advice from the pypy website, I tried using vmprof to profile > > one of my scripts. > > > > I installed vmprof into pypy using pip, starting with "pypy -m > ensurepip". > > > > First I tried to run one of my scripts with vmprof, which typically take > > <10 minutes to run, but it ran for a few hours and so I eventually > > killed it (it doesn't respond to ^C, so I have to put it into the > > background and kill it explicitly with the kill command). > > > > Then I tried just a small test script like the following: > > > > cat -> test.py > > import mpmath as mp > > print mp.pi > > ^D > > > > This runs fine from pypy as: > > > > $pypy test.py > > > >>3.14159265358979 > > > > > > > > but running this same script using vmprof and pypy as > > > > > > pypy -m vmprof -o output.log test.py > > > > > > uses 4GB of ram, 100% processor, and if left running will generate a > > output.log file of growing size (so I kill it). > > > > > > If I run the following: > > > > > > cat -> test2.py > > > > x = 3.14159 > > > > print x > > > > ^D > > > > > > vmprof runs quickly and generates a small output.log file (~400kb). As > > an aside, if I try to use the --web option, it hangs on "Uploading to > > http://vmprof.com..." If I try to go to this website in my browser > > there is an error page about bad gateway from Cloudflare. > > > > > > I seem to be able to use mpmath with pypy just fine, but the combination > > of pypy, mpmath, and vmprof seems to not work. > > > > > > Any suggestions? I am not an experienced vmprof user. > > > I tried your test script on Linux and everything works as expected > locally. So I guess the issue is specific to OS X, or to your own system. > > Try using the --no-native option, as in 'pypy -m vmprof -o output.log > --no-native test.py'. That should disable most of the platform-specific > code, and it doesn't lose much information on pypy anyway. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matti.picus at gmail.com Tue Nov 20 14:27:18 2018 From: matti.picus at gmail.com (Matti Picus) Date: Tue, 20 Nov 2018 11:27:18 -0800 Subject: [pypy-dev] Slowdown on latest 3.5 nightly? In-Reply-To: References: Message-ID: <040f0eac-d0a2-932a-f79e-ba49c4cf9634@gmail.com> On 16/11/18 4:50 pm, Donald Stufft wrote: > I?ve been working on porting PyPI/Warehouse to using PyPy3, using the latest PyPy3 release (6.0.0) my tests on Warehouse take ~200 seconds to run, which is roughly within the realm of what I expected. > > I noticed when I tried to use the latest PyPy3 3.5 nightly (pypy-c-jit-95315-0a5cd8d36e99-linux64.tar.bz2) that my tests were randomly ?hanging? (or taking so long to complete a single test that I got bored and filled it). It wasn?t every time, (IOW, the test it happened to would switch around each run). I thought maybe it was a problem with the functional tests, and ran only the unit tests, and it got ~19% of the way through and happened again (and in general the tests were way way slower). > > I went back one nightly release, to pypy-c-jit-95312-3bb86a88e1d3-linux64.tar.bz2 and tried with that, and that works as expected, with a similar runtime to PyPy3-6.0.0 and without any hangs. That is very strange, the only difference between those two commits is changing a spaace.bytes_w() call to space.|charbuf_w() in PyUnicode_FromEncodedObject, which should have no effect (and some testing changes). Could you try a latest nightly ||(|e1b0f8e6c29c from 2018-11-20 or after) http://buildbot.pypy.org/nightly/py3.5 ? |I refactored that code to call |spaace.bytes_w()|| if possible before other checks. If you have extra time, maybe you could even try the unicode-utf8-py3 nightly, http://buildbot.pypy.org/nightly/unicode-utf8-py3 which is a WIP to use utf8 internally everywhere without converting back and forth to unicode. It would be nice to know if that is any faster (or even works). Matti || From armin.rigo at gmail.com Tue Nov 20 23:29:36 2018 From: armin.rigo at gmail.com (Armin Rigo) Date: Wed, 21 Nov 2018 06:29:36 +0200 Subject: [pypy-dev] Slowdown on latest 3.5 nightly? In-Reply-To: <040f0eac-d0a2-932a-f79e-ba49c4cf9634@gmail.com> References: <040f0eac-d0a2-932a-f79e-ba49c4cf9634@gmail.com> Message-ID: Hi Matti, On 20/11/2018, Matti Picus wrote: > On 16/11/18 4:50 pm, Donald Stufft wrote: > If you have extra time, maybe you could even try the unicode-utf8-py3 > nightly, http://buildbot.pypy.org/nightly/unicode-utf8-py3 which is a > WIP to use utf8 internally everywhere without converting back and forth > to unicode. It would be nice to know if that is any faster (or even works). That's even assuming the slow-down is in any way related to unicodes. I think we might as well assume that the slowdown is not related to any change in the code and due to other random effects. If we want to know more precisely what is going on, the first step would be to get the real code that is unexpectedly slower on one pypy than on the other, and try to run it ourselves---e.g. inside gdb, to start with. Armin From donald at stufft.io Wed Nov 21 10:30:00 2018 From: donald at stufft.io (Donald Stufft) Date: Wed, 21 Nov 2018 10:30:00 -0500 Subject: [pypy-dev] Slowdown on latest 3.5 nightly? In-Reply-To: References: <040f0eac-d0a2-932a-f79e-ba49c4cf9634@gmail.com> Message-ID: <4E7F57CD-6E5F-4A7F-928A-853D32EAC367@stufft.io> > On Nov 20, 2018, at 11:29 PM, Armin Rigo wrote: > > Hi Matti, > > On 20/11/2018, Matti Picus wrote: >> On 16/11/18 4:50 pm, Donald Stufft wrote: >> If you have extra time, maybe you could even try the unicode-utf8-py3 >> nightly, http://buildbot.pypy.org/nightly/unicode-utf8-py3 which is a >> WIP to use utf8 internally everywhere without converting back and forth >> to unicode. It would be nice to know if that is any faster (or even works). > > That's even assuming the slow-down is in any way related to unicodes. > I think we might as well assume that the slowdown is not related to > any change in the code and due to other random effects. If we want to > know more precisely what is going on, the first step would be to get > the real code that is unexpectedly slower on one pypy than on the > other, and try to run it ourselves---e.g. inside gdb, to start with. > It?s in a branch on Github https://github.com/pypa/warehouse/pull/5057 which runs everything in docker, ``make build`` then ``make serve``, and then in another terminal ``make tests`` runs the tests. That may make it. A lot harder to get a gdb going though :( It?s easy to switch what Nightly is being used though, that?s just a line in the Dockerfile to pick what download from PyPy to use. I?m going to go ahead and test those other nightlies later today though. -------------- next part -------------- An HTML attachment was scrubbed... URL: From matti.picus at gmail.com Wed Nov 21 11:27:06 2018 From: matti.picus at gmail.com (Matti Picus) Date: Wed, 21 Nov 2018 08:27:06 -0800 Subject: [pypy-dev] Slowdown on latest 3.5 nightly? In-Reply-To: <4E7F57CD-6E5F-4A7F-928A-853D32EAC367@stufft.io> References: <040f0eac-d0a2-932a-f79e-ba49c4cf9634@gmail.com> <4E7F57CD-6E5F-4A7F-928A-853D32EAC367@stufft.io> Message-ID: <2e52b60e-4607-1305-685e-6ac6ea1ae9cc@gmail.com> On 21/11/18 7:30 am, Donald Stufft wrote: > > >> On Nov 20, 2018, at 11:29 PM, Armin Rigo > > wrote: >> >> Hi Matti, >> >> On 20/11/2018, Matti Picus > > wrote: >>> On 16/11/18 4:50 pm, Donald Stufft wrote: >>> If you have extra time, maybe you could even try the unicode-utf8-py3 >>> nightly, http://buildbot.pypy.org/nightly/unicode-utf8-py3 which is a >>> WIP to use utf8 internally everywhere without converting back and forth >>> to unicode. It would be nice to know if that is any faster (or even >>> works). >> >> That's even assuming the slow-down is in any way related to unicodes. >> I think we might as well assume that the slowdown is not related to >> any change in the code and due to other random effects. ?If we want to >> know more precisely what is going on, the first step would be to get >> the real code that is unexpectedly slower on one pypy than on the >> other, and try to run it ourselves---e.g. inside gdb, to start with. >> > > > It?s in a branch on Github > https://github.com/pypa/warehouse/pull/5057?which runs everything in > docker, ``make build`` then ``make serve``, and then in another > terminal ``make tests`` runs the tests. That may make it. A lot harder > to get a gdb going though :( It?s easy to switch what Nightly is being > used though, that?s just a line in the Dockerfile to pick what > download from PyPy to use. > > I?m going to go ahead and test those other nightlies later today though. it seems my call to test the unicode-utf8 branch may have been premature, it crashes on lib-python tests having to do with test_codeencodings_*.py. Maybe not worth your time yet. Matti From donald at stufft.io Fri Nov 23 15:20:35 2018 From: donald at stufft.io (Donald Stufft) Date: Fri, 23 Nov 2018 15:20:35 -0500 Subject: [pypy-dev] Slowdown on latest 3.5 nightly? In-Reply-To: <2e52b60e-4607-1305-685e-6ac6ea1ae9cc@gmail.com> References: <040f0eac-d0a2-932a-f79e-ba49c4cf9634@gmail.com> <4E7F57CD-6E5F-4A7F-928A-853D32EAC367@stufft.io> <2e52b60e-4607-1305-685e-6ac6ea1ae9cc@gmail.com> Message-ID: > On Nov 21, 2018, at 11:27 AM, Matti Picus wrote: > > > On 21/11/18 7:30 am, Donald Stufft wrote: >> >> >> It?s in a branch on Github https://github.com/pypa/warehouse/pull/5057 which runs everything in docker, ``make build`` then ``make serve``, and then in another terminal ``make tests`` runs the tests. That may make it. A lot harder to get a gdb going though :( It?s easy to switch what Nightly is being used though, that?s just a line in the Dockerfile to pick what download from PyPy to use. >> >> I?m going to go ahead and test those other nightlies later today though. > > > it seems my call to test the unicode-utf8 branch may have been premature, it crashes on lib-python tests having to do with test_codeencodings_*.py. > > Maybe not worth your time yet. > I went ahead and tested the latest py3.5 nightly (pypy-c-jit-95353-773010593365-linux64.tar.bz2) and it did not exhibit the slow down. -------------- next part -------------- An HTML attachment was scrubbed... URL: