From brianvanderburg2 at aim.com Sun Jul 3 13:05:13 2016 From: brianvanderburg2 at aim.com (Brian Allen Vanderburg II) Date: Sun, 3 Jul 2016 13:05:13 -0400 Subject: [Distutils] Module import order issues. Message-ID: <577945C9.8040504@aim.com> The issue I'm having is when I install a test/local version of a package in my user site-packages, but the same package is also installed in a system path and has a .pth entry in the system easy-install.pth. It seems that from a path perspective, items under the user path should automatically take precedence over the system path. However, because of the way the paths are rearranged in easy-install.pth, the system path always takes precedence since the easy-install.pth line moves all added items to the front of the path. So while it seems like the path should be something like: cwd:core-paths:user-site-paths:user-pth-paths:system-dist-paths:system-pth-paths The path turns out to be: cwd:user-pth-paths:system-pth-paths:core-paths:user-site-paths:system-dist-paths Is there a reason why the .pth files even contain the final line that rearranges the paths for all the .eggs to be at the start of the system path? For the time being, my work-around is to manually add a .pth file to my user site-packages path for each item in site-packages as well, and add the lines needed so that when the system easy-install.pth is processed, it doesn't move all: import sys; sys.__plen = len(sys.path) import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = p+len(new) All of that is required, because without it, sys.__egginsert will still be unset, and when site.py processes the system path, items in easy-install.pth will be moved above any user .pth items as well. However, ideally it seems like this path should automatically have precedence over system packages, and it would if not for this line in the various easy-install.pth files: import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = p+len(new) I guess I just find myself wondering what is the purpose of the above line in easy-install.pth files (for python3, but apparently not python2), and why not just keep the path in the order that site.py would add them. Thanks, Brian Allen Vanderburg II From brianvanderburg2 at aim.com Mon Jul 4 09:21:46 2016 From: brianvanderburg2 at aim.com (Brian Allen Vanderburg II) Date: Mon, 4 Jul 2016 09:21:46 -0400 Subject: [Distutils] Module import order issues. In-Reply-To: References: <577945C9.8040504@aim.com> Message-ID: <577A62EA.3000308@aim.com> I have devised the following hack which works for the time being. I add the following line to two new files called 000-fix-easyinstall.pth and zzz-fix-easyinstall.pth: import sys; sys.__egginsert = len(sys.path) This causes the next path re-arrangement by an easy-install.pth to be after all current paths. Why two: The .pth are sorted by site.py. So 000-fix-easyinstall.pth will get called first and set __egginsert after any current paths (such as the core paths and the user site-packages path). This way, when the user easy-install.pth is processed if there is one, any path re-arrangements will occur after them. Now the problem with just one is, if there are any other user .pth files that get processed after the user easy-install.pth, they will add paths but not update the egginsert value. So when the system easy-install.pth gets invoked, those paths will be moved above the user paths that were added after the user easy-install.pth. The solution to this is the second fix file zzz-fix-easyinstall.pth. Since it is processed last, i ensures that the egginsert is after any added paths. To see it in action: import sys for i in sys.path: print(i) Before: /home/allen/.local/lib/python3.4/site-packages/setuptools /usr/local/lib/python3.4/dist-packages/psutil-3.4.2-py3.4-linux-x86_64.egg /usr/local/lib/python3.4/dist-packages/docker_py-1.7.0_rc2-py3.4.egg /usr/local/lib/python3.4/dist-packages/raven-5.10.1-py3.4.egg /usr/local/lib/python3.4/dist-packages/Jinja2-2.8-py3.4.egg /usr/local/lib/python3.4/dist-packages/aiohttp-0.20.2-py3.4-linux-x86_64.egg /usr/local/lib/python3.4/dist-packages/jsonschema-2.5.1-py3.4.egg /usr/local/lib/python3.4/dist-packages/websocket_client-0.35.0-py3.4.egg /usr/local/lib/python3.4/dist-packages/requests-2.9.1-py3.4.egg /usr/local/lib/python3.4/dist-packages/MarkupSafe-0.23-py3.4-linux-x86_64.egg /usr/lib/python3/dist-packages /usr/local/lib/python3.4/dist-packages/paramiko-1.16.0-py3.4.egg /usr/local/lib/python3.4/dist-packages/configobj-5.0.6-py3.4.egg /usr/local/lib/python3.4/dist-packages/ecdsa-0.13-py3.4.egg /usr/local/lib/python3.4/dist-packages/pycrypto-2.6.1-py3.4-linux-x86_64.egg /usr/local/lib/python3.4/dist-packages/gns3_server-1.4.6-py3.4.egg /usr/local/lib/python3.4/dist-packages/gns3_gui-1.4.6-py3.4.egg /usr/local/lib/python3.4/dist-packages/gns3_net_converter-1.3.0-py3.4.egg /usr/lib/python3.4 /usr/lib/python3.4/plat-x86_64-linux-gnu /usr/lib/python3.4/lib-dynload /home/allen/.local/lib/python3.4/site-packages /usr/local/lib/python3.4/dist-packages After: /usr/lib/python3.4 /usr/lib/python3.4/plat-x86_64-linux-gnu /usr/lib/python3.4/lib-dynload /home/allen/.local/lib/python3.4/site-packages /home/allen/.local/lib/python3.4/site-packages/setuptools /usr/local/lib/python3.4/dist-packages/psutil-3.4.2-py3.4-linux-x86_64.egg /usr/local/lib/python3.4/dist-packages/docker_py-1.7.0_rc2-py3.4.egg /usr/local/lib/python3.4/dist-packages/raven-5.10.1-py3.4.egg /usr/local/lib/python3.4/dist-packages/Jinja2-2.8-py3.4.egg /usr/local/lib/python3.4/dist-packages/aiohttp-0.20.2-py3.4-linux-x86_64.egg /usr/local/lib/python3.4/dist-packages/jsonschema-2.5.1-py3.4.egg /usr/local/lib/python3.4/dist-packages/websocket_client-0.35.0-py3.4.egg /usr/local/lib/python3.4/dist-packages/requests-2.9.1-py3.4.egg /usr/local/lib/python3.4/dist-packages/MarkupSafe-0.23-py3.4-linux-x86_64.egg /usr/lib/python3/dist-packages /usr/local/lib/python3.4/dist-packages/paramiko-1.16.0-py3.4.egg /usr/local/lib/python3.4/dist-packages/configobj-5.0.6-py3.4.egg /usr/local/lib/python3.4/dist-packages/ecdsa-0.13-py3.4.egg /usr/local/lib/python3.4/dist-packages/pycrypto-2.6.1-py3.4-linux-x86_64.egg /usr/local/lib/python3.4/dist-packages/gns3_server-1.4.6-py3.4.egg /usr/local/lib/python3.4/dist-packages/gns3_gui-1.4.6-py3.4.egg /usr/local/lib/python3.4/dist-packages/gns3_net_converter-1.3.0-py3.4.egg /usr/local/lib/python3.4/dist-packages This keeps the core paths first, then the user paths, then the system paths Brian Allen Vanderburg II On 07/04/2016 04:59 AM, Piotr Dobrogost wrote: > On Sun, Jul 3, 2016 at 7:05 PM, Brian Allen Vanderburg II via > Distutils-SIG wrote: >> The issue I'm having is when I install a test/local version of a package >> in my user site-packages, but the same package is also installed in a >> system path and has a .pth entry in the system easy-install.pth. It >> seems that from a path perspective, items under the user path should >> automatically take precedence over the system path. However, because of >> the way the paths are rearranged in easy-install.pth, the system path >> always takes precedence since the easy-install.pth line moves all added >> items to the front of the path. > You might want to take a look at issue "pip list reports wrong version > of package installed both in system site and user site" at > https://github.com/pypa/pip/issues/1018. > One "solution" is not to install any packages in virtualenv with > setuptools (which hacks easy-install.pth file as you described). > Install only with pip instead. > > Regards, > Piotr Dobrogost > > > > > >> So while it seems like the path should be something like: >> >> cwd:core-paths:user-site-paths:user-pth-paths:system-dist-paths:system-pth-paths >> >> The path turns out to be: >> >> cwd:user-pth-paths:system-pth-paths:core-paths:user-site-paths:system-dist-paths >> >> Is there a reason why the .pth files even contain the final line that >> rearranges the paths for all the .eggs to be at the start of the system >> path? >> >> For the time being, my work-around is to manually add a .pth file to my >> user site-packages path for each item in site-packages as well, and add >> the lines needed so that when the system easy-install.pth is processed, >> it doesn't move all: >> >> import sys; sys.__plen = len(sys.path) >> >> import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; >> p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = >> p+len(new) >> >> All of that is required, because without it, sys.__egginsert will still >> be unset, and when site.py processes the system path, items in >> easy-install.pth will be moved above any user .pth >> items as well. >> >> However, ideally it seems like this path should automatically have >> precedence over system packages, and it would if not for this line in >> the various easy-install.pth files: >> >> import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; >> p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = >> p+len(new) >> >> I guess I just find myself wondering what is the purpose of the above >> line in easy-install.pth files (for python3, but apparently not >> python2), and why not just keep the path in the order that site.py would >> add them. >> >> Thanks, >> >> Brian Allen Vanderburg II >> >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig From brianvanderburg2 at aim.com Mon Jul 4 11:49:39 2016 From: brianvanderburg2 at aim.com (Brian Allen Vanderburg II) Date: Mon, 4 Jul 2016 11:49:39 -0400 Subject: [Distutils] bdist_wheel alters namespace packages paths: Message-ID: <577A8593.9090309@aim.com> In a manor similar to my previous messages about path manipulation from the .pth files, I've noticed that a wheel created by bdist_wheel also performs some path manipulation issues for namespace packages: First it seems that setup.py when creating the wheel does not install the __init__.py namespace file. When the nspkg.pth file gets loaded by site.py, it creates the dummy namespace module in sys.modules and sets the __path__ to contain where it was found. There seem to be couple issue with this: Every namespace package under the same namespace must use the same method to register it's path. That is, every namespace package under the same namespace will need an nspkg.pth file to extend the module path. This also prevents loading any namespace packages from the current directory, such as for testing purposes, since the dummy module is created by the nspkg.pth and seems to prevent the normal scanning of sys.path for namespace packages during import. Brian Allen Vanderburg II From chris at simplistix.co.uk Mon Jul 4 11:53:43 2016 From: chris at simplistix.co.uk (Chris Withers) Date: Mon, 4 Jul 2016 16:53:43 +0100 Subject: [Distutils] package made up of only .so's? Message-ID: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Hi All, I need to build a package which is made up of a set of .so's. These .so's are compiled elsewhere, so I just need to write an appropriate setup.py and bdist_wheel to get what I want. But that first part is where I'm struggling. Each of these .so's is essentially a top level python module. How do I tell setuptools' setup() function to basically just roll these into a wheel? Any help gratefully received! Chris From dholth at gmail.com Mon Jul 4 12:27:08 2016 From: dholth at gmail.com (Daniel Holth) Date: Mon, 04 Jul 2016 16:27:08 +0000 Subject: [Distutils] package made up of only .so's? In-Reply-To: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Message-ID: Try my new project, enscons. https://bitbucket.org/dholth/enscons . You can just wheel up whatever you want, without bothering with all this setup.py nonsense. Put your metadata in pyproject.toml, enscons' setup2toml script might help you generate that from setup.py. It might look something like this: import pytoml as tomlimport ensconsmetadata = dict(toml.load(open('pyproject.toml')))['tool']['enscons']# most specific binary, non-manylinux1 tag should be at the top of this listif True: import wheel.pep425tags for tag in wheel.pep425tags.get_supported(): full_tag = '-'.join(tag) if not 'manylinux' in tag: breakenv = Environment(tools=['default', 'packaging', enscons.generate], PACKAGE_METADATA=metadata, WHEEL_TAG=full_tag, ROOT_IS_PURELIB=False)sources = Glob('*.so')env.Whl('platlib', sources, root='.') On Mon, Jul 4, 2016 at 12:07 PM Chris Withers wrote: > Hi All, > > I need to build a package which is made up of a set of .so's. > > These .so's are compiled elsewhere, so I just need to write an > appropriate setup.py and bdist_wheel to get what I want. But that first > part is where I'm struggling. > > Each of these .so's is essentially a top level python module. How do I > tell setuptools' setup() function to basically just roll these into a > wheel? > > Any help gratefully received! > > > Chris > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From contact at ionelmc.ro Mon Jul 4 12:31:16 2016 From: contact at ionelmc.ro (=?UTF-8?Q?Ionel_Cristian_M=C4=83rie=C8=99?=) Date: Mon, 4 Jul 2016 19:31:16 +0300 Subject: [Distutils] package made up of only .so's? In-Reply-To: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Message-ID: On Mon, Jul 4, 2016 at 6:53 PM, Chris Withers wrote: > > Each of these .so's is essentially a top level python module. How do I > tell setuptools' setup() function to basically just roll these into a wheel? > ?If you want to package arbitrary files take a look at this setup.py: https://github.com/pytest-dev/pytest-cov/blob/master/setup.py It should be simple enough to replicate, and it handles most ways to install, besides wheels. Thanks, -- Ionel Cristian M?rie?, http://blog.ionelmc.ro -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Mon Jul 4 16:11:37 2016 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 4 Jul 2016 13:11:37 -0700 Subject: [Distutils] package made up of only .so's? In-Reply-To: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Message-ID: You could also probably use flit to do what you want: https://pypi.python.org/pypi/flit (It might require that you bundle the different modules into a single top level package. You might want to consider doing that regardless.) -n On Jul 4, 2016 9:07 AM, "Chris Withers" wrote: > Hi All, > > I need to build a package which is made up of a set of .so's. > > These .so's are compiled elsewhere, so I just need to write an appropriate > setup.py and bdist_wheel to get what I want. But that first part is where > I'm struggling. > > Each of these .so's is essentially a top level python module. How do I > tell setuptools' setup() function to basically just roll these into a wheel? > > Any help gratefully received! > > > Chris > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From encukou at gmail.com Tue Jul 5 03:53:24 2016 From: encukou at gmail.com (Petr Viktorin) Date: Tue, 5 Jul 2016 09:53:24 +0200 Subject: [Distutils] Breaking up the stdlib (Was: [Python-Dev] release cadence) In-Reply-To: References: <5772F17A.1080902@hastings.org> <5774CD41.9030601@hastings.org> Message-ID: <177a7bdf-6744-3b18-29d4-5e6b6e1f17ab@gmail.com> On 07/04/2016 12:19 AM, Steve Dower wrote: > My thinking on this issue was that some/most packages from the stdlib > would move into site-packages. Certainly I'd expect asyncio to be in > this category, and probably typing. Even going as far as email and > urllib would potentially be beneficial (to those packages, is my thinking). > > Obviously not every single module can do this, but there are plenty that > aren't low-level dependencies for other modules that could. Depending on > particular versions of these then becomes a case of adding normal > package version constraints - we could even bundle version information > for non-updateable packages so that installs fail on incompatible Python > versions. > > The "Uber repository" could be a requirements.txt that pulls down wheels > for the selected stable versions of each package so that we still > distribute all the same code with the same stability, but users have > much more ability to patch their own stdlib after install. > > (FWIW, we use a system similar to this at Microsoft for building Visual > Studio, so I can vouch that it works on much more complicated software > than Python.) While we're on the subject, I'd like to offer another point for consideration: not all implementations of Python can provide the full stdlib, and not everyone wants the full stdlib. For MicroPython, most of Python's batteries are too heavy. Tkinter on Android is probably not useful enough for people to port it. Weakref can't be emulated nicely in Javascript. If packages had a way to opt-out of needing the whole standard library, and instead specify the stdlib subset they need, answering questions like "will this run on my phone?" and "what piece of the stdlib do we want to port next?" would be easier. Both Debian and Fedora package some parts of the stdlib separately (tkinter, venv, tests), and have opt-in subsets of the stdlib for minimal systems (python-minimal, system-python). Tools like pyinstaller run magic heuristics to determine what parts of stdlib can be left out. It would help these projects if the "not all of stdlib is installed" case was handled more systematically at the CPython or distutils level. As I said on the Language summit, this is just talk; I don't currently have the resources to drive this effort. But if anyone is thinking of splitting the stdlib, please keep these points in mind as well. I think that, at least, if "pip install -U asyncio" becomes possible, "pip uninstall --yes-i-know-what-im-doing asyncio" should be possible as well. > From: Paul Moore > Sent: ?7/?3/?2016 14:23 > To: Brett Cannon > Cc: Guido van Rossum ; Nick Coghlan > ; Python-Dev ; > Steve Dower > Subject: Re: [Python-Dev] release cadence (was: Request for CPython > 3.5.3 release) > > On 3 July 2016 at 22:04, Brett Cannon wrote: >> This last bit is what I would advocate if we broke the stdlib out > unless an >> emergency patch release is warranted for a specific module (e.g. like >> asyncio that started this discussion). Obviously backporting is its own >> thing. > > It's also worth noting that pip has no mechanism for installing an > updated stdlib module, as everything goes into site-packages, and the > stdlib takes precedence over site-packages unless you get into > sys.path hacking abominations like setuptools uses (or at least used > to use, I don't know if it still does). So as things stand, > independent patch releases of stdlib modules would need to be manually > copied into place. > > Allowing users to override the stdlib opens up a different can of > worms - not necessarily one that we couldn't resolve, but IIRC, it was > always a deliberate policy that overriding the stdlib wasn't possible > (that's why backports have names like unittest2...) > From chris at simplistix.co.uk Thu Jul 7 06:40:46 2016 From: chris at simplistix.co.uk (Chris Withers) Date: Thu, 7 Jul 2016 11:40:46 +0100 Subject: [Distutils] package made up of only .so's? In-Reply-To: References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Message-ID: <2a64a11d-72cf-b984-63ff-482dc1b47372@simplistix.co.uk> On 04/07/2016 17:31, Ionel Cristian M?rie? wrote: > > On Mon, Jul 4, 2016 at 6:53 PM, Chris Withers > wrote: > > > Each of these .so's is essentially a top level python module. How do > I tell setuptools' setup() function to basically just roll these > into a wheel? > > > ?If you want to package arbitrary files take a look at this setup.py: > https://github.com/pytest-dev/pytest-cov/blob/master/setup.py > It should be simple enough to replicate, and it handles most ways to > install, besides wheels. Not sure I'm following, I don't see any handling for non-.py files in there... What line(s) should I be looking at? Chris From chris at simplistix.co.uk Thu Jul 7 06:49:56 2016 From: chris at simplistix.co.uk (Chris Withers) Date: Thu, 7 Jul 2016 11:49:56 +0100 Subject: [Distutils] package made up of only .so's? In-Reply-To: References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Message-ID: <2aef2608-d884-53dd-8b03-5a3abd1a4e9a@simplistix.co.uk> An HTML attachment was scrubbed... URL: From takowl at gmail.com Thu Jul 7 07:05:00 2016 From: takowl at gmail.com (Thomas Kluyver) Date: Thu, 7 Jul 2016 12:05:00 +0100 Subject: [Distutils] package made up of only .so's? In-Reply-To: <2aef2608-d884-53dd-8b03-5a3abd1a4e9a@simplistix.co.uk> References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> <2aef2608-d884-53dd-8b03-5a3abd1a4e9a@simplistix.co.uk> Message-ID: Hi Chris, On 7 July 2016 at 11:49, Chris Withers wrote: > flit does look nice and clean, but appears to only support one > module/package? > Thomas, that ever likely to change? That's by design, I'm afraid. I like one top-level module to correspond to one installable distribution. Flit also assumes that the wheels it's building are pure Python, so they get -none-any tags. Feel free to cannibalise code from flit to build your specific wheels, though. It's relatively easy to create a wheel without lots of tooling. For instance, I have a script that unpacks the Windows binary installers for PyQt4, reassembles the files into wheels, and uploads them to PyPI: https://github.com/takluyver/pyqt4_windows_whl Thomas -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at simplistix.co.uk Thu Jul 7 07:13:03 2016 From: chris at simplistix.co.uk (Chris Withers) Date: Thu, 7 Jul 2016 12:13:03 +0100 Subject: [Distutils] package made up of only .so's? In-Reply-To: References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> Message-ID: <766fbb2f-d7e6-257c-da0f-293bdbc98827@simplistix.co.uk> An HTML attachment was scrubbed... URL: From dholth at gmail.com Thu Jul 7 07:36:53 2016 From: dholth at gmail.com (Daniel Holth) Date: Thu, 07 Jul 2016 11:36:53 +0000 Subject: [Distutils] package made up of only .so's? In-Reply-To: <766fbb2f-d7e6-257c-da0f-293bdbc98827@simplistix.co.uk> References: <24e0350e-034d-2c8a-81d2-7c869f9f7716@simplistix.co.uk> <766fbb2f-d7e6-257c-da0f-293bdbc98827@simplistix.co.uk> Message-ID: Please install directly from the repository for now. On Thu, Jul 7, 2016, 07:13 Chris Withers wrote: > The release version on pypi doesn't seem to work... > > (Appears to be missing setup2toml and doesn't sepcify pytoml as a > dependency). > > What's the current recommended way to install enscons? > > Chris > > On 04/07/2016 17:27, Daniel Holth wrote: > > Try my new project, enscons. https://bitbucket.org/dholth/enscons . You > can just wheel up whatever you want, without bothering with all this > setup.py nonsense. Put your metadata in pyproject.toml, enscons' setup2toml > script might help you generate that from setup.py. It might look something > like this: > > import pytoml as tomlimport enscons > metadata = dict(toml.load(open('pyproject.toml')))['tool']['enscons'] > # most specific binary, non-manylinux1 tag should be at the top of this listif True: > import wheel.pep425tags > for tag in wheel.pep425tags.get_supported(): > full_tag = '-'.join(tag) > if not 'manylinux' in tag: > breakenv = Environment(tools=['default', 'packaging', enscons.generate], > PACKAGE_METADATA=metadata, > WHEEL_TAG=full_tag, > ROOT_IS_PURELIB=False) > sources = Glob('*.so')env.Whl('platlib', sources, root='.') > > > On Mon, Jul 4, 2016 at 12:07 PM Chris Withers > wrote: > >> Hi All, >> >> I need to build a package which is made up of a set of .so's. >> >> These .so's are compiled elsewhere, so I just need to write an >> appropriate setup.py and bdist_wheel to get what I want. But that first >> part is where I'm struggling. >> >> Each of these .so's is essentially a top level python module. How do I >> tell setuptools' setup() function to basically just roll these into a >> wheel? >> >> Any help gratefully received! >> >> >> Chris >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > ______________________________________________________________________ > This email has been scanned by the Symantec Email Security.cloud service. > For more information please visit http://www.symanteccloud.com > ______________________________________________________________________ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dimaqq at gmail.com Tue Jul 12 07:55:37 2016 From: dimaqq at gmail.com (Dima Tisnek) Date: Tue, 12 Jul 2016 13:55:37 +0200 Subject: [Distutils] Outdated packages on pypi Message-ID: Hi all, Is anyone working on pruning old packages from pypi? I found something last updated in 2014, which, looking at the source appears half-done. Github link doesn't work any longer, no description, etc. I managed to find author's email address out of band, and he responded that he can't remember the password, yada yada. I wonder if some basic automation is possible here -- check if url's are reachable and if existing package satisfies basic requirements, failing that mark it as "possibly out of date" d. From glyph at twistedmatrix.com Tue Jul 12 16:45:05 2016 From: glyph at twistedmatrix.com (Glyph Lefkowitz) Date: Tue, 12 Jul 2016 13:45:05 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> > On Jul 12, 2016, at 4:55 AM, Dima Tisnek wrote: > > Hi all, > > Is anyone working on pruning old packages from pypi? > > I found something last updated in 2014, which, looking at the source > appears half-done. > Github link doesn't work any longer, no description, etc. > > I managed to find author's email address out of band, and he responded > that he can't remember the password, yada yada. > > I wonder if some basic automation is possible here -- check if url's > are reachable and if existing package satisfies basic requirements, > failing that mark it as "possibly out of date" My feeling is that there should be a "dead man's switch" sort of mechanism for this. Require manual intervention from at least one package owner at least once a year. I believe if you dig around in the archives there's been quite a bit of discussion around messaging to package owners and that sort of thing - and the main sticking point is that someone needs to volunteer to do the work on Warehouse. Are you that person? :) -glyph From donald at stufft.io Wed Jul 13 00:54:02 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 00:54:02 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> Message-ID: <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> > On Jul 12, 2016, at 4:45 PM, Glyph Lefkowitz wrote: > > My feeling is that there should be a "dead man's switch" sort of mechanism for this. Require manual intervention from at least one package owner at least once a year. I believe if you dig around in the archives there's been quite a bit of discussion around messaging to package owners and that sort of thing - and the main sticking point is that someone needs to volunteer to do the work on Warehouse. Are you that person? :) I suspect any change like this will require some sort of PEP or something similar to it. It?s something that I think is going to hard to get just right (if it?s something we want to do at all). Software can be ?finished? without needing more releases, and sometimes projects stop getting updates until the maintainer has more time (or a new maintainer comes along). An example is setuptools which had no releases between Oct 2009 and Jun 2013. Another nice example is ``wincertstore`` which has had two releases one in 2013 and one in 2014 and is one of the most downloaded projects on PyPI. It doesn?t need any updates because it?s just a wrapper around Windows APIs via ctypes. Another thing we need to be careful about is what do we do once said dead man?s switch triggers? We can?t just release the package to allow anyone to register it, that?s just pointing a security shaped footgun at the foot of every person using that project? It doesn?t make sense to block new uploads for that project since there?s no point to disallowing new uploads. Flagging it to allow someone to ?take over? (possibly with some sort of review) has some of the security shaped footguns as well as a problem with deciding who to trust with a name or not. ? Donald Stufft From baptiste at bitsofnetworks.org Wed Jul 13 08:42:21 2016 From: baptiste at bitsofnetworks.org (Baptiste Jonglez) Date: Wed, 13 Jul 2016 14:42:21 +0200 Subject: [Distutils] Missing IPv6 support on pypi.python.org In-Reply-To: References: <20151108200441.GB28216@lud.polynome.dn42> Message-ID: <20160713124221.GA14650@lud.polynome.dn42> As a follow-up, Fastly now provides an option to enable IPv6 (but this is not enabled by default). See: https://github.com/pypa/pypi-legacy/issues/90#issuecomment-231240046 Does pypi plan to participate in this program? It would be nice! Thanks, Baptiste On Sun, Nov 08, 2015 at 09:13:49PM -0500, Donald Stufft wrote: > I?m pretty sure that PyPI will get IPv6 support as soon as Fastly supports it and not any sooner. I know they?re working on making it happen but I don?t think they have a public timeline for it yet. > > On November 8, 2015 at 4:34:32 PM, Baptiste Jonglez (baptiste at bitsofnetworks.org) wrote: > > Hi, > > > > pypi.python.org is currently not reachable over IPv6. > > > > I know this issue was brought up before [1,2]. This is a real issue for > > us, because our backend servers are IPv6-only (clients never need to talk > > to backend servers, they go through IPv4-enabled HTTP frontends). > > > > So, deploying packages from pypi on the IPv6-only servers is currently a > > pain. What is the roadmap to add IPv6 support? It seems that Fastly has > > already deployed IPv6 [3]. > > > > Thanks, > > Baptiste > > > > > > [1] https://mail.python.org/pipermail/distutils-sig/2014-June/024465.html > > [2] https://bitbucket.org/pypa/pypi/issues/90/missing-ipv6-connectivity > > [3] http://bgp.he.net/AS54113#_prefixes6 > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > > > > ----------------- > Donald Stufft > PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: not available URL: From alex.gronholm at nextday.fi Wed Jul 13 09:08:35 2016 From: alex.gronholm at nextday.fi (=?UTF-8?Q?Alex_Gr=c3=b6nholm?=) Date: Wed, 13 Jul 2016 16:08:35 +0300 Subject: [Distutils] Missing IPv6 support on pypi.python.org In-Reply-To: <20160713124221.GA14650@lud.polynome.dn42> References: <20151108200441.GB28216@lud.polynome.dn42> <20160713124221.GA14650@lud.polynome.dn42> Message-ID: <57863D53.60801@nextday.fi> The legacy software might have issues with IPv6 so I doubt this will happen before Warehouse replaces Cheeseshop as the new PyPI. 13.07.2016, 15:42, Baptiste Jonglez kirjoitti: > As a follow-up, Fastly now provides an option to enable IPv6 (but this is > not enabled by default). > > See: https://github.com/pypa/pypi-legacy/issues/90#issuecomment-231240046 > > Does pypi plan to participate in this program? It would be nice! > > Thanks, > Baptiste > > On Sun, Nov 08, 2015 at 09:13:49PM -0500, Donald Stufft wrote: >> I?m pretty sure that PyPI will get IPv6 support as soon as Fastly supports it and not any sooner. I know they?re working on making it happen but I don?t think they have a public timeline for it yet. >> >> On November 8, 2015 at 4:34:32 PM, Baptiste Jonglez (baptiste at bitsofnetworks.org) wrote: >>> Hi, >>> >>> pypi.python.org is currently not reachable over IPv6. >>> >>> I know this issue was brought up before [1,2]. This is a real issue for >>> us, because our backend servers are IPv6-only (clients never need to talk >>> to backend servers, they go through IPv4-enabled HTTP frontends). >>> >>> So, deploying packages from pypi on the IPv6-only servers is currently a >>> pain. What is the roadmap to add IPv6 support? It seems that Fastly has >>> already deployed IPv6 [3]. >>> >>> Thanks, >>> Baptiste >>> >>> >>> [1] https://mail.python.org/pipermail/distutils-sig/2014-June/024465.html >>> [2] https://bitbucket.org/pypa/pypi/issues/90/missing-ipv6-connectivity >>> [3] http://bgp.he.net/AS54113#_prefixes6 >>> _______________________________________________ >>> Distutils-SIG maillist - Distutils-SIG at python.org >>> https://mail.python.org/mailman/listinfo/distutils-sig >>> >> ----------------- >> Donald Stufft >> PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA >> >> >> >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at withers.org Wed Jul 13 11:25:43 2016 From: chris at withers.org (Chris Withers) Date: Wed, 13 Jul 2016 16:25:43 +0100 Subject: [Distutils] Missing IPv6 support on pypi.python.org In-Reply-To: <57863D53.60801@nextday.fi> References: <20151108200441.GB28216@lud.polynome.dn42> <20160713124221.GA14650@lud.polynome.dn42> <57863D53.60801@nextday.fi> Message-ID: An HTML attachment was scrubbed... URL: From brett at python.org Wed Jul 13 12:49:33 2016 From: brett at python.org (Brett Cannon) Date: Wed, 13 Jul 2016 16:49:33 +0000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> Message-ID: On Tue, 12 Jul 2016 at 21:54 Donald Stufft wrote: > > > On Jul 12, 2016, at 4:45 PM, Glyph Lefkowitz > wrote: > > > > My feeling is that there should be a "dead man's switch" sort of > mechanism for this. Require manual intervention from at least one package > owner at least once a year. I believe if you dig around in the archives > there's been quite a bit of discussion around messaging to package owners > and that sort of thing - and the main sticking point is that someone needs > to volunteer to do the work on Warehouse. Are you that person? :) > > [SNIP] > > Another thing we need to be careful about is what do we do once said dead > man?s switch triggers? We can?t just release the package to allow anyone to > register it, that?s just pointing a security shaped footgun at the foot of > every person using that project? It doesn?t make sense to block new uploads > for that project since there?s no point to disallowing new uploads. > Flagging it to allow someone to ?take over? (possibly with some sort of > review) has some of the security shaped footguns as well as a problem with > deciding who to trust with a name or not. My assumption was that if a project was flagged as no longer maintained, then it would literally just get some clear banner/label/whatever to let people know that if they start using the project that they shouldn't necessarily expect bug-fixes. And if people wanted to get really fancy, expose this metadata such that some tool could easily warn you that you have dependencies that have been flagged as unsupported code. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jim at jimfulton.info Wed Jul 13 13:24:40 2016 From: jim at jimfulton.info (Jim Fulton) Date: Wed, 13 Jul 2016 13:24:40 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: On Tue, Jul 12, 2016 at 7:55 AM, Dima Tisnek wrote: > Hi all, > > Is anyone working on pruning old packages from pypi? > > I found something last updated in 2014, which, looking at the source > appears half-done. > Github link doesn't work any longer, no description, etc. > > I managed to find author's email address out of band, and he responded > that he can't remember the password, yada yada. > > I wonder if some basic automation is possible here -- check if url's > are reachable and if existing package satisfies basic requirements, > failing that mark it as "possibly out of date" I'm curious why you view this as a problem that needs to be solved? - Do you want to take over the name yourself? - Are you afraid someone will stumble on this package and use it? Something else? Jim -- Jim Fulton http://jimfulton.info From donald at stufft.io Wed Jul 13 13:30:46 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 13:30:46 -0400 Subject: [Distutils] Missing IPv6 support on pypi.python.org In-Reply-To: <20160713124221.GA14650@lud.polynome.dn42> References: <20151108200441.GB28216@lud.polynome.dn42> <20160713124221.GA14650@lud.polynome.dn42> Message-ID: > On Jul 13, 2016, at 8:42 AM, Baptiste Jonglez wrote: > > As a follow-up, Fastly now provides an option to enable IPv6 (but this is > not enabled by default). > > See: https://github.com/pypa/pypi-legacy/issues/90#issuecomment-231240046 > > Does pypi plan to participate in this program? It would be nice! It?s not an option that we can turn on at will. It?s currently in Limited Availability and you have to ask Fastly to be put on a list and they?ll select which of their customers they want to invite into the LA program. We have asked to be put on the list and we?re awaiting to see if they accepted us into the LA. ? Donald Stufft From jim at jimfulton.info Wed Jul 13 13:41:22 2016 From: jim at jimfulton.info (Jim Fulton) Date: Wed, 13 Jul 2016 13:41:22 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> Message-ID: Well said. IMO, package names shouldn't be reused. Also, IMO, we have a namespace problem, for which there's a common solution that we avoid (domain based names, which can also be reused, but ...). OTOH, here's an idea. What if in addition to the project name, we also assigned a unique id. When a package was added to a consuming project, we'd store both the packages name, and it's project id. When looking up a package, we'd supply both the project name and id. If a name was reused, the new project with the same name would have a new project id and wouldn't be confused with the old one. We could even still server the old project's released without advertising them. This way, if we did decide to reuse a name, we could do so pretty safely. Jim On Wed, Jul 13, 2016 at 12:54 AM, Donald Stufft wrote: > >> On Jul 12, 2016, at 4:45 PM, Glyph Lefkowitz wrote: >> >> My feeling is that there should be a "dead man's switch" sort of mechanism for this. Require manual intervention from at least one package owner at least once a year. I believe if you dig around in the archives there's been quite a bit of discussion around messaging to package owners and that sort of thing - and the main sticking point is that someone needs to volunteer to do the work on Warehouse. Are you that person? :) > > > I suspect any change like this will require some sort of PEP or something similar to it. It?s something that I think is going to hard to get just right (if it?s something we want to do at all). > > Software can be ?finished? without needing more releases, and sometimes projects stop getting updates until the maintainer has more time (or a new maintainer comes along). An example is setuptools which had no releases between Oct 2009 and Jun 2013. Another nice example is ``wincertstore`` which has had two releases one in 2013 and one in 2014 and is one of the most downloaded projects on PyPI. It doesn?t need any updates because it?s just a wrapper around Windows APIs via ctypes. > > Another thing we need to be careful about is what do we do once said dead man?s switch triggers? We can?t just release the package to allow anyone to register it, that?s just pointing a security shaped footgun at the foot of every person using that project? It doesn?t make sense to block new uploads for that project since there?s no point to disallowing new uploads. Flagging it to allow someone to ?take over? (possibly with some sort of review) has some of the security shaped footguns as well as a problem with deciding who to trust with a name or not. > > ? > Donald Stufft > > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -- Jim Fulton http://jimfulton.info From dimaqq at gmail.com Wed Jul 13 14:08:20 2016 From: dimaqq at gmail.com (Dima Tisnek) Date: Wed, 13 Jul 2016 20:08:20 +0200 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: I came across a package by accident. A mate made a reasonable mistake typing in a pip command, and something odd got installed. For a moment I even suspected that package in question was some kind of malware, so I went to download it manually (not via pip install), and realised that the package was not updated for a long while, didn't have description and github link was broken. That overall got me thinking about namespace pollution in pip, that once something is pushed in, it's like to stay there forever. I figured, with so many packages in pypi, what's the percentage that cannot be removed by author for a simple reason that, in the worst case, author is dead? Btw., I'm not a fan of domain names (to finicky, change more often that short names) or unique ids (humans don't handle them well). I'd rather see something similar to Linux distributions where there's a curated repository "core" and a few semi-official, like "extra" and "community," and for some, "testing." A name foobar resolves to core/foobar- if that exists, and if not some subset of other repositories is used. This way, an outdated package can be moved to another repo without breaking install base. In fact, curation without namespaces will already be pretty good. d. On 13 July 2016 at 19:24, Jim Fulton wrote: > On Tue, Jul 12, 2016 at 7:55 AM, Dima Tisnek wrote: >> Hi all, >> >> Is anyone working on pruning old packages from pypi? >> >> I found something last updated in 2014, which, looking at the source >> appears half-done. >> Github link doesn't work any longer, no description, etc. >> >> I managed to find author's email address out of band, and he responded >> that he can't remember the password, yada yada. >> >> I wonder if some basic automation is possible here -- check if url's >> are reachable and if existing package satisfies basic requirements, >> failing that mark it as "possibly out of date" > > I'm curious why you view this as a problem that needs to be solved? > > - Do you want to take over the name yourself? > > - Are you afraid someone will stumble on this package and use it? > > Something else? > > Jim > > -- > Jim Fulton > http://jimfulton.info From dmitry.trofimov at jetbrains.com Wed Jul 13 14:43:42 2016 From: dmitry.trofimov at jetbrains.com (Dmitry Trofimov) Date: Wed, 13 Jul 2016 20:43:42 +0200 Subject: [Distutils] PyPI index workaround Message-ID: Hi, to have information about available packages, PyCharm IDE currently parses the PyPI index page (https://pypi.python.org/pypi?%3Aaction=index). As it is going to be deprecated soon, we are looking for a workaround. What we need is, making one request, to get the name and the version of all PyPI packages. Then we cache this information in the IDE ( https://github.com/JetBrains/intellij-community/blob/7e16c042a19767d5f548c84f88cc5edd5f9d1721/python/src/com/jetbrains/python/packaging/PyPIPackageUtil.java ). What official API could you advise us to look at? Any hint is appreciated. Best regards, Dmitry -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Wed Jul 13 14:57:11 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 14:57:11 -0400 Subject: [Distutils] PyPI index workaround In-Reply-To: References: Message-ID: > On Jul 13, 2016, at 2:43 PM, Dmitry Trofimov wrote: > > Hi, > > to have information about available packages, PyCharm IDE currently parses > the PyPI index page (https://pypi.python.org/pypi?%3Aaction=index ). > As it is going to be deprecated soon, we are looking for a workaround. > > What we need is, making one request, to get the name and the version of all PyPI packages. Then we cache this information in the IDE (https://github.com/JetBrains/intellij-community/blob/7e16c042a19767d5f548c84f88cc5edd5f9d1721/python/src/com/jetbrains/python/packaging/PyPIPackageUtil.java ). By name and version, do you mean the latest version? ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Wed Jul 13 15:04:41 2016 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 13 Jul 2016 20:04:41 +0100 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: On 13 July 2016 at 19:08, Dima Tisnek wrote: > I'd rather see something similar to Linux distributions where there's > a curated repository "core" and a few semi-official, like "extra" and > "community," and for some, "testing." > A name foobar resolves to core/foobar- if that exists, and if > not some subset of other repositories is used. > This way, an outdated package can be moved to another repo without > breaking install base. > > In fact, curation without namespaces will already be pretty good. Who would take on the work of maintaining a curated repository? To do anything like a reasonable job would be a *lot* of work. Paul From donald at stufft.io Wed Jul 13 15:06:48 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 15:06:48 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: <888353C1-A9CE-4508-9A9A-7EF7731CD5D2@stufft.io> > On Jul 13, 2016, at 2:08 PM, Dima Tisnek wrote: > > I'd rather see something similar to Linux distributions where there's > a curated repository "core" and a few semi-official, like "extra" and > "community," and for some, "testing." > A name foobar resolves to core/foobar- if that exists, and if > not some subset of other repositories is used. > This way, an outdated package can be moved to another repo without > breaking install base. PyPI is unlikely to *ever* become a curated repository. The time and effort it would take to do that, even if we decided we wanted to, is not something that we have available to us. Beyond that though, I think that would fundamentally change PyPI in a way that is for the worse. One of the goals of PyPI is to enable anyone to publish a package, whether they?re as well known and trusted as Guido, or some unknown person from the backwoods of Pennsylvania. We try very hard to remain neutral in terms of whether one package is ?better? than another package and try to present largely unbiased information [1]. It would not be particularly hard, technically speaking, for someone to maintain a curated set of packages ontop of what PyPI provides already. This would not need to be an official PyPI thing, but if one rose to some prominence it would be easy enough to direct folks to it who want that sort of thing. [1] To the extent that any information at all is unbiased. ? Donald Stufft From donald at stufft.io Wed Jul 13 15:25:39 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 15:25:39 -0400 Subject: [Distutils] PyPI index workaround In-Reply-To: References: Message-ID: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> > On Jul 13, 2016, at 3:12 PM, ?????? ??????? wrote: > > I'm sorry, I should have posted my commentary here, not in the separate thread. > > We have some issues with suggested "/simple" endpoint. Despite the need to scrap the web page, old endpoint allowed us to quickly find latest versions of the packages hosted on PyPI. We did a single request on IDE startup and showed outdated installed packages in the settings later. Index "/simple" however contains only package names and links to the dedicated pages with their artifacts (not for each of them, though). It means that now we have to make tons of individual requests to find the latest published version for each installed package. Isn't it going to load the service even worse? > > So, yes, we're interested most in the latest version of a package. > Ok, we don?t currently have an API like that (largely because nobody has come up with a use case that was pressing enough to need to devote resources to it). It was requested though, and is being tracked by https://github.com/pypa/warehouse/issues/347 . This is likely enough to pull this issue onto my radar as sooner rather than later issue. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitry.trofimov at jetbrains.com Wed Jul 13 15:40:28 2016 From: dmitry.trofimov at jetbrains.com (Dmitry Trofimov) Date: Wed, 13 Jul 2016 21:40:28 +0200 Subject: [Distutils] PyPI index workaround In-Reply-To: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> Message-ID: > > > Ok, we don?t currently have an API like that (largely because nobody has > come up with a use case that was pressing enough to need to devote > resources to it). It was requested though, and is being tracked by > https://github.com/pypa/warehouse/issues/347. This is likely enough to > pull this issue onto my radar as sooner rather than later issue. Does that mean that PyPI index page will live for a while until the new API is implemented? On Wed, Jul 13, 2016 at 9:25 PM, Donald Stufft wrote: > > On Jul 13, 2016, at 3:12 PM, ?????? ??????? wrote: > > I'm sorry, I should have posted my commentary here, not in the separate > thread. > > >> We have some issues with suggested "/simple" endpoint. Despite the need >> to scrap the web page, old endpoint allowed us to quickly find latest >> versions of the packages hosted on PyPI. We did a single request on IDE >> startup and showed outdated installed packages in the settings later. Index >> "/simple" however contains only package names and links to the dedicated >> pages with their artifacts (not for each of them, though). It means that >> now we have to make tons of individual requests to find the latest >> published version for each installed package. Isn't it going to load the >> service even worse? > > > So, yes, we're interested most in the latest version of a package. > > > > Ok, we don?t currently have an API like that (largely because nobody has > come up with a use case that was pressing enough to need to devote > resources to it). It was requested though, and is being tracked by > https://github.com/pypa/warehouse/issues/347. This is likely enough to > pull this issue onto my radar as sooner rather than later issue. > > > ? > Donald Stufft > > > > -- Dmitry Trofimov PyCharm Team Lead JetBrainshttp://www.jetbrains.com The Drive To Develop -------------- next part -------------- An HTML attachment was scrubbed... URL: From glyph at twistedmatrix.com Wed Jul 13 15:52:50 2016 From: glyph at twistedmatrix.com (Glyph Lefkowitz) Date: Wed, 13 Jul 2016 12:52:50 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> Message-ID: <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> > On Jul 12, 2016, at 9:54 PM, Donald Stufft wrote: > > >> On Jul 12, 2016, at 4:45 PM, Glyph Lefkowitz wrote: >> >> My feeling is that there should be a "dead man's switch" sort of mechanism for this. Require manual intervention from at least one package owner at least once a year. I believe if you dig around in the archives there's been quite a bit of discussion around messaging to package owners and that sort of thing - and the main sticking point is that someone needs to volunteer to do the work on Warehouse. Are you that person? :) > > > I suspect any change like this will require some sort of PEP or something similar to it. It?s something that I think is going to hard to get just right (if it?s something we want to do at all). > > Software can be ?finished? without needing more releases, "The software isn't finished until the last user is dead." :-) > and sometimes projects stop getting updates until the maintainer has more time (or a new maintainer comes along). Yes; the whole point here is to have some way for people to know that a new maintainer is needed. > An example is setuptools which had no releases between Oct 2009 and Jun 2013. Arguably setuptools _was_ badly broken though, and if it had been obvious earlier on that it was in a bad situation perhaps we'd be further along by now :-). > Another nice example is ``wincertstore`` which has had two releases one in 2013 and one in 2014 and is one of the most downloaded projects on PyPI. It doesn?t need any updates because it?s just a wrapper around Windows APIs via ctypes. Except it does need testing against new versions of Python. No Python :: 3.5 classifier on it, for example! And right at the top of its description, a security fix. The point of such a switch is to be able to push it and respond; not to tell the maintainer "you have to do a new release!" but rather to prompt the maintainer to explicitly acknowledge "the reason I have not done a new release is not that I haven't been paying attention; I am alive, I'm paying attention, and we don't need any maintenance, someone is still watching". > Another thing we need to be careful about is what do we do once said dead man?s switch triggers? We can?t just release the package to allow anyone to register it, that?s just pointing a security shaped footgun at the foot of every person using that project? It doesn?t make sense to block new uploads for that project since there?s no point to disallowing new uploads. Flagging it to allow someone to ?take over? (possibly with some sort of review) has some of the security shaped footguns as well as a problem with deciding who to trust with a name or not. The primary thing would be to have a banner on the page and a warning from `pip install?. Those of us close to the heart of the Python community already have various ways of reading the tea leaves to know that things are likely to be unmaintained or bitrotting; the main purpose of such a feature would be to have an automated way for people who don't personally know all the prominent package authors and see them at conferences and meetups all the time to get this information. For example: nobody should be using PIL, they should be using pillow. Yet there's no way for a new user to figure this out by just looking at https://pypi.io/project/PIL/ :). I think that the adjudication process for stealing a name from an existing owner is something that still bears discussion, but separately. Whatever that process is, you'd have to go through it fully after a package becomes thusly "abandoned", and for the reasons you cite, it absolutely should not be automated. Perhaps it shouldn't even be the way to deal with it - maybe the most you should be able to do in this case is to expand the "this is unmaintained" warning with a pointer to a different replacement name. -glyph -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Wed Jul 13 15:56:45 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 15:56:45 -0400 Subject: [Distutils] PyPI index workaround In-Reply-To: References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> Message-ID: <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> > On Jul 13, 2016, at 3:40 PM, Dmitry Trofimov wrote: > > Does that mean that PyPI index page will live for a while until the new API is implemented? Yes, though I?m looking at this right now. I do have a question here though. If I understand the dialog, this is to provide a way for people to upgrade packages they have installed, and to tell them if their is a newer version or not. So my question here is why do you need the latest version for *every* package instead of just the ones you have installed? If you narrow it down to just the ones that are installed, then the number of HTTP requests needed with the current APIs goes down from ~80,000 to likely <100 or even <50 in most cases. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From qsolo825 at gmail.com Wed Jul 13 15:05:51 2016 From: qsolo825 at gmail.com (=?UTF-8?B?0JzQuNGF0LDQuNC7INCT0L7Qu9GD0LHQtdCy?=) Date: Wed, 13 Jul 2016 22:05:51 +0300 Subject: [Distutils] Deprecation of the endpoint "/pypi?%3Aaction=index" Message-ID: Hi guys. I'd like to clarify Dmitry's question a bit. We have some issues with suggested "/simple" endpoint. Despite the need to scrap the web page, old endpoint allowed us to quickly find latest versions of the packages hosted on PyPI. We did a single request on IDE startup and showed outdated installed packages in the settings later. Index "/simple" however contains only package names and links to the dedicated pages with their artifacts (not for each of them, though). It means that now we have to make tons of individual requests to find the latest published version for each installed package. Isn't it going to load the service even worse? -- Best regards Mikhail Golubev -------------- next part -------------- An HTML attachment was scrubbed... URL: From qsolo825 at gmail.com Wed Jul 13 15:12:36 2016 From: qsolo825 at gmail.com (=?UTF-8?B?0JzQuNGF0LDQuNC7INCT0L7Qu9GD0LHQtdCy?=) Date: Wed, 13 Jul 2016 22:12:36 +0300 Subject: [Distutils] PyPI index workaround In-Reply-To: References: Message-ID: I'm sorry, I should have posted my commentary here, not in the separate thread. > We have some issues with suggested "/simple" endpoint. Despite the need to > scrap the web page, old endpoint allowed us to quickly find latest versions > of the packages hosted on PyPI. We did a single request on IDE startup and > showed outdated installed packages in the settings later. Index "/simple" > however contains only package names and links to the dedicated pages with > their artifacts (not for each of them, though). It means that now we have > to make tons of individual requests to find the latest published version > for each installed package. Isn't it going to load the service even worse? So, yes, we're interested most in the latest version of a package. 2016-07-13 21:57 GMT+03:00 Donald Stufft : > > On Jul 13, 2016, at 2:43 PM, Dmitry Trofimov < > dmitry.trofimov at jetbrains.com> wrote: > > Hi, > > to have information about available packages, PyCharm IDE currently parses > the PyPI index page (https://pypi.python.org/pypi?%3Aaction=index). > As it is going to be deprecated soon, we are looking for a workaround. > > What we need is, making one request, to get the name and the version of > all PyPI packages. Then we cache this information in the IDE ( > https://github.com/JetBrains/intellij-community/blob/7e16c042a19767d5f548c84f88cc5edd5f9d1721/python/src/com/jetbrains/python/packaging/PyPIPackageUtil.java > ). > > > By name and version, do you mean the latest version? > > ? > Donald Stufft > > > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -- Best regards Mikhail Golubev -------------- next part -------------- An HTML attachment was scrubbed... URL: From qsolo825 at gmail.com Wed Jul 13 16:21:24 2016 From: qsolo825 at gmail.com (=?UTF-8?B?0JzQuNGF0LDQuNC7INCT0L7Qu9GD0LHQtdCy?=) Date: Wed, 13 Jul 2016 23:21:24 +0300 Subject: [Distutils] PyPI index workaround In-Reply-To: <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> Message-ID: Right, sorry, that initial question wasn't clear about that. We need the latest versions only for installed packages. Nonetheless, as you noted, it's still several dozens consecutive requests to "/simple/" for each PyCharm session of every user. Can you handle that? 2016-07-13 22:56 GMT+03:00 Donald Stufft : > > On Jul 13, 2016, at 3:40 PM, Dmitry Trofimov < > dmitry.trofimov at jetbrains.com> wrote: > > Does that mean that PyPI index page will live for a while until the new > API is implemented? > > > Yes, though I?m looking at this right now. > > I do have a question here though. If I understand the dialog, this is to > provide a way for people to upgrade packages they have installed, and to > tell them if their is a newer version or not. So my question here is why do > you need the latest version for *every* package instead of just the ones > you have installed? > > If you narrow it down to just the ones that are installed, then the number > of HTTP requests needed with the current APIs goes down from ~80,000 to > likely <100 or even <50 in most cases. > > ? > Donald Stufft > > > > -- Best regards Mikhail Golubev -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Wed Jul 13 16:23:22 2016 From: steve.dower at python.org (Steve Dower) Date: Wed, 13 Jul 2016 13:23:22 -0700 Subject: [Distutils] PyPI index workaround In-Reply-To: References: Message-ID: I'm also interested (for the same support in Visual Studio) though we're unaffected by this change. A batch API to get info for many packages would be great. Currently we scrape simple and then post JSON queries for individual packages. Cheers, Steve Top-posted from my Windows Phone -----Original Message----- From: "?????? ???????" Sent: ?7/?13/?2016 13:04 To: "Donald Stufft" Cc: "distutils-sig at python.org" Subject: Re: [Distutils] PyPI index workaround I'm sorry, I should have posted my commentary here, not in the separate thread. We have some issues with suggested "/simple" endpoint. Despite the need to scrap the web page, old endpoint allowed us to quickly find latest versions of the packages hosted on PyPI. We did a single request on IDE startup and showed outdated installed packages in the settings later. Index "/simple" however contains only package names and links to the dedicated pages with their artifacts (not for each of them, though). It means that now we have to make tons of individual requests to find the latest published version for each installed package. Isn't it going to load the service even worse? So, yes, we're interested most in the latest version of a package. 2016-07-13 21:57 GMT+03:00 Donald Stufft : On Jul 13, 2016, at 2:43 PM, Dmitry Trofimov wrote: Hi, to have information about available packages, PyCharm IDE currently parses the PyPI index page (https://pypi.python.org/pypi?%3Aaction=index). As it is going to be deprecated soon, we are looking for a workaround. What we need is, making one request, to get the name and the version of all PyPI packages. Then we cache this information in the IDE (https://github.com/JetBrains/intellij-community/blob/7e16c042a19767d5f548c84f88cc5edd5f9d1721/python/src/com/jetbrains/python/packaging/PyPIPackageUtil.java). By name and version, do you mean the latest version? ? Donald Stufft _______________________________________________ Distutils-SIG maillist - Distutils-SIG at python.org https://mail.python.org/mailman/listinfo/distutils-sig -- Best regards Mikhail Golubev -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Wed Jul 13 16:46:57 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 16:46:57 -0400 Subject: [Distutils] PyPI index workaround In-Reply-To: References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> Message-ID: > On Jul 13, 2016, at 4:21 PM, ?????? ??????? wrote: > > Right, sorry, that initial question wasn't clear about that. > > We need the latest versions only for installed packages. Nonetheless, as you noted, it's still several dozens consecutive requests to "/simple/" for each PyCharm session of every user. > > Can you handle that? The short answer is yes. The longer answer is, that we have Fastly acting as a CDN in front of PyPI and serving an item out of the cache in Fastly is essentially free for us in terms of resources (obviously Fastly needs to handle that load, but they?re well equipped to handle much larger loads than we are). Thus, the more cacheable (and the longer lived a particular cache item can be) the easier it is for us to scale a particular URL on PyPI. The url you?re currently using has a view downsides that prevent it from being able to be cached effectively: * The URL is a ?UI? URL, so it includes information like current logged in user and thus we need to Vary: Cookie which means it?s less likely to be cached at all since each unique cookie header adds another response to be cached for that URL, and Fastly will only save ~200 responses per URL before it starts to evict some. * Similarly to above, since it?s a ?UI? URL people expect it to update fairly quickly, because legacy PyPI wasn?t implemented with long lived caching with purging on updates in mind, it was easier to simply implement it with a short (5 minute IIRC) TTL on the cached object rather than long lived TTLs with purging (as we do in the ?API? urls). * Responses that act as collections of projects need to be invalidated anytime something changes that may invalidate that collection. In an API that lists every project and the latest version, that means it needs to be invalidated anytime something releases a new version. Compare that to looking at /simple/ and then either accessing /simple// or /pypi//json (all of which are cached for long periods of time and purged on demand). * None of those are ?UI? URLs, so they have long cache times and they do not Vary on Cookie. * For /simple/ we don?t list any versions we only list projects themselves. This means that we only need to invalidate this page whenever a brand new project is added to PyPI or an existing project is completely deleted. This occurs far less than someone releasing an existing project. * For /simple/ we don?t need to do any particularly heavy duty querying, it?s a simple select on an ~80k length table (versus a select on an 80k length table, with a join to a 500k length table) and is fairly quick to render. * For /simple// and /pypi//json these are scoped to an individual project, so they can be cached for a very long time and only invalidated when that particular project releases, not when _any_ project releases. This means that the likelihood we can serve one of these out of cache is VERY high. * For /simple// and /pypi//json our SQL queries are relatively quick because they don?t need to operate over the entire table, but only over the records for one single project. Given all of the above, and the fact that listing every project and their latest version is *slow* and resource intensive, yes it?s very likely that doing that will be far better for our ability to serve your requests, because the extra requests will almost certainly be able to be served straight from the Fastly caches and never hit our origin servers at all. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Wed Jul 13 16:54:09 2016 From: donald at stufft.io (Donald Stufft) Date: Wed, 13 Jul 2016 16:54:09 -0400 Subject: [Distutils] PyPI index workaround In-Reply-To: References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> Message-ID: <75001BE6-DC81-4484-B5AD-4B9C8AF14A02@stufft.io> > On Jul 13, 2016, at 4:21 PM, ?????? ??????? wrote: > > Can you handle that? Oh, and just to put things in scale in the past 30 days: * PyPI has served > 3 billion HTTP requests. * PyPI has served > 327TB of bandwidth. * The 95%tile for cache hit vs cache miss is 92%. * We regularly serve >1,000 concurrent requests - https://s.caremad.io/QDTlK0mRj7/ ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Wed Jul 13 16:54:48 2016 From: steve.dower at python.org (Steve Dower) Date: Wed, 13 Jul 2016 13:54:48 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> Message-ID: <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> On 13Jul2016 1252, Glyph Lefkowitz wrote: > The primary thing would be to have a banner on the page and a warning > from `pip install?. Those of us close to the heart of the Python > community already have various ways of reading the tea leaves to know > that things are likely to be unmaintained or bitrotting; the main > purpose of such a feature would be to have an automated way for people > who /don't/ personally know all the prominent package authors and see > them at conferences and meetups all the time to get this information. > For example: nobody should be using PIL, they should be using pillow. > Yet there's no way for a new user to figure this out by just looking > at https://pypi.io/project/PIL/ :). > > I think that the adjudication process for stealing a name from an > existing owner is something that still bears discussion, but separately. > Whatever that process is, you'd have to go through it fully after a > package becomes thusly "abandoned", and for the reasons you cite, it > absolutely should not be automated. Perhaps it shouldn't even be the > way to deal with it - maybe the most you should be able to do in this > case is to expand the "this is unmaintained" warning with a pointer to a > different replacement name. I like this. Maybe if a maintainer doesn't trigger the switch/publish anything for a year, a banner appears on the page with a publicly editable (votable?) list of alternative packages - thinking something similar to a reviews system with an "I found this review helpful" button. Possibly such user-contributed content would be valuable anyway, but the "probably abandoned" state just moves it to the top of the page instead of the bottom. Cheers, Steve From glyph at twistedmatrix.com Wed Jul 13 17:56:46 2016 From: glyph at twistedmatrix.com (Glyph Lefkowitz) Date: Wed, 13 Jul 2016 14:56:46 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> Message-ID: <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> > On Jul 13, 2016, at 1:54 PM, Steve Dower wrote: > > Possibly such user-contributed content would be valuable anyway https://alternativeto.net but for PyPI? :) -glyph -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Wed Jul 13 21:32:27 2016 From: steve.dower at python.org (Steve Dower) Date: Wed, 13 Jul 2016 18:32:27 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> Message-ID: <5786EBAB.5060709@python.org> On 13Jul2016 1456, Glyph Lefkowitz wrote: > >> On Jul 13, 2016, at 1:54 PM, Steve Dower > > wrote: >> >> Possibly such user-contributed content would be valuable anyway > > https://alternativeto.net but for PyPI? :) Or just more general reviews/warnings/info. "Doesn't work with IronPython", "Works fine on 3.5 even though it doesn't say so", etc. Restrict it to 140 chars, signed in users, only allow linking to other PyPI packages, let the maintainer delete comments (or mark them as disputed) and I think you'd avoid abuse (or rants/detailed bug reports/etc.). Maybe automatically clear all comments on each new release as well. Doesn't have to be complicated and fancy - just enough that users can help each other when maintainers disappear. Cheers, Steve From qsolo825 at gmail.com Thu Jul 14 05:30:12 2016 From: qsolo825 at gmail.com (=?UTF-8?B?0JzQuNGF0LDQuNC7INCT0L7Qu9GD0LHQtdCy?=) Date: Thu, 14 Jul 2016 12:30:12 +0300 Subject: [Distutils] PyPI index workaround In-Reply-To: <75001BE6-DC81-4484-B5AD-4B9C8AF14A02@stufft.io> References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> <75001BE6-DC81-4484-B5AD-4B9C8AF14A02@stufft.io> Message-ID: Ok, you convinced me that these extra requests from PyCharm won't cause you any problems. Impressive stats, by the way :) We will focus on migrating our packaging-related features to these new endpoints; hopefully, it won't take long. Note, however, that we need to prepare updates for already released versions of PyCharm. We'll let you know as soon as everything is ready. Ernest W. Durbin III suggested changing User-Agent, so that it's clear which requests come from PyCharm. To me it seems a fair point. Batch API, as mentioned by Steve Dower, are very welcome, anyway. Also "/simple" index is still HTML page. Honestly, it's a bit cumbersome that this information can be received only by scraping HTML and for everything else there are JSON REST API and XML-RPC. Is anyone from PyPA attending to EuroPython next week? We could discuss these matters further there. 2016-07-13 23:54 GMT+03:00 Donald Stufft : > > On Jul 13, 2016, at 4:21 PM, ?????? ??????? wrote: > > Can you handle that? > > > > Oh, and just to put things in scale in the past 30 days: > > * PyPI has served > 3 billion HTTP requests. > * PyPI has served > 327TB of bandwidth. > * The 95%tile for cache hit vs cache miss is 92%. > * We regularly serve >1,000 concurrent requests - > https://s.caremad.io/QDTlK0mRj7/ > > ? > Donald Stufft > > > > -- Best regards Mikhail Golubev -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Thu Jul 14 06:57:13 2016 From: wes.turner at gmail.com (Wes Turner) Date: Thu, 14 Jul 2016 05:57:13 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> Message-ID: There are types to describe this graph. Thing > CreativeWork > SoftwareApplication CreativeWork.comment r: [Comment] http://schema.org/SoftwareApplication http://schema.org/Comment ( #PEP426JSONLD because this is a graph of SoftwareApplication(s); now with TOML metadata ) There could be edge types as well. e.g. what is the relation between PIL/Pillow. - maintainerSuggests - communitySuggests - communitySaysUnmaintained - unaddressedVulns - etc Adding an embedded JS comments widget does/would add some additional maintenance burden (because user-generated content). Authors can specify an email address as structured data; and whatever they consider relevant in the long_description. On Jul 13, 2016 9:33 PM, "Steve Dower" wrote: On 13Jul2016 1456, Glyph Lefkowitz wrote: > > On Jul 13, 2016, at 1:54 PM, Steve Dower > > wrote: >> >> Possibly such user-contributed content would be valuable anyway >> > > https://alternativeto.net but for PyPI? :) > Or just more general reviews/warnings/info. "Doesn't work with IronPython", "Works fine on 3.5 even though it doesn't say so", etc. Restrict it to 140 chars, signed in users, only allow linking to other PyPI packages, let the maintainer delete comments (or mark them as disputed) and I think you'd avoid abuse (or rants/detailed bug reports/etc.). Maybe automatically clear all comments on each new release as well. Doesn't have to be complicated and fancy - just enough that users can help each other when maintainers disappear. Cheers, Steve _______________________________________________ Distutils-SIG maillist - Distutils-SIG at python.org https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu Jul 14 08:38:52 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Jul 2016 22:38:52 +1000 Subject: [Distutils] Deprecation of the endpoint "/pypi?%3Aaction=index" In-Reply-To: References: Message-ID: On 14 July 2016 at 05:05, ?????? ??????? wrote: > Hi guys. I'd like to clarify Dmitry's question a bit. > > We have some issues with suggested "/simple" endpoint. Despite the need to > scrap the web page, old endpoint allowed us to quickly find latest versions > of the packages hosted on PyPI. We did a single request on IDE startup and > showed outdated installed packages in the settings later. Index "/simple" > however contains only package names and links to the dedicated pages with > their artifacts (not for each of them, though). It means that now we have to > make tons of individual requests to find the latest published version for > each installed package. Isn't it going to load the service even worse? Donald addressed this in the other thread, but summarising it quickly here (mainly for the benefit of the list archives): - the "/simple" API is really cache friendly, and hence almost always gets served out of the Fastly cache instead of hitting the main app server - the main index endpoint isn't cache friendly at all, so many (most?) requests to it will hit the main app server - that main app server is the weak link in the current setup that's responsible for the 503 errors folks still sometimes see - rendering the index page can be one of the culprits contributing to 503's So if folks are using pypi.python.org to get this info, we'd prefer multiple requests to the /simple API over a single request to the index page. However, Donald's also going to look at providing a suitable batch query endpoint on https://pypi.io/ that folks can start using, even before Warehouse officially launches (similar to the way we're already recommending Warehouse as the preferred upload server). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu Jul 14 08:43:47 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Jul 2016 22:43:47 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: On 14 July 2016 at 05:04, Paul Moore wrote: > On 13 July 2016 at 19:08, Dima Tisnek wrote: >> I'd rather see something similar to Linux distributions where there's >> a curated repository "core" and a few semi-official, like "extra" and >> "community," and for some, "testing." >> A name foobar resolves to core/foobar- if that exists, and if >> not some subset of other repositories is used. >> This way, an outdated package can be moved to another repo without >> breaking install base. >> >> In fact, curation without namespaces will already be pretty good. > > Who would take on the work of maintaining a curated repository? To do > anything like a reasonable job would be a *lot* of work. Work that Linux distros already do, hence ideas like https://fedoraproject.org/wiki/Env_and_Stacks/Projects/LanguageSpecificRepositories and https://fedoraproject.org/wiki/Env_and_Stacks/Projects/PackageReviewProcessRedesign Convincing Linux distros to change their review processes to be less precious about package formats is a long political battle to fight, but it's more sustainable in the long run than trying to build *new* curation focused communities for each different language ecosystem that pops up :) Cheers, Nick. P.S. The conda community would be another example of a collaborative project curation effort, albeit one closer to the traditional Linux distro model (where review is accompanied by a change in packaging format) -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From robin at reportlab.com Thu Jul 14 09:37:16 2016 From: robin at reportlab.com (Robin Becker) Date: Thu, 14 Jul 2016 14:37:16 +0100 Subject: [Distutils] can someone explain why I have started to get these pip failures Message-ID: > (myenv) rptlab at app0:~/myenv > $ python > Python 2.7.11+ (default, Apr 17 2016, 14:00:29) > [GCC 5.3.1 20160413] on linux2 > Type "help", "copyright", "credits" or "license" for more information. >>>> > (myenv) rptlab at app0:~/myenv > $ pip --version > pip 8.1.2 from /home/rptlab/myenv/local/lib/python2.7/site-packages (python 2.7) > (myenv) rptlab at app0:~/myenv > pip install -r requirements.txt > Collecting MySQL-python<1.3,>=1.2.5 (from -r requirements.txt (line 1)) > Downloading MySQL-python-1.2.5.zip (108kB) > 100% |################################| 112kB 2.1MB/s > Could not import setuptools which is required to install from a source distribution. > Traceback (most recent call last): > File "/home/rptlab/> (myenv) rptlab at app0:~/myenv/local/lib/python2.7/site-packages/pip/req/req_install.py", line 372, in setup_py > import setuptools # noqa > File "/home/rptlab/> (myenv) rptlab at app0:~/myenv/local/lib/python2.7/site-packages/setuptools/__init__.py", line 11, in > from setuptools.extern.six.moves import filterfalse, map > File "/home/rptlab/> (myenv) rptlab at app0:~/myenv/local/lib/python2.7/site-packages/setuptools/extern/__init__.py", line 1, in > from pkg_resources.extern import VendorImporter > ImportError: No module named pkg_resources.extern > (myenv) rptlab at app0:~/myenv > $ pip install setuptools -U > Requirement already up-to-date: setuptools in ./lib/python2.7/site-packages This is on a unbuntu 16.04lts system > $ uname -a > Linux app0 4.4.0-28-generic #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux -- Robin Becker From graffatcolmingov at gmail.com Thu Jul 14 10:41:51 2016 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Thu, 14 Jul 2016 09:41:51 -0500 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: References: Message-ID: Try: pip install --force-reinstall setuptools -U On Thu, Jul 14, 2016 at 8:37 AM, Robin Becker wrote: >> (myenv) rptlab at app0:~/myenv >> $ python >> Python 2.7.11+ (default, Apr 17 2016, 14:00:29) >> [GCC 5.3.1 20160413] on linux2 >> Type "help", "copyright", "credits" or "license" for more information. >>>>> >>>>> > >> (myenv) rptlab at app0:~/myenv >> >> $ pip --version >> pip 8.1.2 from /home/rptlab/myenv/local/lib/python2.7/site-packages >> (python 2.7) > > > >> (myenv) rptlab at app0:~/myenv >> >> pip install -r requirements.txt >> Collecting MySQL-python<1.3,>=1.2.5 (from -r requirements.txt (line 1)) >> Downloading MySQL-python-1.2.5.zip (108kB) >> 100% |################################| 112kB 2.1MB/s >> Could not import setuptools which is required to install from a source >> distribution. >> Traceback (most recent call last): >> File "/home/rptlab/> (myenv) >> rptlab at app0:~/myenv/local/lib/python2.7/site-packages/pip/req/req_install.py", >> line 372, in setup_py >> import setuptools # noqa >> File "/home/rptlab/> (myenv) >> rptlab at app0:~/myenv/local/lib/python2.7/site-packages/setuptools/__init__.py", >> line 11, in >> from setuptools.extern.six.moves import filterfalse, map >> File "/home/rptlab/> (myenv) >> rptlab at app0:~/myenv/local/lib/python2.7/site-packages/setuptools/extern/__init__.py", >> line 1, in >> from pkg_resources.extern import VendorImporter >> ImportError: No module named pkg_resources.extern > >> (myenv) rptlab at app0:~/myenv >> >> $ pip install setuptools -U >> Requirement already up-to-date: setuptools in >> ./lib/python2.7/site-packages > > > > This is on a unbuntu 16.04lts system > >> $ uname -a >> Linux app0 4.4.0-28-generic #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 >> x86_64 x86_64 x86_64 GNU/Linux > > > -- > Robin Becker > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From donald at stufft.io Thu Jul 14 10:44:19 2016 From: donald at stufft.io (Donald Stufft) Date: Thu, 14 Jul 2016 10:44:19 -0400 Subject: [Distutils] PyPI index workaround In-Reply-To: References: <8114470F-5B75-4FDB-B858-CD746B79C92F@stufft.io> <1051DE9F-813F-40E6-AAC3-3A4EA1DF7A18@stufft.io> <75001BE6-DC81-4484-B5AD-4B9C8AF14A02@stufft.io> Message-ID: <59CD6D63-683F-454F-9E85-C0084757A48F@stufft.io> > On Jul 14, 2016, at 5:30 AM, ?????? ??????? wrote: > > Ok, you convinced me that these extra requests from PyCharm won't cause you any problems. Impressive stats, by the way :) > > We will focus on migrating our packaging-related features to these new endpoints; hopefully, it won't take long. Note, however, that we need to prepare updates for already released versions of PyCharm. We'll let you know as soon as everything is ready. > > Ernest W. Durbin III suggested changing User-Agent, so that it's clear which requests come from PyCharm. To me it seems a fair point. > > Batch API, as mentioned by Steve Dower, are very welcome, anyway. Also "/simple" index is still HTML page. Honestly, it's a bit cumbersome that this information can be received only by scraping HTML and for everything else there are JSON REST API and XML-RPC. Yea, I plan on a new ?next gen? API in Warehouse at some point that will be much cleaner overall and not require multiple different formats to use :). For the record, XML-RPC should be avoided where possible as well, we also can?t cache that in the CDN (because it?s a POST request to the same URL for all routes, and the CDN can?t inspect the body of a POST request to determine cache key). > > Is anyone from PyPA attending to EuroPython next week? We could discuss these matters further there. I?m not. I?m not sure if anyone else is. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From robin at reportlab.com Thu Jul 14 11:01:23 2016 From: robin at reportlab.com (Robin Becker) Date: Thu, 14 Jul 2016 16:01:23 +0100 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: References: Message-ID: On 14/07/2016 15:41, Ian Cordasco wrote: > Try: > > pip install --force-reinstall setuptools -U > I didn't do the force-reinstall and for some reason when I cleaned both ~/.cache/pip and ~/.pip the pip install -r requirements.txt did work. I have tried various solutions proposed in the past eg sudo apt-get install --reinstall python-pkg-resources but nothing seems to work. I did the cache cleanups in desperation mode. I did try pip install -U setuptools, but it says it is up to date. If this might be a case of the underlying python changing on the server I have turned off automatic security updates. I would like to try and understand this happens as then I might have some wya of fixing it. > On Thu, Jul 14, 2016 at 8:37 AM, Robin Becker wrote: >>> (myenv) rptlab at app0:~/myenv ......... >> Robin Becker -- Robin Becker From fungi at yuggoth.org Thu Jul 14 11:14:21 2016 From: fungi at yuggoth.org (Jeremy Stanley) Date: Thu, 14 Jul 2016 15:14:21 +0000 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: References: Message-ID: <20160714151421.GN2458@yuggoth.org> On 2016-07-14 16:01:23 +0100 (+0100), Robin Becker wrote: > On 14/07/2016 15:41, Ian Cordasco wrote: > >Try: > > > >pip install --force-reinstall setuptools -U > > > > I didn't do the force-reinstall and for some reason when I cleaned both > ~/.cache/pip and ~/.pip the pip install -r requirements.txt did work. > > I have tried various solutions proposed in the past eg > > sudo apt-get install --reinstall python-pkg-resources > > but nothing seems to work. > > I did the cache cleanups in desperation mode. > > I did try pip install -U setuptools, but it says it is up to date. > > If this might be a case of the underlying python changing on the server I > have turned off automatic security updates. > > I would like to try and understand this happens as then I might have some > wya of fixing it. You really should avoid mixing pip-installed packages in the system context with distro-provided Python libraries, otherwise you will run into these sorts of issues constantly. I help maintain some very, very large test infrastructure for Python-based tools and libraries: in scenarios where we use pip to install anything system-wide we first make sure to scrub every last distro-provided Python library from the system along with any other Python-based applications that might depend on them, and only then we bootstrap pip completely independent of distro packaging (downloading and running get-pip.py). Also whenever possible, we instead rely on pip install within virtualenvs _without_ --system-site-packages, so that there's no risk of interaction with any Python libraries that might somehow get subsequently installed on the system. -- Jeremy Stanley From robin at reportlab.com Thu Jul 14 12:23:24 2016 From: robin at reportlab.com (Robin Becker) Date: Thu, 14 Jul 2016 17:23:24 +0100 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: <20160714151421.GN2458@yuggoth.org> References: <20160714151421.GN2458@yuggoth.org> Message-ID: <09561887-c292-2e50-a4e8-1881259237ff@chamonix.reportlab.co.uk> On 14/07/2016 16:14, Jeremy Stanley wrote: > On 2016-07-14 16:01:23 +0100 (+0100), Robin Becker wrote: >> On 14/07/2016 15:41, Ian Cordasco wrote: >>> Try: ............. >> >> I would like to try and understand this happens as then I might have some >> wya of fixing it. > > You really should avoid mixing pip-installed packages in the system > context with distro-provided Python libraries, otherwise you will > run into these sorts of issues constantly. I help maintain some not really sure why this is an issue. All of my problems are in virtual environments and I never use the --system-site-packages flag. I have never used pip to install anything system wide (so far as I know). Step 0 in after setting up an environment is pip install -U pip setuptools Of course you are right in that the python is a distro provided one; and also the pip and virtualenv etc etc. > very, very large test infrastructure for Python-based tools and > libraries: in scenarios where we use pip to install anything > system-wide we first make sure to scrub every last distro-provided > Python library from the system along with any other Python-based > applications that might depend on them, and only then we bootstrap > pip completely independent of distro packaging (downloading and > running get-pip.py). Also whenever possible, we instead rely on pip > install within virtualenvs _without_ --system-site-packages, so that > there's no risk of interaction with any Python libraries that might > somehow get subsequently installed on the system. > I used always to build python from source in older ubuntus, but that was because we wanted the latest python 2.x etc etc. Using a local copy prevents the os from smashing stuff, but means more work whenever a serious upgrade is required. When I run python -mvirtualenv I end up with an environment that has pip and setuptools already. Are you saying I should do /usr/bin/python -mvirtualenv --no-pip --no-setuptools myenv myenv/bin/python get_pip.py and then proceed from there? Or is it foolish to rely on the system python at all? I haven't seen this problem in ubuntu 14.04, but that may be just luck. I certainly notice some new behaviour ie the system pip seems to want to assert --user whereas it used to fail for lack of rights in installing into /usr/lib/python.... -- Robin Becker From p.f.moore at gmail.com Thu Jul 14 12:39:04 2016 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 14 Jul 2016 17:39:04 +0100 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: <09561887-c292-2e50-a4e8-1881259237ff@chamonix.reportlab.co.uk> References: <20160714151421.GN2458@yuggoth.org> <09561887-c292-2e50-a4e8-1881259237ff@chamonix.reportlab.co.uk> Message-ID: On 14 July 2016 at 17:23, Robin Becker wrote: > I used always to build python from source in older ubuntus, but that was > because we wanted the latest python 2.x etc etc. Using a local copy prevents > the os from smashing stuff, but means more work whenever a serious upgrade > is required. > > When I run python -mvirtualenv I end up with an environment that has pip and > setuptools already. Are you saying I should do > > /usr/bin/python -mvirtualenv --no-pip --no-setuptools myenv > myenv/bin/python get_pip.py > > and then proceed from there? Or is it foolish to rely on the system python > at all? > > I haven't seen this problem in ubuntu 14.04, but that may be just luck. > > I certainly notice some new behaviour ie the system pip seems to want to > assert --user whereas it used to fail for lack of rights in installing into > /usr/lib/python.... Yes, apparently Ubuntu have patched pip to make --user the default. I've seen some bug reports caused by this patch, so it's possible that it's what is causing you problems here. Unfortunately, as that is an Ubuntu patch you'd need to report it to them. Paul From dholth at gmail.com Thu Jul 14 12:45:52 2016 From: dholth at gmail.com (Daniel Holth) Date: Thu, 14 Jul 2016 16:45:52 +0000 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: References: <20160714151421.GN2458@yuggoth.org> <09561887-c292-2e50-a4e8-1881259237ff@chamonix.reportlab.co.uk> Message-ID: It's from Debian. They had time to break pip, but they don't have time to fix it again. On Thu, Jul 14, 2016 at 12:39 PM Paul Moore wrote: > On 14 July 2016 at 17:23, Robin Becker wrote: > > I used always to build python from source in older ubuntus, but that was > > because we wanted the latest python 2.x etc etc. Using a local copy > prevents > > the os from smashing stuff, but means more work whenever a serious > upgrade > > is required. > > > > When I run python -mvirtualenv I end up with an environment that has pip > and > > setuptools already. Are you saying I should do > > > > /usr/bin/python -mvirtualenv --no-pip --no-setuptools myenv > > myenv/bin/python get_pip.py > > > > and then proceed from there? Or is it foolish to rely on the system > python > > at all? > > > > I haven't seen this problem in ubuntu 14.04, but that may be just luck. > > > > I certainly notice some new behaviour ie the system pip seems to want to > > assert --user whereas it used to fail for lack of rights in installing > into > > /usr/lib/python.... > > Yes, apparently Ubuntu have patched pip to make --user the default. > I've seen some bug reports caused by this patch, so it's possible that > it's what is causing you problems here. Unfortunately, as that is an > Ubuntu patch you'd need to report it to them. > > Paul > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fungi at yuggoth.org Thu Jul 14 13:30:33 2016 From: fungi at yuggoth.org (Jeremy Stanley) Date: Thu, 14 Jul 2016 17:30:33 +0000 Subject: [Distutils] can someone explain why I have started to get these pip failures In-Reply-To: <09561887-c292-2e50-a4e8-1881259237ff@chamonix.reportlab.co.uk> References: <20160714151421.GN2458@yuggoth.org> <09561887-c292-2e50-a4e8-1881259237ff@chamonix.reportlab.co.uk> Message-ID: <20160714173033.GP2458@yuggoth.org> On 2016-07-14 17:23:24 +0100 (+0100), Robin Becker wrote: > not really sure why this is an issue. All of my problems are in virtual > environments and I never use the --system-site-packages flag. > > I have never used pip to install anything system wide (so far as I know). Ahh, whoops, after rereading back through some of your earlier messages in the thread I see this is indeed all in the scope of a virtualenv. > Step 0 in after setting up an environment is pip install -U pip setuptools > > Of course you are right in that the python is a distro provided one; and > also the pip and virtualenv etc etc. [...] I've noticed in the past that for some reason the system pkg_resources can end up getting used by setuptools (on Debian/Ubuntu it's unvendored and split into a separate python-pkg-resources deb). I haven't had an opportunity to track it down. You might try myenv/bin/python -c 'import sys;print sys.path' and then check the results in order to see which the first one is to provide pkg_resources. Hunting around a bit, this looks similar to https://github.com/pypa/setuptools/issues/497 so see if any of the suggestions mentioned there help at all. > When I run python -mvirtualenv I end up with an environment that has pip and > setuptools already. Are you saying I should do > > /usr/bin/python -mvirtualenv --no-pip --no-setuptools myenv > myenv/bin/python get_pip.py [...] No, but you might try bootstrapping a newer version of virtualenv into its own virtualenv, like: virtualenv foo foo/bin/pip install -U pip setuptools virtualenv foo/bin/virtualenv myenv Doing that on some systems where I'm forced to start from distro-provided pip/setuptools/virtualenv has worked out consistently well. -- Jeremy Stanley From daniel at ddbeck.com Thu Jul 14 09:19:44 2016 From: daniel at ddbeck.com (Daniel D. Beck) Date: Thu, 14 Jul 2016 14:19:44 +0100 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> Message-ID: On Thu, Jul 14, 2016 at 11:57 AM, Wes Turner wrote: > Adding an embedded JS comments widget does/would add some additional > maintenance burden (because user-generated content). Free-form, user-generated content on PyPI would become a pathway for harassment and abuse. Introducing user-generated content on PyPI would necessarily put an emotional burden on package maintainers in addition to the maintenance burden (unless PyPI moderators are going to screen content before maintainers and users see it?given the dearth of resources for PyPI as it is, this strikes me as exceedingly unlikely). ?Daniel -------------- next part -------------- An HTML attachment was scrubbed... URL: From thomas at kluyver.me.uk Thu Jul 14 14:02:52 2016 From: thomas at kluyver.me.uk (Thomas Kluyver) Date: Thu, 14 Jul 2016 19:02:52 +0100 Subject: [Distutils] Add data-requires-python attribute to download links in simple repository API Message-ID: <1468519372.3694908.666461825.5C356355@webmail.messagingengine.com> In a discussion about how to allow pip to select a package version compatible with the target Python version, Donald suggested adding a data-requires-python attribute to links in the simple repository API, which pip uses to find candidate downloads. ... This would expose the Requires-Python metadata field already specified in PEP 345. There is a separate PR for setuptools to allow specifying this value. I have opened a PR to add this to PEP 503, which defines the simple repository API, and Donald asked that I post about it here for comments: https://github.com/python/peps/pull/56 Thanks, Thomas From dholth at gmail.com Thu Jul 14 17:30:51 2016 From: dholth at gmail.com (Daniel Holth) Date: Thu, 14 Jul 2016 21:30:51 +0000 Subject: [Distutils] Add data-requires-python attribute to download links in simple repository API In-Reply-To: <1468519372.3694908.666461825.5C356355@webmail.messagingengine.com> References: <1468519372.3694908.666461825.5C356355@webmail.messagingengine.com> Message-ID: Lgtm On Thu, Jul 14, 2016, 16:49 Thomas Kluyver wrote: > In a discussion about how to allow pip to select a package version > compatible with the target Python version, Donald suggested adding a > data-requires-python attribute to links in the simple repository API, > which pip uses to find candidate downloads. > > ... > > This would expose the Requires-Python metadata field already specified > in PEP 345. There is a separate PR for setuptools to allow specifying > this value. > > I have opened a PR to add this to PEP 503, which defines the simple > repository API, and Donald asked that I post about it here for comments: > > https://github.com/python/peps/pull/56 > > Thanks, > Thomas > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Thu Jul 14 18:51:20 2016 From: steve.dower at python.org (Steve Dower) Date: Thu, 14 Jul 2016 15:51:20 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> Message-ID: <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> On 14Jul2016 0619, Daniel D. Beck wrote: > Free-form, user-generated content on PyPI would become a pathway for > harassment and abuse. Introducing user-generated content on PyPI would > necessarily put an emotional burden on package maintainers in addition > to the maintenance burden (unless PyPI moderators are going to screen > content before maintainers and users see it?given the dearth of > resources for PyPI as it is, this strikes me as exceedingly unlikely). This is why I listed a set of restrictions to help prevent that: * 140 chars (flexible, but short enough to prevent rants) * users must be logged in * no external links * maintainers can delete/dispute comments * clear comments on each new release * one comment per user per package (implied, but I didn't explicitly call it out in my previous email) Do you really think this will be worse than the current state, where abusers *only* have access Twitter, github, reddit and email to harass package maintainers? Assuming harassment is not going to be a problem, is there value in letting people add comments directly on the page where users seem to keep ending up? Cheers, Steve From contact at ionelmc.ro Thu Jul 14 19:37:56 2016 From: contact at ionelmc.ro (=?UTF-8?Q?Ionel_Cristian_M=C4=83rie=C8=99?=) Date: Fri, 15 Jul 2016 02:37:56 +0300 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> Message-ID: On Fri, Jul 15, 2016 at 1:51 AM, Steve Dower wrote: > This is why I listed a set of restrictions to help prevent that: > > * 140 chars (flexible, but short enough to prevent rants) > ?Did you mean to write "provoke"? instead of "prevent"? If we can learn one thing from Twitter it's that such limit favors short and brutish comments over the more nuanced and thoughtful ones - that take way more character space of course. I don't get what all this fuss is about about comments on PyPI. Such feature seems unnecessary. There are plenty of ways to assess how well maintained a package is. If a package maintainer wants comments or feedback there's the url/long_description fields. Thanks, -- Ionel Cristian M?rie?, http://blog.ionelmc.ro -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Thu Jul 14 20:05:30 2016 From: donald at stufft.io (Donald Stufft) Date: Thu, 14 Jul 2016 20:05:30 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> Message-ID: <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> > On Jul 14, 2016, at 6:51 PM, Steve Dower wrote: > > On 14Jul2016 0619, Daniel D. Beck wrote: >> Free-form, user-generated content on PyPI would become a pathway for >> harassment and abuse. Introducing user-generated content on PyPI would >> necessarily put an emotional burden on package maintainers in addition >> to the maintenance burden (unless PyPI moderators are going to screen >> content before maintainers and users see it?given the dearth of >> resources for PyPI as it is, this strikes me as exceedingly unlikely). > > This is why I listed a set of restrictions to help prevent that: > > * 140 chars (flexible, but short enough to prevent rants) > * users must be logged in > * no external links > * maintainers can delete/dispute comments > * clear comments on each new release > * one comment per user per package (implied, but I didn't explicitly call it out in my previous email) > > Do you really think this will be worse than the current state, where abusers *only* have access Twitter, github, reddit and email to harass package maintainers? > > Assuming harassment is not going to be a problem, is there value in letting people add comments directly on the page where users seem to keep ending up? > I don?t believe you can assume that harassment is not going to be a problem. There?s a fundamental power dynamic here, where publishing your project to PyPI is not *entirely* optional. It?s optional in the sense that nobody is going to force you to publish your project there, but it?s hard to interact fully with the Python ecosystem as a whole for your project if you don?t at least add an entry for it there. Given that we have this dynamic, we need to be particularly careful how the features we add can be used against people, particularly against the most vulnerable people in our community. We sadly live in a world where our industry is incredibly toxic to, well basically everyone but white guys and are actively hostile towards efforts to seeing a community become more inclusive. These are people who will regularly create multiple twitter accounts in order to spam harassment at people (in 140 characters) to get around cases where the person has blocked them. These are people who will flood comments on GitHub issue trackers for projects they don?t even use to bitch about someone changing some pronouns to be more inclusive. Consider that a rude comment can completely crush someone?s motivation to learn Python, or to maintain a package. It can make our community seem all that more hostile and I don?t think the vast majority of comments are going to actually be very useful. I suspect they will largely be used as yet another support venue for random users who are confused (and deleting them doesn?t help those users either). We had a comments and review system years ago (before my time TBH) and the backlash against it was so great that it was a major point of contention on catalog-sig where Package authors wanted it to be gotten rid of and the maintainers at the time pushing back to keep it. We (obviously) eventually got rid of it, and I think that is pretty indicative of the idea in general. Any sort of user created content requires us, the people running PyPI, to moderate to some degree. We have to do it now with people who create projects with vulgar or offensive names and I don?t believe that we have the man power available to us to moderate the comments of a much larger feature that is going to incentivize people to make negative comments (and let?s be real, 95% of the comments are going to be negative, people rarely reach out to say they?re happy but they?re always ready to complain). This is a lot of words to say that I would be very against this kind of feature on PyPI. I am not *entirely* against some sort of automated marker for possibly unmaintained packages, but even that I?m sketchy on. Allowing people to poop their own content onto project pages for a project they don?t own is just not tenable I think. ? Donald Stufft From steve.dower at python.org Thu Jul 14 20:19:07 2016 From: steve.dower at python.org (Steve Dower) Date: Thu, 14 Jul 2016 17:19:07 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> Message-ID: Sure, if it's been tried before and people couldn't control themselves back then (before my time too, and the internet wasn't as blatantly toxic six+ years ago as it is now) then that's reason enough not to try again. I'm still keen to find a way to redirect people to useful forks or alternative packages that doesn't require thousands of mentions at conferences for all time ala PIL. Top-posted from my Windows Phone -----Original Message----- From: "Donald Stufft" Sent: ?7/?14/?2016 17:06 To: "Steve Dower" Cc: "Daniel D. Beck" ; "distutils-sig" Subject: Re: [Distutils] Outdated packages on pypi > On Jul 14, 2016, at 6:51 PM, Steve Dower wrote: > > On 14Jul2016 0619, Daniel D. Beck wrote: >> Free-form, user-generated content on PyPI would become a pathway for >> harassment and abuse. Introducing user-generated content on PyPI would >> necessarily put an emotional burden on package maintainers in addition >> to the maintenance burden (unless PyPI moderators are going to screen >> content before maintainers and users see it?given the dearth of >> resources for PyPI as it is, this strikes me as exceedingly unlikely). > > This is why I listed a set of restrictions to help prevent that: > > * 140 chars (flexible, but short enough to prevent rants) > * users must be logged in > * no external links > * maintainers can delete/dispute comments > * clear comments on each new release > * one comment per user per package (implied, but I didn't explicitly call it out in my previous email) > > Do you really think this will be worse than the current state, where abusers *only* have access Twitter, github, reddit and email to harass package maintainers? > > Assuming harassment is not going to be a problem, is there value in letting people add comments directly on the page where users seem to keep ending up? > I don?t believe you can assume that harassment is not going to be a problem. There?s a fundamental power dynamic here, where publishing your project to PyPI is not *entirely* optional. It?s optional in the sense that nobody is going to force you to publish your project there, but it?s hard to interact fully with the Python ecosystem as a whole for your project if you don?t at least add an entry for it there. Given that we have this dynamic, we need to be particularly careful how the features we add can be used against people, particularly against the most vulnerable people in our community. We sadly live in a world where our industry is incredibly toxic to, well basically everyone but white guys and are actively hostile towards efforts to seeing a community become more inclusive. These are people who will regularly create multiple twitter accounts in order to spam harassment at people (in 140 characters) to get around cases where the person has blocked them. These are people who will flood comments on GitHub issue trackers for projects they don?t even use to bitch about someone changing some pronouns to be more inclusive. Consider that a rude comment can completely crush someone?s motivation to learn Python, or to maintain a package. It can make our community seem all that more hostile and I don?t think the vast majority of comments are going to actually be very useful. I suspect they will largely be used as yet another support venue for random users who are confused (and deleting them doesn?t help those users either). We had a comments and review system years ago (before my time TBH) and the backlash against it was so great that it was a major point of contention on catalog-sig where Package authors wanted it to be gotten rid of and the maintainers at the time pushing back to keep it. We (obviously) eventually got rid of it, and I think that is pretty indicative of the idea in general. Any sort of user created content requires us, the people running PyPI, to moderate to some degree. We have to do it now with people who create projects with vulgar or offensive names and I don?t believe that we have the man power available to us to moderate the comments of a much larger feature that is going to incentivize people to make negative comments (and let?s be real, 95% of the comments are going to be negative, people rarely reach out to say they?re happy but they?re always ready to complain). This is a lot of words to say that I would be very against this kind of feature on PyPI. I am not *entirely* against some sort of automated marker for possibly unmaintained packages, but even that I?m sketchy on. Allowing people to poop their own content onto project pages for a project they don?t own is just not tenable I think. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Thu Jul 14 20:25:11 2016 From: donald at stufft.io (Donald Stufft) Date: Thu, 14 Jul 2016 20:25:11 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> Message-ID: > On Jul 14, 2016, at 8:19 PM, Steve Dower wrote: > > I'm still keen to find a way to redirect people to useful forks or alternative packages that doesn't require thousands of mentions at conferences for all time ala PIL. I?m not opposed to this but we?ll want to make sure we?re careful about how we do it. PIL is an easy example where the maintainer is gone and there is a community fork of it. But I struggle to come up with very many more examples of this where there is something that is: - Popular enough that enough people are tripping over it to make it worth it. - There is a clear successor (or successors). Off the top of my head I can only really think of PIL, and *maybe* suds. Unless there?s a lot of these maybe all we really need is a policy for when administrators can/will edit the page to direct people towards a different project or a way to add an admin message directing people to another project. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Thu Jul 14 21:21:47 2016 From: steve.dower at python.org (Steve Dower) Date: Thu, 14 Jul 2016 18:21:47 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> Message-ID: I forget the exact names but there's a range of SQL Server packages that also fit in here. Perhaps I get to hear more complaints about those because of where I work :) But you're right, it may be a small enough problem to handle it that way. Top-posted from my Windows Phone -----Original Message----- From: "Donald Stufft" Sent: ?7/?14/?2016 17:25 To: "Steve Dower" Cc: "Daniel D. Beck" ; "distutils-sig" Subject: Re: [Distutils] Outdated packages on pypi On Jul 14, 2016, at 8:19 PM, Steve Dower wrote: I'm still keen to find a way to redirect people to useful forks or alternative packages that doesn't require thousands of mentions at conferences for all time ala PIL. I?m not opposed to this but we?ll want to make sure we?re careful about how we do it. PIL is an easy example where the maintainer is gone and there is a community fork of it. But I struggle to come up with very many more examples of this where there is something that is: - Popular enough that enough people are tripping over it to make it worth it. - There is a clear successor (or successors). Off the top of my head I can only really think of PIL, and *maybe* suds. Unless there?s a lot of these maybe all we really need is a policy for when administrators can/will edit the page to direct people towards a different project or a way to add an admin message directing people to another project. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From mbacchi at gmail.com Thu Jul 14 19:45:26 2016 From: mbacchi at gmail.com (Matt Bacchi) Date: Thu, 14 Jul 2016 19:45:26 -0400 Subject: [Distutils] PyPI index workaround Message-ID: OT: I hope you're going to provide a setting to allow the user to disable this unnecessary and bandwith intensive 'feature'? -Matt From: "?????? ???????" To: Donald Stufft Cc: Dmitry Trofimov , distutils-sig at python.org Date: Wed, 13 Jul 2016 23:21:24 +0300 Subject: Re: [Distutils] PyPI index workaround Right, sorry, that initial question wasn't clear about that. We need the latest versions only for installed packages. Nonetheless, as you noted, it's still several dozens consecutive requests to "/simple/" for each PyCharm session of every user. Can you handle that? -------------- next part -------------- An HTML attachment was scrubbed... URL: From thomas at kluyver.me.uk Fri Jul 15 09:59:39 2016 From: thomas at kluyver.me.uk (Thomas Kluyver) Date: Fri, 15 Jul 2016 14:59:39 +0100 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> Message-ID: <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> On Fri, Jul 15, 2016, at 01:25 AM, Donald Stufft wrote: > Off the top of my head I can only really think of PIL, and *maybe* > suds. Unless there?s a lot of these maybe all we really need is a > policy for when administrators can/will edit the page to direct people > towards a different project or a way to add an admin message directing > people to another project. Proposal: let's put some such manual intervention policy in place for now. Apply it for PIL to point to Pillow, and query the active suds forks to see if there's a generally agreed successor. If this works well, great! If the admins are flooded with 'successor requests', then we can come back to question of an automated mechanism. If there are too many abandoned packages with competing successors, that's a trickier problem to solve, but at least we'd be considering it with more information. As further examples: pydot, pexpect and python-modernize have all been unmaintained, leading to forks springing up. In all three cases, some of the forkers eventually coordinated to contact the original maintainer, get upload rights, and make new releases with the original name. It would certainly have been nice if that could have happened sooner in each case, but I doubt that any technical fix would have made a big difference. Thomas -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat Jul 16 03:35:46 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 16 Jul 2016 17:35:46 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 15 July 2016 at 23:59, Thomas Kluyver wrote: > On Fri, Jul 15, 2016, at 01:25 AM, Donald Stufft wrote: > > Off the top of my head I can only really think of PIL, and *maybe* suds. > Unless there?s a lot of these maybe all we really need is a policy for when > administrators can/will edit the page to direct people towards a different > project or a way to add an admin message directing people to another > project. > > > Proposal: let's put some such manual intervention policy in place for now. > Apply it for PIL to point to Pillow, and query the active suds forks to see > if there's a generally agreed successor. > > If this works well, great! If the admins are flooded with 'successor > requests', then we can come back to question of an automated mechanism. If > there are too many abandoned packages with competing successors, that's a > trickier problem to solve, but at least we'd be considering it with more > information. +1, although I'd propose decoupling the policy aspect ("Project X has been declared unmaintained, with Project Y as its official successor") from the implementation aspect of how that policy is applied in PyPI and pip. That way we wouldn't be adding to the existing workload of the PyPI admins - their involvement would just be implementing the collective policy decisions of the PyPA, rather than being directly tasked with making those policy decisions themselves. For example, suppose we had a "Replacement Packages" page on packaging.python.org, that documented cases like PIL -> Pillow, where: - a de facto community standard package has become unmaintained - attempts to reach the existing maintainers to transfer ownership have failed - a de facto replacement package has emerged - the overall newcomer experience for the Python ecosystem is being harmed by legacy documentation that still recommends the old de facto standard Adding new entries to that page would then require filing an issue at https://github.com/pypa/python-packaging-user-guide/issues/ to establish: - the package being replaced is a de facto community standard - the package being replaced is important to the user experience of newcomers to the Python ecosystem - the package being replaced has become unmaintained - a newer community fork has gained sufficient traction to be deemed a de facto successor - the existing maintainer has been contacted, and is unresponsive to requests to accept help with maintenance If *all* of those points are credibly established, *then* the package replacement would be added to the "Replacement Packages" list on packaging.python.org. How that list was utilised in PyPI and pip, as well as in other package introspection tools (e.g. IDEs, VersionEye), would then be the decision of the designers of those tools. > As further examples: pydot, pexpect and python-modernize have all been > unmaintained, leading to forks springing up. In all three cases, some of the > forkers eventually coordinated to contact the original maintainer, get > upload rights, and make new releases with the original name. It would > certainly have been nice if that could have happened sooner in each case, > but I doubt that any technical fix would have made a big difference. The PyCA folks obtaining maintenance access to PyOpenSSL would be another example of this being navigated successfully without a long term split. One of the longest running eventually resolved examples to date would be the multi-year setuptools/distribute split, and I'd actually consider that the ideal outcome of this process in general: while we understand entirely that folks may need to step away from open source software maintenance for a wide range of reasons, we strongly prefer to see projects providing critical functionality handed over to a new set of maintainers that have earned the trust of either the original maintainer or the wider community rather than letting them languish indefinitely. We can't mandate that any given project invest time in succession planning though, so having a system in place to designate successor projects at the ecosystem level when maintainers aren't able to resolve it at a project level makes sense. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Sat Jul 16 09:47:16 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 16 Jul 2016 08:47:16 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On Jul 16, 2016 3:36 AM, "Nick Coghlan" wrote: > > On 15 July 2016 at 23:59, Thomas Kluyver wrote: > > On Fri, Jul 15, 2016, at 01:25 AM, Donald Stufft wrote: > > > > Off the top of my head I can only really think of PIL, and *maybe* suds. > > Unless there?s a lot of these maybe all we really need is a policy for when > > administrators can/will edit the page to direct people towards a different > > project or a way to add an admin message directing people to another > > project. > > > > > > Proposal: let's put some such manual intervention policy in place for now. > > Apply it for PIL to point to Pillow, and query the active suds forks to see > > if there's a generally agreed successor. > > > > If this works well, great! If the admins are flooded with 'successor > > requests', then we can come back to question of an automated mechanism. If > > there are too many abandoned packages with competing successors, that's a > > trickier problem to solve, but at least we'd be considering it with more > > information. > > +1, although I'd propose decoupling the policy aspect ("Project X has > been declared unmaintained, with Project Y as its official successor") > from the implementation aspect of how that policy is applied in PyPI > and pip. That way we wouldn't be adding to the existing workload of > the PyPI admins - their involvement would just be implementing the > collective policy decisions of the PyPA, rather than being directly > tasked with making those policy decisions themselves. > > For example, suppose we had a "Replacement Packages" page on > packaging.python.org, that documented cases like PIL -> Pillow, where: > > - a de facto community standard package has become unmaintained > - attempts to reach the existing maintainers to transfer ownership have failed > - a de facto replacement package has emerged > - the overall newcomer experience for the Python ecosystem is being > harmed by legacy documentation that still recommends the old de facto > standard > > Adding new entries to that page would then require filing an issue at > https://github.com/pypa/python-packaging-user-guide/issues/ to > establish: > > - the package being replaced is a de facto community standard > - the package being replaced is important to the user experience of > newcomers to the Python ecosystem > - the package being replaced has become unmaintained > - a newer community fork has gained sufficient traction to be deemed a > de facto successor > - the existing maintainer has been contacted, and is unresponsive to > requests to accept help with maintenance > > If *all* of those points are credibly established, *then* the package > replacement would be added to the "Replacement Packages" list on > packaging.python.org. > > How that list was utilised in PyPI and pip, as well as in other > package introspection tools (e.g. IDEs, VersionEye), would then be the > decision of the designers of those tools. So, there could be RDFa in the project detail pages and a JSONLD key/dict in the project metadata indicating this community-reviewed edge or edges. As an unreified triple: (pypi:PIL pypa:recommendsPackageInstead pypi:pillow) As a reified edge: _:1234 a pypa:SupersededByEdge; schema:dateCreated iso8601; schema:description "reason"; schema:url ; pypa:origPackage pypi:PIL; pypa:otherPackage pypi:pillow; . These are N3/Turtle syntax; which is expressable as a JSONLD block in HTML and/or as RDFa HTML. (#PEP426JSONLD) > > > As further examples: pydot, pexpect and python-modernize have all been > > unmaintained, leading to forks springing up. In all three cases, some of the > > forkers eventually coordinated to contact the original maintainer, get > > upload rights, and make new releases with the original name. It would > > certainly have been nice if that could have happened sooner in each case, > > but I doubt that any technical fix would have made a big difference. > > The PyCA folks obtaining maintenance access to PyOpenSSL would be > another example of this being navigated successfully without a long > term split. > > One of the longest running eventually resolved examples to date would > be the multi-year setuptools/distribute split, and I'd actually > consider that the ideal outcome of this process in general: while we > understand entirely that folks may need to step away from open source > software maintenance for a wide range of reasons, we strongly prefer > to see projects providing critical functionality handed over to a new > set of maintainers that have earned the trust of either the original > maintainer or the wider community rather than letting them languish > indefinitely. > > We can't mandate that any given project invest time in succession > planning though, so having a system in place to designate successor > projects at the ecosystem level when maintainers aren't able to > resolve it at a project level makes sense. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From xav.fernandez at gmail.com Sun Jul 17 07:32:30 2016 From: xav.fernandez at gmail.com (Xavier Fernandez) Date: Sun, 17 Jul 2016 13:32:30 +0200 Subject: [Distutils] Add data-requires-python attribute to download links in simple repository API In-Reply-To: References: <1468519372.3694908.666461825.5C356355@webmail.messagingengine.com> Message-ID: LGTM also :) For the record, I've started a related PR to check Requires-Python in pip at install time: https://github.com/pypa/pip/pull/3846 It should still be useful once PEP 503 is amended, for packages or package indexes that could not expose the Requires-Python information via this new mecanism. Regards, Xavier On Thu, Jul 14, 2016 at 11:30 PM, Daniel Holth wrote: > Lgtm > > On Thu, Jul 14, 2016, 16:49 Thomas Kluyver wrote: > >> In a discussion about how to allow pip to select a package version >> compatible with the target Python version, Donald suggested adding a >> data-requires-python attribute to links in the simple repository API, >> which pip uses to find candidate downloads. >> >> ... >> >> This would expose the Requires-Python metadata field already specified >> in PEP 345. There is a separate PR for setuptools to allow specifying >> this value. >> >> I have opened a PR to add this to PEP 503, which defines the simple >> repository API, and Donald asked that I post about it here for comments: >> >> https://github.com/python/peps/pull/56 >> >> Thanks, >> Thomas >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sun Jul 17 09:06:22 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 Jul 2016 23:06:22 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 16 July 2016 at 23:47, Wes Turner wrote: > So, there could be RDFa in the project detail pages and a JSONLD key/dict in > the project metadata indicating this community-reviewed edge or edges. Wes, once again, please stop attempting to inject JSON-LD into every metadata discussion we have on this list. We already know you like it. However, despite being simpler than RDFa, JSON-LD is still overengineered for our purposes, so we're not going to ask people to go read the JSON-LD spec in order to understand any particular aspect of our APIs or metadata formats. If you'd like to set up a PyPI dataset in one of the semantic web projects, then please feel free to do so, but adding that kind of information to the upstream metadata isn't a priority, any more than it's a priority to add native support for things like CPE [1] or SWID [2]. Regards, Nick. [1] https://scap.nist.gov/specifications/cpe/ [2] http://tagvault.org/swid-tags/what-are-swid-tags/ -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Sun Jul 17 12:56:53 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sun, 17 Jul 2016 11:56:53 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: If you have an alternate way to represent a graph with JSON, which is indexable as as RDF named graph quads and cryptographically signable irrespective of data ordering or representation format (RDFa, JSONLD) with ld-signatures, I'd be interested to hear how said format solves for that problem. These contain checksums. https://web-payments.org/specs/source/ld-signatures/ RDFa (with http://schema.org/SoftwareApplication and pypa:-specific classes and properties (types and attributes) for things like SupersededBy)) would be advantageous because, then, public and local RDF(a) search engines could easily assist with structured data search. Adding data-request-pythonver *is* a stopgap solution which, initially, seems less bandwidth-burdensome; but requires all downstream data consumers to re-implement and ad-hoc parser; which is unnecessary if it's already understood that a graph description semantics which works in JSON (as JSONLD) and HTML has already solved that problem many times over. On Jul 17, 2016 9:06 AM, "Nick Coghlan" wrote: > On 16 July 2016 at 23:47, Wes Turner wrote: > > So, there could be RDFa in the project detail pages and a JSONLD > key/dict in > > the project metadata indicating this community-reviewed edge or edges. > > Wes, once again, please stop attempting to inject JSON-LD into every > metadata discussion we have on this list. We already know you like it. > > However, despite being simpler than RDFa, JSON-LD is still > overengineered for our purposes, so we're not going to ask people to > go read the JSON-LD spec in order to understand any particular aspect > of our APIs or metadata formats. > > If you'd like to set up a PyPI dataset in one of the semantic web > projects, then please feel free to do so, but adding that kind of > information to the upstream metadata isn't a priority, any more than > it's a priority to add native support for things like CPE [1] or SWID > [2]. > > Regards, > Nick. > > [1] https://scap.nist.gov/specifications/cpe/ > [2] http://tagvault.org/swid-tags/what-are-swid-tags/ > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jul 19 02:37:45 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 19 Jul 2016 16:37:45 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 18 July 2016 at 02:56, Wes Turner wrote: > If you have an alternate way to represent a graph with JSON, which is > indexable as as RDF named graph quads and cryptographically signable > irrespective of data ordering or representation format (RDFa, JSONLD) with > ld-signatures, > I'd be interested to hear how said format solves for that problem. It doesn't, but someone *that isn't PyPI* can still grab the data set, throw it into a graph database like Neo4j, calculate the cross references, and then republish the result as a publicly available data set for the semantic web. That way, the semantic linking won't need to be limited just to the Python ecosystem, it will be able to span ecosystems, as happens with cases like npm build dependencies (where node-gyp is the de facto C extension build toolchain for Node.js, and that's written in Python, so NPM dependency analysis needs to be able to cross the gap into the Python packaging world) and with frontend asset pipelines in Python (where applications often want to bring in additional JavaScript dependencies via npm rather than vendoring them). Given that we already have services like libraries.io and release-monitoring.org for ecosystem independent tracking of upstream releases, they're more appropriate projects to target for the addition of semantic linking support to project metadata, as having one or two public semantic linking projects like that for the entirety of the open source ecosystem would make a lot more sense than each language community creating their own independent solutions that would still need to be stitched together later. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Tue Jul 19 03:25:11 2016 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 Jul 2016 02:25:11 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On Jul 19, 2016 2:37 AM, "Nick Coghlan" wrote: > > On 18 July 2016 at 02:56, Wes Turner wrote: > > If you have an alternate way to represent a graph with JSON, which is > > indexable as as RDF named graph quads and cryptographically signable > > irrespective of data ordering or representation format (RDFa, JSONLD) with > > ld-signatures, > > I'd be interested to hear how said format solves for that problem. > > It doesn't, but someone *that isn't PyPI* can still grab the data set, > throw it into a graph database like Neo4j, calculate the cross > references, and then republish the result as a publicly available data > set for the semantic web. That way, the semantic linking won't need to > be limited just to the Python ecosystem, it will be able to span > ecosystems, as happens with cases like npm build dependencies (where > node-gyp is the de facto C extension build toolchain for Node.js, and > that's written in Python, so NPM dependency analysis needs to be able > to cross the gap into the Python packaging world) and with frontend > asset pipelines in Python (where applications often want to bring in > additional JavaScript dependencies via npm rather than vendoring > them). > > Given that we already have services like libraries.io and > release-monitoring.org for ecosystem independent tracking of upstream > releases, they're more appropriate projects to target for the addition > of semantic linking support to project metadata, as having one or two > public semantic linking projects like that for the entirety of the > open source ecosystem would make a lot more sense than each language > community creating their own independent solutions that would still > need to be stitched together later. so, language/packaging-specific subclasses of e.g http://schema.org/SoftwareApplication and native linked data would reduce the need for post-hoc parsing and batch-processing. there are many benefits to being able to JOIN on URIs and version strings here. I'll stop now because OT; the relevant concern here was/is that, if there are PyPI-maintainer redirects to other packages, that metadata should probably be signed (and might as well be JSONLD, because this is a graph of packages and metadata). And there should be a disclaimer regarding auto-following said redirects. Also, --find-links makes it dangerous to include comments with links. #PEP426JSONLD > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Tue Jul 19 04:13:11 2016 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 Jul 2016 03:13:11 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: so, there's a need for specifying the {PyPI} package URI in setup.py and then generating meta.jsonld from setup.py and then generating JSONLD in a warehouse/pypa view; because that's where they keep the actual metadara (package platform versions, checksums, potentially supersededBy redirects) and then a signing key for a) package maintainer-supplied metadata and b) package repository metadata (which is/would be redundant but comforting) and then third-party services like NVD, CVEdetails, and stack metadata aggregation services - "PEP 426: Define a JSON-LD context as part of the proposal" https://github.com/pypa/interoperability-peps/issues/31 - "Expressing dependencies (between data, software, content...)" https://github.com/schemaorg/schemaorg/issues/975 sorry to hijack the thread; i hear "more links and metadata in an auxilliary schema" and think 'RDF is the semantic web solution for this graph problem' On Jul 19, 2016 3:59 AM, "Nick Coghlan" wrote: > On 19 July 2016 at 17:25, Wes Turner wrote: > > On Jul 19, 2016 2:37 AM, "Nick Coghlan" wrote: > >> Given that we already have services like libraries.io and > >> release-monitoring.org for ecosystem independent tracking of upstream > >> releases, they're more appropriate projects to target for the addition > >> of semantic linking support to project metadata, as having one or two > >> public semantic linking projects like that for the entirety of the > >> open source ecosystem would make a lot more sense than each language > >> community creating their own independent solutions that would still > >> need to be stitched together later. > > > > so, language/packaging-specific subclasses of e.g > > http://schema.org/SoftwareApplication and native linked data would > reduce > > the need for post-hoc parsing and batch-processing. > > Anyone sufficiently interested in the large scale open source > dependency management problem to fund work on it is going to want a > language independent solution, rather than a language specific one. > Folks only care about unambiguous software identification systems like > CPE and SWID when managing large infrastructure installations, and any > system of infrastructure that large is going to be old enough and > sprawling enough to include multiple language stacks. > > At the same time, nobody cares about this kind of problem when all > they want to do is publish their hobby project or experimental proof > of concept somewhere that their friends and peers can easily get to > it, which means it doesn't make sense to expect all software > publishers to provide the information themselves, and as a language > ecosystem with a strong focus on inclusive education, we *certainly* > don't want to make it a barrier to engagement with Python's default > publishing toolchain. > > > there are many benefits to being able to JOIN on URIs and version strings > > here. > > > > I'll stop now because OT; the relevant concern here was/is that, if > there > > are PyPI-maintainer redirects to other packages, that metadata should > > probably be signed > > Metadata signing support is a different problem, and one we want to > pursue for a range of reasons. > > > (and might as well be JSONLD, because this is a graph of > > packages and metadata) > > There is no "might as well" here. At the language level, there's a > relevant analogy with Guido's work on gradual typing - talk to someone > for whom a 20 person team is small, and a 10k line project is barely > worth mentioning and their reaction is going to be "of course you want > to support static type analysis", while someone that thinks a 5 person > team is unthinkably large and a 1k line utility is terribly bloated > isn't going to see any value in it whatsoever. > > In the context of packaging metadata, supporting JSON-LD and RDFa is > akin to providing PEP 434 type information for Python APIs - are they > potentially useful? Absolutely. Are there going to be folks that see > the value in them, and invest the time in designing a way to use them > to describe Python packages? Absolutely (and depending on how a few > other things work out, one of them may even eventually be me in a > release-monitoring.org context). > > But it doesn't follow that it then makes sense to make them a > *dependency* of our interoperability specifications, rather than an > optional add-on - we want folks just doing relatively simple things > (like writing web services in Python) to be able to remain blissfully > unaware that there's a world of large scale open source software > supply chain management out there that benefits from having ways of > doing things that are standardised across language ecosystems. > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jul 19 03:59:43 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 19 Jul 2016 17:59:43 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 19 July 2016 at 17:25, Wes Turner wrote: > On Jul 19, 2016 2:37 AM, "Nick Coghlan" wrote: >> Given that we already have services like libraries.io and >> release-monitoring.org for ecosystem independent tracking of upstream >> releases, they're more appropriate projects to target for the addition >> of semantic linking support to project metadata, as having one or two >> public semantic linking projects like that for the entirety of the >> open source ecosystem would make a lot more sense than each language >> community creating their own independent solutions that would still >> need to be stitched together later. > > so, language/packaging-specific subclasses of e.g > http://schema.org/SoftwareApplication and native linked data would reduce > the need for post-hoc parsing and batch-processing. Anyone sufficiently interested in the large scale open source dependency management problem to fund work on it is going to want a language independent solution, rather than a language specific one. Folks only care about unambiguous software identification systems like CPE and SWID when managing large infrastructure installations, and any system of infrastructure that large is going to be old enough and sprawling enough to include multiple language stacks. At the same time, nobody cares about this kind of problem when all they want to do is publish their hobby project or experimental proof of concept somewhere that their friends and peers can easily get to it, which means it doesn't make sense to expect all software publishers to provide the information themselves, and as a language ecosystem with a strong focus on inclusive education, we *certainly* don't want to make it a barrier to engagement with Python's default publishing toolchain. > there are many benefits to being able to JOIN on URIs and version strings > here. > > I'll stop now because OT; the relevant concern here was/is that, if there > are PyPI-maintainer redirects to other packages, that metadata should > probably be signed Metadata signing support is a different problem, and one we want to pursue for a range of reasons. > (and might as well be JSONLD, because this is a graph of > packages and metadata) There is no "might as well" here. At the language level, there's a relevant analogy with Guido's work on gradual typing - talk to someone for whom a 20 person team is small, and a 10k line project is barely worth mentioning and their reaction is going to be "of course you want to support static type analysis", while someone that thinks a 5 person team is unthinkably large and a 1k line utility is terribly bloated isn't going to see any value in it whatsoever. In the context of packaging metadata, supporting JSON-LD and RDFa is akin to providing PEP 434 type information for Python APIs - are they potentially useful? Absolutely. Are there going to be folks that see the value in them, and invest the time in designing a way to use them to describe Python packages? Absolutely (and depending on how a few other things work out, one of them may even eventually be me in a release-monitoring.org context). But it doesn't follow that it then makes sense to make them a *dependency* of our interoperability specifications, rather than an optional add-on - we want folks just doing relatively simple things (like writing web services in Python) to be able to remain blissfully unaware that there's a world of large scale open source software supply chain management out there that benefits from having ways of doing things that are standardised across language ecosystems. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From qsolo825 at gmail.com Tue Jul 19 06:21:59 2016 From: qsolo825 at gmail.com (=?UTF-8?B?0JzQuNGF0LDQuNC7INCT0L7Qu9GD0LHQtdCy?=) Date: Tue, 19 Jul 2016 12:21:59 +0200 Subject: [Distutils] PyPI index workaround In-Reply-To: References: Message-ID: Well, it isn't that bandwidth intensive, these recent changes only cause extra (but tiny and potentially faster) requests to PyPI. We've also decided to update versions of packages only once a day unless user explicitly forced refreshing of packages list. If PyPI is not available for some reason, e.g. due to network problems, we merely don't show package versions, it won't break any other IDE functionality. 2016-07-15 1:45 GMT+02:00 Matt Bacchi : > OT: I hope you're going to provide a setting to allow the user to disable > this unnecessary and bandwith intensive 'feature'? > > -Matt > > From: "?????? ???????" > To: Donald Stufft > Cc: Dmitry Trofimov , > distutils-sig at python.org > Date: Wed, 13 Jul 2016 23:21:24 +0300 > Subject: Re: [Distutils] PyPI index workaround > Right, sorry, that initial question wasn't clear about that. > > We need the latest versions only for installed packages. Nonetheless, as > you noted, it's still several dozens consecutive requests to > "/simple/" for each PyCharm session of every user. > > Can you handle that? > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -- Best regards Mikhail Golubev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jul 19 08:44:23 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 19 Jul 2016 22:44:23 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 19 July 2016 at 18:13, Wes Turner wrote: > so, there's a need for specifying the {PyPI} package URI in setup.py Not really - tools can make a reasonable guess about the source PyPI URL based purely on the name and version. For non-PyPI hosted packages, the extra piece of info needed is the index server URL. > and then generating meta.jsonld from setup.py No, a JSON-LD generator would start with a rendered metadata format, not the raw setup.py. > and then generating JSONLD in a warehouse/pypa view; because that's where > they keep the actual metadara (package platform versions, checksums, > potentially supersededBy redirects) No, there is no requirement for this to be a PyPI feature. Absolutely none. > and then a signing key for a) package maintainer-supplied metadata and b) > package repository metadata (which is/would be redundant but comforting) This is already covered (thoroughly) in PEPs 458 and 480, and has nothing to do with metadata linking. > and then third-party services like NVD, CVEdetails, and stack metadata > aggregation services And this is the other reason why it doesn't make sense to do this on PyPI itself - the publisher provided metadata from PyPI is only one piece of the project metadata puzzle (issue trackers and source code repositories are another one, as are the communication metrics collected by the likes of Bitergia). For a data aggregator, supporting multiple language ecosystems, and multiple issue trackers, and multiple code hosting sites is an M+N+O scale problem (where M is the number of language ecosystems supported, etc). By contrast, if you try to solve this problem in the package publication services for each individual language, you turn it into an M*(N+O) scale problem, where you need to give each language-specific service the ability to collect metadata from all those other sources. This means that since we don't have a vested interest in adding more functionality to PyPI that doesn't specifically *need* to be there (and in fact actively want to avoid doing so), we can say "Conformance to semantic web standards is a problem for aggregation services like libraries.io and release-monitoring.org to solve, not for us to incorporate directly into PyPI". > sorry to hijack the thread; i hear "more links and metadata in an > auxilliary schema" and think 'RDF is the semantic web solution for this > graph problem' I know, and you're not wrong about that. Where you're running into trouble is that you're trying to insist that it is the responsibility of the initial data *publishers* to conform to the semantic web standards, and it *isn't* - that job is one for the data aggregators that have an interest in making it easier for people to work across multiple data sets managed by different groups of people. For publication platforms managing a single subgraph, native support for JSON-LD and RDFa introduces unwanted complexity by expanding the data model to incorporate all of the relational concepts defined in those standards. Well funded platforms may have the development capacity to spare to spend time on such activities, but PyPI isn't such a platform. By contrast, for aggregators managing a graph-of-graphs problem, JSON-LD and RDFa introduce normalisation across data sets that *reduces* overall complexity, since most of the details of the subgraphs can be ignored, as you focus instead on the links between the entities they contain. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Tue Jul 19 11:41:25 2016 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 Jul 2016 10:41:25 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On Jul 19, 2016 8:44 AM, "Nick Coghlan" wrote: > > On 19 July 2016 at 18:13, Wes Turner wrote: > > so, there's a need for specifying the {PyPI} package URI in setup.py > > Not really - tools can make a reasonable guess about the source PyPI > URL based purely on the name and version. For non-PyPI hosted > packages, the extra piece of info needed is the index server URL. So, the index server URL is in pip.conf or .pydistutils.cfg or setup.cfg OR specified on the commandline? > > > and then generating meta.jsonld from setup.py > > No, a JSON-LD generator would start with a rendered metadata format, > not the raw setup.py. "pydist.json", my mistake https://github.com/pypa/interoperability-peps/issues/31#issuecomment-139657247 - pydist.json - metadata.json (wheel) - pydist.jsonld > > > and then generating JSONLD in a warehouse/pypa view; because that's where > > they keep the actual metadara (package platform versions, checksums, > > potentially supersededBy redirects) > > No, there is no requirement for this to be a PyPI feature. Absolutely none. > > > and then a signing key for a) package maintainer-supplied metadata and b) > > package repository metadata (which is/would be redundant but comforting) > > This is already covered (thoroughly) in PEPs 458 and 480, and has > nothing to do with metadata linking. ld-signatures can be used to sign {RDF, JSONLD, RDFa}; and attach the signature to the document. https://web-payments.org/specs/source/ld-signatures/ - JWS only works with JSON formats (and not RDF) https://www.python.org/dev/peps/pep-0480/ - Does this yet include signing potentially cached JSON metadata used by actual tools like e.g. pip? - How do you feel about redirects because superseded and nobody can convince the maintainer to update the long_description? > > > and then third-party services like NVD, CVEdetails, and stack metadata > > aggregation services > > And this is the other reason why it doesn't make sense to do this on > PyPI itself - the publisher provided metadata from PyPI is only one > piece of the project metadata puzzle (issue trackers and source code > repositories are another one, as are the communication metrics > collected by the likes of Bitergia). AFAIU, the extra load of fielding vulnerability reports for responsibly PyPI-hosted packages is beyond the scope of the PyPI and Warehouse packages. > > For a data aggregator, supporting multiple language ecosystems, and > multiple issue trackers, and multiple code hosting sites is an M+N+O > scale problem (where M is the number of language ecosystems supported, > etc). By contrast, if you try to solve this problem in the package > publication services for each individual language, you turn it into an > M*(N+O) scale problem, where you need to give each language-specific > service the ability to collect metadata from all those other sources. Are you saying that, for release-monitoring.org (a service you are somehow financially associated with), you have already invested the time to read the existing PyPI metadata; but not eg the 'python' or 'python-dev' OS package metadata? Debian has an RDF endpoint. - https://packages.qa.debian.org/p/python-defaults.html - https://packages.qa.debian.org/p/python-defaults.ttl - But there's yet no easy way to JOIN metadata down the graph of downstream OS packages to PyPI archives to source repository changesets; not without RDF and not without writing unnecessary language/packaging-community-specific {INI,JSON,TOML, YAMLLD } parsers. O-estimations aside, when a data publisher publishes web standard data, everyone can benefit; because upper bound network effects N**2 (Metcalf's Law) > > This means that since we don't have a vested interest in adding more > functionality to PyPI that doesn't specifically *need* to be there > (and in fact actively want to avoid doing so), we can say "Conformance > to semantic web standards is a problem for aggregation services like > libraries.io and release-monitoring.org to solve, not for us to > incorporate directly into PyPI". A view producing JSONLD. Probably right about here: https://github.com/pypa/warehouse/blob/master/warehouse/packaging/views.py Because there are a few (possibly backwards compatible) changes that could be made here so that we could just add @context to the existing JSON record (thus making it JSONLD, which anyone can read and index without a domain-specific parser): https://github.com/pypa/warehouse/blob/master/warehouse/legacy/api/json.py IIRC: https://github.com/pypa/interoperability-peps/issues/31#issuecomment-233195564 > > > sorry to hijack the thread; i hear "more links and metadata in an > > auxilliary schema" and think 'RDF is the semantic web solution for this > > graph problem' > > I know, and you're not wrong about that. Where you're running into > trouble is that you're trying to insist that it is the responsibility > of the initial data *publishers* to conform to the semantic web > standards, and it *isn't* - that job is one for the data aggregators > that have an interest in making it easier for people to work across > multiple data sets managed by different groups of people. No, after-the-fact transformation is wasteful and late. A bit of advice for data publishers: http://5stardata.info/en/ > > For publication platforms managing a single subgraph, native support > for JSON-LD and RDFa introduces unwanted complexity by expanding the > data model to incorporate all of the relational concepts defined in > those standards. Well funded platforms may have the development > capacity to spare to spend time on such activities, but PyPI isn't > such a platform. This is Warehouse: https://github.com/pypa/warehouse It is maintainable. https://www.pypa.io/en/latest/help/ > > By contrast, for aggregators managing a graph-of-graphs problem, > JSON-LD and RDFa introduce normalisation across data sets that > *reduces* overall complexity, since most of the details of the > subgraphs can be ignored, as you focus instead on the links between > the entities they contain. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jul 19 22:45:02 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 20 Jul 2016 12:45:02 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 20 July 2016 at 01:41, Wes Turner wrote: > A view producing JSONLD. > > Probably right about here: > https://github.com/pypa/warehouse/blob/master/warehouse/packaging/views.py Then stop trying to guilt other people into implementing JSON-LD support for you, and submit a patch to implement it yourself. Requirements: - zero additional learning overhead for newcomers to Python packaging - near-zero additional maintenance overhead for tooling maintainers that don't care about the semantic web If you can meet those requirements, then your rationale of "package dependencies are a linked graph represented as JSON, so we might as well support expressing them as JSON-LD" applies. Your best bet for that would likely be to make it an optional Warehouse feature (e.g. an alternate endpoint that adds the JSON-LD metadata), rather than a formal part of the interoperability specifications. If you find you can't make it unobtrusive and optional, then you'd be proving my point that introducing JSON-LD adds further cognitive overhead to an already complicated system for zero practical gain to the vast majority of users of that system. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Wed Jul 20 00:13:03 2016 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 Jul 2016 23:13:03 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: I think you're right that we should identify the stakeholders here. Which clients consume PyPI JSON? @dstufft Is there a User Agent report for the PyPI and the warehouse legacy JSON views? ... https://code.activestate.com/lists/python-distutils-sig/25457/ Are there still pending metadata PEPs that would also need to be JSONLD-ified? On Jul 19, 2016 10:45 PM, "Nick Coghlan" wrote: > > On 20 July 2016 at 01:41, Wes Turner wrote: > > A view producing JSONLD. > > > > Probably right about here: > > https://github.com/pypa/warehouse/blob/master/warehouse/packaging/views.py > > Then stop trying to guilt other people into implementing JSON-LD > support for you, and submit a patch to implement it yourself. > > Requirements: > > - zero additional learning overhead for newcomers to Python packaging Should be transparent to the average bear. > - near-zero additional maintenance overhead for tooling maintainers > that don't care about the semantic web Is it of value to link CVE reports with the package metadata? > > If you can meet those requirements, then your rationale of "package > dependencies are a linked graph represented as JSON, so we might as > well support expressing them as JSON-LD" applies. Your best bet for > that would likely be to make it an optional Warehouse feature (e.g. an > alternate endpoint that adds the JSON-LD metadata), rather than a > formal part of the interoperability specifications. - Another cached view > > If you find you can't make it unobtrusive and optional, then you'd be > proving my point that introducing JSON-LD adds further cognitive > overhead to an already complicated system for zero practical gain to > the vast majority of users of that system. There are a number of additional value propositions and use cases here: https://github.com/pypa/interoperability-peps/issues/31 When I find the time > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Wed Jul 20 00:24:35 2016 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 Jul 2016 23:24:35 -0500 Subject: [Distutils] Warehouse re: Celery < 4 In-Reply-To: References: Message-ID: >From @asksol "Time to pin your versions if you haven?t already. Celery 4 is out soon: https://t.co/XpZqbjt91t" http://docs.celeryproject.org/en/master/whatsnew-4.0.html - https://github.com/pypa/warehouse/blob/master/requirements/main.in > celery>=3.1 Should be: celery>3.1<4.0 I'm on my phone -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed Jul 20 00:57:33 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 20 Jul 2016 14:57:33 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <38B36DC7-984D-41DB-A694-2C482FFC89E3@twistedmatrix.com> <6C21342D-2D50-40A4-B9E4-3F6F2C41060B@stufft.io> <94D4D9B2-4BA1-4235-A262-9069737BB7A6@twistedmatrix.com> <6bfd6749-ec17-5151-aa88-e4c1756dba00@python.org> <4ED1229F-69E0-4090-8F37-7053A4A6C1ED@twistedmatrix.com> <5786EBAB.5060709@python.org> <1b9986f6-8b99-bd87-3555-fc34ac89a412@python.org> <5E046B3F-8E62-4E4D-8777-46C56ABEF0D4@stufft.io> <1468591179.1297610.667247169.007C4342@webmail.messagingengine.com> Message-ID: On 20 July 2016 at 14:13, Wes Turner wrote: >> - near-zero additional maintenance overhead for tooling maintainers >> that don't care about the semantic web > > Is it of value to link CVE reports with the package metadata? On PyPI, the main value would be in publisher notification (i.e. if folks maintaining projects on PyPI aren't tracking CVE reports directly, it would be nice if they could opt in to having PyPI do it for them rather than having to learn how to navigate the CVE ecosystem themselves - "Maintainers are actively monitoring CVE notifications" would then become a piece of metadata PyPI could potentially publish to help distinguish people's personal side projects from projects with funded developers supporting them). Similarly, given suitable investment in Warehouse development, PyPI could be enhanced to provide a front-end to the experimental Distributed Weakness Filing system, where folks can request assignment of CVE numbers in a more automated way than the traditional process. However, for clients, the problem with relying on PyPI for CVE notifications is that what you actually want as a developer is a situation where your security notifications are independent of the particular ecosystem providing the components, and also where a compromise of your connection to the software publication platform doesn't inhibit your ability to be alerted to security concerns. While there *are* ecosystem specific services already operating in that domain (e.g. requires.io for Python), the cross-language ones like VersionEye.com and dependencyci.com are more valuable when you're running complex infrastructure, since they abstract away the ecosystem-specific differences. While the previously mentioned libraries.io is the release notification component for dependencyci.com, release-monitoring.org is a service written in Python by the Fedora Infrastructure team to provide upstream release notifications to Linux distributions that's been around for a while longer, hence why that tends to be my own main point of interest. Cheers, Nick. P.S. For anyone that isn't aware, understanding and helping to manage the sustainability of Red Hat's software supply chain makes up a respectable portion of my day job. In the Python ecosystem, that just happens to include pointing out things that volunteers probably shouldn't invest their own time in implementing, since they're good candidates for funded commercial development ;) -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From chris.barker at noaa.gov Fri Jul 22 11:47:53 2016 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Fri, 22 Jul 2016 08:47:53 -0700 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: Message-ID: <4808390596708755952@unknownmsgid> Getting to this thread late, but it didn't seem that was resolved in the least, so I'll as my $0.02 > That overall got me thinking about namespace pollution in pip, that > once something is pushed in, it's like to stay there forever. This REALLY is a problem, and one that will only get worse. It would be nice to work out an approach before it gets too bad. > I'd rather see something similar to Linux distributions where there's > a curated repository "core" As pointed out, that's not going to happen. > and a few semi-official, like "extra" and > "community," and for some, "testing." But namespaces like these could still be implemented without curation. Right now, the barrier to entry to putting a package up on PyPI is very, very low. There are a lot of 'toy', or 'experimental', or 'well intentioned but abandoned' projects on there. If there was a clear and obvious place for folks to put these packages while they work out the kinks, and determine whether the package is really going to fly, I think people would use it. And we could have mostly-automatic policies about seemingly abandoned projects as well -- moving them to the "unmaintained" channel, or.... As for the 'deadman's switch' idea -- that's a great idea. Sure, there are projects with no active maintainer that people rely on, but almost be definition, if folks care about it, we'd be able to find someone willing to put their finger on the dads man's button once a year.... As for issues like setuptools and PIL, where there is a shift in maintenance if a large, high profile package, we really need a benevolent oligarchy for PyPA that can resolve these issues by hand. As pointed out -- this really doesn't come up often. However, in these discussions, I've observed a common theme: folks in the community bring up issues about unmaintained packages, namespace pollution, etc. the core PyPA folks respond with generally well reasoned arguments why proposed solutions won't fly. But it's totally unclear to me whether the core devs don't think these are problems worth addressing, or think they can only be addresses with major effort that no one has time for. If the core devs think it's fine and dandy like it is, we can all stop talking about it. By the way, there is an experiment underway with a "curated" community package repository for conda packages: https://conda-forge.github.io It's semi-curated, in the sense that only the core devs can approve a package being added, but once added, anyone can maintain it. It's going very well, but I'm not sure how it's going to scale. So far, 900 or so packages, 80 or so maintainers. Which I think is very impressive for a young system, but still a lot smaller than it could be. But I think PyPA should keep an eye on it -- I'm sure there will be lessons learned. -CHB > A name foobar resolves to core/foobar- if that exists, and if > not some subset of other repositories is used. > This way, an outdated package can be moved to another repo without > breaking install base. > > In fact, curation without namespaces will already be pretty good. > > d. > >> On 13 July 2016 at 19:24, Jim Fulton wrote: >>> On Tue, Jul 12, 2016 at 7:55 AM, Dima Tisnek wrote: >>> Hi all, >>> >>> Is anyone working on pruning old packages from pypi? >>> >>> I found something last updated in 2014, which, looking at the source >>> appears half-done. >>> Github link doesn't work any longer, no description, etc. >>> >>> I managed to find author's email address out of band, and he responded >>> that he can't remember the password, yada yada. >>> >>> I wonder if some basic automation is possible here -- check if url's >>> are reachable and if existing package satisfies basic requirements, >>> failing that mark it as "possibly out of date" >> >> I'm curious why you view this as a problem that needs to be solved? >> >> - Do you want to take over the name yourself? >> >> - Are you afraid someone will stumble on this package and use it? >> >> Something else? >> >> Jim >> >> -- >> Jim Fulton >> http://jimfulton.info > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From p.f.moore at gmail.com Fri Jul 22 12:32:03 2016 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 22 Jul 2016 17:32:03 +0100 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <4808390596708755952@unknownmsgid> References: <4808390596708755952@unknownmsgid> Message-ID: On 22 July 2016 at 16:47, Chris Barker - NOAA Federal wrote: > But it's totally unclear to me whether the core devs don't think these > are problems worth addressing, or think they can only be addresses > with major effort that no one has time for. Speaking for myself, it's the latter. I think there are issues worth resolving, but the implementation would need more effort than we have available. Also, there's a lot of sensitivity in this area. Any solution is almost certain to ruffle some feathers, so a lot of the effort needed could well be in the form of diplomatically managing project maintainers' expectations/views. Which is pretty draining, and the people doing the work are likely to have a high burnout rate. (In my opinion). Paul From donald at stufft.io Fri Jul 22 12:39:44 2016 From: donald at stufft.io (Donald Stufft) Date: Fri, 22 Jul 2016 12:39:44 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <4808390596708755952@unknownmsgid> References: <4808390596708755952@unknownmsgid> Message-ID: <67CBF31E-5513-4167-B22B-6EDA2933E76D@stufft.io> > On Jul 22, 2016, at 11:47 AM, Chris Barker - NOAA Federal wrote: > > > If the core devs think it's fine and dandy like it is, we can all stop > talking about it. I think they?re certainly a problem. The current solutions that have been proposed have their own problems of course, and problems enough that I don?t feel comfortable implementing them. Personally I don?t currently have the time to work on a better solution but if someone did that?d be fine with me. I would mention though that it?s possible there *is* no solution to this problem that doesn?t bring with it it?s own problems that are worse then the problem at hand. I?m not saying that?s the case, but just mentioning that it may be so. > > By the way, there is an experiment underway with a "curated" community > package repository for conda packages: > > https://conda-forge.github.io > > It's semi-curated, in the sense that only the core devs can approve a > package being added, but once added, anyone can maintain it. > > It's going very well, but I'm not sure how it's going to scale. So > far, 900 or so packages, 80 or so maintainers. Which I think is very > impressive for a young system, but still a lot smaller than it could > be. > > But I think PyPA should keep an eye on it -- I'm sure there will be > lessons learned. conda-forge and PyPI are not really similar. For conda-forge they?re largely just repackaging other software for the conda system. This makes it function better when you have just anyone of the core team able to work on stuff, because all they?re doing to adjusting the build, upgrading versions etc. They?re not working on the projects themselves. Curated also goes against the sort of ?spirit? of PyPI. It?s a place where anyone can release something and immediately be able to be part of the ecosystem. Adding curation on top of that would change the nature of PyPI, maybe for the better or maybe for the worst, but it would fundamentally change PyPI. ? Donald Stufft From donald at stufft.io Fri Jul 22 12:39:44 2016 From: donald at stufft.io (Donald Stufft) Date: Fri, 22 Jul 2016 12:39:44 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <4808390596708755952@unknownmsgid> References: <4808390596708755952@unknownmsgid> Message-ID: > On Jul 22, 2016, at 11:47 AM, Chris Barker - NOAA Federal wrote: > > > If the core devs think it's fine and dandy like it is, we can all stop > talking about it. I think they?re certainly a problem. The current solutions that have been proposed have their own problems of course, and problems enough that I don?t feel comfortable implementing them. Personally I don?t currently have the time to work on a better solution but if someone did that?d be fine with me. I would mention though that it?s possible there *is* no solution to this problem that doesn?t bring with it it?s own problems that are worse then the problem at hand. I?m not saying that?s the case, but just mentioning that it may be so. > > By the way, there is an experiment underway with a "curated" community > package repository for conda packages: > > https://conda-forge.github.io > > It's semi-curated, in the sense that only the core devs can approve a > package being added, but once added, anyone can maintain it. > > It's going very well, but I'm not sure how it's going to scale. So > far, 900 or so packages, 80 or so maintainers. Which I think is very > impressive for a young system, but still a lot smaller than it could > be. > > But I think PyPA should keep an eye on it -- I'm sure there will be > lessons learned. conda-forge and PyPI are not really similar. For conda-forge they?re largely just repackaging other software for the conda system. This makes it function better when you have just anyone of the core team able to work on stuff, because all they?re doing to adjusting the build, upgrading versions etc. They?re not working on the projects themselves. Curated also goes against the sort of ?spirit? of PyPI. It?s a place where anyone can release something and immediately be able to be part of the ecosystem. Adding curation on top of that would change the nature of PyPI, maybe for the better or maybe for the worst, but it would fundamentally change PyPI. ? Donald Stufft From donald at stufft.io Fri Jul 22 12:43:07 2016 From: donald at stufft.io (Donald Stufft) Date: Fri, 22 Jul 2016 12:43:07 -0400 Subject: [Distutils] Warehouse re: Celery < 4 In-Reply-To: References: Message-ID: <53A9CA79-6C94-4E6D-9083-CEA456F763B9@stufft.io> This should be fine. We pin versions in the deployment and we can?t land changes without passing tests. > On Jul 20, 2016, at 12:24 AM, Wes Turner wrote: > > From @asksol "Time to pin your versions if you haven?t already. Celery 4 is out soon: https://t.co/XpZqbjt91t " > > http://docs.celeryproject.org/en/master/whatsnew-4.0.html > - https://github.com/pypa/warehouse/blob/master/requirements/main.in > > celery>=3.1 > > Should be: > > celery>3.1<4.0 > > I'm on my phone > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From leorochael at gmail.com Fri Jul 22 13:47:36 2016 From: leorochael at gmail.com (Leonardo Rochael Almeida) Date: Fri, 22 Jul 2016 14:47:36 -0300 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <4808390596708755952@unknownmsgid> Message-ID: We've been discussing here at least two different problems related to package maintainership: 1. Abandoned/no-longer-maintained, but previously useful packages 2. namespace and package idea space pollution due to tests/aborted attempts/packaginginexperience. I don't have a good idea about 1, and it's clear from this discussion that curation as a solution for 2 is currently non-workable/undesirable due to both lack of manpower and a desire not to discourage contributions. Even if curation is not involved, we also don't want to up the barrier to entry such that we discourage new entrants into the ecosystem. A level of namespacing on the project name (but not (necessarily) on the python module/package level) was suggested as a way to reduce 2. I just thought of another, orthogonal, suggestion for 2: There is a TestPyPI server but it feels like few people use it (I certainly never did). What if you were only allowed to push a new project onto PyPI proper if you already have similarly named project on TestPyPI and at least one release of that project there? Notice that I'm not suggesting that all files need to touch TestPyPI before going to PyPI, although that could be considered in the future. I'm also not suggesting a waiting period between publishing on TestPyPI and PyPI, but this could also be considered if we see people abusing the fact that dumping some garbage on TestPyPI is the only step necessary before dumping some garbage on PyPI. And we could make it so TestPyPI is a bit more flexible with files than PyPI itself. For instance we could allow deletion and re-upload of files with the same name there that is currently disallowed on PyPI so people can test-drive their releases before pushing files onto PyPI proper (don't know if this is already the case). "Hey!" I hear you saying "If we're just moving the namespace reservation from PyPI to TestPyPI" aren't we just moving problem number 2 around without really solving it?" Well, if someone does some test on TestPyPI and later abandons it, we don't need to have as many qualms about removing the project than we do with a project on PyPI proper. We could even have an official grace period, like a few months, before we consider that a project on TestPyPI without a PyPI equivalent can be considered abandoned. And if we do implement a minimum period on TestPyPI before graduation to PyPI, this would also give us a measure of protection from malicious typo squatting: http://incolumitas.com/2016/06/08/typosquatting-package-managers/ Regards, Leo On 22 July 2016 at 13:39, Donald Stufft wrote: > > > On Jul 22, 2016, at 11:47 AM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > > > > > > If the core devs think it's fine and dandy like it is, we can all stop > > talking about it. > > I think they?re certainly a problem. The current solutions that have been > proposed have their own problems of course, and problems enough that I > don?t feel comfortable implementing them. Personally I don?t currently have > the time to work on a better solution but if someone did that?d be fine > with me. > > I would mention though that it?s possible there *is* no solution to this > problem that doesn?t bring with it it?s own problems that are worse then > the problem at hand. I?m not saying that?s the case, but just mentioning > that it may be so. > > > > > By the way, there is an experiment underway with a "curated" community > > package repository for conda packages: > > > > https://conda-forge.github.io > > > > It's semi-curated, in the sense that only the core devs can approve a > > package being added, but once added, anyone can maintain it. > > > > It's going very well, but I'm not sure how it's going to scale. So > > far, 900 or so packages, 80 or so maintainers. Which I think is very > > impressive for a young system, but still a lot smaller than it could > > be. > > > > But I think PyPA should keep an eye on it -- I'm sure there will be > > lessons learned. > > conda-forge and PyPI are not really similar. For conda-forge they?re > largely > just repackaging other software for the conda system. This makes it > function > better when you have just anyone of the core team able to work on stuff, > because all they?re doing to adjusting the build, upgrading versions etc. > They?re not working on the projects themselves. > > Curated also goes against the sort of ?spirit? of PyPI. It?s a place where > anyone > can release something and immediately be able to be part of the ecosystem. > Adding > curation on top of that would change the nature of PyPI, maybe for the > better or > maybe for the worst, but it would fundamentally change PyPI. > > ? > Donald Stufft > > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From randy at thesyrings.us Fri Jul 22 13:57:26 2016 From: randy at thesyrings.us (Randy Syring) Date: Fri, 22 Jul 2016 13:57:26 -0400 Subject: [Distutils] Outdated packages on pypi In-Reply-To: References: <4808390596708755952@unknownmsgid> Message-ID: <494c016b-8999-70e0-9090-456709531bf9@thesyrings.us> On 07/22/2016 12:39 PM, Donald Stufft wrote: >> On Jul 22, 2016, at 11:47 AM, Chris Barker - NOAA Federal wrote: >> >> >> If the core devs think it's fine and dandy like it is, we can all stop >> talking about it. > I think they?re certainly a problem. The current solutions that have been > proposed have their own problems of course, and problems enough that I > don?t feel comfortable implementing them. Personally I don?t currently have > the time to work on a better solution but if someone did that?d be fine > with me. > > I would mention though that it?s possible there *is* no solution to this > problem that doesn?t bring with it it?s own problems that are worse then > the problem at hand. I?m not saying that?s the case, but just mentioning > that it may be so. Is there a place where the currently proposed solutions are briefly outlined? One solution that seems apparent to me is to move to an org/package hierarchy like what GitHub has. By default, packages get published under a default namespace: default/flask legacy/flask (you get the point, probably need a better name) unless the user has registered on pypi for an organization and publishes the package under that org: pallets/flask You would still have contention at the org level, but my guess is this contention would be much less significant than the current contention that is faced with only having a single-level namespace for package names. You could further improve this by having org creation requests either A) approved to prevent name squatting or B) have an appeal process for org name squatting that is blatant (e.g. I register the "google" or "pypa" org) and/or C) expire orgs that are no longer maintained. The details of both A & B & C would be tricky to get right, but the rules would at least be decided on from the beginning, so people know what the conditions are. If they don't like those conditions, then they don't get an org, and the situation they are in with name contention is exactly the same as it is now. All legacy packages operate under the current ruleset. All orgs and their packages operate under the new ruleset. Hopefully avoiding complaints of "you changed the game on us." You could also operate the org registration idea under "beta" conditions for first couple years to work out kinks in the process and warn people up-front that the rules could change during that time. By mapping all current packages under some "legacy" namespace, there should be room for backwards compatibility. So, if my projects require "flask" either pip or Warehouse knows to return "legacy/flask." Has this been proposed before? Any interest? *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From prometheus235 at gmail.com Fri Jul 22 22:13:17 2016 From: prometheus235 at gmail.com (Nick Timkovich) Date: Fri, 22 Jul 2016 21:13:17 -0500 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <494c016b-8999-70e0-9090-456709531bf9@thesyrings.us> References: <4808390596708755952@unknownmsgid> <494c016b-8999-70e0-9090-456709531bf9@thesyrings.us> Message-ID: A more conservative approach might be to flag high-risk, typo-prone package names as requiring moderator approval to register. Some combination of looking at common 404s (or whatever happens when a client asks for a non-existent package), some string metrics (Levenshtein, Jaro, whatever) to an existing package, etc. could be used. Light moderation should be able to tell that someone who wants to register "requsets" should maybe be challenged as to why. Someone that wants to typo-attack would need to go after many more low-value names. That said, is there some sort of rate-limiter in place to prevent someone from registering hundreds of names in a short span? Probably easy to defeat with multiple accounts though? In any event, figuring out how to best attack PyPI/pip users is the best way to improve security. On Fri, Jul 22, 2016 at 12:57 PM, Randy Syring wrote: > > On 07/22/2016 12:39 PM, Donald Stufft wrote: > > On Jul 22, 2016, at 11:47 AM, Chris Barker - NOAA Federal wrote: > > > If the core devs think it's fine and dandy like it is, we can all stop > talking about it. > > I think they?re certainly a problem. The current solutions that have been > proposed have their own problems of course, and problems enough that I > don?t feel comfortable implementing them. Personally I don?t currently have > the time to work on a better solution but if someone did that?d be fine > with me. > > I would mention though that it?s possible there *is* no solution to this > problem that doesn?t bring with it it?s own problems that are worse then > the problem at hand. I?m not saying that?s the case, but just mentioning > that it may be so. > > Is there a place where the currently proposed solutions are briefly > outlined? > > One solution that seems apparent to me is to move to an org/package > hierarchy like what GitHub has. By default, packages get published under a > default namespace: > > default/flask > legacy/flask > (you get the point, probably need a better name) > > unless the user has registered on pypi for an organization and publishes > the package under that org: > > pallets/flask > > You would still have contention at the org level, but my guess is this > contention would be much less significant than the current contention that > is faced with only having a single-level namespace for package names. You > could further improve this by having org creation requests either A) > approved to prevent name squatting or B) have an appeal process for org > name squatting that is blatant (e.g. I register the "google" or "pypa" org) > and/or C) expire orgs that are no longer maintained. > > The details of both A & B & C would be tricky to get right, but the rules > would at least be decided on from the beginning, so people know what the > conditions are. If they don't like those conditions, then they don't get > an org, and the situation they are in with name contention is exactly the > same as it is now. All legacy packages operate under the current ruleset. > All orgs and their packages operate under the new ruleset. Hopefully > avoiding complaints of "you changed the game on us." You could also > operate the org registration idea under "beta" conditions for first couple > years to work out kinks in the process and warn people up-front that the > rules could change during that time. > > By mapping all current packages under some "legacy" namespace, there > should be room for backwards compatibility. So, if my projects require > "flask" either pip or Warehouse knows to return "legacy/flask." > > Has this been proposed before? Any interest? > > *Randy Syring* > Husband | Father | Redeemed Sinner > > > *"For what does it profit a man to gain the whole world and forfeit his > soul?" (Mark 8:36 ESV)* > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat Jul 23 00:20:41 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 23 Jul 2016 14:20:41 +1000 Subject: [Distutils] Outdated packages on pypi In-Reply-To: <4808390596708755952@unknownmsgid> References: <4808390596708755952@unknownmsgid> Message-ID: [Good replies from Donald, Paul, et al already, but rather than replying to individual points, I figure it's best to just respond to Chris's original question with my own thoughts] On 23 July 2016 at 01:47, Chris Barker - NOAA Federal wrote: > Right now, the barrier to entry to putting a package up on PyPI is > very, very low. There are a lot of 'toy', or 'experimental', or 'well > intentioned but abandoned' projects on there. > > If there was a clear and obvious place for folks to put these packages > while they work out the kinks, and determine whether the package is > really going to fly, I think people would use it. That place is PyPI. Having a separate "maybe good, maybe bad" location for experimental packages (which is the way a lot of people use GitHub repos these days, relying on direct-from-VCS installs) leads to a persistent problem where folks later decide "I want to publish this officially", go to claim the name on PyPI as well and find they have a conflict. As further examples of similar "multiple authoritative namespaces" problems, we sometimes see folks creating Python projects specifically for Linux distros rather than upstream Python causing name collisions when the upstream project is later packaged for that distro (e.g. python-mock conflicting with Fedora's RPM "mock" build tool - resolved by the Fedora library being renamed to "mockbuild" while keeping "mock" as the CLI name), and we also see cross-ecosystem conflicts (e.g. python-pip and perl-pip conflicting on the "pip" CLI name, resolved by the good graces of the perl-pip maintainer in ceding the unqualified name to the Python package). You can also look at the number of semantically versioned packages that enjoy huge adoption even before the author(s) pull the trigger on a 1.0 release (e.g. requests, SQL Alchemy, the TOML spec off the top of my head), which reveals that package authors often have higher standards for "good enough for 1.0" than their prospective users do (the standard for users is generally "it's good enough to solve my current problem", while the standard for maintainers is more likely to be "it isn't a hellish nightmare to maintain as people start finding more corner cases that didn't previously occur to me/us"). The other instinctive answer is "namespaces solve this!", but the truth is they don't, as: 1. Namespaces tend to belong to organisations, and particularly for utility projects, "this utility happened to be developed by this organisation" is entirely arbitrary and mostly irrelevant (except insofar as if you trust a particular org you may trust their libraries more, but you can get that from the metadata). 2. If you want to build a genuinely inclusive open source project, branding it with the name of your company is one of the *worst* things you can do (since it prevents any chance of a feeling of shared ownership by the entire community) 3. Python already allows distribution package names to differ from import package names, as well as supporting namespace packages, which means folks *could* have adopted namespaces-by-convention if they were a genuinely compelling solution. That hasn't happened, which suggests there's an inherent user experience problem with the idea. Would requests be more discoverable if Kenneth had called it "kreitz-requests" instead? What if we had "org-pocoo-flask" instead of just plain "flask"? Or "ljworld-django"? How many folks haven't even looked at "zc.interface" because the association with Zope prompts them to dismiss it out of hand? Organisational namespaces on a service like GitHub are *absolutely* useful, but what they're replacing is the model where different organisations run their own version control server (just as different organisations may run their own private Python index today), rather than being a good thing to expose directly to end users that just want to locate and use a piece of software. > However, in these discussions, I've observed a common theme: folks in > the community bring up issues about unmaintained packages, namespace > pollution, etc. the core PyPA folks respond with generally well > reasoned arguments why proposed solutions won't fly. > > But it's totally unclear to me whether the core devs don't think these > are problems worth addressing, or think they can only be addresses > with major effort that no one has time for. If we accept my premise that "single global namespace, flat except by convention" really does offer the most attractive overall user experience for a software distribution ecosystem, what's missing from the status quo on PyPI? In a word? Gardening. Post-publication curation like that provided by Linux distros and other redistributor communities (including conda-forge) can help with the "What's worth my time and attention?" question (we can think of this as filtering the output of an orchard, and only passing along the best fruit), but it can't address the problem of undesirable name collisions and other problems on the main index itself (we can think of this as actually weeding the original orchard, and pruning the trees when necessary) However, we don't have anyone that's specifically responsible for looking after the shared orchard that is PyPI, and this isn't something we can reasonably ask volunteers to do, as it's an intensely political and emotional draining task where your primary responsibility is deciding if and when it's appropriate to *take people's toys away*. As an added bonus, becoming more active in content curation as a platform provider means potentially opening yourself up to lawsuits as well, if folks object to either your reclamation of a name they previously controlled, or else your refusal to reclaim a particular name for *their* purposes. So while first-come-first-served namespace management definitely doesn't provide the best possible user experience for Pythonistas, it *does* minimise the volume of ongoing namespace maintenance work required, as well as the PSF's exposure to legal liability as the platform provider. These concerns aren't something that "policy enforcement can be automated" really addresses either, as even if you have algorithmic enforcement, those "The robots have decided to take your project away from you" emails are still going to be going to real people, and those folks may still be understandably upset. "Our algorithms did it, not our staff" is also a pretty thin legal reed to pin your hopes on if somebody turns out to be upset enough to sue you about it. This means the entire situation changes if the PSF's Packaging Working Group receives sufficient funding (either directly through https://donate.pypi.io or through general PSF sponsorships) to staff at least one full-time "PyPI Publisher Support" role (in addition to the other points on the WG's TODO list), as well as to pay for analyses of the legal implications of having more formal content curation policies. However, in the absence of such ongoing funding support, the current laissez-faire policies will necessarily remain in place, as they're the only affordable option. Regards, Nick. P.S. If folks work at end user organisations for whom contributing something like $10k a year to the PSF for a Silver sponsorship (https://www.python.org/psf/sponsorship/ ) would be a rounding error in the annual budget want to see change in this kind of area, then advocating within your organisation to become PSF sponsor members on the basis of "We want to enable the PSF to work on community infrastructure improvement activities it's currently not pursuing for lack of resources" is probably one of *the most helpful things you can do* for the wider Python community. As individuals, we can at most sustainably contribute a few hours a week as volunteers, or maybe wrangle a full-time upstream contribution role if we're lucky enough to have or find a supportive employer. By contrast, the PSF is in a position to look for ways to let volunteers focus their time and energy on activities that are more inherently rewarding, by directing funding towards the many necessary-but-draining activities that go into supporting a collaborative software development community. The more resources the PSF has available, the more time and energy it will be able to devote towards identifying and addressing unwanted sources of friction in the collaborative process. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From rosuav at gmail.com Fri Jul 22 13:48:25 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 23 Jul 2016 03:48:25 +1000 Subject: [Distutils] Proposal: "Install and save" Message-ID: In teaching my students how to use third-party Python modules, I generally recommend starting every project with "python3 -m venv env", and then install dependencies into the virtual environment. They then need a requirements.txt to keep track of them. There are two workflows for that: $ pip install flask $ pip freeze >requirements.txt or: $ echo flask >>requirements.txt $ pip install -r requirements.txt Over in the JavaScript world, npm has a much tidier way to do things. I propose adding an equivalent to pip: $ pip install --save flask which will do the same as the second option above (or possibly write it out with a version number as per 'pip freeze' - bikeshed away). As well as cutting two commands down to one, it's heaps easier in the multiple installation case - "pip install --save flask sqlalchemy gunicorn" is much clearer than messing around with creating a multi-line file; and the 'pip freeze' option always ends up listing unnecessary dependencies. Thoughts? ChrisA From alex.gronholm at nextday.fi Sat Jul 23 09:32:48 2016 From: alex.gronholm at nextday.fi (=?UTF-8?Q?Alex_Gr=c3=b6nholm?=) Date: Sat, 23 Jul 2016 16:32:48 +0300 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: Message-ID: <57937200.7000403@nextday.fi> I'm -1 on this because requirements.txt is not really the standard way to list dependencies. In the Python world, setup.py is the equivalent of Node's package.json. But as it is Python code, it cannot so easily be programmatically modified. 22.07.2016, 20:48, Chris Angelico kirjoitti: > In teaching my students how to use third-party Python modules, I > generally recommend starting every project with "python3 -m venv env", > and then install dependencies into the virtual environment. They then > need a requirements.txt to keep track of them. There are two workflows > for that: > > $ pip install flask > $ pip freeze >requirements.txt > > or: > > $ echo flask >>requirements.txt > $ pip install -r requirements.txt > > Over in the JavaScript world, npm has a much tidier way to do things. > I propose adding an equivalent to pip: > > $ pip install --save flask > > which will do the same as the second option above (or possibly write > it out with a version number as per 'pip freeze' - bikeshed away). As > well as cutting two commands down to one, it's heaps easier in the > multiple installation case - "pip install --save flask sqlalchemy > gunicorn" is much clearer than messing around with creating a > multi-line file; and the 'pip freeze' option always ends up listing > unnecessary dependencies. > > Thoughts? > > ChrisA > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From thomas at kluyver.me.uk Sat Jul 23 10:04:45 2016 From: thomas at kluyver.me.uk (Thomas Kluyver) Date: Sat, 23 Jul 2016 15:04:45 +0100 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: <57937200.7000403@nextday.fi> References: <57937200.7000403@nextday.fi> Message-ID: <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: > I'm -1 on this because requirements.txt is not really the standard way > to list dependencies. > In the Python world, setup.py is the equivalent of Node's package.json. > But as it is > Python code, it cannot so easily be programmatically modified. Packaging based on declarative metadata: http://flit.readthedocs.io/en/latest/ We have a bit of a divide. Specifying dependencies in setup.py (or flit.ini, or upcoming pyproject.toml) is the standard for library and tool packages that are intended to be published on PyPI and installed with pip. requirements.txt is generally used for applications which will be distributed or deployed by other means. As I understand it, in the Javascript world package.json is used in both cases. Is that something Python should try to emulate? Is it hard to achieve given the limitations of setup.py that you pointed out? Thomas From alex.gronholm at nextday.fi Sat Jul 23 10:10:53 2016 From: alex.gronholm at nextday.fi (=?UTF-8?Q?Alex_Gr=c3=b6nholm?=) Date: Sat, 23 Jul 2016 17:10:53 +0300 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> Message-ID: <57937AED.9000200@nextday.fi> 23.07.2016, 17:04, Thomas Kluyver kirjoitti: > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: >> I'm -1 on this because requirements.txt is not really the standard way >> to list dependencies. >> In the Python world, setup.py is the equivalent of Node's package.json. >> But as it is >> Python code, it cannot so easily be programmatically modified. > Packaging based on declarative metadata: > http://flit.readthedocs.io/en/latest/ > > > We have a bit of a divide. Specifying dependencies in setup.py (or > flit.ini, or upcoming pyproject.toml) is the standard for library and > tool packages that are intended to be published on PyPI and installed > with pip. requirements.txt is generally used for applications which will > be distributed or deployed by other means. > > As I understand it, in the Javascript world package.json is used in both > cases. Is that something Python should try to emulate? Is it hard to > achieve given the limitations of setup.py that you pointed out? This topic has been beaten to death. There is no way to cram the complexities of C extension compilation setup into purely declarative metadata. Distutils2 tried and failed. Just look at the setup.py files of some popular projects and imagine all that logic expressed in declarative metadata. > Thomas > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From dholth at gmail.com Sat Jul 23 10:43:36 2016 From: dholth at gmail.com (Daniel Holth) Date: Sat, 23 Jul 2016 14:43:36 +0000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: <57937AED.9000200@nextday.fi> References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> Message-ID: Here is my attempt. The SConstruct (like a Makefile) builds the extension. The .toml file gives the static metadata. No need to put the two in the same file. https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm wrote: > 23.07.2016, 17:04, Thomas Kluyver kirjoitti: > > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: > >> I'm -1 on this because requirements.txt is not really the standard way > >> to list dependencies. > >> In the Python world, setup.py is the equivalent of Node's package.json. > >> But as it is > >> Python code, it cannot so easily be programmatically modified. > > Packaging based on declarative metadata: > > http://flit.readthedocs.io/en/latest/ > > > > > > We have a bit of a divide. Specifying dependencies in setup.py (or > > flit.ini, or upcoming pyproject.toml) is the standard for library and > > tool packages that are intended to be published on PyPI and installed > > with pip. requirements.txt is generally used for applications which will > > be distributed or deployed by other means. > > > > As I understand it, in the Javascript world package.json is used in both > > cases. Is that something Python should try to emulate? Is it hard to > > achieve given the limitations of setup.py that you pointed out? > This topic has been beaten to death. There is no way to cram the > complexities of C extension compilation setup into purely declarative > metadata. Distutils2 tried and failed. Just look at the setup.py files > of some popular projects and imagine all that logic expressed in > declarative metadata. > > Thomas > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alex.gronholm at nextday.fi Sat Jul 23 11:37:02 2016 From: alex.gronholm at nextday.fi (=?UTF-8?Q?Alex_Gr=c3=b6nholm?=) Date: Sat, 23 Jul 2016 18:37:02 +0300 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> Message-ID: <57938F1E.10502@nextday.fi> pip doesn't yet support pyproject.toml does it? 23.07.2016, 17:43, Daniel Holth kirjoitti: > Here is my attempt. The SConstruct (like a Makefile) builds the > extension. The .toml file gives the static metadata. No need to put > the two in the same file. > > https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct > > https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml > > On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm > > wrote: > > 23.07.2016, 17:04, Thomas Kluyver kirjoitti: > > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: > >> I'm -1 on this because requirements.txt is not really the > standard way > >> to list dependencies. > >> In the Python world, setup.py is the equivalent of Node's > package.json. > >> But as it is > >> Python code, it cannot so easily be programmatically modified. > > Packaging based on declarative metadata: > > http://flit.readthedocs.io/en/latest/ > > > > > > We have a bit of a divide. Specifying dependencies in setup.py (or > > flit.ini, or upcoming pyproject.toml) is the standard for > library and > > tool packages that are intended to be published on PyPI and > installed > > with pip. requirements.txt is generally used for applications > which will > > be distributed or deployed by other means. > > > > As I understand it, in the Javascript world package.json is used > in both > > cases. Is that something Python should try to emulate? Is it hard to > > achieve given the limitations of setup.py that you pointed out? > This topic has been beaten to death. There is no way to cram the > complexities of C extension compilation setup into purely declarative > metadata. Distutils2 tried and failed. Just look at the setup.py files > of some popular projects and imagine all that logic expressed in > declarative metadata. > > Thomas > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > > https://mail.python.org/mailman/listinfo/distutils-sig > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dholth at gmail.com Sat Jul 23 13:35:30 2016 From: dholth at gmail.com (Daniel Holth) Date: Sat, 23 Jul 2016 17:35:30 +0000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: <57938F1E.10502@nextday.fi> References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: Not yet. Someone should fix that ? On Sat, Jul 23, 2016, 11:37 Alex Gr?nholm wrote: > pip doesn't yet support pyproject.toml does it? > > > 23.07.2016, 17:43, Daniel Holth kirjoitti: > > Here is my attempt. The SConstruct (like a Makefile) builds the extension. > The .toml file gives the static metadata. No need to put the two in the > same file. > > https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct > > https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml > > On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm > wrote: > >> 23.07.2016, 17:04, Thomas Kluyver kirjoitti: >> > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: >> >> I'm -1 on this because requirements.txt is not really the standard way >> >> to list dependencies. >> >> In the Python world, setup.py is the equivalent of Node's package.json. >> >> But as it is >> >> Python code, it cannot so easily be programmatically modified. >> > Packaging based on declarative metadata: >> > http://flit.readthedocs.io/en/latest/ >> > >> > >> > We have a bit of a divide. Specifying dependencies in setup.py (or >> > flit.ini, or upcoming pyproject.toml) is the standard for library and >> > tool packages that are intended to be published on PyPI and installed >> > with pip. requirements.txt is generally used for applications which will >> > be distributed or deployed by other means. >> > >> > As I understand it, in the Javascript world package.json is used in both >> > cases. Is that something Python should try to emulate? Is it hard to >> > achieve given the limitations of setup.py that you pointed out? >> This topic has been beaten to death. There is no way to cram the >> complexities of C extension compilation setup into purely declarative >> metadata. Distutils2 tried and failed. Just look at the setup.py files >> of some popular projects and imagine all that logic expressed in >> declarative metadata. >> > Thomas >> > _______________________________________________ >> > Distutils-SIG maillist - Distutils-SIG at python.org >> > https://mail.python.org/mailman/listinfo/distutils-sig >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicholas.chammas at gmail.com Sat Jul 23 14:40:26 2016 From: nicholas.chammas at gmail.com (Nicholas Chammas) Date: Sat, 23 Jul 2016 18:40:26 +0000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI Message-ID: This may be a heretical idea, and it?s definitely not something anyone is likely to take on anytime soon, but I?d like to put it up for discussion and see what people think. I often find myself wanting to show gratitude to the authors or maintainers of a package by giving money. Most of the time, it is easier for me to give money than time. After all, contributing time in a meaningful way to a project can be quite difficult. I wonder how many others find themselves in a similar situation, wanting to chip a few bucks in as a token of gratitude for someone?s work, either one-time or on an ongoing basis. You can already do this today, of course, with services like PayPal, Gratipay , and Salt . But the process is scattered and different for each Python package and the group of people behind it. What?s more, it seems wrong that these third-party services should capture some of the value generated by the Python community (e.g. by charging some transaction fee) without any of it going back to support that community (e.g. via the PSF). So that raises the question: How about if people could contribute money to the people behind projects they felt grateful for via PyPI? PyPI is already the community?s central package hub. Perhaps it could also be the community?s central hub for contributing money. The central goal would be to create a low-friction ecosystem of monetary contributions that benefits Python package authors/maintainers, as well as the larger Python ecosystem. At a high-level, the ecosystem would be opt-in, and contributions would go directly to their intended recipients, with tiny cuts of each transaction going to the payment processor and some Python community organization like the PSF. I know a more concrete proposal would have to address a lot of details (e.g. like how to split contributions across multiple maintainers), and perhaps there is no way to find the resources to build or maintain such a thing in the first place. But just for now I?d like to separate essence of idea from the practical concerns of implementing it. Assume the work is already done, and we have such a system of monetary contributions in place today. My questions are: - Would such a system benefit the Python community? - Would the cut of transactions that went to the PSF (or similar) be enough to cover the cost of maintaining the system and meaningfully impact other Python efforts? - Would such a system create perverse incentives? - Would it just be a dud that very few people would use? I?m trying to get a feel for whether the idea has any potential. Nick ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sat Jul 23 15:11:56 2016 From: donald at stufft.io (Donald Stufft) Date: Sat, 23 Jul 2016 15:11:56 -0400 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: > On Jul 23, 2016, at 2:40 PM, Nicholas Chammas wrote: > > I know a more concrete proposal would have to address a lot of details (e.g. like how to split contributions across multiple maintainers), and perhaps there is no way to find the resources to build or maintain such a thing in the first place. But just for now I?d like to separate essence of idea from the practical concerns of implementing it. I?m mulling over the idea in my head, but one other thing we?d need to figure out is the *legality* of doing this and if it?s something the PSF is willing to do at all. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sat Jul 23 15:22:36 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 23 Jul 2016 12:22:36 -0700 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On Jul 23, 2016 11:40 AM, "Nicholas Chammas" wrote: > [...] > You can already do this today, of course, with services like PayPal, Gratipay, and Salt. But the process is scattered and different for each Python package and the group of people behind it. What?s more, it seems wrong that these third-party services should capture some of the value generated by the Python community (e.g. by charging some transaction fee) without any of it going back to support that community (e.g. via the PSF). These kinds of money transfer services are pretty competitive, and AFAIK their transaction fees are largely set by companies like VISA, plus the actual costs of running this kind of business. (And this is much more than just shuffling bits around - there are all kinds of complicated regulations you have to comply with around issues like "how do you know that organized crime isn't using your service for money laundering". IIRC every country and US state has their own idea about what counts as adequate safeguards.) I think it's vanishingly unlikely that the PSF could provide these services more efficiently than these dedicated companies. So I don't think the PSF is going to get rich on transaction fees. OTOH, if we give up on that part of the idea, then it becomes much easier :-). It'd be straightforward for PyPI to provide a "how to donate to this project" box on each project page, that has links to whatever donation processing service(s) the project prefers. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From glyph at twistedmatrix.com Sat Jul 23 15:22:54 2016 From: glyph at twistedmatrix.com (Glyph Lefkowitz) Date: Sat, 23 Jul 2016 12:22:54 -0700 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: <3361DE5B-4D35-47EC-84F8-013EEA29FE47@twistedmatrix.com> > On Jul 23, 2016, at 12:11 PM, Donald Stufft wrote: > > >> On Jul 23, 2016, at 2:40 PM, Nicholas Chammas > wrote: >> >> I know a more concrete proposal would have to address a lot of details (e.g. like how to split contributions across multiple maintainers), and perhaps there is no way to find the resources to build or maintain such a thing in the first place. But just for now I?d like to separate essence of idea from the practical concerns of implementing it. > > > I?m mulling over the idea in my head, but one other thing we?d need to figure out is the *legality* of doing this and if it?s something the PSF is willing to do at all. This was my initial reaction as well. It would be awesome if it worked! It would potentially go a long way to addressing the now much-discussed problem of funding open source infrastructure >. But it is also a legal and financial mine-field. Even if a lawyer says it's OK and it's possible to comply with the law, you still generate a lot of work for an accountant to actually do the complying. https://gratipay.com is a good, recent example of an apparently simple idea like this running into severe legal consequences and nearly imploding as a result. Another potential problem that may not be initially obvious; due to the somewhat ambiguous nature of the funding structure, they also became a popular payment processor for nazis and white supremacists, since it's hard to get paid for producing nazi propaganda on other platforms. Of course, PyPI might always be used as an update platform for malware or a C&C control point too, so it's not like there are no risks in operating it as it currently stands, but money always has the potential to make things worse. I don't want to be doom-and-gloom here, in fact I would _very_ much like to see this project happen. I just think that in order to do it in a way which doesn't backfire horribly, it has to be responsibly staffed at the outset so that problems like these, that we know about, can be addressed up front, and the inevitable ones that don't seem obvious at the moment have a clearly responsible person to go fix them as they arise, in a timely way. -glyph -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Sat Jul 23 15:25:06 2016 From: steve.dower at python.org (Steve Dower) Date: Sat, 23 Jul 2016 12:25:06 -0700 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: <5793C492.30904@python.org> On 23Jul2016 1211, Donald Stufft wrote: > >> On Jul 23, 2016, at 2:40 PM, Nicholas Chammas >> > wrote: >> >> I know a more concrete proposal would have to address a lot of details >> (e.g. like how to split contributions across multiple maintainers), >> and perhaps there is no way to find the resources to build or maintain >> such a thing in the first place. But just for now I?d like to separate >> essence of idea from the practical concerns of implementing it. > > > I?m mulling over the idea in my head, but one other thing we?d need to > figure out is the *legality* of doing this and if it?s something the PSF > is willing to do at all. Maybe it's as simple as a user profile "Donations URL" field that is also listed on any packages owned by that user? Or possibly a per-package URL? (The latter may work out better for people who manage both personal and work packages, which is currently the system we're using at Microsoft until we get a centralised publishing system set up.) Cheers, Steve From dholth at gmail.com Sat Jul 23 15:35:26 2016 From: dholth at gmail.com (Daniel Holth) Date: Sat, 23 Jul 2016 19:35:26 +0000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: <5793C492.30904@python.org> References: <5793C492.30904@python.org> Message-ID: Have you seen https://rubytogether.org ? It looks like exactly what you are proposing, only with more blocks. On Sat, Jul 23, 2016 at 3:25 PM Steve Dower wrote: > On 23Jul2016 1211, Donald Stufft wrote: > > > >> On Jul 23, 2016, at 2:40 PM, Nicholas Chammas > >> > wrote: > >> > >> I know a more concrete proposal would have to address a lot of details > >> (e.g. like how to split contributions across multiple maintainers), > >> and perhaps there is no way to find the resources to build or maintain > >> such a thing in the first place. But just for now I?d like to separate > >> essence of idea from the practical concerns of implementing it. > > > > > > I?m mulling over the idea in my head, but one other thing we?d need to > > figure out is the *legality* of doing this and if it?s something the PSF > > is willing to do at all. > > Maybe it's as simple as a user profile "Donations URL" field that is > also listed on any packages owned by that user? Or possibly a > per-package URL? (The latter may work out better for people who manage > both personal and work packages, which is currently the system we're > using at Microsoft until we get a centralised publishing system set up.) > > Cheers, > Steve > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicholas.chammas at gmail.com Sat Jul 23 15:54:10 2016 From: nicholas.chammas at gmail.com (Nicholas Chammas) Date: Sat, 23 Jul 2016 19:54:10 +0000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On Sat, Jul 23, 2016 at 3:22 PM Nathaniel Smith wrote: > On Jul 23, 2016 11:40 AM, "Nicholas Chammas" > wrote: > > > [...] > > > > You can already do this today, of course, with services like PayPal, > Gratipay, and Salt. But the process is scattered and different for each > Python package and the group of people behind it. What?s more, it seems > wrong that these third-party services should capture some of the value > generated by the Python community (e.g. by charging some transaction fee) > without any of it going back to support that community (e.g. via the PSF). > > These kinds of money transfer services are pretty competitive, and AFAIK > their transaction fees are largely set by companies like VISA, plus the > actual costs of running this kind of business. (And this is much more than > just shuffling bits around - there are all kinds of complicated regulations > you have to comply with around issues like "how do you know that organized > crime isn't using your service for money laundering". IIRC every country > and US state has their own idea about what counts as adequate safeguards.) > I think it's vanishingly unlikely that the PSF could provide these services > more efficiently than these dedicated companies. So I don't think the PSF > is going to get rich on transaction fees. > Agreed that it doesn't make sense for us to reinvent this wheel, at least not unless the idea is massively successful. I was thinking the PSF could perhaps start off by partnering with one of these organizations who have already figured this stuff out. The PSF/PyPI would bring a whole new group of users to their service, and in return the service provider (e.g. Salt) would give the PSF some cut of their fee. Not sure if that's viable (maybe the margins are too low?), but that's what I was thinking. Nick -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicholas.chammas at gmail.com Sat Jul 23 16:20:32 2016 From: nicholas.chammas at gmail.com (Nicholas Chammas) Date: Sat, 23 Jul 2016 20:20:32 +0000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: <5793C492.30904@python.org> Message-ID: On Sat, Jul 23, 2016 at 3:35 PM Daniel Holth wrote: > Have you seen https://rubytogether.org ? It looks like exactly what you > are proposing, only with more blocks. > Interesting. I hadn't heard of that before now. I'm not sure what you mean by "with more blocks", but that idea seems focused on funding core Ruby infrastructure, whereas I'm proposing a way for users to fund any Python project they want, with the added benefit that by funding those projects they automatically also fund core Python infrastructure too. A rough -- but in my view very apt -- analogy for what I'm proposing would be the iOS App Store. Users buy apps they want; when that happens, the app developers benefit by getting paid for their work, and the general iOS ecosystem also benefits since the maintainer of that ecosystem, Apple, gets a cut of those payments. Of course, in our case we are talking about voluntary monetary contributions and not prices, and we are talking about tiny fees and not 15-30% cuts. But the basic idea is the same: Fund the core infrastructure by capturing some of the value generated by that infrastructure. The core infrastructure would be things like PyPI, tooling like pip, and Python itself. The value generated by that infrastructure, for the purposes of this discussion, would be the monetary contributions people made to projects they appreciated. And capturing some of that value would be taking a tiny cut out of those contributions and reinvesting it in the core. Nick -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Sat Jul 23 19:40:21 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 23 Jul 2016 18:40:21 -0500 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: <5793C492.30904@python.org> Message-ID: - https://en.wikipedia.org/wiki/Micropayment - https://en.wikipedia.org/wiki/Micro-donation - https://en.wikipedia.org/wiki/Gratis_versus_libre - https://en.wikipedia.org/wiki/Business_models_for_open-source_software#Voluntary_donations - http://alternativeto.net/software/gittip/ (TIL GitTip is now Gratipay) - https://gratipay.com/about/ ... https://en.wikipedia.org/wiki/Capitalization_table ... https://en.wikipedia.org/wiki/Bug_bounty_program ... On Saturday, July 23, 2016, Nicholas Chammas wrote: > On Sat, Jul 23, 2016 at 3:35 PM Daniel Holth > wrote: > >> Have you seen https://rubytogether.org ? It looks like exactly what you >> are proposing, only with more blocks. >> > > Interesting. I hadn't heard of that before now. > > I'm not sure what you mean by "with more blocks", but that idea seems > focused on funding core Ruby infrastructure, whereas I'm proposing a way > for users to fund any Python project they want, with the added benefit that > by funding those projects they automatically also fund core Python > infrastructure too. > > A rough -- but in my view very apt -- analogy for what I'm proposing would > be the iOS App Store. Users buy apps they want; when that happens, the app > developers benefit by getting paid for their work, and the general iOS > ecosystem also benefits since the maintainer of that ecosystem, Apple, gets > a cut of those payments. > > Of course, in our case we are talking about voluntary monetary > contributions and not prices, and we are talking about tiny fees and not > 15-30% cuts. But the basic idea is the same: Fund the core infrastructure > by capturing some of the value generated by that infrastructure. > > The core infrastructure would be things like PyPI, tooling like pip, and > Python itself. The value generated by that infrastructure, for the purposes > of this discussion, would be the monetary contributions people made to > projects they appreciated. And capturing some of that value would be taking > a tiny cut out of those contributions and reinvesting it in the core. > > Nick > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat Jul 23 22:53:06 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jul 2016 12:53:06 +1000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> Message-ID: On 24 July 2016 at 00:04, Thomas Kluyver wrote: > As I understand it, in the Javascript world package.json is used in both > cases. Is that something Python should try to emulate? Is it hard to > achieve given the limitations of setup.py that you pointed out? There are other problems with the "one file to rule them all" approach, one of which is that libraries should aim to be as loose as is practical in declaring their dependencies, while applications should treat dependency updates like any other code change: require them to go through pre-merge CI by pinning an exact version of the dependency in requirements.txt. Donald wrote a good explanation of the distinction a few years ago: https://caremad.io/2013/07/setup-vs-requirement/ The best current toolset I know for managing the distinction in the case of an application venv or a user's ad hoc working environment is actually pip-tools: https://github.com/nvie/pip-tools#readme That splits the "explicitly declared application dependencies" out into a requirements.in file, which pip-compile then turns into a conventional (but fully pinned) requirements.txt file. The pip-sync tool then says "make this environment exactly match this requirements file". Personally, I could definitely see a feature like "pip-newdep requirements.in '' ''" being relevant in pip-tools as a shorthand for something like: echo '' >> requirements.in && echo '' >> requirements.in && pip-compile requirements.in && pip install -r requirements.txt (A full pip-sync wouldn't be appropriate though, since that may uninstall dev requirements added via a separate requirements file) I'm not sure it makes sense for pip itself though - pip's a bit more neutral than that on precisely how people do their environment management (even "pip install -r requirements.txt" is just an emergent convention). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat Jul 23 23:19:45 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jul 2016 13:19:45 +1000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On 24 July 2016 at 04:40, Nicholas Chammas wrote: > This may be a heretical idea, and it?s definitely not something anyone is > likely to take on anytime soon, but I?d like to put it up for discussion and > see what people think. The PSF wouldn't want to get involved in the actual money transfer (facilitating international monetary transfers is complicated at the best of times, facilitating them without jeopardising the PSF's public interest charity status would be even worse), but one of the things I'd personally like to see happen post Warehouse migration is along the lines of what Nathaniel Smith suggested: we could adjust the publisher facing UX to explicitly nudge people towards explaining how ongoing development of their project is funded, and make it not only acceptable, but encouraged, for people to engage in fundraising activities on their project pages. The public project pages would then include that sustainability information, and we'd also make it available as part of the project metadata available through the service API. It would then be up to publishers to decide if and how how they wanted to seek funds (PayPal, Patreon, Gratipay, BountySource Salt, etc), rather than the PyPA or the PSF making that decision on their behalf. (However, we could also consider being open to code contributions from those kinds of companies that made it easy for publishers to integrate their services with PyPI) If folks publishing software through PyPI didn't personally want or need additional funds (e.g. when it's a fully funded institutional project, or if it's someone's personal side project that they have no interest in turning into a paid job), then we could let them opt in to using the relevant space on the project page to display the logo(s) of the sponsoring institution(s), encourage contributions to the PSF, or leave it blank entirely. Cheers, Nick. P.S. As far as RubyTogether goes, that's closer to what the PSF is doing with the Packaging Working Group - providing a centrally administered shared funding pool for sustaining engineering on common community infrastructure. The Python equivalent to that is now for organisations to either sign up as PSF Sponsors (or at least explain to the PSF what they would like to see in improved expenditure reporting before they would sign up as sponsors), or else to make an earmarked donation specifically to the community packaging infrastructure via https://donate.pypi.io/ It's not the same process or problem as the "help Python project users to effectively manage their supply chain by providing them with ways to fund Python project publishers" -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Sun Jul 24 01:07:19 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sun, 24 Jul 2016 00:07:19 -0500 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: https://donate.pypi.io/ - Minimum donation: $5 - 1 year PSF Associate Member: $99 ( https://en.wikipedia.org/wiki/CiviCRM ) SSL certs were a primary cost before letsencrypt (which is sponsored by a large number of organizations): - https://letsencrypt.org/ - https://mozilla.github.io/server-side-tls/ssl-config-generator/ ... Third party payment services handle PCI-DSS (: https://en.wikipedia.org/wiki/Payment_Card_Industry_Data_Security_Standard On Saturday, July 23, 2016, Nick Coghlan wrote: > On 24 July 2016 at 04:40, Nicholas Chammas > wrote: > > This may be a heretical idea, and it?s definitely not something anyone is > > likely to take on anytime soon, but I?d like to put it up for discussion > and > > see what people think. > > The PSF wouldn't want to get involved in the actual money transfer > (facilitating international monetary transfers is complicated at the > best of times, facilitating them without jeopardising the PSF's public > interest charity status would be even worse), but one of the things > I'd personally like to see happen post Warehouse migration is along > the lines of what Nathaniel Smith suggested: we could adjust the > publisher facing UX to explicitly nudge people towards explaining how > ongoing development of their project is funded, and make it not only > acceptable, but encouraged, for people to engage in fundraising > activities on their project pages. The public project pages would then > include that sustainability information, and we'd also make it > available as part of the project metadata available through the > service API. > > It would then be up to publishers to decide if and how how they wanted > to seek funds (PayPal, Patreon, Gratipay, BountySource Salt, etc), > rather than the PyPA or the PSF making that decision on their behalf. > (However, we could also consider being open to code contributions from > those kinds of companies that made it easy for publishers to integrate > their services with PyPI) > > If folks publishing software through PyPI didn't personally want or > need additional funds (e.g. when it's a fully funded institutional > project, or if it's someone's personal side project that they have no > interest in turning into a paid job), then we could let them opt in to > using the relevant space on the project page to display the logo(s) of > the sponsoring institution(s), encourage contributions to the PSF, or > leave it blank entirely. > > Cheers, > Nick. > > P.S. As far as RubyTogether goes, that's closer to what the PSF is > doing with the Packaging Working Group - providing a centrally > administered shared funding pool for sustaining engineering on common > community infrastructure. The Python equivalent to that is now for > organisations to either sign up as PSF Sponsors (or at least explain > to the PSF what they would like to see in improved expenditure > reporting before they would sign up as sponsors), or else to make an > earmarked donation specifically to the community packaging > infrastructure via https://donate.pypi.io/ > > It's not the same process or problem as the "help Python project users > to effectively manage their supply chain by providing them with ways > to fund Python project publishers" > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, > Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pombredanne at nexb.com Sun Jul 24 02:33:57 2016 From: pombredanne at nexb.com (Philippe Ombredanne) Date: Sun, 24 Jul 2016 08:33:57 +0200 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: On Sat, Jul 23, 2016, 17:04, Thomas Kluyver wrote: > As I understand it, in the Javascript world package.json is used in > both > cases. Is that something Python should try to emulate? Is it hard to > achieve given the limitations of setup.py that you pointed out? Thomas: this is not entirely correct: a node package with native code needs to have node-gyp [1] installed which in turns relies on the gyp-based Python tool ;) to actually build the natives using actual makefile-like build scripts. MO is that any kind of mildly engaged package build for NPM requires a rather more involved effort than its Python world equivalent. And you can see the JS packaging world walking somewhat of the same path that has walked here before. To get some feeling of what they are going through read this thread [2] On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm wrote: > This topic has been beaten to death. There is no way to cram the > complexities of C extension compilation setup into purely declarative > metadata. Distutils2 tried and failed. Just look at the setup.py files > of some popular projects and imagine all that logic expressed in > declarative metadata. Indeed! There is no magic and no free lunch there. [1] https://github.com/nodejs/node-gyp [2] https://github.com/npm/npm/issues/1891 -- Cordially Philippe Ombredanne From rosuav at gmail.com Sat Jul 23 09:56:34 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 23 Jul 2016 23:56:34 +1000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: <57937200.7000403@nextday.fi> References: <57937200.7000403@nextday.fi> Message-ID: On Sat, Jul 23, 2016 at 11:32 PM, Alex Gr?nholm wrote: > I'm -1 on this because requirements.txt is not really the standard way to > list dependencies. > In the Python world, setup.py is the equivalent of Node's package.json. But > as it is > Python code, it cannot so easily be programmatically modified. For other packages, yes; what about for applications? How is a web application to best say "I need Flask, and markdown, and SQLAlchemy"? I'm willing to consider workflow changes (among other advantages, they don't have to wait until the new pip gets pushed out to everyone). ChrisA From rosuav at gmail.com Sat Jul 23 23:50:19 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 24 Jul 2016 13:50:19 +1000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> Message-ID: On Sun, Jul 24, 2016 at 12:53 PM, Nick Coghlan wrote: > Personally, I could definitely see a feature like "pip-newdep > requirements.in '' ''" being relevant in pip-tools as a > shorthand for something like: > > echo '' >> requirements.in && echo '' >> > requirements.in && pip-compile requirements.in && pip install -r > requirements.txt Can the file name be made implicit? If so, it would come to almost what I was originally looking for: pip-newdep sqlalchemy flask and it'd DTRT as regards building the requirements. If I understand you correctly, requirements.in would store only the names of packages required (no version info), and then requirements.txt would look like the output of 'pip freeze' but only for the packages listed in requirements.in? Because that would be _perfect_. Both files go into source control, and the task "update to the latest versions of things and check if it all works" would be straight-forward. ChrisA From rosuav at gmail.com Sat Jul 23 23:54:54 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 24 Jul 2016 13:54:54 +1000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> Message-ID: On Sun, Jul 24, 2016 at 1:50 PM, Chris Angelico wrote: > then requirements.txt would look like > the output of 'pip freeze' but only for the packages listed in > requirements.in? Oops, should have read the readme first. It's listing the dependencies of those packages, too. Which is not a problem; the main issue I have with 'pip freeze' is that it often picks up standard library modules like argparse and wsgiref, which then cause installation issues. ChrisA From ncoghlan at gmail.com Sun Jul 24 10:23:20 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Jul 2016 00:23:20 +1000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> Message-ID: On 24 July 2016 at 13:50, Chris Angelico wrote: > On Sun, Jul 24, 2016 at 12:53 PM, Nick Coghlan wrote: >> Personally, I could definitely see a feature like "pip-newdep >> requirements.in '' ''" being relevant in pip-tools as a >> shorthand for something like: >> >> echo '' >> requirements.in && echo '' >> >> requirements.in && pip-compile requirements.in && pip install -r >> requirements.txt > > Can the file name be made implicit? "requirements.in" is the default in pip-compile if you don't otherwise specify a source file, so yes, you could rely on that. However, I'm not really the right person to ask - that would be Vincent Driessen, as the author of pip-tools :) > and it'd DTRT as regards building the requirements. If I understand > you correctly, requirements.in would store only the names of packages > required (no version info), You can still put version constraints in requirements.in, but they'd generally only be for known incompatibilities (i.e. minimum versions, plus sometimes maximum versions if there's a backwards compatibility break) > and then requirements.txt would look like > the output of 'pip freeze' but only for the packages listed in > requirements.in? Because that would be _perfect_. Both files go into > source control, and the task "update to the latest versions of things > and check if it all works" would be straight-forward. Yep, and if I understand the logic correctly, "pip-compile" will add new dependencies, but leave existing ones alone if they still meet the requirements, while "pip-compile --upgrade" will try to upgrade everything to the latest version. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From brett at python.org Sun Jul 24 12:21:33 2016 From: brett at python.org (Brett Cannon) Date: Sun, 24 Jul 2016 16:21:33 +0000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: On Sat, 23 Jul 2016 at 10:36 Daniel Holth wrote: > Not yet. Someone should fix that ? > There is an issue tracking that if someone gets adventurous enough to write up a PR: https://github.com/pypa/pip/issues/3691 . -Brett > > On Sat, Jul 23, 2016, 11:37 Alex Gr?nholm > wrote: > >> pip doesn't yet support pyproject.toml does it? >> >> >> 23.07.2016, 17:43, Daniel Holth kirjoitti: >> >> Here is my attempt. The SConstruct (like a Makefile) builds the >> extension. The .toml file gives the static metadata. No need to put the two >> in the same file. >> >> https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct >> >> https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml >> >> On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm >> wrote: >> >>> 23.07.2016, 17:04, Thomas Kluyver kirjoitti: >>> > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: >>> >> I'm -1 on this because requirements.txt is not really the standard way >>> >> to list dependencies. >>> >> In the Python world, setup.py is the equivalent of Node's >>> package.json. >>> >> But as it is >>> >> Python code, it cannot so easily be programmatically modified. >>> > Packaging based on declarative metadata: >>> > http://flit.readthedocs.io/en/latest/ >>> > >>> > >>> > We have a bit of a divide. Specifying dependencies in setup.py (or >>> > flit.ini, or upcoming pyproject.toml) is the standard for library and >>> > tool packages that are intended to be published on PyPI and installed >>> > with pip. requirements.txt is generally used for applications which >>> will >>> > be distributed or deployed by other means. >>> > >>> > As I understand it, in the Javascript world package.json is used in >>> both >>> > cases. Is that something Python should try to emulate? Is it hard to >>> > achieve given the limitations of setup.py that you pointed out? >>> This topic has been beaten to death. There is no way to cram the >>> complexities of C extension compilation setup into purely declarative >>> metadata. Distutils2 tried and failed. Just look at the setup.py files >>> of some popular projects and imagine all that logic expressed in >>> declarative metadata. >>> > Thomas >>> > _______________________________________________ >>> > Distutils-SIG maillist - Distutils-SIG at python.org >>> > https://mail.python.org/mailman/listinfo/distutils-sig >>> >>> _______________________________________________ >>> Distutils-SIG maillist - Distutils-SIG at python.org >>> https://mail.python.org/mailman/listinfo/distutils-sig >>> >> >> _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robin at reportlab.com Mon Jul 25 08:04:41 2016 From: robin at reportlab.com (Robin Becker) Date: Mon, 25 Jul 2016 13:04:41 +0100 Subject: [Distutils] bulk upload Message-ID: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> I have started to make manylinux wheels for reportlab. Our work flow is split across multiple machines. In the end we create a total of 19 package files (10 manylinux, 8 windows + 1 source); these total 53Mb. 1) Is there a convenient way to upload a new version starting from the package files themselves? Normally we try to test the packages before they are uploaded which implies we cannot just use the distutils upload command. 2) I assume I cannot just keep on uploading new versions to pypi. Presumably I would have to delete a micro release before uploading a new one and only keep significant releases. 3) The manylinux builds are significantly larger than the windows ones because the manylinux build is not statically linking those bits of freetype which we use. Is there a way to detect that I'm building under manylinux? -- Robin Becker From dholth at gmail.com Mon Jul 25 10:30:07 2016 From: dholth at gmail.com (Daniel Holth) Date: Mon, 25 Jul 2016 14:30:07 +0000 Subject: [Distutils] bulk upload In-Reply-To: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> Message-ID: 1. There is a tool called twine that is the best way to upload to pypi 2. I'm not aware of any aggregate limits but I'm pretty sure each individual file can only be so big 3. Maybe the platform returns as manylinux1? Set an environment variable to ask for static linking, and check for it in your build script? On Mon, Jul 25, 2016 at 8:05 AM Robin Becker wrote: > I have started to make manylinux wheels for reportlab. > > Our work flow is split across multiple machines. In the end we create a > total of > 19 package files (10 manylinux, 8 windows + 1 source); these total 53Mb. > > 1) Is there a convenient way to upload a new version starting from the > package > files themselves? Normally we try to test the packages before they are > uploaded > which implies we cannot just use the distutils upload command. > > > 2) I assume I cannot just keep on uploading new versions to pypi. > Presumably I > would have to delete a micro release before uploading a new one and only > keep > significant releases. > > 3) The manylinux builds are significantly larger than the windows ones > because > the manylinux build is not statically linking those bits of freetype which > we > use. Is there a way to detect that I'm building under manylinux? > -- > Robin Becker > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robin at reportlab.com Mon Jul 25 11:55:23 2016 From: robin at reportlab.com (Robin Becker) Date: Mon, 25 Jul 2016 16:55:23 +0100 Subject: [Distutils] bulk upload In-Reply-To: References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> Message-ID: On 25/07/2016 15:30, Daniel Holth wrote: > 1. There is a tool called twine that is the best way to upload to pypi > thanks I'll check that out. > 2. I'm not aware of any aggregate limits but I'm pretty sure each > individual file can only be so big In our private readonly pypi we have 93 releases. I don't think that burden should fall on pypi. However, it's not clear to me if I should push micro releases to pypi and then remove them when another release is made. Is there a way to remove a 'release' completely? The edit pages seem to suggest so, but does that remove the files? > > 3. Maybe the platform returns as manylinux1? Set an environment variable to > ask for static linking, and check for it in your build script? > ....... I did try manylinux1 (after PEP 513), but it didn't seem to work; looked at sys platform, os name and the platform module, but see only this > platform=Linux-3.16.0-50-generic-x86_64-with-redhat-5.11-Final > sys.platform=linux2 > os.name=posix however, it's easy enough to export an environment variable in the docker startup script. I did try to reduce my manylinux sizes by using a library of shared object codes (ie a .a built from the PIC compile objs), but I didn't seem able to make this work properly; the resulting .so seems to contain the whole library (freetype). The windows linker seems able to pick up only the required bits so the windows wheels are much smaller. -- Robin Becker From chris.barker at noaa.gov Mon Jul 25 14:52:30 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 25 Jul 2016 11:52:30 -0700 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On Sat, Jul 23, 2016 at 12:22 PM, Nathaniel Smith wrote: > OTOH, if we give up on that part of the idea, then it becomes much easier > :-). It'd be straightforward for PyPI to provide a "how to donate to this > project" box on each project page, that has links to whatever donation > processing service(s) the project prefers. > +1 on this one -- shift all the legal hassles to the projects, but provide a tiny bit of infrastructure to make it easier for people to find. note: for a higher level of support, the PSF _could_ follow the numfocus approach: NumFocus is a properly set-up non-profit that can act as a gateway for particular projects, so that the individual projects don't need to set up all that accounting and legal infrastructure: http://www.numfocus.org/information-on-fiscal-sponsorship.html However, there is still a pretty big barrier to entry to become a sponsored organization, as there should be. So I think it would be great if PyPi could simply put a donate button in there, but not try to get into the money laundering business. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon Jul 25 15:05:19 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 25 Jul 2016 12:05:19 -0700 Subject: [Distutils] bulk upload In-Reply-To: References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> Message-ID: On Mon, Jul 25, 2016 at 8:55 AM, Robin Becker wrote: > In our private readonly pypi we have 93 releases. I don't think that > burden should fall on pypi. However, it's not clear to me if I should push > micro releases to pypi and then remove them when another release is made. > Is there a way to remove a 'release' completely? I'm pretty sure there is no way to remove a release (at least not routinely). thi sis by design -- if someone has done something with that particular release, we want it to be reproducible. I see the point, but it's a little be too bad -- I know I've got some releases up there that were replaced VERY soon due to a build error or some carelessness on my part :-) Apparently, disk space is cheap enough that PyPI doesn't need to worry about it. Are you running into any problems? I did try to reduce my manylinux sizes by using a library of shared object > codes (ie a .a built from the PIC compile objs), but I didn't seem able to > make this work properly; the resulting .so seems to contain the whole > library (freetype). is this a problem other than file sizes? I think until / if Nathanial (or someone :-) ) comes up with a standard way to make wheels of shared libs, we'll simply have to live with large binaries. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Mon Jul 25 16:16:26 2016 From: donald at stufft.io (Donald Stufft) Date: Mon, 25 Jul 2016 16:16:26 -0400 Subject: [Distutils] bulk upload In-Reply-To: References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> Message-ID: <0211FB17-3BEF-477F-8484-DFEDF724DD70@stufft.io> > On Jul 25, 2016, at 3:05 PM, Chris Barker wrote: > > On Mon, Jul 25, 2016 at 8:55 AM, Robin Becker > wrote: > In our private readonly pypi we have 93 releases. I don't think that burden should fall on pypi. However, it's not clear to me if I should push micro releases to pypi and then remove them when another release is made. Is there a way to remove a 'release' completely? > > I'm pretty sure there is no way to remove a release (at least not routinely). thi sis by design -- if someone has done something with that particular release, we want it to be reproducible. Authors can delete files, releases, or projects but can never re-upload an already uploaded file, even if they delete it. It is discouraged to actually do this though (and in the future we may change it to a soft delete that just hides it from everything with the ability to restore it). It is discouraged for basically the reason you mentioned, people pin to specific versions (and sometimes specific hashes) and we don?t want to break their deployments. > > I see the point, but it's a little be too bad -- I know I've got some releases up there that were replaced VERY soon due to a build error or some carelessness on my part :-) > > Apparently, disk space is cheap enough that PyPI doesn't need to worry about it. Disk space is super cheap. We?re currently using Amazon S3 to store our files, and the storage portion of our ?bill? there is something like $10/month for all of PyPI (out of a total ?cost? of ~$35,000/month). Almost all of our ?cost? for PyPI as a whole comes from bandwidth used not from storage. ? Donald Stufft -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Mon Jul 25 17:57:56 2016 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 26 Jul 2016 07:57:56 +1000 Subject: [Distutils] bulk upload In-Reply-To: <0211FB17-3BEF-477F-8484-DFEDF724DD70@stufft.io> References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> <0211FB17-3BEF-477F-8484-DFEDF724DD70@stufft.io> Message-ID: On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft wrote: > Disk space is super cheap. We?re currently using Amazon S3 to store our > files, and the storage portion of our ?bill? there is something like > $10/month for all of PyPI (out of a total ?cost? of ~$35,000/month). Almost > all of our ?cost? for PyPI as a whole comes from bandwidth used not from > storage. Does anyone mirror all of PyPI? If so, "storage" suddenly also means "bandwidth". ChrisA From donald at stufft.io Mon Jul 25 18:03:52 2016 From: donald at stufft.io (Donald Stufft) Date: Mon, 25 Jul 2016 18:03:52 -0400 Subject: [Distutils] bulk upload In-Reply-To: References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> <0211FB17-3BEF-477F-8484-DFEDF724DD70@stufft.io> Message-ID: <4A958D8A-350B-48C5-96A4-3F8553E63907@stufft.io> > On Jul 25, 2016, at 5:57 PM, Chris Angelico wrote: > > On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft wrote: >> Disk space is super cheap. We?re currently using Amazon S3 to store our >> files, and the storage portion of our ?bill? there is something like >> $10/month for all of PyPI (out of a total ?cost? of ~$35,000/month). Almost >> all of our ?cost? for PyPI as a whole comes from bandwidth used not from >> storage. > > Does anyone mirror all of PyPI? If so, "storage" suddenly also means > "bandwidth?. Yes folks do mirror all of PyPI, but it?s not as simple as storage == bandwidth. The price of the bandwidth is paid generally when the file is uploaded so deleting doesn?t reduce the bandwidth demands of existing mirrors. It *does* increase the bandwidth demands of a brand new mirror, but a single full mirror represents 0.089% of the total monthly bandwidth of PyPI and there are no indications that there are significant numbers of new mirrors being added regularly to where it would even matter. ? Donald Stufft From rosuav at gmail.com Mon Jul 25 18:08:30 2016 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 26 Jul 2016 08:08:30 +1000 Subject: [Distutils] bulk upload In-Reply-To: <4A958D8A-350B-48C5-96A4-3F8553E63907@stufft.io> References: <2b783ea0-cdf0-df2f-b277-44d1d7602180@chamonix.reportlab.co.uk> <0211FB17-3BEF-477F-8484-DFEDF724DD70@stufft.io> <4A958D8A-350B-48C5-96A4-3F8553E63907@stufft.io> Message-ID: On Tue, Jul 26, 2016 at 8:03 AM, Donald Stufft wrote: > >> On Jul 25, 2016, at 5:57 PM, Chris Angelico wrote: >> >> On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft wrote: >>> Disk space is super cheap. We?re currently using Amazon S3 to store our >>> files, and the storage portion of our ?bill? there is something like >>> $10/month for all of PyPI (out of a total ?cost? of ~$35,000/month). Almost >>> all of our ?cost? for PyPI as a whole comes from bandwidth used not from >>> storage. >> >> Does anyone mirror all of PyPI? If so, "storage" suddenly also means >> "bandwidth?. > > > Yes folks do mirror all of PyPI, but it?s not as simple as storage == bandwidth. The price of the bandwidth is paid generally when the file is uploaded so deleting doesn?t reduce the bandwidth demands of existing mirrors. It *does* increase the bandwidth demands of a brand new mirror, but a single full mirror represents 0.089% of the total monthly bandwidth of PyPI and there are no indications that there are significant numbers of new mirrors being added regularly to where it would even matter. > Good stats, thanks. ChrisA From ncoghlan at gmail.com Mon Jul 25 23:24:55 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 26 Jul 2016 13:24:55 +1000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On 26 July 2016 at 04:52, Chris Barker wrote: > On Sat, Jul 23, 2016 at 12:22 PM, Nathaniel Smith wrote: > note: for a higher level of support, the PSF _could_ follow the numfocus > approach: > > NumFocus is a properly set-up non-profit that can act as a gateway for > particular projects, so that the individual projects don't need to set up > all that accounting and legal infrastructure: > > http://www.numfocus.org/information-on-fiscal-sponsorship.html > > However, there is still a pretty big barrier to entry to become a sponsored > organization, as there should be. The PSF has considered this, but there's not a lot of value we could provide above and beyond other organisations that already do this for open source projects in general. For example: - Software Freedom Conservancy - Software in the Public Interest - Outercurve Foundation However, similar to the more direct crowdfunding options, pointing folks towards organisations like these via the publisher-facing pages in Warehouse is certainly something that could be done, especially as their download counts start to grow. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From nicholas.chammas at gmail.com Thu Jul 28 12:09:28 2016 From: nicholas.chammas at gmail.com (Nicholas Chammas) Date: Thu, 28 Jul 2016 16:09:28 +0000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On Sat, Jul 23, 2016 at 3:11 PM Donald Stufft donald at stufft.io wrote: > On Jul 23, 2016, at 2:40 PM, Nicholas Chammas > wrote: > > I know a more concrete proposal would have to address a lot of details > (e.g. like how to split contributions across multiple maintainers), and > perhaps there is no way to find the resources to build or maintain such a > thing in the first place. But just for now I?d like to separate essence of > idea from the practical concerns of implementing it. > > > > I?m mulling over the idea in my head, but one other thing we?d need to > figure out is the *legality* of doing this and if it?s something the PSF is > willing to do at all. > Would it simplify things for the PSF if they partnered with someone who took care of moving the money around? The PSF, via PyPI, would bring a large, opt-in user base to their doorstep, and in return the payment provider (e.g. Gratipay, Salt) would cut the PSF some tiny slice of each transaction. I don?t know if this tweak makes the proposal more realistic (again, maybe the margins wouldn?t work for the provider), but at least on first blush I think would help us accomplish the following: - We offer the community a low-friction, standardized way to support projects they care about. - The PSF doesn?t have to handle thorny legal and accounting issues, at least not as bad as they would if they handled the payments directly. - The PSF builds up a funding stream that can support our core infrastructure. Nick ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jul 29 06:40:38 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 29 Jul 2016 20:40:38 +1000 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: On 29 July 2016 at 02:09, Nicholas Chammas wrote: > Would it simplify things for the PSF if they partnered with someone who took > care of moving the money around? If a global payments provider came to the PSF (or the Packaging Working Group within the PSF) and said "Here's a proposal for you to consider and suggest amendments to before we start sending implementation patches to Warehouse", then it would likely simplify things somewhat (the PSF would mainly just need to review the proposal to ensure it didn't jeopardise the PSF's non-profit status, that the platform operator had a clear escalation process for folks sending and receiving money, and that the terms of the proposal for individual publishers were something the PSF was happy to promote to PyPI's users). However, in terms of designing such a system from scratch, picking a default payment platform is a relatively easy part of the problem - the design work around how the program is presented to package publishers and users, how folks raise questions regarding problems with money sent or received, and how we mitigate the chance of horror stories as folks naively fail to comply with their local tax laws all remain as complex problems to be addressed. (There may also end up being some challenges around age verification, as PyPI doesn't currently require you to specify your age when signing up, but also doesn't currently provide any services where that's a potential problem) > The PSF, via PyPI, would bring a large, opt-in user base to their doorstep, > and in return the payment provider (e.g. Gratipay, Salt) would cut the PSF > some tiny slice of each transaction. > > I don?t know if this tweak makes the proposal more realistic (again, maybe > the margins wouldn?t work for the provider) It does make it more realistic, but as you note in your parenthetical comment, it's an open question as to whether it would be worth the investment in design and implementation effort from the side of the platform provider (especially if they assume the PSF itself will eventually get around to funding something along these lines). Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From barry at python.org Fri Jul 29 19:57:21 2016 From: barry at python.org (Barry Warsaw) Date: Fri, 29 Jul 2016 19:57:21 -0400 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI References: Message-ID: <20160729195721.0dff9103@anarchist.wooz.org> On Jul 26, 2016, at 01:24 PM, Nick Coghlan wrote: >The PSF has considered this, but there's not a lot of value we could >provide above and beyond other organisations that already do this for >open source projects in general. For example: > >- Software Freedom Conservancy >- Software in the Public Interest >- Outercurve Foundation The Free Software Foundation also runs a donation program, and GNU Mailman has benefited from this to some extent. They do take a healthy cut, which is okay with us because it helps fund their own programs and administrative costs. One thing to keep in mind is that not all developers can or want to accept donations. It can cause tax and employment headaches. But in GNU Mailman's case, having donations go to a project-centric fund has allowed us to help pay for travel and Pycon attendance for our GSoC students, at least one of which has since become a core developer. At our size, that's been way more valuable to the project than paying individual developers. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From wes.turner at gmail.com Sat Jul 30 13:50:57 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 30 Jul 2016 12:50:57 -0500 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: pipup has "save to a requirements.txt" functionality https://github.com/revsys/pipup It looks like it doesn't yet handle hash-checking mode (which is from peep, IIRC): - https://pip.pypa.io/en/stable/reference/pip_install/#hash-checking-mode - https://github.com/revsys/pipup/blob/master/pipup/req_files.py I think str(req_install.InstallRequirement) could/should just work? Or maybe to_requirements_str()? - https://github.com/pypa/pip/blob/master/pip/req/req_file.py - https://github.com/pypa/pip/blob/master/pip/req/req_install.py pip-tools probably has InstallRequirement.to_requirements_str()? - https://github.com/nvie/pip-tools/blob/master/piptools/writer.py - https://github.com/nvie/pip-tools/blob/master/piptools/utils.py - format_requirement() - format_specifier() Round-trip with requirements.txt files would probably be useful On Sunday, July 24, 2016, Brett Cannon wrote: > > > On Sat, 23 Jul 2016 at 10:36 Daniel Holth > wrote: > >> Not yet. Someone should fix that ? >> > There is an issue tracking that if someone gets adventurous enough to > write up a PR: https://github.com/pypa/pip/issues/3691 . > > -Brett > > >> >> On Sat, Jul 23, 2016, 11:37 Alex Gr?nholm > > wrote: >> >>> pip doesn't yet support pyproject.toml does it? >>> >>> >>> 23.07.2016, 17:43, Daniel Holth kirjoitti: >>> >>> Here is my attempt. The SConstruct (like a Makefile) builds the >>> extension. The .toml file gives the static metadata. No need to put the two >>> in the same file. >>> >>> https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct >>> >>> https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml >>> >>> On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm >> > wrote: >>> >>>> 23.07.2016, 17:04, Thomas Kluyver kirjoitti: >>>> > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: >>>> >> I'm -1 on this because requirements.txt is not really the standard >>>> way >>>> >> to list dependencies. >>>> >> In the Python world, setup.py is the equivalent of Node's >>>> package.json. >>>> >> But as it is >>>> >> Python code, it cannot so easily be programmatically modified. >>>> > Packaging based on declarative metadata: >>>> > http://flit.readthedocs.io/en/latest/ >>>> > >>>> > >>>> > We have a bit of a divide. Specifying dependencies in setup.py (or >>>> > flit.ini, or upcoming pyproject.toml) is the standard for library and >>>> > tool packages that are intended to be published on PyPI and installed >>>> > with pip. requirements.txt is generally used for applications which >>>> will >>>> > be distributed or deployed by other means. >>>> > >>>> > As I understand it, in the Javascript world package.json is used in >>>> both >>>> > cases. Is that something Python should try to emulate? Is it hard to >>>> > achieve given the limitations of setup.py that you pointed out? >>>> This topic has been beaten to death. There is no way to cram the >>>> complexities of C extension compilation setup into purely declarative >>>> metadata. Distutils2 tried and failed. Just look at the setup.py files >>>> of some popular projects and imagine all that logic expressed in >>>> declarative metadata. >>>> > Thomas >>>> > _______________________________________________ >>>> > Distutils-SIG maillist - Distutils-SIG at python.org >>>> >>>> > https://mail.python.org/mailman/listinfo/distutils-sig >>>> >>>> _______________________________________________ >>>> Distutils-SIG maillist - Distutils-SIG at python.org >>>> >>>> https://mail.python.org/mailman/listinfo/distutils-sig >>>> >>> >>> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> >> https://mail.python.org/mailman/listinfo/distutils-sig >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tritium-list at sdamon.com Sat Jul 30 17:39:54 2016 From: tritium-list at sdamon.com (tritium-list at sdamon.com) Date: Sat, 30 Jul 2016 17:39:54 -0400 Subject: [Distutils] Contributing money to package authors/maintainers via PyPI In-Reply-To: References: Message-ID: <010001d1eaaa$e9461840$bbd248c0$@hotmail.com> I assert that a PayPal Donate link in a readme is sufficient. Anything more is a pure waste of precious PSF and -sig resources. If a project is large and needs significant funding, there are better avenues to get funding from or through the PSF. Besides, I do not want to ask Donald to deal with PCI compliance; no matter who writes the patch, he would be the one that needs to make sure its correct. > -----Original Message----- > From: Distutils-SIG [mailto:distutils-sig-bounces+tritium- > list=sdamon.com at python.org] On Behalf Of Nick Coghlan > Sent: Friday, July 29, 2016 6:41 AM > To: Nicholas Chammas > Cc: distutils-sig > Subject: Re: [Distutils] Contributing money to package authors/maintainers > via PyPI > > On 29 July 2016 at 02:09, Nicholas Chammas > wrote: > > Would it simplify things for the PSF if they partnered with someone who > took > > care of moving the money around? > > If a global payments provider came to the PSF (or the Packaging > Working Group within the PSF) and said "Here's a proposal for you to > consider and suggest amendments to before we start sending > implementation patches to Warehouse", then it would likely simplify > things somewhat (the PSF would mainly just need to review the proposal > to ensure it didn't jeopardise the PSF's non-profit status, that the > platform operator had a clear escalation process for folks sending and > receiving money, and that the terms of the proposal for individual > publishers were something the PSF was happy to promote to PyPI's > users). > > However, in terms of designing such a system from scratch, picking a > default payment platform is a relatively easy part of the problem - > the design work around how the program is presented to package > publishers and users, how folks raise questions regarding problems > with money sent or received, and how we mitigate the chance of horror > stories as folks naively fail to comply with their local tax laws all > remain as complex problems to be addressed. (There may also end up > being some challenges around age verification, as PyPI doesn't > currently require you to specify your age when signing up, but also > doesn't currently provide any services where that's a potential > problem) > > > The PSF, via PyPI, would bring a large, opt-in user base to their doorstep, > > and in return the payment provider (e.g. Gratipay, Salt) would cut the PSF > > some tiny slice of each transaction. > > > > I don?t know if this tweak makes the proposal more realistic (again, maybe > > the margins wouldn?t work for the provider) > > It does make it more realistic, but as you note in your parenthetical > comment, it's an open question as to whether it would be worth the > investment in design and implementation effort from the side of the > platform provider (especially if they assume the PSF itself will > eventually get around to funding something along these lines). > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From wes.turner at gmail.com Sat Jul 30 21:23:14 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 30 Jul 2016 20:23:14 -0500 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: pbr also supports "environment markers" which we would want to preserve when round-tripping (reading in, modifying, and writing out) requirements.txt files; though IDK if environment markers are part of any Python Packaging Spec? from http://docs.openstack.org/developer/pbr/#environment-markers : argparse; python_version=='2.6' On Sat, Jul 30, 2016 at 12:50 PM, Wes Turner wrote: > pipup has "save to a requirements.txt" functionality > https://github.com/revsys/pipup > > It looks like it doesn't yet handle hash-checking mode (which is from > peep, IIRC): > > - https://pip.pypa.io/en/stable/reference/pip_install/#hash-checking-mode > - https://github.com/revsys/pipup/blob/master/pipup/req_files.py > > I think str(req_install.InstallRequirement) could/should just work? Or > maybe to_requirements_str()? > - https://github.com/pypa/pip/blob/master/pip/req/req_file.py > - https://github.com/pypa/pip/blob/master/pip/req/req_install.py > > pip-tools probably has InstallRequirement.to_requirements_str()? > > - https://github.com/nvie/pip-tools/blob/master/piptools/writer.py > - https://github.com/nvie/pip-tools/blob/master/piptools/utils.py > - format_requirement() > - format_specifier() > > Round-trip with requirements.txt files would probably be useful > > > On Sunday, July 24, 2016, Brett Cannon wrote: > >> >> >> On Sat, 23 Jul 2016 at 10:36 Daniel Holth wrote: >> >>> Not yet. Someone should fix that ? >>> >> There is an issue tracking that if someone gets adventurous enough to >> write up a PR: https://github.com/pypa/pip/issues/3691 . >> >> -Brett >> >> >>> >>> On Sat, Jul 23, 2016, 11:37 Alex Gr?nholm >>> wrote: >>> >>>> pip doesn't yet support pyproject.toml does it? >>>> >>>> >>>> 23.07.2016, 17:43, Daniel Holth kirjoitti: >>>> >>>> Here is my attempt. The SConstruct (like a Makefile) builds the >>>> extension. The .toml file gives the static metadata. No need to put the two >>>> in the same file. >>>> >>>> https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct >>>> >>>> https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml >>>> >>>> On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm < >>>> alex.gronholm at nextday.fi> wrote: >>>> >>>>> 23.07.2016, 17:04, Thomas Kluyver kirjoitti: >>>>> > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: >>>>> >> I'm -1 on this because requirements.txt is not really the standard >>>>> way >>>>> >> to list dependencies. >>>>> >> In the Python world, setup.py is the equivalent of Node's >>>>> package.json. >>>>> >> But as it is >>>>> >> Python code, it cannot so easily be programmatically modified. >>>>> > Packaging based on declarative metadata: >>>>> > http://flit.readthedocs.io/en/latest/ >>>>> > >>>>> > >>>>> > We have a bit of a divide. Specifying dependencies in setup.py (or >>>>> > flit.ini, or upcoming pyproject.toml) is the standard for library and >>>>> > tool packages that are intended to be published on PyPI and installed >>>>> > with pip. requirements.txt is generally used for applications which >>>>> will >>>>> > be distributed or deployed by other means. >>>>> > >>>>> > As I understand it, in the Javascript world package.json is used in >>>>> both >>>>> > cases. Is that something Python should try to emulate? Is it hard to >>>>> > achieve given the limitations of setup.py that you pointed out? >>>>> This topic has been beaten to death. There is no way to cram the >>>>> complexities of C extension compilation setup into purely declarative >>>>> metadata. Distutils2 tried and failed. Just look at the setup.py files >>>>> of some popular projects and imagine all that logic expressed in >>>>> declarative metadata. >>>>> > Thomas >>>>> > _______________________________________________ >>>>> > Distutils-SIG maillist - Distutils-SIG at python.org >>>>> > https://mail.python.org/mailman/listinfo/distutils-sig >>>>> >>>>> _______________________________________________ >>>>> Distutils-SIG maillist - Distutils-SIG at python.org >>>>> https://mail.python.org/mailman/listinfo/distutils-sig >>>>> >>>> >>>> _______________________________________________ >>> Distutils-SIG maillist - Distutils-SIG at python.org >>> https://mail.python.org/mailman/listinfo/distutils-sig >>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Sat Jul 30 21:24:08 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 30 Jul 2016 20:24:08 -0500 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: (side note: I would love to work on this but am in dire need of a job (which would ideally encourage further open source contribution)) On Sat, Jul 30, 2016 at 8:23 PM, Wes Turner wrote: > pbr also supports "environment markers" > which we would want to preserve when round-tripping (reading in, > modifying, and writing out) requirements.txt files; > though IDK if environment markers are part of any Python Packaging Spec? > > from http://docs.openstack.org/developer/pbr/#environment-markers : > > argparse; python_version=='2.6' > > > On Sat, Jul 30, 2016 at 12:50 PM, Wes Turner wrote: > >> pipup has "save to a requirements.txt" functionality >> https://github.com/revsys/pipup >> >> It looks like it doesn't yet handle hash-checking mode (which is from >> peep, IIRC): >> >> - https://pip.pypa.io/en/stable/reference/pip_install/#hash-checking-mode >> - https://github.com/revsys/pipup/blob/master/pipup/req_files.py >> >> I think str(req_install.InstallRequirement) could/should just work? Or >> maybe to_requirements_str()? >> - https://github.com/pypa/pip/blob/master/pip/req/req_file.py >> - https://github.com/pypa/pip/blob/master/pip/req/req_install.py >> >> pip-tools probably has InstallRequirement.to_requirements_str()? >> >> - https://github.com/nvie/pip-tools/blob/master/piptools/writer.py >> - https://github.com/nvie/pip-tools/blob/master/piptools/utils.py >> - format_requirement() >> - format_specifier() >> >> Round-trip with requirements.txt files would probably be useful >> >> >> On Sunday, July 24, 2016, Brett Cannon wrote: >> >>> >>> >>> On Sat, 23 Jul 2016 at 10:36 Daniel Holth wrote: >>> >>>> Not yet. Someone should fix that ? >>>> >>> There is an issue tracking that if someone gets adventurous enough to >>> write up a PR: https://github.com/pypa/pip/issues/3691 . >>> >>> -Brett >>> >>> >>>> >>>> On Sat, Jul 23, 2016, 11:37 Alex Gr?nholm >>>> wrote: >>>> >>>>> pip doesn't yet support pyproject.toml does it? >>>>> >>>>> >>>>> 23.07.2016, 17:43, Daniel Holth kirjoitti: >>>>> >>>>> Here is my attempt. The SConstruct (like a Makefile) builds the >>>>> extension. The .toml file gives the static metadata. No need to put the two >>>>> in the same file. >>>>> >>>>> https://bitbucket.org/dholth/cryptacular/src/tip/SConstruct >>>>> >>>>> https://bitbucket.org/dholth/cryptacular/src/tip/pyproject.toml >>>>> >>>>> On Sat, Jul 23, 2016 at 10:11 AM Alex Gr?nholm < >>>>> alex.gronholm at nextday.fi> wrote: >>>>> >>>>>> 23.07.2016, 17:04, Thomas Kluyver kirjoitti: >>>>>> > On Sat, Jul 23, 2016, at 02:32 PM, Alex Gr?nholm wrote: >>>>>> >> I'm -1 on this because requirements.txt is not really the standard >>>>>> way >>>>>> >> to list dependencies. >>>>>> >> In the Python world, setup.py is the equivalent of Node's >>>>>> package.json. >>>>>> >> But as it is >>>>>> >> Python code, it cannot so easily be programmatically modified. >>>>>> > Packaging based on declarative metadata: >>>>>> > http://flit.readthedocs.io/en/latest/ >>>>>> > >>>>>> > >>>>>> > We have a bit of a divide. Specifying dependencies in setup.py (or >>>>>> > flit.ini, or upcoming pyproject.toml) is the standard for library >>>>>> and >>>>>> > tool packages that are intended to be published on PyPI and >>>>>> installed >>>>>> > with pip. requirements.txt is generally used for applications which >>>>>> will >>>>>> > be distributed or deployed by other means. >>>>>> > >>>>>> > As I understand it, in the Javascript world package.json is used in >>>>>> both >>>>>> > cases. Is that something Python should try to emulate? Is it hard to >>>>>> > achieve given the limitations of setup.py that you pointed out? >>>>>> This topic has been beaten to death. There is no way to cram the >>>>>> complexities of C extension compilation setup into purely declarative >>>>>> metadata. Distutils2 tried and failed. Just look at the setup.py files >>>>>> of some popular projects and imagine all that logic expressed in >>>>>> declarative metadata. >>>>>> > Thomas >>>>>> > _______________________________________________ >>>>>> > Distutils-SIG maillist - Distutils-SIG at python.org >>>>>> > https://mail.python.org/mailman/listinfo/distutils-sig >>>>>> >>>>>> _______________________________________________ >>>>>> Distutils-SIG maillist - Distutils-SIG at python.org >>>>>> https://mail.python.org/mailman/listinfo/distutils-sig >>>>>> >>>>> >>>>> _______________________________________________ >>>> Distutils-SIG maillist - Distutils-SIG at python.org >>>> https://mail.python.org/mailman/listinfo/distutils-sig >>>> >>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sat Jul 30 23:17:37 2016 From: donald at stufft.io (Donald Stufft) Date: Sat, 30 Jul 2016 23:17:37 -0400 Subject: [Distutils] Switch to upload.pypi.org instead of upload.pypi.io Message-ID: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> Hey! The PSF was finally successful in purchasing pypi.org from the person who previously owned it. Previously discussions had dropped off and we assumed we weren?t going to be able to purchase it, so we moved forward with pypi.io instead. However, the person recently came back around and was willing to sell it this time for a reasonable price. We don?t have all of the pypi.io domains switched over to supporting or preferring pypi.org yet, but I just enabled upload.pypi.org. If you?re currently uploading to Warehouse using upload.pypi.io, nothing will break, but I suggest switching your URL to upload.pypi.org instead as it may not continue to work forever. Thanks! ? Donald Stufft From annaraven at gmail.com Sat Jul 30 23:53:45 2016 From: annaraven at gmail.com (Anna Ravenscroft) Date: Sat, 30 Jul 2016 20:53:45 -0700 Subject: [Distutils] Switch to upload.pypi.org instead of upload.pypi.io In-Reply-To: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> References: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> Message-ID: Great timing! Thanks for letting us know. (I've updated it in the upcoming Nutshell.) On Sat, Jul 30, 2016 at 8:17 PM, Donald Stufft wrote: > Hey! > > The PSF was finally successful in purchasing pypi.org from the person who > previously owned it. Previously discussions had dropped off and we assumed > we weren?t going to be able to purchase it, so we moved forward with > pypi.io instead. However, the person recently came back around and was > willing to sell it this time for a reasonable price. > > We don?t have all of the pypi.io domains switched over to supporting or > preferring pypi.org yet, but I just enabled upload.pypi.org. If you?re > currently uploading to Warehouse using upload.pypi.io, nothing will > break, but I suggest switching your URL to upload.pypi.org instead as it > may not continue to work forever. > > Thanks! > > ? > Donald Stufft > > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -- cordially, Anna -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat Jul 30 23:56:08 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 31 Jul 2016 13:56:08 +1000 Subject: [Distutils] Switch to upload.pypi.org instead of upload.pypi.io In-Reply-To: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> References: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> Message-ID: On 31 July 2016 at 13:17, Donald Stufft wrote: > Hey! > > The PSF was finally successful in purchasing pypi.org from the person who previously owned it. Previously discussions had dropped off and we assumed we weren?t going to be able to purchase it, so we moved forward with pypi.io instead. However, the person recently came back around and was willing to sell it this time for a reasonable price. Woohoo, that's great news! pypi.io was an acceptable fallback, but pypi.org is definitely the preferred option :) > We don?t have all of the pypi.io domains switched over to supporting or preferring pypi.org yet, but I just enabled upload.pypi.org. If you?re currently uploading to Warehouse using upload.pypi.io, nothing will break, but I suggest switching your URL to upload.pypi.org instead as it may not continue to work forever. +1 on the recommendation for folks to switch their configuration settings, although given that domains aren't that expensive, it's probably worth the PSF keeping control of pypi.io as an alias of pypi.org at least for a few years (and perhaps indefinitely). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From tritium-list at sdamon.com Sat Jul 30 23:59:10 2016 From: tritium-list at sdamon.com (tritium-list at sdamon.com) Date: Sat, 30 Jul 2016 23:59:10 -0400 Subject: [Distutils] Switch to upload.pypi.org instead of upload.pypi.io In-Reply-To: References: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> Message-ID: <017101d1eadf$e52bbaf0$af8330d0$@hotmail.com> > -----Original Message----- > From: Distutils-SIG [mailto:distutils-sig-bounces+tritium- > list=sdamon.com at python.org] On Behalf Of Nick Coghlan > Sent: Saturday, July 30, 2016 11:56 PM > To: Donald Stufft > Cc: distutils sig > Subject: Re: [Distutils] Switch to upload.pypi.org instead of upload.pypi.io > > On 31 July 2016 at 13:17, Donald Stufft wrote: > > Hey! > > > > The PSF was finally successful in purchasing pypi.org from the person who > previously owned it. Previously discussions had dropped off and we assumed > we weren?t going to be able to purchase it, so we moved forward with > pypi.io instead. However, the person recently came back around and was > willing to sell it this time for a reasonable price. > > Woohoo, that's great news! > > pypi.io was an acceptable fallback, but pypi.org is definitely the > preferred option :) > > > We don?t have all of the pypi.io domains switched over to supporting or > preferring pypi.org yet, but I just enabled upload.pypi.org. If you?re currently > uploading to Warehouse using upload.pypi.io, nothing will break, but I > suggest switching your URL to upload.pypi.org instead as it may not continue > to work forever. > > +1 on the recommendation for folks to switch their configuration > settings, although given that domains aren't that expensive, it's > probably worth the PSF keeping control of pypi.io as an alias of > pypi.org at least for a few years (and perhaps indefinitely). It's important to keep pypi.io around for a while, since pypi.org isn?t propagating yet. > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From donald at stufft.io Sun Jul 31 00:54:15 2016 From: donald at stufft.io (Donald Stufft) Date: Sun, 31 Jul 2016 00:54:15 -0400 Subject: [Distutils] Switch to upload.pypi.org instead of upload.pypi.io In-Reply-To: <017101d1eadf$e52bbaf0$af8330d0$@hotmail.com> References: <2F95E510-D853-4D9F-9F76-58C87EE74559@stufft.io> <017101d1eadf$e52bbaf0$af8330d0$@hotmail.com> Message-ID: > On Jul 30, 2016, at 11:59 PM, tritium-list at sdamon.com wrote: > >> >> +1 on the recommendation for folks to switch their configuration >> settings, although given that domains aren't that expensive, it's >> probably worth the PSF keeping control of pypi.io as an alias of >> pypi.org at least for a few years (and perhaps indefinitely). > > It's important to keep pypi.io around for a while, since pypi.org isn?t propagating yet. There are currently no plans to stop supporting pypi.io, though at some point upload.pypi.io might go away or become a redirect to upload.pypi.org (we can?t redirect it currently because it?s a POST request with files and the clients don?t support redirects). None of the other domains for pypi.org are currently active yet. ? Donald Stufft From fungi at yuggoth.org Sun Jul 31 12:54:49 2016 From: fungi at yuggoth.org (Jeremy Stanley) Date: Sun, 31 Jul 2016 16:54:49 +0000 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> Message-ID: <20160731165449.GS2458@yuggoth.org> On 2016-07-30 20:23:14 -0500 (-0500), Wes Turner wrote: > pbr also supports "environment markers" > which we would want to preserve when round-tripping (reading in, modifying, > and writing out) requirements.txt files; > though IDK if environment markers are part of any Python Packaging Spec? > > from http://docs.openstack.org/developer/pbr/#environment-markers : > > argparse; python_version=='2.6' I'm assuming you didn't follow the "conditional dependencies" link from the description in that first paragraph, as it would have taken you to the corresponding section of one: https://www.python.org/dev/peps/pep-0426/#environment-markers You should probably also see: https://www.python.org/dev/peps/pep-0496/ https://www.python.org/dev/peps/pep-0508/#environment-markers Quickly skimming relevant changelogs, environment markers have been supported by Setuptools since 0.7, and directly in requirements lists since pip 6.0. -- Jeremy Stanley From dholth at gmail.com Sun Jul 31 15:27:25 2016 From: dholth at gmail.com (Daniel Holth) Date: Sun, 31 Jul 2016 19:27:25 +0000 Subject: [Distutils] cffi & Py_LIMITED_API Message-ID: The next version of cffi will contain small changes to generate code compliant with Python's Py_LIMITED_API: https://bitbucket.org/cffi/cffi/commits/8f867f5a869f (although cffi itself is not, the extensions it generates will be). If we also add an appropriate supported tag to pip ~= cp3.abi3.manylinux1 and provide a way to name the generated DLL's appropriately, it may become possible to reduce the burden of distributing cffi extensions, especially for Windows. One compiled artifact should work on Python 3.2 and above. -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Sun Jul 31 15:29:04 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sun, 31 Jul 2016 14:29:04 -0500 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> <20160731165449.GS2458@yuggoth.org> Message-ID: On Jul 31, 2016 12:25 PM, "Jeremy Stanley" wrote: > > On 2016-07-30 20:23:14 -0500 (-0500), Wes Turner wrote: > > pbr also supports "environment markers" > > which we would want to preserve when round-tripping (reading in, modifying, > > and writing out) requirements.txt files; > > though IDK if environment markers are part of any Python Packaging Spec? > > > > from http://docs.openstack.org/developer/pbr/#environment-markers : > > > > argparse; python_version=='2.6' > > I'm assuming you didn't follow the "conditional dependencies" link > from the description in that first paragraph, as it would have taken > you to the corresponding section of one: > > https://www.python.org/dev/peps/pep-0426/#environment-markers > > You should probably also see: > > https://www.python.org/dev/peps/pep-0496/ > https://www.python.org/dev/peps/pep-0508/#environment-markers > > Quickly skimming relevant changelogs, environment markers have been > supported by Setuptools since 0.7, and directly in requirements > lists since pip 6.0. So we still need InstallRequirement.to_requirement_str() as described above ? Is there an implementation which does not mangle hashes and/or environment markers when *writing out* requirements.txt; so that --save could be implemented? > -- > Jeremy Stanley > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Sun Jul 31 16:06:53 2016 From: wes.turner at gmail.com (Wes Turner) Date: Sun, 31 Jul 2016 15:06:53 -0500 Subject: [Distutils] Proposal: "Install and save" In-Reply-To: References: <57937200.7000403@nextday.fi> <1469282685.822527.674628577.5DAF02E5@webmail.messagingengine.com> <57937AED.9000200@nextday.fi> <57938F1E.10502@nextday.fi> <20160731165449.GS2458@yuggoth.org> Message-ID: I've created a GitHub issue for this feature request: "pip install/upgrade --save" https://github.com/pypa/pip/issues/3884 On Sun, Jul 31, 2016 at 2:29 PM, Wes Turner wrote: > On Jul 31, 2016 12:25 PM, "Jeremy Stanley" wrote: > > > > On 2016-07-30 20:23:14 -0500 (-0500), Wes Turner wrote: > > > pbr also supports "environment markers" > > > which we would want to preserve when round-tripping (reading in, > modifying, > > > and writing out) requirements.txt files; > > > though IDK if environment markers are part of any Python Packaging > Spec? > > > > > > from http://docs.openstack.org/developer/pbr/#environment-markers : > > > > > > argparse; python_version=='2.6' > > > > I'm assuming you didn't follow the "conditional dependencies" link > > from the description in that first paragraph, as it would have taken > > you to the corresponding section of one: > > > > https://www.python.org/dev/peps/pep-0426/#environment-markers > > > > You should probably also see: > > > > https://www.python.org/dev/peps/pep-0496/ > > https://www.python.org/dev/peps/pep-0508/#environment-markers > > > > Quickly skimming relevant changelogs, environment markers have been > > supported by Setuptools since 0.7, and directly in requirements > > lists since pip 6.0. > > So we still need InstallRequirement.to_requirement_str() as described > above ? > > Is there an implementation which does not mangle hashes and/or environment > markers when *writing out* requirements.txt; > so that --save could be implemented? > > > -- > > Jeremy Stanley > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: