From vladimir.v.diaz at gmail.com Thu Jan 1 17:25:23 2015 From: vladimir.v.diaz at gmail.com (Vladimir Diaz) Date: Thu, 1 Jan 2015 11:25:23 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> Message-ID: On Wed, Dec 31, 2014 at 2:51 PM, Donald Stufft wrote: > > On Dec 31, 2014, at 11:08 AM, Vladimir Diaz > wrote: > > > > > > On Wed, Dec 31, 2014 at 2:26 AM, Donald Stufft wrote: > >> >> On Dec 10, 2014, at 10:16 PM, Vladimir Diaz >> wrote: >> >> Hello everyone, >> >> I am a research programmer at the NYU School of Engineering. My >> colleagues (Trishank Kuppusamy and Justin Cappos) and I are requesting >> community feedback on our proposal, "Surviving a Compromise of PyPI." The >> two-stage proposal can be reviewed online at: >> >> PEP 458 >> http://legacy.python.org/dev/peps/pep-0458/ >> >> PEP 480 >> http://legacy.python.org/dev/peps/pep-0480/ >> >> >> Summary of the Proposal: >> >> "Surviving a Compromise of PyPI" proposes how the Python Package Index >> (PyPI) can be amended to better protect end users from altered or malicious >> packages, and to minimize the extent of PyPI compromises against affected >> users. The proposed integration allows package managers such as pip to be >> more secure against various types of security attacks on PyPI and defend >> end users from attackers responding to package requests. Specifically, >> these PEPs describe how PyPI processes should be adapted to generate and >> incorporate repository metadata, which are signed text files that describe >> the packages and metadata available on PyPI. Package managers request >> (along with the packages) the metadata on PyPI to verify the authenticity >> of packages before they are installed. The changes to PyPI and tools will >> be minimal by leveraging a library, The Update Framework >> , that generates and >> transparently validates the relevant metadata. >> >> The first stage of the proposal (PEP 458 >> ) uses a basic security >> model that supports verification of PyPI packages signed with cryptographic >> keys stored on PyPI, requires no action from developers and end users, and >> protects against malicious CDNs and public mirrors. To support continuous >> delivery of uploaded packages, PyPI administrators sign for uploaded >> packages with an online key stored on PyPI infrastructure. This level of >> security prevents packages from being accidentally or deliberately tampered >> with by a mirror or a CDN because the mirror or CDN will not have any of >> the keys required to sign for projects. >> >> The second stage of the proposal (PEP 480 >> ) is an extension to the >> basic security model (discussed in PEP 458) that supports end-to-end >> verification of signed packages. End-to-end signing allows both PyPI and >> developers to sign for the packages that are downloaded by end users. If >> the PyPI infrastructure were to be compromised, attackers would be unable >> to serve malicious versions of these packages without access to the >> project's developer key. As in PEP 458, no additional action is required >> by end users. However, PyPI administrators will need to periodically >> (perhaps every few months) sign metadata with an offline key. PEP 480 also >> proposes an easy-to-use key management solution for developers, how to >> interface with a potential build farm on PyPI infrastructure, and discusses >> the security benefits of end-to-end signing. The second stage of the >> proposal simultaneously supports real-time project registration and >> developer signatures, and when configured to maximize security on PyPI, >> less than 1% of end users will be at risk even if an attacker controls PyPI >> and goes undetected for a month. >> >> We thank Nick Coghlan and Donald Stufft for their valuable contributions, >> and Giovanni Bajo and Anatoly Techtonik for their feedback. >> >> >> I?ve just finished (re)reading the white paper, PEP 450, PEP 480, and >> some of the supporting documentation on the TUF website. >> > > Thanks! > > >> >> I?m confused about what exactly is contained within the TUF metadata and >> who signs what in a PEP 480 world. >> > > The following illustration shows what is contained within TUF metadata > (JSON files): > > https://github.com/vladimir-v-diaz/pep-on-pypi-with-tuf/raw/master/pep-0458/figure4.pdf > Note: In this illustration, the "snapshot" and "targets" roles are renamed > "release" and "projects", respectively. > > If you're interested in what exactly is contained in these JSON files, > here is example metadata: > > https://github.com/theupdateframework/tuf/tree/develop/examples/repository/metadata > > In a PEP 480 world, project developers sign a single JSON file. For > example, developer(s) for the "Request" project sign their assigned JSON > file named "/targets/claimed/Requests.json". Specifically, a signature is > generated of the "signed" entry > > of the dictionary. Once the signature is generated, it is added to the > "signatures" entry > > of the JSON file. > > In figure 1 of PEP 480, PyPI signs for every metadata except those listed > under the "roles signed by developer keys" label: > https://github.com/vladimir-v-diaz/pep-maximum-security-model/blob/master/pep-0480/figure1.png > > > Ok, so authors never actually directly sign the package files themselves, > they sign for a document that contains a hash of the files they upload? > Yes, the proposal does *not* generate a signature of just the package file. The hash and size of the package file are included in the document / metadata that is signed. The metadata also includes the relative file path of the package, identifies the hashing algorithm used, when the metadata is set to expire, the metadata's version number, etc. > > > >> Currently when you do something like ``pip install FooBar``, pip fetches >> /simple/FooBar/ to look for potential installation candidates, and when it >> finds one it downloads it and installs it. This all all ?signed? by online >> keys via TLS. >> >> 1. In a TUF world, would pip still fetch /simple/FooBar/ to discover >> things to install or would it fetch some TUF metadata to find things to >> install? >> > > In the integration/demo we did with pip, we treated each /simple/ html > file as a target (listed the hash and file size of these html index pages > in TUF metadata). That is, pip still fetched /simple/FooBar/ to discover > distributions to install, but we verified the html files *and* > distributions against TUF metadata. In PEP 458, we state that "/simple" is > also listed in TUF metadata: > http://legacy.python.org/dev/peps/pep-0458/#pypi-and-tuf-metadata (last > paragraph just before the diagram). > > Another option is to avoid crawling/listing the simple index pages and > just search TUF metadata for distributions, but this approach will require > design changes to pip. We went with the approach (treat the index pages as > targets) that required minimal changes to pip. > > > So I?m personally perfectly happy to make more than minimal changes to pip > as I want to get this right rather than just bolt something onto the side. > I like this route very much, assuming we have the liberty to improve the "API" and maintain a clean design / integration; flexibility is a good thing in this regard. At the time, we were also unsure what we could change. The integration demo effectively kept a list of available distributions in two locations, the simple html pages (e.g., https://pypi.python.org/simple/requests/, which pip crawled & could be modified by developers through the API) and the JSON metadata. This approach can certainly be simplified / improved IMO. > > > >> 2. If it?s fetching /simple/FooBar/ is that secured by TUF? >> > > Yes, see my response to (1). > > 3. If it?s secured by TUF who signs the TUF metadata that talks about >> /simple/FooBar/ in PEP 480 the author or PyPI? >> > > PEP 480 authors sign for both their project's index page and > distribution(s) (as indicated in the JSON file): > > "A claimed or recently-claimed project will need to upload in its > transaction to PyPI not just targets (a simple index as well as > distributions) but also TUF metadata. The project MAY do so by uploading a > ZIP file containing two directories, /metadata/ (containing delegated > targets metadata files) and /targets/ (containing targets such as the > project simple index and distributions that are signed by the delegated > targets metadata)." > > See the second paragraph of > http://legacy.python.org/dev/peps/pep-0480/#snapshot-process. > > > So here is my problem. I?m completely on board with the developer signing > for the distribution files. I think that makes total sense. However I worry > that requiring the developer to sign for what is essentially the > ?installer? API (aka how pip discovers things to install) is going to put > us in a situation where we cannot evolve the API easily. If we modified > this PEP so that an online key signed for /simple/ what security properties > would we lose? > > It *appears* to me that the problem then would be that a compromise of > PyPI can present whatever information they want to pip as to what is > available for pip to download and install. This would mean freeze attacks, > mix and match attacks. It would also mean that they could, in a future > world where pip can use metadata on PyPI to do dependency resolution, tell > pip that it needs to download a valid but malicious project as a dependency > of a popular project like virtualenv. > I think is a valid problem, as you have noted. It is probably better to avoid a simple API (signed with an online key) that can be easily modified in the event of a compromise. By "evolve" the API, you mean modify the API such as disallowing developers to re-upload distributions and thus change hashes / file sizes listed in metadata, hide available distributions, etc.? In other words, contradict what is stated in metadata in an "on the fly" manner? > However I don?t think they?d be able to actually cause pip to install a > malicious copy of a good project and I believe that we can protect against > an attacker who poses that key from tricking pip into installing a > malicious but valid project as a fake dependency by having pip only use the > theoretical future PyPI metadata that lists dependencies as an optimization > hint for what it should download and then once it?s actually downloaded a > project like virtualenv (which has been validated to be from the real > author) peek inside that file and ensure that the metadata inside that > matches what PyPI told pip. > Assuming I have understood you correctly (I had to make certain assumptions), I think this is a good observation and assessment. My understanding: 1. Developers sign and upload PEP 480 metadata, which list the project's distributions. 2. Developers do *not* sign the API part (/simple/ in this case). 3. A future / new dependency file format is used to handle dependency resolution. Developers also implicitly sign the new dependency file that is included in distribution(s) uploaded to PyPI. 4. Client code (python or pip?) verifies dependencies, regardless of what pip fetches from PyPI. Now, the "optimization hint" part is to avoid having to open the distribution archive to extract the dependency file, and then continue downloading remaining dependencies? In this manner, pip can just use the project's "online" dependency information, that is also stored on PyPI, to handle dependency resolution more quickly. Once all needed distributions have been downloaded, pip can consult the PEP 480 metadata *and* the dependency file to ensure that no shenanigans have occurred. Is my understanding correct? > Is my assessment correct? Is keeping the ?API? under control of PyPI a > reasonable thing to do while keeping the actual distribution files > themselves under control of the distribution authors? The reason this > worries me is that unlikely a Linux distribution or an application like > Firefox or so we don?t have much of a relationship with the people who are > uploading things to PyPI. So if we need to evolve the API we are not going > to be able to compel our authors to go back and re-generate new signed > metadata. > > An additional thing I see, it appears that all of the metadata in TUF has > an expiration. While I think this makes complete sense for things signed by > online keys and things signed by keys that the PyPI administrator and/or > PSF board I don?t think this is something we can reasonably do for things > signed by authors themselves. An author might publish something and then > disappear and never come back and forcing them to resign at some point in > the future isn?t something we?re reasonably able to do. Is there a plan for > how to handle that? > Yes, we can add support for this behavior (i.e., the signed JSON can state {expiration: None} and pip / client can ignore expirations in this case). Another technique we may consider is moving these projects to an "abandoned" role (signed with an offline key). Although the PEP's do not currently mention it, we can discuss how to handle abandoned projects here if you wish, before deciding how to proceed. Justin and Trishank (CC'd) can also give feedback on this "abandoned" role. > > > Let me know exactly what needs to change in the PEPs to make everything > explained above clearer. For example, in PEP 458 we provide a > link/reference > > (last paragraph of this subsection) to the Metadata document > indicating > the content of the JSON files, but should the illustration > I've > included in this reply also be added? > > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Thu Jan 1 20:19:16 2015 From: barry at python.org (Barry Warsaw) Date: Thu, 1 Jan 2015 14:19:16 -0500 Subject: [Distutils] Bilingual namespace package conundrum Message-ID: <20150101141916.4e2f0bf0@anarchist> I hope the following makes sense; I've been a little under the weather. ;) Apologies in advance for the long data-dump, but I wanted to provide as complete information as possible. I have ported Mailman 3 to Python 3.4[*]. While working on various development tasks, I noticed something rather strange regarding (pseudo-)namespace packages which MM3 is dependent on. These libraries work for both Python 2 and 3, but of course namespace packages work completely differently in those two versions (as of Python 3.3, i.e. PEP 420). I've been thinking about how such bilingual packages can play better with PEP 420 when running under a compatible Python 3 version, and still *seem* to work well in Python 2. More on that later. It turns out this isn't actually my problem. Here's what's going wrong. My py3 branch has a fairly straightforward tox.ini. When I run `tox -e py34` it builds the hidden .tox/py34 virtualenv, installs all the dependencies into this, runs the test suite, and all is happy. Since I do most of my development this way, I never noticed any problems. But then I wanted to work with the code in a slightly different way, so I created a virtual environment, activated it, and ran `python setup.py develop` which appeared to work fine. The problem though is that some packages are not importable by the venv's python. Let's concentrate on three of these dependencies, lazr.config, lazr.delegates, and lazr.smtptest. All three give ImportErrors when trying to import them from the interactive prompt of the venv's python. I tried something different to see if I could narrow down the problem, so I created another venv, activated it, and ran `pip install lazr.config lazr.delegates lazr.smtptest`. Firing up the venv's interactive python, I can now import all three of them just fine. So this tells me that `python setup.py develop` is broken, or at the very least behaves differently that the `pip install` route. Actually, `python setup.py install` exhibits the same problem. Just to be sure Debian's Python 3.4 package isn't introducing any weirdnesses, all of this work was done with an up-to-date build of Python from hg, using the 3.4 branch, and the virtualenvs were created with `./python -m venv` Let's look more closely at part of the site-packages directories of the two virtualenvs. The one that fails looks like this: drwxrwxr-x [...] lazr.config-2.0.1-py3.4.egg drwxrwxr-x [...] lazr.delegates-2.0.1-py3.4.egg drwxrwxr-x [...] lazr.smtptest-2.0.2-py3.4.egg All three directories exhibit the same pattern: lazr.config-2.0.1-py3.4.egg EGG-INFO namespace_packages.txt (contains one line saying "lazr") lazr config __init__.py (contains the pseudo-namespace boilerplate code[+]) __pycache__ [+] that boilerplate code looks like: -----snip snip----- # This is a namespace package, however under >= Python 3.3, let it be a true # namespace package (i.e. this cruft isn't necessary). import sys if sys.hexversion < 0x30300f0: try: import pkg_resources pkg_resources.declare_namespace(__name__) except ImportError: import pkgutil __path__ = pkgutil.extend_path(__path__, __name__) -----snip snip----- but of course, this doesn't really play nicely with PEP 420 because it's the mere existence of the __init__.py in the namespace package that prevents PEP 420 from getting invoked. As an aside, Debian's packaging tools will actually *remove* this __init__.py for declared namespace packages in Python 3, so under Debian-land, things do work properly. The upstream tools have no such provision, so it's clear why we're getting the ImportErrors. These three packages do not share a PEP 420-style namespace, and the sys.hexversion guard prevents the old-style pseudo-namespace hack from working. This could potentially be fixable in the upstream packages, but again, more on that later. Now let's look at the working venvs: drwxrwxr-x [...] lazr drwxrwxr-x [...] lazr.config-2.0.1-py3.4.egg-info -rw-rw-r-- [...] lazr.config-2.0.1-py3.4-nspkg.pth drwxrwxr-x [...] lazr.delegates-2.0.1-py3.4.egg-info -rw-rw-r-- [...] lazr.delegates-2.0.1-py3.4-nspkg.pth drwxrwxr-x [...] lazr.smtptest-2.0.2-py3.4.egg-info -rw-rw-r-- [...] lazr.smtptest-2.0.2-py3.4-nspkg.pth This looks quite different, and digging into the 'lazr' directory shows: lazr config delegates smtptest No __init__.py in 'lazr', and the sub-package directories contain just the code for the appropriate library. The egg-info directories look similar, but now we also have -nspkg.pth files which contain something like the following: -----snip snip----- import sys, types, os;p = os.path.join(sys._getframe(1).f_locals['sitedir'], *('lazr',));ie = os.path.exists(os.path.join(p,'__init__.py'));m = not ie and sys.modules.setdefault('lazr', types.ModuleType('lazr'));mp = (m or []) and m.__dict__.setdefault('__path__',[]);(p not in mp) and mp.append(p) -----snip snip----- which look to me a lot like they are setting up something akin to the PEP 420-ish virtual `lazr` top-level package. It all makes sense too; importing lazr.{config.delegates,smtptest} all come from the expected locations and importing 'lazr' gives you a stripped down module object with no __file__. Stepping back a moment, what is it I want? I want `python setup.py develop` to work properly in both Python 2 and Python 3 virtualenvs. I want clear rules we can give library authors so that they can be sure their bilingual namespace packages will work properly in all supported venv environments. How to best accomplish this? - Should distutils/setuptools remove the namespace package __init__.py files automatically? (Something like what the Debian packaging tools do.) - Should some other boilerplate be added to the namespace-package __init__.py files in each portion? Maybe the sys.hexversion guards should be removed so that it acts the same way in both Python 2 and Python 3. - Should we introduce some other protocol in Python 3.5 that would allow such bilingual namespace packages to actually *be* PEP 420 namespace packages in Python 3? I took a quick review of importlib and it seems tricky, since it's the mere existence of the top-level __init__.py files that trigger PEP 420 namespace packages. So putting some code in the __init__.py won't get executed until the top-level package is attempted to be imported. Any trigger for "hey, treat me as a namespace package" would have to be file system based, since even the namespace_packages.txt file may not exist. I'll stop here. Happy new year and let the games begin. Cheers, -Barry [*] https://mail.python.org/pipermail/mailman-announce/2014-December/000197.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From tseaver at palladion.com Thu Jan 1 21:20:10 2015 From: tseaver at palladion.com (Tres Seaver) Date: Thu, 01 Jan 2015 15:20:10 -0500 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: <20150101141916.4e2f0bf0@anarchist> References: <20150101141916.4e2f0bf0@anarchist> Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 01/01/2015 02:19 PM, Barry Warsaw wrote: > Maybe the sys.hexversion guards should be removed so that it acts the > same way in both Python 2 and Python 3. That sounds right to me. I never really understood the motivation for PEP 420, but if the presence of that file disables it, then it should ensure the "old" behavior regardless. Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver at palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iEYEARECAAYFAlSlq/oACgkQ+gerLs4ltQ5hjgCeKMfXT0DPBS//0y/XAP2npJRW KssAniyBjuuhjVpw0ETola6v6hVuwpiP =BLOQ -----END PGP SIGNATURE----- From barry at python.org Thu Jan 1 22:54:49 2015 From: barry at python.org (Barry Warsaw) Date: Thu, 1 Jan 2015 16:54:49 -0500 Subject: [Distutils] Bilingual namespace package conundrum References: <20150101141916.4e2f0bf0@anarchist> Message-ID: <20150101165449.6dec9bba@anarchist> On Jan 01, 2015, at 03:20 PM, Tres Seaver wrote: >That sounds right to me. I never really understood the motivation for >PEP 420, but if the presence of that file disables it, then it should >ensure the "old" behavior regardless. The motivation is described in the PEP, but briefly, on (many, most, all?) Linux distros any single file can be owned by exactly one system package. So which package would own e.g. zope/__init__.py? Before PEP 420, the answer is "well, it's complicated". After PEP 420, the answer is "no package, because that file doesn't exist". This is why the Debian tools delete any stray zope/__init__.py files when they are creating any zope.* binary package for Python 3. I think you're right that removing the version guard will make it work, but they won't be PEP 420 packages. I don't think it's possible to make a bilingual package a pseudo-namespace package under Python 2, but a PEP 420 namespace package under Python >= 3.3. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From donald at stufft.io Thu Jan 1 22:57:35 2015 From: donald at stufft.io (Donald Stufft) Date: Thu, 1 Jan 2015 16:57:35 -0500 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: <20150101165449.6dec9bba@anarchist> References: <20150101141916.4e2f0bf0@anarchist> <20150101165449.6dec9bba@anarchist> Message-ID: <759E9E8B-575D-455D-B1AE-5AE91C7FF863@stufft.io> > On Jan 1, 2015, at 4:54 PM, Barry Warsaw wrote: > > On Jan 01, 2015, at 03:20 PM, Tres Seaver wrote: > >> That sounds right to me. I never really understood the motivation for >> PEP 420, but if the presence of that file disables it, then it should >> ensure the "old" behavior regardless. > > The motivation is described in the PEP, but briefly, on (many, most, all?) > Linux distros any single file can be owned by exactly one system package. So > which package would own e.g. zope/__init__.py? > > Before PEP 420, the answer is "well, it's complicated". After PEP 420, the > answer is "no package, because that file doesn't exist". > > This is why the Debian tools delete any stray zope/__init__.py files when they > are creating any zope.* binary package for Python 3. > > I think you're right that removing the version guard will make it work, but > they won't be PEP 420 packages. I don't think it's possible to make a > bilingual package a pseudo-namespace package under Python 2, but a PEP 420 > namespace package under Python >= 3.3. > > Cheers, > -Barry > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig I?m pretty sure the problem with setup.py develop and setup.py install is because they are installed as eggs more or less and setuptools probably doesn?t support it. pip install installs it unpacked so it?ll rely on built in importing. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From tseaver at palladion.com Fri Jan 2 03:11:13 2015 From: tseaver at palladion.com (Tres Seaver) Date: Thu, 01 Jan 2015 21:11:13 -0500 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: <759E9E8B-575D-455D-B1AE-5AE91C7FF863@stufft.io> References: <20150101141916.4e2f0bf0@anarchist> <20150101165449.6dec9bba@anarchist> <759E9E8B-575D-455D-B1AE-5AE91C7FF863@stufft.io> Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 01/01/2015 04:57 PM, Donald Stufft wrote: > I?m pretty sure the problem with setup.py develop and setup.py install > is because they are installed as eggs more or less and setuptools > probably doesn?t support it. pip install installs it > unpacked so it?ll rely on built in importing. I'm puzzled by that notion: 'setup.py develop' and 'setup.py install' have worked for a decade with namespace packages, as long as the __init__ for them does the Right Thing (using either pkg_resources or pkgutil). What has never worked properly is mixing pip installation of namespace packages with easy_install installation of packages in the same namespace: pip's "install in one directory without fixups" stragedy screws up the __path__ for the namespace packages installed elsewhere. Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver at palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iEYEARECAAYFAlSl/kEACgkQ+gerLs4ltQ4aYQCcDImoKfLr3Xb+tkFSY9x8Oinh QT8AoNDKmxVoqdX3UMnsUg1ktZbqukZg =CPP5 -----END PGP SIGNATURE----- From donald at stufft.io Fri Jan 2 03:14:27 2015 From: donald at stufft.io (Donald Stufft) Date: Thu, 1 Jan 2015 21:14:27 -0500 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: References: <20150101141916.4e2f0bf0@anarchist> <20150101165449.6dec9bba@anarchist> <759E9E8B-575D-455D-B1AE-5AE91C7FF863@stufft.io> Message-ID: <5B3159A7-26F2-424C-8F0E-51882C65B44B@stufft.io> > On Jan 1, 2015, at 9:11 PM, Tres Seaver wrote: > > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > On 01/01/2015 04:57 PM, Donald Stufft wrote: > >> I?m pretty sure the problem with setup.py develop and setup.py install >> is because they are installed as eggs more or less and setuptools >> probably doesn?t support it. pip install installs it >> unpacked so it?ll rely on built in importing. > > I'm puzzled by that notion: 'setup.py develop' and 'setup.py install' > have worked for a decade with namespace packages, as long as the __init__ > for them does the Right Thing (using either pkg_resources or pkgutil). > > What has never worked properly is mixing pip installation of namespace > packages with easy_install installation of packages in the same > namespace: pip's "install in one directory without fixups" stragedy > screws up the __path__ for the namespace packages installed elsewhere. > Why are you puzzled by the notion that something designed to work with a one mechanism for a particular feature probably does not work with a newer, different mechanism for a particular feature? My assumption is that setuptools is ensuring that the __init__.py files are being written (or the nspkg or whatever, I forget which files are used in which cases) which is breaking the PEP420 ?implicit? namespace packages. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From barry at python.org Fri Jan 2 05:33:59 2015 From: barry at python.org (Barry Warsaw) Date: Thu, 1 Jan 2015 23:33:59 -0500 Subject: [Distutils] Bilingual namespace package conundrum References: <20150101141916.4e2f0bf0@anarchist> <20150101165449.6dec9bba@anarchist> <759E9E8B-575D-455D-B1AE-5AE91C7FF863@stufft.io> <5B3159A7-26F2-424C-8F0E-51882C65B44B@stufft.io> Message-ID: <20150101233359.5852c0fc@anarchist> On Jan 01, 2015, at 09:14 PM, Donald Stufft wrote: >Why are you puzzled by the notion that something designed to work with a >one mechanism for a particular feature probably does not work with a >newer, different mechanism for a particular feature? My assumption is that >setuptools is ensuring that the __init__.py files are being written (or the >nspkg or whatever, I forget which files are used in which cases) which is >breaking the PEP420 ?implicit? namespace packages. Note that in neither case are actual PEP 420 namespace packages being used. It would be nice if both pip and setuptools could handle namespace packages, but neither do. In fact I think if they did, there wouldn't be a problem. The problem comes about because setuptools is installing them as separate eggs which are *not* PEP 420 portions. pip succeeds because it happens to install the portions under the same top-level directory, so the namespace's __init__.py just happens to get overwritten and thus works. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From ncoghlan at gmail.com Fri Jan 2 06:57:22 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 2 Jan 2015 15:57:22 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> Message-ID: On 1 January 2015 at 05:51, Donald Stufft wrote: > > So here is my problem. I?m completely on board with the developer signing > for the distribution files. I think that makes total sense. However I worry > that requiring the developer to sign for what is essentially the > ?installer? API (aka how pip discovers things to install) is going to put > us in a situation where we cannot evolve the API easily. If we modified > this PEP so that an online key signed for /simple/ what security properties > would we lose? > > It *appears* to me that the problem then would be that a compromise of > PyPI can present whatever information they want to pip as to what is > available for pip to download and install. This would mean freeze attacks, > mix and match attacks. It would also mean that they could, in a future > world where pip can use metadata on PyPI to do dependency resolution, tell > pip that it needs to download a valid but malicious project as a dependency > of a popular project like virtualenv. > > However I don?t think they?d be able to actually cause pip to install a > malicious copy of a good project and I believe that we can protect against > an attacker who poses that key from tricking pip into installing a > malicious but valid project as a fake dependency by having pip only use the > theoretical future PyPI metadata that lists dependencies as an optimization > hint for what it should download and then once it?s actually downloaded a > project like virtualenv (which has been validated to be from the real > author) peek inside that file and ensure that the metadata inside that > matches what PyPI told pip. > > Is my assessment correct? Is keeping the ?API? under control of PyPI a > reasonable thing to do while keeping the actual distribution files > themselves under control of the distribution authors? The reason this > worries me is that unlikely a Linux distribution or an application like > Firefox or so we don?t have much of a relationship with the people who are > uploading things to PyPI. So if we need to evolve the API we are not going > to be able to compel our authors to go back and re-generate new signed > metadata. > I think this is a good entry point for an idea I've had kicking around in my brain for the past couple of days: what if we change the end goal of PEP 480 slightly, from "prevent attackers from compromising published PyPI metadata" to "allow developers & administrators to rapidly detect and recover from compromised PyPI metadata"? My reasoning is that when it comes to PyPI security, there are actually two major dials we can twiddle: * raising the cost of an attack (e.g. making compromise harder by distributing signing authority to developers) * reducing the benefit of an attack (e.g. making the expected duration, and hence reach, of a compromise lower, or downgrading an artifact substitution attack to a denial of service attack) To raise the cost of a compromise through distributed signing authority, we have to solve the trust management problem - getting developer keys out to end users in a way that doesn't involve trusting the central PyPI service. That's actually a really difficult problem to solve, which is why we have situations like TLS still relying on the CA system, despite the known problems with the latter. However, the latter objective is potentially more tractable: we wouldn't need to distribute trust management out to arbitrary end users, we'd "just" need a federated group of entities that are in a position to detect that PyPI has potentially been compromised, and request a service shutdown until such time as the compromise has been investigated and resolved. This notion isn't fully evolved yet (that's why this email is so long), but it feels like a far more viable direction to me than the idea of pushing the enhanced security management problem back on to end users. Suppose, for example, there were additional independently managed validation services hosting TUF metadata for various subsets of PyPI. The enhanced security model would then involve developers opting in to uploading their package metadata to one or more of the validation servers, rather than just to the main PyPI server. pip itself wouldn't worry about checking the validation services - it would just check against the main server as it does today, so we wouldn't need to worry about how we get the root keys for the validation servers out to arbitrary client end points. That is, rather than "sign your own packages", the enhanced security model becomes "get multiple entities to sign your packages, so compromise of any one entity (including PyPI itself) can be detected and investigated appropriately". The *validation* services would then be responsible for checking that their own registered metadata matched the metadata being published on PyPI. If they detect a discrepancy between their own metadata and PyPI's, then we'd have a human-in-the-loop process for reporting the problem, and the most likely response would be to disable PyPI downloads while the situation was resolved. I believe something like that would change the threat landscape in a positive way, and has three very attractive features over distributed signing authority: * It's completely transparent at the point of installation - it transforms PEP 480 into a back end data integrity validation project, rather than something that affects the end user experience of the PyPI ecosystem. The changes to the installation experience would be completely covered by PEP 458. * Uploading metadata to additional servers for signing is relatively low impact on developers (if they have an automated release process, it's likely just another line in a script somewhere), significantly lowering barriers to adoption relative to asking developers to sign their own packages. * Folks that decide to run or use a validation server are likely going to be more closely engaged with the PyPI community, and hence easier to reach as the metadata requirements evolve In terms of how I believe such a change would mitigate the threat of a PyPI compromise: * it provides a cryptographically validated way to detect a compromise of any packages registered with one or more validation services, significantly reducing the likelihood of a meaningful PyPI compromise going undetected * in any subsequent investigation, we'd have multiple sets of cryptographically validated metadata to compare to identify exactly what was compromised, and how it was compromised * the new attack vectors introduced (by compromising the validation services rather than PyPI itself) are *denial of service* attacks (due to PyPI downloads being disabled while the discrepancy is investigated), rather than the artifact substitution that is possible by attacking PyPI directly That means we would move from the status quo, where a full PyPI compromise may permit silent substitution of artifacts to one where an illicit online package substitution would likely be detected in minutes or hours for high profile projects, so the likely pay-off for an attack on the central infrastructure is a denial of service against organisations not using their own local PyPI mirrors, rather than arbitrary software installation on a wide range of systems. Another nice benefit of this approach is that it also protects against attacks on developer PyPI *accounts*, so long as they use different authentication mechanisms on the validation server over the main PyPI server. For example, larger organisations could run their *own* validation server for the packages they publish, and manage it using offline keys as recommended by TUF - that's a lot easier to do when you don't need to allow arbitrary uploads. Specific *projects* could still be attacked (by compromising developer systems), but that's not a new threat, and outside the scope of PEP 458/480 - we're aiming to mitigate the threat of *systemic* compromise that currently makes PyPI a relatively attractive target. As far as the pragmatic aspects go, we could either go with a model where projects are encouraged to run their *own* validation services on something like OpenShift (or even a static hosting site if they generate their validation metadata locally), or else we could look for willing partners to host public PyPI metadata validation servers (e.g. the OpenStack Foundation, Fedora/Red Hat, perhaps someone from the Debian/Ubuntu/Canonical ecosystem, perhaps some of the other commercial Python redistributors) Regards, Nick. [1] Via Leigh Alexander, I was recently introduced to this excellent paper on understanding and working with the mental threat models that users actually have, rather than attempting to educate the users: https://cups.cs.cmu.edu/soups/2010/proceedings/a11_Walsh.pdf. While the paper is specifically written in the context of home PC security, I think that's good advice in general: adjusting software systems to accommodate the reality of human behaviour is usually going to be far more effective than attempting to teach humans to conform to the current needs of the software. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 07:13:17 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 01:13:17 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> Message-ID: > On Jan 2, 2015, at 12:57 AM, Nick Coghlan wrote: > > To raise the cost of a compromise through distributed signing authority, we have to solve the trust management problem - getting developer keys out to end users in a way that doesn't involve trusting the central PyPI service. That's actually a really difficult problem to solve, which is why we have situations like TLS still relying on the CA system, despite the known problems with the latter. I haven?t read the entirety of your email, but I would like to point out that PEP 480 does not attempt to solve this problem without trusting PyPI. Rather it just moves the trust from trusting the server that runs PyPI to trusting the people running PyPI itself. TUF is fundamentally extremely similar to the CA system except there is only one CA which is scoped to a particular repository (e.g. PyPI) and it includes some distribution specific stuff like file size and delegating partial trust. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 07:33:59 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 2 Jan 2015 16:33:59 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> Message-ID: On 2 January 2015 at 16:13, Donald Stufft wrote: > > On Jan 2, 2015, at 12:57 AM, Nick Coghlan wrote: > > To raise the cost of a compromise through distributed signing authority, > we have to solve the trust management problem - getting developer keys out > to end users in a way that doesn't involve trusting the central PyPI > service. That's actually a really difficult problem to solve, which is why > we have situations like TLS still relying on the CA system, despite the > known problems with the latter. > > > I haven?t read the entirety of your email, but I would like to point out > that PEP 480 does not attempt to solve this problem without trusting PyPI. > Rather it just moves the trust from trusting the server that runs PyPI to > trusting the people running PyPI itself. TUF is fundamentally extremely > similar to the CA system except there is only one CA which is scoped to a > particular repository (e.g. PyPI) and it includes some distribution > specific stuff like file size and delegating partial trust. > That's the part I meant - the signing of developer keys to delegate trust to them without needing to trust the integrity of the online PyPI service. Hence the idea of instead keeping PyPI as an entirely online service (without any offline delegation of authority), and suggesting that developers keep their *own* separately signed metadata, which can then be compared against the PyPI published metadata (both by the developers themselves and by third parties). Discrepancies becoming a trigger for further investigation, which may include suspending the PyPI service if the the discrepancy is reported by an individual or organisation that the PyPI administrators trust. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 07:38:02 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 01:38:02 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> Message-ID: <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> > On Jan 2, 2015, at 1:33 AM, Nick Coghlan wrote: > > On 2 January 2015 at 16:13, Donald Stufft > wrote: > >> On Jan 2, 2015, at 12:57 AM, Nick Coghlan > wrote: >> >> To raise the cost of a compromise through distributed signing authority, we have to solve the trust management problem - getting developer keys out to end users in a way that doesn't involve trusting the central PyPI service. That's actually a really difficult problem to solve, which is why we have situations like TLS still relying on the CA system, despite the known problems with the latter. > > > I haven?t read the entirety of your email, but I would like to point out that PEP 480 does not attempt to solve this problem without trusting PyPI. Rather it just moves the trust from trusting the server that runs PyPI to trusting the people running PyPI itself. TUF is fundamentally extremely similar to the CA system except there is only one CA which is scoped to a particular repository (e.g. PyPI) and it includes some distribution specific stuff like file size and delegating partial trust. > > That's the part I meant - the signing of developer keys to delegate trust to them without needing to trust the integrity of the online PyPI service. > > Hence the idea of instead keeping PyPI as an entirely online service (without any offline delegation of authority), and suggesting that developers keep their *own* separately signed metadata, which can then be compared against the PyPI published metadata (both by the developers themselves and by third parties). Discrepancies becoming a trigger for further investigation, which may include suspending the PyPI service if the the discrepancy is reported by an individual or organisation that the PyPI administrators trust. I?m confused what you mean by ?without needing to the trust the integrity of the online PyPI service?. Developer keys get signed by offline keys controlled by I?m guessing either myself or Richard or both. The only time we?re depending on the integrity of the machine that runs PyPI and not on an offline key possessed by someone is during the window of time when a new project has been created (the project itself, not a release of a project) and the next time the delegations get signed by the offline keys. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 09:21:40 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 2 Jan 2015 18:21:40 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: On 2 January 2015 at 16:38, Donald Stufft wrote: > > On Jan 2, 2015, at 1:33 AM, Nick Coghlan wrote: > > That's the part I meant - the signing of developer keys to delegate trust > to them without needing to trust the integrity of the online PyPI service. > > Hence the idea of instead keeping PyPI as an entirely online service > (without any offline delegation of authority), and suggesting that > developers keep their *own* separately signed metadata, which can then be > compared against the PyPI published metadata (both by the developers > themselves and by third parties). Discrepancies becoming a trigger for > further investigation, which may include suspending the PyPI service if the > the discrepancy is reported by an individual or organisation that the PyPI > administrators trust. > > > I?m confused what you mean by ?without needing to the trust the integrity > of the online PyPI service?. > > Developer keys get signed by offline keys controlled by I?m guessing > either myself or Richard or both. The only time we?re depending on the > integrity of the machine that runs PyPI and not on an offline key possessed > by someone is during the window of time when a new project has been created > (the project itself, not a release of a project) and the next time the > delegations get signed by the offline keys. > Yes, as I said, that's the part I mean. To avoid trusting the integrity of the online PyPI service, while still using PyPI as a trust root for the purpose of software installation, we would need to define a system whereby: 1. The PyPI administrators have a set of offline keys 2. Developer are able to supply keys to the PyPI administrators for trust delegation 3. This system has sufficiently low barriers to entry that developers are actually willing to use it 4. This system is compatible with a PyPI run build service We already have a hard security problem to solve (running PyPI), so adding a *second* hard security problem (running what would in effect be a CA) doesn't seem like a good approach to risk mitigation to me. My proposal is that we instead avoid the hard problem of running a CA entirely by advising *developers* to monitor PyPI's integrity by ensuring that what PyPI is publishing matches what they released. That is, we split the end-to-end data integrity validation problem in two and solve each part separately: * use PEP 458 to cover the PyPI -> end user link, with the end users treating PyPI as a trusted authority. End users will be able to detect tampering with the link between them and PyPI, but if the online PyPI service gets compromised, *end users won't detect it*. * use a separate metadata validation process to check that PyPI is publishing the right thing, covering both the developer -> PyPI link *and* the integrity of the PyPI service itself. The metadata validation potentially wouldn't even need to use TUF - developers could simply upload the expected hash of the artifacts they published, and the metadata validation service would check that the signed artifacts from PyPI match those hashes. The core of the idea is simply that there be a separate service (or services) which PyPI can't update, but developers uploading packages *can*. By focusing on detection & recovery, rather than prevention, we can drastically reduce the complexity of the problem to be solved, while still mitigating the major risks we care about. The potential attacks that worry me are the ones that result in silent substitution of artifacts - when it comes to denial of service attacks, there's little reason to mess about with inducing metadata validation failures when there are already far simpler options available. Redistributors may decide to take advantage of the developer metadata validation support to do our own verification of source downloads, but I don't believe upstream needs to worry about that too much - if developers have the means to verify automatically that what PyPI is currently publishing matches what they released, then the redistributor side of things should take care of itself. Another way of viewing the problem is that instead of thinking of the scope of PEP 480 as PyPI delegating trust to developers, we can instead think of it as developers delegating trust to PyPI. PyPI then becomes a choke point in a network graph, rather than the root of a tree. My core idea stays the same regardless of how you look at it though: we *don't* try to solve the problem of letting end users establish in a single step that what they downloaded matches what the developer published. Instead, we aim to provide answers to the questions: * Did I just download what PyPI is currently publishing? * Is PyPI currently publishing what the developer of released? There's no fundamental requirement that those two questions be answered by the *same* security system - we have the option of splitting them, and I'm starting to think that the overall UX will be better if we do. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 09:57:28 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 03:57:28 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: > On Jan 2, 2015, at 3:21 AM, Nick Coghlan wrote: > > On 2 January 2015 at 16:38, Donald Stufft > wrote: > >> On Jan 2, 2015, at 1:33 AM, Nick Coghlan > wrote: >> >> That's the part I meant - the signing of developer keys to delegate trust to them without needing to trust the integrity of the online PyPI service. >> >> Hence the idea of instead keeping PyPI as an entirely online service (without any offline delegation of authority), and suggesting that developers keep their *own* separately signed metadata, which can then be compared against the PyPI published metadata (both by the developers themselves and by third parties). Discrepancies becoming a trigger for further investigation, which may include suspending the PyPI service if the the discrepancy is reported by an individual or organisation that the PyPI administrators trust. > > I?m confused what you mean by ?without needing to the trust the integrity of the online PyPI service?. > > Developer keys get signed by offline keys controlled by I?m guessing either myself or Richard or both. The only time we?re depending on the integrity of the machine that runs PyPI and not on an offline key possessed by someone is during the window of time when a new project has been created (the project itself, not a release of a project) and the next time the delegations get signed by the offline keys. > > Yes, as I said, that's the part I mean. To avoid trusting the integrity of the online PyPI service, while still using PyPI as a trust root for the purpose of software installation, we would need to define a system whereby: > > 1. The PyPI administrators have a set of offline keys > 2. Developer are able to supply keys to the PyPI administrators for trust delegation > 3. This system has sufficiently low barriers to entry that developers are actually willing to use it > 4. This system is compatible with a PyPI run build service > > We already have a hard security problem to solve (running PyPI), so adding a *second* hard security problem (running what would in effect be a CA) doesn't seem like a good approach to risk mitigation to me. > > My proposal is that we instead avoid the hard problem of running a CA entirely by advising *developers* to monitor PyPI's integrity by ensuring that what PyPI is publishing matches what they released. That is, we split the end-to-end data integrity validation problem in two and solve each part separately: > > * use PEP 458 to cover the PyPI -> end user link, with the end users treating PyPI as a trusted authority. End users will be able to detect tampering with the link between them and PyPI, but if the online PyPI service gets compromised, *end users won't detect it*. > * use a separate metadata validation process to check that PyPI is publishing the right thing, covering both the developer -> PyPI link *and* the integrity of the PyPI service itself. > > The metadata validation potentially wouldn't even need to use TUF - developers could simply upload the expected hash of the artifacts they published, and the metadata validation service would check that the signed artifacts from PyPI match those hashes. The core of the idea is simply that there be a separate service (or services) which PyPI can't update, but developers uploading packages *can*. > > By focusing on detection & recovery, rather than prevention, we can drastically reduce the complexity of the problem to be solved, while still mitigating the major risks we care about. The potential attacks that worry me are the ones that result in silent substitution of artifacts - when it comes to denial of service attacks, there's little reason to mess about with inducing metadata validation failures when there are already far simpler options available. > > Redistributors may decide to take advantage of the developer metadata validation support to do our own verification of source downloads, but I don't believe upstream needs to worry about that too much - if developers have the means to verify automatically that what PyPI is currently publishing matches what they released, then the redistributor side of things should take care of itself. > > Another way of viewing the problem is that instead of thinking of the scope of PEP 480 as PyPI delegating trust to developers, we can instead think of it as developers delegating trust to PyPI. PyPI then becomes a choke point in a network graph, rather than the root of a tree. My core idea stays the same regardless of how you look at it though: we *don't* try to solve the problem of letting end users establish in a single step that what they downloaded matches what the developer published. Instead, we aim to provide answers to the questions: > > * Did I just download what PyPI is currently publishing? > * Is PyPI currently publishing what the developer of released? > > There's no fundamental requirement that those two questions be answered by the *same* security system - we have the option of splitting them, and I'm starting to think that the overall UX will be better if we do. > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia Oh I see. I was just misreading what you meant by ?without trusting the integrity of the online PyPI service?, I thought you meant it in a post PEP 480 world, you meant it in a pre (or without) PEP 480 world. So onto the actual thing that you?ve proposed! I have concerns about the actual feasibility of doing such a thing, some of which are similar to my concerns with doing non-mandatory PEP 480. * If uploading to a verifier service is optional then a significant portion of authors simply won?t do it and if you installing 100 things, and 99 of them are verified and 1 of them are not then there is an attack vector that I can use to compromise you undetected (since the author didn?t upload their verification somewhere else). * It?s not actually less work in general, it just pushes the work from the PyPI administrators to the community. This can work well if the community is willing to step up! However, PyPI?s availability/speed problems were originally attempted to be solved by pushing the work to the community via the original mirror system and (not to downplay the people who did step up) the response was not particularly great and the mirrors got a few at first and as the shiny factor wore off people?s mirrors shutdown or stopped working or what have you. * A number of the attacks that TUF protects against do not rely on the attacker creating malicious software packages, things only showing known insecure versions of a project so that they can then attack people through a known exploit. It?s not *wrong* to not protect against these (most systems don?t) but we?d want to explicitly decide that we?re not going to. I?d note that PEP 480 and your proposal aren?t really mutually exclusive so there?s not really harm in *trying* yours and if it fails falling back to something like PEP 480 other than end user confusion if that gets shut down and the cost of actually developing/setting up that solution. Overall I?m +1 on things that enable better detection of a compromise but I?m probably -0.5 or so on your specific proposal as I think that expecting developers to upload verification data to ?verification servers? is just pushing work onto other people just so we don?t have to do it. I also think your two questions are not exactly right, because all that means is that it becomes harder to attack *everyone* via a PyPI compromise, however it?s still trivial to attack specific people if you?ve compromised PyPI or the CDN since you can selectively serve maliciously signed packages depending on who is requesting them. To this end I don?t think a solution that pip doesn?t implement is actually going to prevent anything but very dumb attacks by an attacker who has already compromised the PyPI machines. I think another issue here is that we?re effectively doing something similar to TLS except instead of domain names we have project names and that although there are *a lot* of people who really hate the CA system nobody has yet come up with an effective means of actually replacing it without regressing into worse security. The saving grace here is that we operate at a much smaller scale (one ?DNS? root, one trust root, ~53k unique names vs? more than I feel like account) so it?s possible that solutions which don?t scale at TLS scale might scale at PyPI scale. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Fri Jan 2 12:04:40 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 2 Jan 2015 11:04:40 +0000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: On 2 January 2015 at 06:38, Donald Stufft wrote: > Developer keys get signed by offline keys controlled by I?m guessing either > myself or Richard or both. One thought here. The issue being discussed here seems mainly to be that it's hard to manage signing of developer keys. That's certainly one issue, but another is that the signing process takes time. When I set up my first project [1], I did so because I had an idea one afternoon, knocked up the first draft and set up the project. If there had been a delay of a week because you and Richard were both on holiday (or even a day, because of timezones) I may not have bothered - I tend to only have the opportunity to work on things for Python in pretty short bursts. You could argue that we don't want projects on PyPI that have been set up with so little preparation - it's a valid position to take - but that's a separate matter. I just want to make the point that management isn't the only issue here. Turnaround time is also a barrier to entry that needs to be considered. And not every project that people want to publish is something major like requests or django... Paul [1] I assume I only need to set up a key once, for my PyPI account. If I need an individual key per project, the cost multiplies. And it means that the barrier is to all new projects, rather than merely to attracting new developers. From donald at stufft.io Fri Jan 2 12:21:20 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 06:21:20 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: > On Jan 2, 2015, at 6:04 AM, Paul Moore wrote: > > On 2 January 2015 at 06:38, Donald Stufft wrote: >> Developer keys get signed by offline keys controlled by I?m guessing either >> myself or Richard or both. > > One thought here. The issue being discussed here seems mainly to be > that it's hard to manage signing of developer keys. That's certainly > one issue, but another is that the signing process takes time. When I > set up my first project [1], I did so because I had an idea one > afternoon, knocked up the first draft and set up the project. If there > had been a delay of a week because you and Richard were both on > holiday (or even a day, because of timezones) I may not have bothered > - I tend to only have the opportunity to work on things for Python in > pretty short bursts. > > You could argue that we don't want projects on PyPI that have been set > up with so little preparation - it's a valid position to take - but > that's a separate matter. I just want to make the point that > management isn't the only issue here. Turnaround time is also a > barrier to entry that needs to be considered. And not every project > that people want to publish is something major like requests or > django... > > Paul > > [1] I assume I only need to set up a key once, for my PyPI account. If > I need an individual key per project, the cost multiplies. And it > means that the barrier is to all new projects, rather than merely to > attracting new developers. To be clear, there is zero delay in being able to publish a new project, the delay is between moving from a new project being validated by an online key to an offline key. The only real difference between validation levels is that until it's been signed by an offline key then people installing that project are vulnerable to someone compromising PyPI. This is because until the delegation of project X to a specific developer has been signed the "chain of trust" contains a key that is sitting on the harddrive of PyPI. However, once a delegation of Project X _has_ been signed changing that delegation would need waiting until the next time the delegations were signed by the offline keys. This is because once a project is signed by an offline key then all further changes to the delegation require offline signing. In addition, this does not mean (I believe! we should verify this) that the owner of a project cannot further delegate to other people without delay, since they'll be able to sign that delegation with their own keys and won't require intervention from PyPI. So really it looks like this (abbreviated, not exactly, etc): root (offline) |- PyPI Admins (offline) |- "Unclamined" (online) |- New Project that hasn't yet been signed for by PyPI Admins (offline, owned by author) |- Existing Project that has already been signed for by PyPI Admins (offline, owned by author) The periodic signing process by the PyPI admins just moves a new project from being signed for by the "Unclaimed" online key to being signed for by our offline keys. This process is basically invisible to everyone involved. Does that make sense? --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From p.f.moore at gmail.com Fri Jan 2 13:45:42 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 2 Jan 2015 12:45:42 +0000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: On 2 January 2015 at 11:21, Donald Stufft wrote: > To be clear, there is zero delay in being able to publish a new project, the > delay is between moving from a new project being validated by an online key > to an offline key. OK, got it. Although on the terminology front, I don't really understand what an "online key" and an "offline key" are. My mind translates them as "worse" and "better" from context, but that's about all. I'm not asking for an explanation (I can look that up) but as terms being encountered by the non-specialist, they contribute to the difficulty in reading the proposals. If there's any better way of naming these two types of keys, that would be accessible to the non-specialist, that would be a great help. (Actually, the other implication I read into "offline key" is "something that I, as the owner, have to take care of and manage" - and that scares me because I'm rubbish at managing keys or anything like that, basically anything more complex than a password - at least until we get to things like RSA keys and tokens that I use for "work" and am paid to manage as opposed to "hobby" stuff). > The only real difference between validation levels is that > until it's been signed by an offline key then people installing that project > are vulnerable to someone compromising PyPI. This is because until the > delegation of project X to a specific developer has been signed the "chain of > trust" contains a key that is sitting on the harddrive of PyPI. > > However, once a delegation of Project X _has_ been signed changing that > delegation would need waiting until the next time the delegations were signed > by the offline keys. This is because once a project is signed by an offline > key then all further changes to the delegation require offline signing. "Delegation" is another term that doesn't make immediate sense. I read it as "Confirm ownership" sort of, Again, it's not that I can't work out what the term means, but I don't get an immediate sense of the implications. Here, for example, it's not immediately clear whether delegation changes would be common or rare, or whether having them happen quickly would be important. (For example, if you're not available for a pip release, and we never bothered sharing the keys because it's easier just for you to have them, would we need a delegation change to do an emergency release?) Again, this isn't something that it's important to clarify for me here and now, but I would like to see the PEP updated to clarify this sort of issue in terms that are accessible to the layman. > In addition, this does not mean (I believe! we should verify this) that the > owner of a project cannot further delegate to other people without delay, since > they'll be able to sign that delegation with their own keys and won't require > intervention from PyPI. See above - implies to me that if the "owner" (in terms of keys rather than project structure) is unavailable, other project members may not be able to upload (rather than as now, when they can upload with the project's PyPI account password and/or standard password recovery processes to the project email address). > So really it looks like this (abbreviated, not exactly, etc): > > root (offline) > |- PyPI Admins (offline) > |- "Unclamined" (online) > |- New Project that hasn't yet been signed for by PyPI Admins > (offline, owned by author) > |- Existing Project that has already been signed for by PyPI Admins > (offline, owned by author) I'm not keen on the term "unclaimed". All projects on PyPI right now are "unclaimed" by that usage, which is emphatically not what "unclaimed" means intuitively to me. Surely "pip" is "claimed" by the PyPA? Maybe "unverified" works as a term (as in, verifying your account when signing up to a forum). I get the idea that unclaimed implies there's a risk, and sure there is, but this smacks of using a loaded term to rub people's noses in the fact that what they've been happily using for years "isn't safe". This happens a lot with security debates, and IMO actively discourages people from buying into the changes. It would be useful to have *some* document (or part thereof - maybe an overview section in the PEP) structured as an objective cost/benefit proposal: 1. The current PyPI infrastructure has the following risks. We assess the chance that they might occur as X. 2. The impact on the reader, as an individual, of a compromise, would be the following. 3. The cost to the reader, under the proposed solution, of avoiding the risks, is the following. There are probably at least two classes of reader involved - project authors and project users. If in either case one class of user has to bear some cost on behalf of the other, then that should be called out. I believe that I (and any other readers of the proposals) should be able to sensibly assess the cost/benefit tradeoffs on my own, given the above information. My experience and judgement may not be typical, so my opinion should be taken in the context of others, but that doesn't mean I'm wrong, or that my views are inappropriate for me. For example, in 20 years of extensively using software off the internet, I have never once downloaded a piece of software that wasn't what it claimed, and I expected it, to be. So the discussion of compromising PyPI packages seems like an amazingly low risk to me[1]. > The periodic signing process by the PyPI admins just moves a new project from > being signed for by the "Unclaimed" online key to being signed for by our > offline keys. This process is basically invisible to everyone involved. It's as visible to end users as the significance of describing something as "unclaimed". If nobody cared a project was "unclaimed" then it would be invisible. Otherwise, less so. Hence my preference for a less emotive term. > Does that make sense? Yes it does - many thanks for the explanation. Paul [1] That's just an example, and it would be off-topic to debate the various other things that overall contribute to why and to what level I'm currently comfortable using PyPI. And I'm not running a business that would fail if PyPI were compromised. So please just take this as a small data point, nothing more. From donald at stufft.io Fri Jan 2 14:48:43 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 08:48:43 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: > On Jan 2, 2015, at 7:45 AM, Paul Moore wrote: > > On 2 January 2015 at 11:21, Donald Stufft wrote: >> To be clear, there is zero delay in being able to publish a new project, the >> delay is between moving from a new project being validated by an online key >> to an offline key. > > OK, got it. > > Although on the terminology front, I don't really understand what an > "online key" and an "offline key" are. My mind translates them as > "worse" and "better" from context, but that's about all. I'm not > asking for an explanation (I can look that up) but as terms being > encountered by the non-specialist, they contribute to the difficulty > in reading the proposals. If there's any better way of naming these > two types of keys, that would be accessible to the non-specialist, > that would be a great help. > > (Actually, the other implication I read into "offline key" is > "something that I, as the owner, have to take care of and manage" - > and that scares me because I'm rubbish at managing keys or anything > like that, basically anything more complex than a password - at least > until we get to things like RSA keys and tokens that I use for "work" > and am paid to manage as opposed to "hobby" stuff). Hmm, I?m not sure if there?s really a better way of naming them. Those are "standard" names for them and they are defined in the PEP under the Definitions section [1]. For PEP 458 there is zero key management done by anyone other than the people running PyPI. It is essentially replacing using TLS for verification with using TUF for verficiation (which gets us a few wins, but still leaves a few enhancements on the table that we get with the addition of PEP 480). For PEP 480 authors have to manage *something*. What exactly that *something* is, is a question for PEP 480. One (poor in my opinion) possibility is an RSA key which means that authors will need to manage keeping that file around and backed up and such. Another possiblility is a secondary password which is only used when needing to sign something (so during uploads and the like). > >> The only real difference between validation levels is that >> until it's been signed by an offline key then people installing that project >> are vulnerable to someone compromising PyPI. This is because until the >> delegation of project X to a specific developer has been signed the "chain of >> trust" contains a key that is sitting on the harddrive of PyPI. >> >> However, once a delegation of Project X _has_ been signed changing that >> delegation would need waiting until the next time the delegations were signed >> by the offline keys. This is because once a project is signed by an offline >> key then all further changes to the delegation require offline signing. > > "Delegation" is another term that doesn't make immediate sense. I read > it as "Confirm ownership" sort of, Again, it's not that I can't work > out what the term means, but I don't get an immediate sense of the > implications. Here, for example, it's not immediately clear whether > delegation changes would be common or rare, or whether having them > happen quickly would be important. (For example, if you're not > available for a pip release, and we never bothered sharing the keys > because it's easier just for you to have them, would we need a > delegation change to do an emergency release?) > > Again, this isn't something that it's important to clarify for me here > and now, but I would like to see the PEP updated to clarify this sort > of issue in terms that are accessible to the layman. This isn?t defined in the PEP. In PEP 480 though, if I?m the only person who has been setup to be allowed to sign for pip releases, then nobody else can release pip without intervention from PyPI Administrators. That?s not entirely different than the current situation where if I was the only person added as a maintainer to the pip project on PyPI and someone else needed to do a release they couldn?t without intervention from the PyPI administrators. IOW doing the required steps to enable other people?s keys to sign would be part of adding them as maintainers to that project. > >> In addition, this does not mean (I believe! we should verify this) that the >> owner of a project cannot further delegate to other people without delay, since >> they'll be able to sign that delegation with their own keys and won't require >> intervention from PyPI. > > See above - implies to me that if the "owner" (in terms of keys rather > than project structure) is unavailable, other project members may not > be able to upload (rather than as now, when they can upload with the > project's PyPI account password and/or standard password recovery > processes to the project email address). If the owner or someone they delegated to, yes. > >> So really it looks like this (abbreviated, not exactly, etc): >> >> root (offline) >> |- PyPI Admins (offline) >> |- "Unclamined" (online) >> |- New Project that hasn't yet been signed for by PyPI Admins >> (offline, owned by author) >> |- Existing Project that has already been signed for by PyPI Admins >> (offline, owned by author) > > I'm not keen on the term "unclaimed". All projects on PyPI right now > are "unclaimed" by that usage, which is emphatically not what > "unclaimed" means intuitively to me. Surely "pip" is "claimed" by the > PyPA? Maybe "unverified" works as a term (as in, verifying your > account when signing up to a forum). I get the idea that unclaimed > implies there's a risk, and sure there is, but this smacks of using a > loaded term to rub people's noses in the fact that what they've been > happily using for years "isn't safe". This happens a lot with security > debates, and IMO actively discourages people from buying into the > changes. Unclaimed is probably a bad name for it, although this name wouldn?t actually be exposed to end users. It?s sort of like if we had rel=?unclaimed? links on /simple/. > > It would be useful to have *some* document (or part thereof - maybe an > overview section in the PEP) structured as an objective cost/benefit > proposal: > > 1. The current PyPI infrastructure has the following risks. We assess > the chance that they might occur as X. > 2. The impact on the reader, as an individual, of a compromise, would > be the following. > 3. The cost to the reader, under the proposed solution, of avoiding > the risks, is the following. > > There are probably at least two classes of reader involved - project > authors and project users. If in either case one class of user has to > bear some cost on behalf of the other, then that should be called out. > > I believe that I (and any other readers of the proposals) should be > able to sensibly assess the cost/benefit tradeoffs on my own, given > the above information. My experience and judgement may not be typical, > so my opinion should be taken in the context of others, but that > doesn't mean I'm wrong, or that my views are inappropriate for me. For > example, in 20 years of extensively using software off the internet, I > have never once downloaded a piece of software that wasn't what it > claimed, and I expected it, to be. So the discussion of compromising > PyPI packages seems like an amazingly low risk to me[1]. See an upcoming email about refocusing the discussion here. > >> The periodic signing process by the PyPI admins just moves a new project from >> being signed for by the "Unclaimed" online key to being signed for by our >> offline keys. This process is basically invisible to everyone involved. > > It's as visible to end users as the significance of describing > something as "unclaimed". If nobody cared a project was "unclaimed" > then it would be invisible. Otherwise, less so. Hence my preference > for a less emotive term. > >> Does that make sense? > > Yes it does - many thanks for the explanation. > > Paul > > [1] That's just an example, and it would be off-topic to debate the > various other things that overall contribute to why and to what level > I'm currently comfortable using PyPI. And I'm not running a business > that would fail if PyPI were compromised. So please just take this as > a small data point, nothing more. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From donald at stufft.io Fri Jan 2 15:25:17 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 09:25:17 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: > On Dec 10, 2014, at 10:16 PM, Vladimir Diaz wrote: > > Hello everyone, > > I am a research programmer at the NYU School of Engineering. My colleagues (Trishank Kuppusamy and Justin Cappos) and I are requesting community feedback on our proposal, "Surviving a Compromise of PyPI." The two-stage proposal can be reviewed online at: > > PEP 458 > http://legacy.python.org/dev/peps/pep-0458/ > > PEP 480 > http://legacy.python.org/dev/peps/pep-0480/ > I?m going through the PEPs again, and I think that evaluating these PEPs is more complicated by the fact that there is two of them, I agree that splitting up the two PEPs was the right thing to do though. What do you think about putting PEP 480 on the back burner for the moment and focus on PEP 458. I think this will benefit the process in a few ways: * I believe that PEP 458 is far less controversial than PEP 480 since it?s effectively just exchanging TLS for TUF for verification and other than for the authors of the tools who need to work with it (pip, devpi, bandersnatch, PyPI, etc) it?s transparent for the end users. * It allows us to discuss the particulars of getting TUF integrated in PyPI without worrying about the far nastier problem of exposing a signing UX to end authors. * While discussing it, we can still ensure that we leave ourselves enough flexibility so that we don?t need to make any major changes for PEP 480. * The problems that are going to crop up while implementing it (is the mirror protocol good enough? CDN Purging? Etc) are mostly likely to happen during PEP 458. * I think having them both being discussed at the same time is causing a lot of confusion between which proposal does what. * By focusing on one at a time we can get a more polished PEP that is easier to understand (see below). Overall I think the PEPs themselves need a bit of work, they currently rely a lot on reading the TUF white paper and the TUF spec in the repository to get a grasp of what?s going on. I believe they also read more like suggestions about how we might go about implementing TUF rather than an actionable plan for implementing TUF. I?ve seen feedback from more than one person that they feel like they are having a hard time grok?ing what the *impact* of the PEPs will be to them without having to go through and understand the intricacies of TUF and how the PEP implements TUF. Technically I?m listed on this PEP as an author (although I don?t remember anymore what I did to get on there :D), but I care a good bit about getting something better than TLS setup, so, if y?all are willing to defer PEP 480 for right now and focus on PEP 458 and y?all want it, I?m willing to actually earn that co-author spot and help get PEP 458 into shape and help get it accepted and implemented. Either way though, I suggest focus on PEP 458 (with an eye towards not making any decisions which will require changes on the client side to implement PEP 480). --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 15:55:17 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 00:55:17 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: On 3 January 2015 at 00:25, Donald Stufft wrote: > > On Dec 10, 2014, at 10:16 PM, Vladimir Diaz > wrote: > > Hello everyone, > > I am a research programmer at the NYU School of Engineering. My > colleagues (Trishank Kuppusamy and Justin Cappos) and I are requesting > community feedback on our proposal, "Surviving a Compromise of PyPI." The > two-stage proposal can be reviewed online at: > > PEP 458 > http://legacy.python.org/dev/peps/pep-0458/ > > PEP 480 > http://legacy.python.org/dev/peps/pep-0480/ > > > I?m going through the PEPs again, and I think that evaluating these PEPs is > more complicated by the fact that there is two of them, I agree that > splitting > up the two PEPs was the right thing to do though. What do you think about > putting PEP 480 on the back burner for the moment and focus on PEP 458. > +1 from me - being able to resolve them in that order was my main motivation for suggesting they be split apart into separate proposals. I just don't personally have any major open questions for PEP 458 - while I'm aware there are some significant technical details to be resolved in terms of exactly what gets signed, and how the implementation will work in practice, I think the concept is sound, and I don't have the necessary knowledge of pip and PyPI internals to have an opinion on the details of the integration. For PEP 480, by contrast, I still have some fundamental questions about whether we should go with a model of "additional developer managed secrets" (the PEP 480 approach, where developers register an additional authentication token for uploads with the PyPI administrators that mean compromising PyPI doesn't grant the ability to do fake uploads for a service) or "additional developer managed accounts" (my new suggestion in this thread, where we collaborate with other organisations like Linux distributions and the OpenStack Foundation for federated validation of critical uploads). While the two approaches are mathematically equivalent, I suspect they're profoundly different in terms of how easy it will be for folks to grasp the relevant security properties. > I think this will benefit the process in a few ways: > > * I believe that PEP 458 is far less controversial than PEP 480 since it?s > effectively just exchanging TLS for TUF for verification and other than > for > the authors of the tools who need to work with it (pip, devpi, > bandersnatch, > PyPI, etc) it?s transparent for the end users. > * It allows us to discuss the particulars of getting TUF integrated in PyPI > without worrying about the far nastier problem of exposing a signing UX > to > end authors. > * While discussing it, we can still ensure that we leave ourselves enough > flexibility so that we don?t need to make any major changes for PEP 480. > * The problems that are going to crop up while implementing it (is the > mirror > protocol good enough? CDN Purging? Etc) are mostly likely to happen > during > PEP 458. > * I think having them both being discussed at the same time is causing a > lot of > confusion between which proposal does what. > * By focusing on one at a time we can get a more polished PEP that is > easier to > understand (see below). > Resolving PEP 458 first was my intention when I made the proposal to split them. I only brought the broader PEP 480 proposal into question because the validation server based design finally started to click for me earlier today. > Overall I think the PEPs themselves need a bit of work, they currently > rely a > lot on reading the TUF white paper and the TUF spec in the repository to > get a > grasp of what?s going on. I believe they also read more like suggestions > about > how we might go about implementing TUF rather than an actionable plan for > implementing TUF. I?ve seen feedback from more than one person that they > feel > like they are having a hard time grok?ing what the *impact* of the PEPs > will be > to them without having to go through and understand the intricacies of TUF > and > how the PEP implements TUF. > > Technically I?m listed on this PEP as an author (although I don?t remember > anymore what I did to get on there :D), > IIRC, the current draft of the snapshotting mechanism relied heavily on your input regarding what was feasible in the context of the PyPI API design. > but I care a good bit about getting > something better than TLS setup, so, if y?all are willing to defer PEP 480 > for > right now and focus on PEP 458 and y?all want it, I?m willing to actually > earn > that co-author spot and help get PEP 458 into shape and help get it > accepted > and implemented. > If the TUF folks are amenable, I think that would be great. You've been through the PEP process a few times now, and are probably far more familiar with what's involved in creating a robust PEP than you might wish ;) > Either way though, I suggest focus on PEP 458 (with an eye towards not > making > any decisions which will require changes on the client side to implement > PEP > 480). > Yep. I'll make one more post to try to clarify why I see separate validation services as having the potential to offer a superior UX in a PEP 480 context, but I'm not pushing for a short term decision on that one - I'm just aiming to plant the seeds of ideas that may be worth keeping in mind as we work on PEP 458. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 16:02:54 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 10:02:54 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: <221549F4-E5A3-4880-8663-797871C0FB26@stufft.io> > On Jan 2, 2015, at 9:55 AM, Nick Coghlan wrote: > > I just don't personally have any major open questions for PEP 458 - while I'm aware there are some significant technical details to be resolved in terms of exactly what gets signed, and how the implementation will work in practice, I think the concept is sound, and I don't have the necessary knowledge of pip and PyPI internals to have an opinion on the details of the integration. To be clear, I also think that PEP 458?s concept is sound and I think it?s the right direction to go in. The things I think are holding it back are nailing down the technical details and getting it so that someone can get a high level understanding of what it?s actually doing without needing to read the supporting documentation (white paper, etc) and without some level of what most would call expert knowledge in the area (though I don?t like to call myself an expert, I realize to most people I likely am). --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Fri Jan 2 16:31:42 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 2 Jan 2015 15:31:42 +0000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: On 2 January 2015 at 14:25, Donald Stufft wrote: > I?m going through the PEPs again, and I think that evaluating these PEPs is > more complicated by the fact that there is two of them, I agree that > splitting > up the two PEPs was the right thing to do though. What do you think about > putting PEP 480 on the back burner for the moment and focus on PEP 458. > > I think this will benefit the process in a few ways: > > * I believe that PEP 458 is far less controversial than PEP 480 since it?s > effectively just exchanging TLS for TUF for verification and other than > for the authors of the tools who need to work with it (pip, devpi, > bandersnatch, PyPI, etc) it?s transparent for the end users. > * It allows us to discuss the particulars of getting TUF integrated in PyPI > without worrying about the far nastier problem of exposing a signing UX to > end authors. > * While discussing it, we can still ensure that we leave ourselves enough > flexibility so that we don?t need to make any major changes for PEP 480. > * The problems that are going to crop up while implementing it (is the > mirror protocol good enough? CDN Purging? Etc) are mostly likely to happen during > PEP 458. > * I think having them both being discussed at the same time is causing a lot > of confusion between which proposal does what. > * By focusing on one at a time we can get a more polished PEP that is easier > to understand (see below). > > Overall I think the PEPs themselves need a bit of work, they currently rely > a lot on reading the TUF white paper and the TUF spec in the repository to get > a grasp of what?s going on. I believe they also read more like suggestions > about how we might go about implementing TUF rather than an actionable plan > for implementing TUF. I?ve seen feedback from more than one person that they > feel like they are having a hard time grok?ing what the *impact* of the PEPs will > be to them without having to go through and understand the intricacies of TUF > and how the PEP implements TUF. > > Technically I?m listed on this PEP as an author (although I don?t remember > anymore what I did to get on there :D), but I care a good bit about getting > something better than TLS setup, so, if y?all are willing to defer PEP 480 > for right now and focus on PEP 458 and y?all want it, I?m willing to actually > earn that co-author spot and help get PEP 458 into shape and help get it > accepted and implemented. > > Either way though, I suggest focus on PEP 458 (with an eye towards not > making any decisions which will require changes on the client side to implement > PEP 480). +1 on all of this. I agree that PEP 458 is (relatively speaking) an obvious thing to do, and if the people who have to do the work for it are happy, I think it should just go ahead. I'd like to see the PEPs reworded a bit to be less intimidating to the non-specialist. For PEPs about the trust model for PyPI, it's ironic that I have to place a lot of trust in the PEP authors simply because I don't understand half of what they are saying ;-) Paul From chris.jerdonek at gmail.com Fri Jan 2 16:39:54 2015 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Fri, 2 Jan 2015 07:39:54 -0800 Subject: [Distutils] problem viewing pep 440 docs on mobile Message-ID: Sorry if this isn't the best list on which to bring this up, but it came up for me during the recent PEP 440 discussions. For a while I've noticed a serious problem when viewing PEP doc pages like the following on my iPhone: https://www.python.org/dev/peps/pep-0440/ It's a bit maddening, so I wanted to see if anyone else on this list has experienced it or can reproduce it. It happens for me with both Chrome and Safari on the latest iOS 8.1.2, though I believe it happened for me on older iOS versions, too. After the page is fully loaded, if I scroll down, say, halfway through the page, the browser will jump back to the very top. This effectively makes it impossible to read the page. I haven't noticed this issue with any page not hosted on python.org. Can anyone else reproduce? Thanks, --Chris -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 16:51:21 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 01:51:21 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: On 2 January 2015 at 18:57, Donald Stufft wrote: > I have concerns about the actual feasibility of doing such a thing, some > of which are similar to my concerns with doing non-mandatory PEP 480. > > * If uploading to a verifier service is optional then a significant > portion of authors simply won?t do it and if you installing 100 things, and > 99 of them are verified and 1 of them are not then there is an attack > vector that I can use to compromise you undetected (since the author didn?t > upload their verification somewhere else). > Independently of the technical details of the "enhanced validation" support, I now agree pip would at least need to acquire a "fully validated downloads only" mode, where it refused to install anything that wasn't trusted at the higher integrity level. However, it's worth keeping in mind that the truly security conscious people aren't going to ever trust any package that hasn't been through at least some level of preliminary security review (whether their own, or that of an organisation they trust). > * It?s not actually less work in general, it just pushes the work from the > PyPI administrators to the community. This can work well if the community > is willing to step up! However, PyPI?s availability/speed problems were > originally attempted to be solved by pushing the work to the community via > the original mirror system and (not to downplay the people who did step up) > the response was not particularly great and the mirrors got a few at first > and as the shiny factor wore off people?s mirrors shutdown or stopped > working or what have you. > >From a practical perspective, I think the work being pushed to the developer community is roughly equivalent. The only question is whether the secret developers are being asked to manage is a separate authentication token for PyPI uploads (which is then used to mathematically secure uploads in such a way that PyPI itself can't change the release metadata for a project), or an account on a separate validation service, either run by the developer themselves, or by a trusted third party like the OpenStack Foundation, Fedora, Mozilla, etc. My contention is that "to support enhanced validation of your project, you, and anyone else authorised to make releases for your project, will need an account on a metadata validation server, and whenever you make a new release, you will need to register it with the metadata validation server before publishing it on PyPI (otherwise folks using enhanced validation won't be able to install it)" is a relatively simple and easy to understand approach, while "to support enhanced validation of your project, you, and anyone else authorised to make releases for your project, will need to get a developer key signed by the PyPI administrators, and no, you can't just upload the key though the PyPI web UI, because if we let you do that, it wouldn't provide the desired security properties" is inherently confusing, and there's no explanation we can provide that will make it make sense to anyone that hasn't spent years thoroughly steeped in this stuff. Yes, the mathematics technically lets us provide the desired security guarantees within the scope of a single service, but doing it that way is confusing to users in a way that I now believe can be avoided through clear structural separation of the "content distribution" functionality of the main PyPI service and the "content validation" functionality of separate metadata validation services. Having the validation services run by someone *other than the PSF* can also be explained in a way that makes intuitive sense even when someone doesn't understand the technical details, since it's clear that folks with privileged access to a distribution service run by the PSF wouldn't necessarily have privileged access to a validation service run by the OpenStack Foundation (etc) and vice-versa. The "independent online services" approach also provides additional flexibility that isn't offered by a purely mathematical solution - an online metadata service potentially *can* regenerate its metadata in a new format if they can be persuaded its necessary. Having that capability represents a potential denial of service vulnerability specifically against enhanced validation mode (if a validation server is compromised), but there are already much easier ways to execute those. > * A number of the attacks that TUF protects against do not rely on the > attacker creating malicious software packages, things only showing known > insecure versions of a project so that they can then attack people through > a known exploit. It?s not *wrong* to not protect against these (most > systems don?t) but we?d want to explicitly decide that we?re not going to. > External metadata validation servers can still protect against downgrade attacks - "release is unexpectedly missing" is a metadata discrepancy, just like "release contents are unexpectedly different". Validation servers can also provide additional functionality, like mapping from CVEs to affected versions, that can't be sensibly offered through the same service that is responsible for publishing the software in the first place. > I?d note that PEP 480 and your proposal aren?t really mutually exclusive > so there?s not really harm in *trying* yours and if it fails falling back > to something like PEP 480 other than end user confusion if that gets shut > down and the cost of actually developing/setting up that solution. > > Overall I?m +1 on things that enable better detection of a compromise but > I?m probably -0.5 or so on your specific proposal as I think that expecting > developers to upload verification data to ?verification servers? is just > pushing work onto other people just so we don?t have to do it. > Getting them to manage additional keys, and get them signed and registered appropriately, and then supplying them is going to be a similar amount of work, and the purpose is far more cryptic and confusing. My proposal is basically that instead of asking developers to manage signing keys, we should instead be ask them to manage account on a validation server (or servers). Building on top of service account management also means developers can potentially leverage existing *account* security tools, rather than needing to come up with custom key management solutions for PyPI publishing keys. Technically such a validation server could be run on the PSF infrastructure, but that opens up lots of opportunities for common attacks that provide privileged access to both PyPI and the validation servers - by moving the operation of the latter out to trusted third party organisations, we'd be keeping indirect attacks through the PSF infrastructure side from compromising both systems at the same time. > I also think your two questions are not exactly right, because all that > means is that it becomes harder to attack *everyone* via a PyPI compromise, > however it?s still trivial to attack specific people if you?ve compromised > PyPI or the CDN since you can selectively serve maliciously signed packages > depending on who is requesting them. To this end I don?t think a solution > that pip doesn?t implement is actually going to prevent anything but very > dumb attacks by an attacker who has already compromised the PyPI machines. > Yes, while I was thinking we may be able to get away without pip providing enhanced validation support directly, I now agree we'll need to provide it. However, the UX of that shouldn't depend on the technical details of how enhanced validation mode is actually implemented - from an end user perspective, the key things they need to know are: * for many cases, the default level of validation offered by pip (given PEP 458) is likely to be good enough, and will provide access to all the packages on PyPI * for some cases, the default level of validation *isn't* good enough, and for those, pip offers an "enhanced validation" mode * if you turn on enhanced validation, there will be a lot of software that *won't* install * if you want to use enhanced validation, but some projects you'd like to use don't support it, then you'll either need to offer those projects assistance with supporting enhanced validation mode, consider using different dependencies, or else reconsider whether or not you really need enhanced validation for your current use case > I think another issue here is that we?re effectively doing something > similar to TLS except instead of domain names we have project names and > that although there are *a lot* of people who really hate the CA system > nobody has yet come up with an effective means of actually replacing it > without regressing into worse security. The saving grace here is that we > operate at a much smaller scale (one ?DNS? root, one trust root, ~53k > unique names vs? more than I feel like account) so it?s possible that > solutions which don?t scale at TLS scale might scale at PyPI scale. > It's worth noting that my validation server idea is still very much a CA-style model - it's just that instead of registering developer keys with PyPI (as in PEP 480), we'd be registering the trust roots of metadata validation servers with pip. All my idea really does is take the key management problem for enhanced validation away from developers, replacing it with an account management problem that they're likely to already be thoroughly familiar with, even if they don't any experience in secure software distribution. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 17:12:32 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 11:12:32 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: > On Jan 2, 2015, at 10:51 AM, Nick Coghlan wrote: > > Getting them to manage additional keys, and get them signed and registered appropriately, and then supplying them is going to be a similar amount of work, and the purpose is far more cryptic and confusing. My proposal is basically that instead of asking developers to manage signing keys, we should instead be ask them to manage account on a validation server (or servers). I need to think more about the rest of what you?ve said (and I don?t think it?s a short term problem), but I just wanted to point out that ?managing keys? can be as simple as ?create a secondary pass(word|phrase) and remember it/write it down/whatever?. It doesn?t need to be ?secure this file and copy it around to all of your computers?. Likewise there?s no reason that ?delegate authority to someone else? can?t be something like ``twine add-maintainer pip pfmoore``. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From qwcode at gmail.com Fri Jan 2 17:07:44 2015 From: qwcode at gmail.com (Marcus Smith) Date: Fri, 2 Jan 2015 08:07:44 -0800 Subject: [Distutils] problem viewing pep 440 docs on mobile In-Reply-To: References: Message-ID: yes, I see a similar bug on android. If I try to scroll up a *little*, it goes all the back to the top. On Fri, Jan 2, 2015 at 7:39 AM, Chris Jerdonek wrote: > Sorry if this isn't the best list on which to bring this up, but it came > up for me during the recent PEP 440 discussions. > > For a while I've noticed a serious problem when viewing PEP doc pages like > the following on my iPhone: > > https://www.python.org/dev/peps/pep-0440/ > > It's a bit maddening, so I wanted to see if anyone else on this list has > experienced it or can reproduce it. > > It happens for me with both Chrome and Safari on the latest iOS 8.1.2, > though I believe it happened for me on older iOS versions, too. > > After the page is fully loaded, if I scroll down, say, halfway through the > page, the browser will jump back to the very top. This effectively makes > it impossible to read the page. I haven't noticed this issue with any page > not hosted on python.org. > > Can anyone else reproduce? > > Thanks, > --Chris > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 17:14:51 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 02:14:51 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: On 3 January 2015 at 01:31, Paul Moore wrote: > On 2 January 2015 at 14:25, Donald Stufft wrote: > > Either way though, I suggest focus on PEP 458 (with an eye towards not > > making any decisions which will require changes on the client side to > implement > > PEP 480). > > +1 on all of this. > > I agree that PEP 458 is (relatively speaking) an obvious thing to do, > and if the people who have to do the work for it are happy, I think it > should just go ahead. > > I'd like to see the PEPs reworded a bit to be less intimidating to the > non-specialist. For PEPs about the trust model for PyPI, it's ironic > that I have to place a lot of trust in the PEP authors simply because > I don't understand half of what they are saying ;-) > FWIW, Niels Ferguson's and Bruce Scheier's "Practical Cryptography" was probably the single most enlightening book I've read on the topic. The NIST standards in this area are also genuinely excellent (the occasional less than ideal technical recommendation from certain government agencies notwithstanding), and if you can afford the paywall (or work for an organisation that can do so), actually reading relevant sections of IEEE 802.11i was a key part of my own learning. (My specific interest was in authentication protocols for access control, hence why I was reading the Wi-Fi WPA2 spec, but a lot of the underlying cryptographic concepts are shared between the different kinds of digital verification) For broader context, Schneier's "Secrets and Lies" is good from a technical perspective, while the more recent "Liars and Outliers" looks to situate the security mindset in a broader social environment. There's a reason Schneier is as well respected as he is - if you're ever looking for general advice on how to be pragmatically paranoid, then he's a great source to turn to. That said, while I doubt we're going to be able to completely de-jargonise the PEP details, I agree it would be worthwhile to ensure there's a clear explanation of the practical consequences for folks that we'd otherwise lose in the cryptographic weeds. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From vladimir.v.diaz at gmail.com Fri Jan 2 17:14:49 2015 From: vladimir.v.diaz at gmail.com (Vladimir Diaz) Date: Fri, 2 Jan 2015 11:14:49 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: Thanks for the great feedback - Nick, Donald, Paul, and Richard (off-list). I am totally fine with focusing on PEP 458 and applying the final coat of paint on this document. There's a lot of background documentation and technical details excluded from the PEPs (to avoid turning the PEP into a 15+ page behemoth), but I do agree that we should explicitly cover some of these implementation details in PEP 458. Subsections on the exact format of metadata, explanation on how metadata is signed, and how the roles are "delegated" with the library, still remain. As Paul as indicated, terminology can also be improved so as to be more readable for "non-experts." Let me know how we should collaborate on PEP 458 going forward. Guido van Rossum made minor corrections to PEP 458, and requested we reflect his changes back to the version on Github. We can either move hg.python.org/pep/pep-0458.txt to github.com/pypa or github.com/theupdateframework/pep-on-pypi-with-tuf. Thanks, Vlad On Fri, Jan 2, 2015 at 10:31 AM, Paul Moore wrote: > On 2 January 2015 at 14:25, Donald Stufft wrote: > > I?m going through the PEPs again, and I think that evaluating these PEPs > is > > more complicated by the fact that there is two of them, I agree that > > splitting > > up the two PEPs was the right thing to do though. What do you think about > > putting PEP 480 on the back burner for the moment and focus on PEP 458. > > > > I think this will benefit the process in a few ways: > > > > * I believe that PEP 458 is far less controversial than PEP 480 since > it?s > > effectively just exchanging TLS for TUF for verification and other than > > for the authors of the tools who need to work with it (pip, devpi, > > bandersnatch, PyPI, etc) it?s transparent for the end users. > > * It allows us to discuss the particulars of getting TUF integrated in > PyPI > > without worrying about the far nastier problem of exposing a signing > UX to > > end authors. > > * While discussing it, we can still ensure that we leave ourselves enough > > flexibility so that we don?t need to make any major changes for PEP > 480. > > * The problems that are going to crop up while implementing it (is the > > mirror protocol good enough? CDN Purging? Etc) are mostly likely to > happen during > > PEP 458. > > * I think having them both being discussed at the same time is causing a > lot > > of confusion between which proposal does what. > > * By focusing on one at a time we can get a more polished PEP that is > easier > > to understand (see below). > > > > Overall I think the PEPs themselves need a bit of work, they currently > rely > > a lot on reading the TUF white paper and the TUF spec in the repository > to get > > a grasp of what?s going on. I believe they also read more like > suggestions > > about how we might go about implementing TUF rather than an actionable > plan > > for implementing TUF. I?ve seen feedback from more than one person that > they > > feel like they are having a hard time grok?ing what the *impact* of the > PEPs will > > be to them without having to go through and understand the intricacies > of TUF > > and how the PEP implements TUF. > > > > Technically I?m listed on this PEP as an author (although I don?t > remember > > anymore what I did to get on there :D), but I care a good bit about > getting > > something better than TLS setup, so, if y?all are willing to defer PEP > 480 > > for right now and focus on PEP 458 and y?all want it, I?m willing to > actually > > earn that co-author spot and help get PEP 458 into shape and help get it > > accepted and implemented. > > > > Either way though, I suggest focus on PEP 458 (with an eye towards not > > making any decisions which will require changes on the client side to > implement > > PEP 480). > > +1 on all of this. > > I agree that PEP 458 is (relatively speaking) an obvious thing to do, > and if the people who have to do the work for it are happy, I think it > should just go ahead. > > I'd like to see the PEPs reworded a bit to be less intimidating to the > non-specialist. For PEPs about the trust model for PyPI, it's ironic > that I have to place a lot of trust in the PEP authors simply because > I don't understand half of what they are saying ;-) > > Paul > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 17:15:22 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 11:15:22 -0500 Subject: [Distutils] problem viewing pep 440 docs on mobile In-Reply-To: References: Message-ID: > On Jan 2, 2015, at 11:07 AM, Marcus Smith wrote: > > yes, I see a similar bug on android. > If I try to scroll up a *little*, it goes all the back to the top. > > On Fri, Jan 2, 2015 at 7:39 AM, Chris Jerdonek > wrote: > Sorry if this isn't the best list on which to bring this up, but it came up for me during the recent PEP 440 discussions. > > For a while I've noticed a serious problem when viewing PEP doc pages like the following on my iPhone: > > https://www.python.org/dev/peps/pep-0440/ > > It's a bit maddening, so I wanted to see if anyone else on this list has experienced it or can reproduce it. > > It happens for me with both Chrome and Safari on the latest iOS 8.1.2, though I believe it happened for me on older iOS versions, too. > > After the page is fully loaded, if I scroll down, say, halfway through the page, the browser will jump back to the very top. This effectively makes it impossible to read the page. I haven't noticed this issue with any page not hosted on python.org . > > Can anyone else reproduce? > Probably the best place to report this is at https://github.com/python/pythondotorg/issues --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 17:26:16 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 11:26:16 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: > On Jan 2, 2015, at 11:14 AM, Vladimir Diaz wrote: > > Thanks for the great feedback - Nick, Donald, Paul, and Richard (off-list). > > I am totally fine with focusing on PEP 458 and applying the final coat of paint on this document. > > There's a lot of background documentation and technical details excluded from the PEPs (to avoid turning the PEP into a 15+ page behemoth), but I do agree that we should explicitly cover some of these implementation details in PEP 458. Subsections on the exact format of metadata, explanation on how metadata is signed, and how the roles are "delegated" with the library, still remain. As Paul as indicated, terminology can also be improved so as to be more readable for "non-experts." > > Let me know how we should collaborate on PEP 458 going forward. Guido van Rossum made minor corrections to PEP 458, and requested we reflect his changes back to the version on Github. We can either move hg.python.org/pep/pep-0458.txt to github.com/pypa or github.com/theupdateframework/pep-on-pypi-with-tuf . As far as I?m concerned I?m willing to collab however is best for y?all. It appears you?re doing it on Github in the https://github.com/theupdateframework/pep-on-pypi-with-tuf repository so I?m happy to make PRs there. I?m also happy to make PRs elsewhere as well though I prefer somewhere on Github. I?ll sit down with PEP 458 maybe this weekend and see if I can crank out some PRs to refine it. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 17:26:56 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 02:26:56 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: On 3 January 2015 at 02:12, Donald Stufft wrote: > > On Jan 2, 2015, at 10:51 AM, Nick Coghlan wrote: > > Getting them to manage additional keys, and get them signed and registered > appropriately, and then supplying them is going to be a similar amount of > work, and the purpose is far more cryptic and confusing. My proposal is > basically that instead of asking developers to manage signing keys, we > should instead be ask them to manage account on a validation server (or > servers). > > > I need to think more about the rest of what you?ve said (and I don?t think > it?s a short term problem), but I just wanted to point out that ?managing > keys? can be as simple as ?create a secondary pass(word|phrase) and > remember it/write it down/whatever?. It doesn?t need to be ?secure this > file and copy it around to all of your computers?. Likewise there?s no > reason that ?delegate authority to someone else? can?t be something like > ``twine add-maintainer pip pfmoore``. > Yeah, I'm confident that the UI can be made relatively straightforward regardless of how we make the actual validation work. The part I haven't got the faintest clue how to do for the PEP 480 version is building viable "folks models" of what those commands are doing on the back end that will give people confidence that they understand what is going on just from using the tools, rather than leaving them wondering why they need a secondary password, etc. >From a technical perspective, I don't think the validation server idea is superior to PEP 480. Where I think it's superior is that I'm far more confident in my ability to explain to a developer with zero security background how separate validation servers provide increased security, as the separation of authority would be structural in addition to mathematical. While the real security would still be coming from the maths, a folk model that believes it is coming from the structural separation between the publication server and the metadata validation servers will be good enough for most practical purposes, and unless someone is particularly interested in the mathematical details, they can largely be handwaved away with "the separation of responsibilities between the services is enforced mathematically". Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 17:30:04 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 02:30:04 +1000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: On 3 January 2015 at 02:26, Donald Stufft wrote: > > On Jan 2, 2015, at 11:14 AM, Vladimir Diaz > wrote: > > Thanks for the great feedback - Nick, Donald, Paul, and Richard (off-list). > > I am totally fine with focusing on PEP 458 and applying the final coat of > paint on this document. > > There's a lot of background documentation and technical details excluded > from the PEPs (to avoid turning the PEP into a 15+ page behemoth), but I do > agree that we should explicitly cover some of these implementation details > in PEP 458. Subsections on the exact format of metadata, explanation on > how metadata is signed, and how the roles are "delegated" with the library, > still remain. As Paul as indicated, terminology can also be improved so as > to be more readable for "non-experts." > > Let me know how we should collaborate on PEP 458 going forward. Guido van > Rossum made minor corrections to PEP 458, and requested we reflect his > changes back to the version on Github. We can either move > hg.python.org/pep/pep-0458.txt > to > github.com/pypa or github.com/theupdateframework/pep-on-pypi-with-tuf. > > > As far as I?m concerned I?m willing to collab however is best for y?all. > It appears you?re doing it on Github in the > https://github.com/theupdateframework/pep-on-pypi-with-tuf repository so > I?m happy to make PRs there. I?m also happy to make PRs elsewhere as well > though I prefer somewhere on Github. I?ll sit down with PEP 458 maybe this > weekend and see if I can crank out some PRs to refine it. > It probably makes sense to pull the TUF PEPs into the new pypa/interoperability-peps repo with the rest of them, and add Vladimir et al as developers on that repo (or just to the general PyPA developers group). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From vladimir.v.diaz at gmail.com Fri Jan 2 17:47:56 2015 From: vladimir.v.diaz at gmail.com (Vladimir Diaz) Date: Fri, 2 Jan 2015 11:47:56 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: I prefer pulling the TUF PEPs (available on hg.python.org) into github.com/pypa. Please add Justin, Linda, Trishank, and myself as collaborators: https://github.com/vladimir-v-diaz https://github.com/dachshund https://github.com/JustinCappos https://github.com/lvigdor P.S. Donald helped tremendously with the snapshot process, Ed25519 library, ideas, and feedback. I think that earns a spot on the authors list. On Fri, Jan 2, 2015 at 11:30 AM, Nick Coghlan wrote: > On 3 January 2015 at 02:26, Donald Stufft wrote: > >> >> On Jan 2, 2015, at 11:14 AM, Vladimir Diaz >> wrote: >> >> Thanks for the great feedback - Nick, Donald, Paul, and Richard >> (off-list). >> >> I am totally fine with focusing on PEP 458 and applying the final coat of >> paint on this document. >> >> There's a lot of background documentation and technical details excluded >> from the PEPs (to avoid turning the PEP into a 15+ page behemoth), but I do >> agree that we should explicitly cover some of these implementation details >> in PEP 458. Subsections on the exact format of metadata, explanation on >> how metadata is signed, and how the roles are "delegated" with the library, >> still remain. As Paul as indicated, terminology can also be improved so as >> to be more readable for "non-experts." >> >> Let me know how we should collaborate on PEP 458 going forward. Guido >> van Rossum made minor corrections to PEP 458, and requested we reflect his >> changes back to the version on Github. We can either move >> hg.python.org/pep/pep-0458.txt >> to >> github.com/pypa or github.com/theupdateframework/pep-on-pypi-with-tuf. >> >> >> As far as I?m concerned I?m willing to collab however is best for y?all. >> It appears you?re doing it on Github in the >> https://github.com/theupdateframework/pep-on-pypi-with-tuf repository so >> I?m happy to make PRs there. I?m also happy to make PRs elsewhere as well >> though I prefer somewhere on Github. I?ll sit down with PEP 458 maybe this >> weekend and see if I can crank out some PRs to refine it. >> > > It probably makes sense to pull the TUF PEPs into the new > pypa/interoperability-peps repo with the rest of them, and add Vladimir et > al as developers on that repo (or just to the general PyPA developers > group). > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 2 18:09:23 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 03:09:23 +1000 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: <20150101141916.4e2f0bf0@anarchist> References: <20150101141916.4e2f0bf0@anarchist> Message-ID: On 2 January 2015 at 05:19, Barry Warsaw wrote: > - Should some other boilerplate be added to the namespace-package > __init__.py > files in each portion? Maybe the sys.hexversion guards should be > removed so > that it acts the same way in both Python 2 and Python 3. > This was the intended solution when PEP 420 was written - we deliberately didn't break old-style namespace packages, we just made them redundant for code that only needs to run on 3.3+. This is much easier to learn, since it means creating packages as subdirectories "just works", and automatically collecting all search path subdirectories with a given name into a common namespace by default better matches the typical behaviour of other search path based explicit import systems. However, implicit namespace packages aren't inherently *better* than the old explicit ones at runtime, as the end result is the same in either case: a module.__path__ entry that contains multiple directories. The only difference is in whether you get there by leaving out __init__.py and letting the 3.3+ import machinery handle it, or by doing it explicitly yourself. That means there's a bug to be fixed in the lazr packages - they provide an __init__.py file, thus turning off the implicit namespace package support, but then they use a version check to also turn off the explicit namespace package support. If you turn off both kinds of namespace package support, you're not going to have a namespace package at the end of it :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Fri Jan 2 18:23:38 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 2 Jan 2015 17:23:38 +0000 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: On 2 January 2015 at 16:14, Vladimir Diaz wrote: > There's a lot of background documentation and technical details excluded > from the PEPs (to avoid turning the PEP into a 15+ page behemoth), but I do > agree that we should explicitly cover some of these implementation details > in PEP 458. Subsections on the exact format of metadata, explanation on how > metadata is signed, and how the roles are "delegated" with the library, > still remain. As Paul as indicated, terminology can also be improved so as > to be more readable for "non-experts." There's a tension here, in that while the detail is useful, the non-expert content needs to be *less* words, not more - the problem is one of an intimidating wall of text, much more than that the concepts are difficult. Adding more content, however necessary it is to the implementation side, probably detracts here. Just as an example, take the first paragraph of the abstract: """ This PEP proposes how the Python Package Index (PyPI [1] ) should be integrated with The Update Framework [2] (TUF). TUF was designed to be a flexible security add-on to a software updater or package manager. The framework integrates best security practices such as separating role responsibilities, adopting the many-man rule for signing packages, keeping signing keys offline, and revocation of expired or compromised signing keys. For example, attackers would have to steal multiple signing keys stored independently to compromise a role responsible for specifying a repository's available files. Another role responsible for indicating the latest snapshot of the repository may have to be similarly compromised, and independent of the first compromised role. """ There's no way I, even with the benefit of the discussions both on the list here and off-list, can skim that sentence and get anything more meaningful than "blah, blah, blah" from it. Taking a moment to think about it is fine, but at that point you've lost me (in terms of encouraging me to read the document, rather than just debate based on what I think it's going to say). None of the "best security practices" mentioned in sentence 3 mean anything to me - they all basically look like "do more security stuff" to me, and "doing security stuff" unfortunately, from my experience, translates to "jump through annoying hoops that stop me getting my job done"... Similarly, the last 2 sentences (the example) read as saying "attackers have to work harder". Well good, I'd sort of hope the PEP provided a benefit of some sort :-) The following (somewhat off the cuff) may work better: """ This PEP proposes some changes to the Python Package Index (PyPI) infrastructure to enhance its resistance to attack without requiring any changes in workflow from project authors. It is similar to recent changes to use TLS for all PyPI traffic, but offers additional security over TLS. Future PEPs may look at further security enhancements that could require additional action from project authors, but this PEP is restricted to the PyPI infrastructure and mirroring software. """ Sorry for what is pretty much a hatchet job on a single paragraph take out of context from the PEP, but hopefully you can see what I'm trying to say - skip details of security concepts, attack vectors, etc, and concentrate on "we're making things better - it's as low impact as the change to TLS - project authors won't need to do anything (at this stage)". Those are the key messages to get across straight away. A few more paragraphs in the abstract explaining in non-technical terms what's being done and why, and you're done. You can say "the rest of the document is the technical details for people with an understanding of (or interest in) the security concepts" and get on with the specifics. If you can get the non-specialists to read just the abstract section, and go away knowing they are happy to let the expects get on with the process, you've won. Put the details and the scary terms in the "for experts and interested parties only" main body. Paul. From vladimir.v.diaz at gmail.com Fri Jan 2 19:00:17 2015 From: vladimir.v.diaz at gmail.com (Vladimir Diaz) Date: Fri, 2 Jan 2015 13:00:17 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: Paul, I understand your point, and made an attempt at what you're suggesting with my initial post to this mailing list (addressing a different target audience): "Surviving a Compromise of PyPI" proposes how the Python Package Index (PyPI) can be amended to better protect end users from altered or malicious packages, and to minimize the extent of PyPI compromises against affected users. The proposed integration allows package managers such as pip to be more secure against various types of security attacks on PyPI and defend end users from attackers responding to package requests. Specifically, these PEPs describe how PyPI processes should be adapted to generate and incorporate repository metadata, which are signed text files that describe the packages and metadata available on PyPI. Package managers request (along with the packages) the metadata on PyPI to verify the authenticity of packages before they are installed. The changes to PyPI and tools will be minimal by leveraging a library, The Update Framework , that generates and transparently validates the relevant metadata. The first stage of the proposal (PEP 458 ) uses a basic security model that supports verification of PyPI packages signed with cryptographic keys stored on PyPI, requires no action from developers and end users, and protects against malicious CDNs and public mirrors. To support continuous delivery of uploaded packages, PyPI administrators sign for uploaded packages with an online key stored on PyPI infrastructure. This level of security prevents packages from being accidentally or deliberately tampered with by a mirror or a CDN because the mirror or CDN will not have any of the keys required to sign for projects." Perhaps parts of this introduction (and your suggestion) will work better as an abstract, instead of assuming that the PyPI admins will be the only ones reading PEP 458. Thanks again for all the feedback. On Fri, Jan 2, 2015 at 12:23 PM, Paul Moore wrote: > On 2 January 2015 at 16:14, Vladimir Diaz > wrote: > > There's a lot of background documentation and technical details excluded > > from the PEPs (to avoid turning the PEP into a 15+ page behemoth), but I > do > > agree that we should explicitly cover some of these implementation > details > > in PEP 458. Subsections on the exact format of metadata, explanation on > how > > metadata is signed, and how the roles are "delegated" with the library, > > still remain. As Paul as indicated, terminology can also be improved so > as > > to be more readable for "non-experts." > > There's a tension here, in that while the detail is useful, the > non-expert content needs to be *less* words, not more - the problem is > one of an intimidating wall of text, much more than that the concepts > are difficult. Adding more content, however necessary it is to the > implementation side, probably detracts here. > > Just as an example, take the first paragraph of the abstract: > > """ > This PEP proposes how the Python Package Index (PyPI [1] ) should be > integrated with The Update Framework [2] (TUF). TUF was designed to be > a flexible security add-on to a software updater or package manager. > The framework integrates best security practices such as separating > role responsibilities, adopting the many-man rule for signing > packages, keeping signing keys offline, and revocation of expired or > compromised signing keys. For example, attackers would have to steal > multiple signing keys stored independently to compromise a role > responsible for specifying a repository's available files. Another > role responsible for indicating the latest snapshot of the repository > may have to be similarly compromised, and independent of the first > compromised role. > """ > > There's no way I, even with the benefit of the discussions both on the > list here and off-list, can skim that sentence and get anything more > meaningful than "blah, blah, blah" from it. Taking a moment to think > about it is fine, but at that point you've lost me (in terms of > encouraging me to read the document, rather than just debate based on > what I think it's going to say). > > None of the "best security practices" mentioned in sentence 3 mean > anything to me - they all basically look like "do more security stuff" > to me, and "doing security stuff" unfortunately, from my experience, > translates to "jump through annoying hoops that stop me getting my job > done"... Similarly, the last 2 sentences (the example) read as saying > "attackers have to work harder". Well good, I'd sort of hope the PEP > provided a benefit of some sort :-) > > The following (somewhat off the cuff) may work better: > > """ > This PEP proposes some changes to the Python Package Index (PyPI) > infrastructure to enhance its resistance to attack without requiring > any changes in workflow from project authors. It is similar to recent > changes to use TLS for all PyPI traffic, but offers additional > security over TLS. Future PEPs may look at further security > enhancements that could require additional action from project > authors, but this PEP is restricted to the PyPI infrastructure and > mirroring software. > """ > > Sorry for what is pretty much a hatchet job on a single paragraph take > out of context from the PEP, but hopefully you can see what I'm trying > to say - skip details of security concepts, attack vectors, etc, and > concentrate on "we're making things better - it's as low impact as the > change to TLS - project authors won't need to do anything (at this > stage)". Those are the key messages to get across straight away. A few > more paragraphs in the abstract explaining in non-technical terms > what's being done and why, and you're done. You can say "the rest of > the document is the technical details for people with an understanding > of (or interest in) the security concepts" and get on with the > specifics. > > If you can get the non-specialists to read just the abstract section, > and go away knowing they are happy to let the expects get on with the > process, you've won. Put the details and the scary terms in the "for > experts and interested parties only" main body. > > Paul. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vladimir.v.diaz at gmail.com Fri Jan 2 19:24:30 2015 From: vladimir.v.diaz at gmail.com (Vladimir Diaz) Date: Fri, 2 Jan 2015 13:24:30 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: <874D1500-0AFE-44FB-83C2-502A79091A7D@stufft.io> <484EB80E-E812-4B66-812E-285CF6664455@stufft.io> <353FFEBC-55D5-4A3C-87B3-FE6C5B2F55AE@stufft.io> Message-ID: Nick, Renaming the PEPs is not problem. Perhaps "PEP 458: Securing the Link from PyPI to the End User" is another option. I am going to read the Rick Walsh paper you've linked and give some careful thought to your proposal. I'll get back to you. I had one person (off-list and recommending how to better explain 480 to non-specialists) say, "the property PEP 480 gives is that developers who sign their project protect their users even if PyPI is compromised. This is because end users are told to trust the developer keys over the keys that are kept on the PyPI server. (PyPI administrators still have a way of using keys that are kept in secure, offline storage to recover if a developer's keys are lost or stolen.)" Yes, you gotta love those "aha" moments - you're in the shower and go to grab the shampoo bottle when it hits you, "aha! That's the solution... Thank you, shampoo bottle of 'Head & Shoulders'" On Fri, Jan 2, 2015 at 11:26 AM, Nick Coghlan wrote: > On 3 January 2015 at 02:12, Donald Stufft wrote: > >> >> On Jan 2, 2015, at 10:51 AM, Nick Coghlan wrote: >> >> Getting them to manage additional keys, and get them signed and >> registered appropriately, and then supplying them is going to be a similar >> amount of work, and the purpose is far more cryptic and confusing. My >> proposal is basically that instead of asking developers to manage signing >> keys, we should instead be ask them to manage account on a validation >> server (or servers). >> >> >> I need to think more about the rest of what you?ve said (and I don?t >> think it?s a short term problem), but I just wanted to point out that >> ?managing keys? can be as simple as ?create a secondary pass(word|phrase) >> and remember it/write it down/whatever?. It doesn?t need to be ?secure this >> file and copy it around to all of your computers?. Likewise there?s no >> reason that ?delegate authority to someone else? can?t be something like >> ``twine add-maintainer pip pfmoore``. >> > > Yeah, I'm confident that the UI can be made relatively straightforward > regardless of how we make the actual validation work. The part I haven't > got the faintest clue how to do for the PEP 480 version is building viable > "folks models" of what those commands are doing on the back end that will > give people confidence that they understand what is going on just from > using the tools, rather than leaving them wondering why they need a > secondary password, etc. > > From a technical perspective, I don't think the validation server idea is > superior to PEP 480. Where I think it's superior is that I'm far more > confident in my ability to explain to a developer with zero security > background how separate validation servers provide increased security, as > the separation of authority would be structural in addition to > mathematical. While the real security would still be coming from the maths, > a folk model that believes it is coming from the structural separation > between the publication server and the metadata validation servers will be > good enough for most practical purposes, and unless someone is particularly > interested in the mathematical details, they can largely be handwaved away > with "the separation of responsibilities between the services is enforced > mathematically". > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri Jan 2 19:35:21 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 13:35:21 -0500 Subject: [Distutils] Surviving a Compromise of PyPI - PEP 458 and 480 In-Reply-To: References: Message-ID: <16BB965D-14B8-481D-B346-CC2E31141827@stufft.io> > On Jan 2, 2015, at 11:47 AM, Vladimir Diaz wrote: > > I prefer pulling the TUF PEPs (available on hg.python.org ) into github.com/pypa . > > Please add Justin, Linda, Trishank, and myself as collaborators: > > https://github.com/vladimir-v-diaz > https://github.com/dachshund > https://github.com/JustinCappos > https://github.com/lvigdor > > P.S. Donald helped tremendously with the snapshot process, Ed25519 library, ideas, and feedback. I think that earns a spot on the authors list. > I?ve gone ahead and imported PEP 458 and the three image files it uses into https://github.com/pypa/interoperability-peps and added the above names as contributors. I probably won?t get to it today but I?ll try to get to it tomorrow to go through PEP 458 and PR/issue what I can find. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat Jan 3 03:49:58 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Jan 2015 12:49:58 +1000 Subject: [Distutils] Last call for feedback: PEP 440 exclusive ordered comparison fix In-Reply-To: References: Message-ID: On 31 December 2014 at 13:52, Nick Coghlan wrote: > Donald is keen to get the updated versions of packaging/pip/setuptools out > that fix the regression in handling exclusive ordered comparison, so this > is a last call for feedback on those changes before we publish the updated > version of the PEP to python.org (and then update the tools accordingly). > > Given the proximity to New Year's celebrations, I currently plan to > publish the updated version some time on Saturday 3rd January (UTC+10). > The PEP 440 updates based on user feedback on the recent setuptools and pip releases have now been pushed to python.org: https://hg.python.org/peps/rev/289dbffc16ed There are couple of other minor clarifications still to be made, and some existing clarifications that need to be noted in the changelog, but all currently anticipated semantic updates are included now. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sat Jan 3 03:51:14 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 2 Jan 2015 21:51:14 -0500 Subject: [Distutils] Last call for feedback: PEP 440 exclusive ordered comparison fix In-Reply-To: References: Message-ID: <54D4ADD3-9080-40A0-8EEF-BF8CE83B533B@stufft.io> > On Jan 2, 2015, at 9:49 PM, Nick Coghlan wrote: > > On 31 December 2014 at 13:52, Nick Coghlan > wrote: > Donald is keen to get the updated versions of packaging/pip/setuptools out that fix the regression in handling exclusive ordered comparison, so this is a last call for feedback on those changes before we publish the updated version of the PEP to python.org (and then update the tools accordingly). > > Given the proximity to New Year's celebrations, I currently plan to publish the updated version some time on Saturday 3rd January (UTC+10). > > The PEP 440 updates based on user feedback on the recent setuptools and pip releases have now been pushed to python.org : https://hg.python.org/peps/rev/289dbffc16ed > > There are couple of other minor clarifications still to be made, and some existing clarifications that need to be noted in the changelog, but all currently anticipated semantic updates are included now. > This has been released in the packaging lib version 15.0. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Sat Jan 3 13:56:27 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 3 Jan 2015 13:56:27 +0100 Subject: [Distutils] PyPI search engine slow Message-ID: <20150103135627.11b75893@fsol> Hello, It seems that PyPI's search engine has become quite slow. For example the following URL takes ~8s. to load: https://pypi.python.org/pypi?%3Aaction=search&term=signature%20inspect&submit=search Regards Antoine. From steve at holdenweb.com Fri Jan 2 14:04:18 2015 From: steve at holdenweb.com (Steve Holden) Date: Fri, 2 Jan 2015 13:04:18 +0000 Subject: [Distutils] Immutable Files on PyPI Message-ID: > Do you seriously want to force package authors to cut a new release > just because a single uploaded distribution file is broken for > some reason and then ask all users who have already installed one > of the non-broken ones to upgrade again, even though they are not Yes. Seems to me the issues have been well discussed by knowledgable authorities and it's time to go ahead with implementation. regards Steve -- Steve Holden steve at holdenweb.com +1 571 484 6266 @holdenweb -------------- next part -------------- An HTML attachment was scrubbed... URL: From richard at python.org Sun Jan 4 23:11:10 2015 From: richard at python.org (Richard Jones) Date: Sun, 04 Jan 2015 22:11:10 +0000 Subject: [Distutils] PyPI search engine slow References: <20150103135627.11b75893@fsol> Message-ID: Thanks for pointing that out. There's not lot we can do immediately, as any significant improvement to search will require architecture changes. Or, do what I do: use Google :) [in the Bad Old Days, we used google site search, but people complained that the results weren't pretty enough] On Sat Jan 03 2015 at 11:57:00 PM Antoine Pitrou wrote: > > Hello, > > It seems that PyPI's search engine has become quite slow. For example > the following URL takes ~8s. to load: > > https://pypi.python.org/pypi?%3Aaction=search&term= > signature%20inspect&submit=search > > Regards > > Antoine. > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sontek at gmail.com Sun Jan 4 23:15:23 2015 From: sontek at gmail.com (John Anderson) Date: Sun, 4 Jan 2015 14:15:23 -0800 Subject: [Distutils] PyPI search engine slow In-Reply-To: References: <20150103135627.11b75893@fsol> Message-ID: https://warehouse.python.org/search/project/?q=signature+inspect Returns almost instantly even though it does have different results. On Sun, Jan 4, 2015 at 2:11 PM, Richard Jones wrote: > Thanks for pointing that out. There's not lot we can do immediately, as > any significant improvement to search will require architecture changes. > > Or, do what I do: use Google :) > > [in the Bad Old Days, we used google site search, but people complained > that the results weren't pretty enough] > > On Sat Jan 03 2015 at 11:57:00 PM Antoine Pitrou > wrote: > >> >> Hello, >> >> It seems that PyPI's search engine has become quite slow. For example >> the following URL takes ~8s. to load: >> >> https://pypi.python.org/pypi?%3Aaction=search&term= >> signature%20inspect&submit=search >> >> Regards >> >> Antoine. >> >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sun Jan 4 23:24:36 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 4 Jan 2015 17:24:36 -0500 Subject: [Distutils] PyPI search engine slow In-Reply-To: References: <20150103135627.11b75893@fsol> Message-ID: <03E652E5-DF4F-4376-A88F-8D2CC09929F5@stufft.io> > On Jan 4, 2015, at 5:15 PM, John Anderson wrote: > > https://warehouse.python.org/search/project/?q=signature+inspect > > Returns almost instantly even though it does have different results. One of the many improvements that Warehouse has is that it uses Elasticsearch as a search engine instead of a pretty gnarly and slow SQL query. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From VenkatRamReddy.k at hcl.com Tue Jan 6 07:31:02 2015 From: VenkatRamReddy.k at hcl.com (Venkat Ram Reddy K) Date: Tue, 6 Jan 2015 06:31:02 +0000 Subject: [Distutils] py2exe package for 3.4.2 version Message-ID: <36E5D16AE437E04AA964C5801754D69C043AA9DC@chn-hclt-mbs06.HCLT.CORP.HCL.IN> Hi Good Afternoon, This is Venkat from HCL Technologies. Actually I am trying to download "py2exe" package for Python 3.4.2 version. But I am not able to download "py2exe". Could you please send me "py2exe" package to me. And also could you please tell me how to extract the .whl files. waiting for your reply. Thanks In Advance&Regards, Venkat. ::DISCLAIMER:: ---------------------------------------------------------------------------------------------------------------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only. E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted, lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents (with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates. Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of this message without the prior written consent of authorized representative of HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any email and/or attachments, please check them for viruses and other defects. ---------------------------------------------------------------------------------------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Tue Jan 6 15:15:54 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 6 Jan 2015 14:15:54 +0000 Subject: [Distutils] py2exe package for 3.4.2 version In-Reply-To: <36E5D16AE437E04AA964C5801754D69C043AA9DC@chn-hclt-mbs06.HCLT.CORP.HCL.IN> References: <36E5D16AE437E04AA964C5801754D69C043AA9DC@chn-hclt-mbs06.HCLT.CORP.HCL.IN> Message-ID: On 6 January 2015 at 06:31, Venkat Ram Reddy K wrote: > This is Venkat from HCL Technologies. Actually I am trying to download > ?py2exe? package for Python 3.4.2 version. But I am not able to download > ?py2exe?. Could you please send me ?py2exe? package to me. > > > > And also could you please tell me how to extract the .whl files. pip install py2exe Paul From barry at python.org Tue Jan 6 21:39:37 2015 From: barry at python.org (Barry Warsaw) Date: Tue, 6 Jan 2015 15:39:37 -0500 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: References: <20150101141916.4e2f0bf0@anarchist> Message-ID: <20150106153937.2de69d7b@anarchist.wooz.org> On Jan 03, 2015, at 03:09 AM, Nick Coghlan wrote: >This was the intended solution when PEP 420 was written - we deliberately >didn't break old-style namespace packages, we just made them redundant for >code that only needs to run on 3.3+. This is much easier to learn, since it >means creating packages as subdirectories "just works", and automatically >collecting all search path subdirectories with a given name into a common >namespace by default better matches the typical behaviour of other search >path based explicit import systems. > >However, implicit namespace packages aren't inherently *better* than the >old explicit ones at runtime, as the end result is the same in either case: >a module.__path__ entry that contains multiple directories. The only >difference is in whether you get there by leaving out __init__.py and >letting the 3.3+ import machinery handle it, or by doing it explicitly >yourself. > >That means there's a bug to be fixed in the lazr packages - they provide an >__init__.py file, thus turning off the implicit namespace package support, >but then they use a version check to also turn off the explicit namespace >package support. If you turn off both kinds of namespace package support, >you're not going to have a namespace package at the end of it :) While I agree that the lazr package's __init__.py files should not have the version checks (and I've released fixes for the few that I care about immediately), I still think it could make sense for better support in the Python installation tools to support PEP 420 namespace packages. FWIW, the Debian packaging tools explicitly remove namespace package __init__.py files from portions when they are built for Python 3 (let's assume we don't care about any Python 3 that doesn't support PEP 420). We've not had any problems with this that I know of. I suppose with a working old-style namespace __init__.py file, there's little practical effect, but for consistency it makes sense. One reason would be introspection tools, which would see a different set of attributes for `dir(lazr)` in the two cases, one where lazr is a concrete package and the other where it's a virtual/namespace package. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From techtonik at gmail.com Wed Jan 7 15:38:03 2015 From: techtonik at gmail.com (anatoly techtonik) Date: Wed, 7 Jan 2015 17:38:03 +0300 Subject: [Distutils] Problems with download redirection and secure certificate Message-ID: Hi, I can't download package from PyPI due to redirection to HTTPS and failed certificate check. # wget http://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip --2015-01-07 06:04:22-- http://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip Resolving pypi.python.org... 199.27.79.223 Connecting to pypi.python.org|199.27.79.223|:80... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: https://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip [following] --2015-01-07 06:04:22-- https://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip Connecting to pypi.python.org|199.27.79.223|:443... connected. ERROR: certificate common name `www.python.org' doesn't match requested host name `pypi.python.org'. To connect to pypi.python.org insecurely, use `--no-check-certificate'. This is from Debian Squeeze. -- anatoly t. From donald at stufft.io Wed Jan 7 16:38:36 2015 From: donald at stufft.io (Donald Stufft) Date: Wed, 7 Jan 2015 10:38:36 -0500 Subject: [Distutils] Problems with download redirection and secure certificate In-Reply-To: References: Message-ID: <6BDB3431-084B-4344-9FB2-6E723BDEC1C5@stufft.io> > On Jan 7, 2015, at 9:38 AM, anatoly techtonik wrote: > > Hi, > > I can't download package from PyPI due to redirection to HTTPS and > failed certificate check. > > # wget http://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip > --2015-01-07 06:04:22-- > http://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip > Resolving pypi.python.org... 199.27.79.223 > Connecting to pypi.python.org|199.27.79.223|:80... connected. > HTTP request sent, awaiting response... 301 Moved Permanently > Location: https://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip > [following] > --2015-01-07 06:04:22-- > https://pypi.python.org/packages/source/h/hexdump/hexdump-3.1.zip > Connecting to pypi.python.org|199.27.79.223|:443... connected. > ERROR: certificate common name `www.python.org' doesn't match > requested host name `pypi.python.org'. > To connect to pypi.python.org insecurely, use `--no-check-certificate'. > > > This is from Debian Squeeze. Your wget is too old and doesn?t support SAN names. https://docs.fastly.com/guides/ssl/why-do-i-see-ssl-certificate-errors-when-using-wget --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From chris.barker at noaa.gov Thu Jan 8 01:20:41 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 7 Jan 2015 16:20:41 -0800 Subject: [Distutils] Role of setuptools and eggs in "modern" distributing... In-Reply-To: References: Message-ID: OK, I started this thread a while back, as I was getting confused and having issues with intermixing python, setuptools, pip, and Anaconda / conda. Now I've figured out where i have my issue: I'm using an Anaconda distribution at the moment. I want conda to handle installing my dependencies, etc. for me. OK. However, I am also developing various python packages -- this means I can't / don't want o build and install a conda package for them, rather, I'd like to use setuptools "develop" mode. So here's the rub: When I call "setup.py develop", setuptools apparently looks for the "install_requires" packages. If it doesn't find them, it goes out and decided to apparently pip install them: gets the source form pypi, download, tries to compile, etc.... Even if it does find them already installed, it does some annoying adding to easy_install.pth magic for them. This is all why I've been thinking that dependencies do not belong in setup.py -- but rather outside of setup.py (requirements.txt), and more to the pint, dependency handling ius a pip (or conda, or...) issue - not one that should be handled by aw setuptools at build time. Note that conda build usually simply calls setup.py install as well, so this can be a problem even there (though I think it usually satisfies the requirements first, so not so bad) At the end of the day, however, I think the issue is not so much where you specify dependencies, but what setuptool develop mode is doing: it should NOT go an auto-install dependencies, particularly not install-dependencies (maybe build dependencies are required...) OK -- I just found the --no-deps option. So I can do what I want, but still, I don't think it belongs there and all, and certainly would be better to have the default be no-deps. Let pip (or conda, or...) handle that. Any one running these by hand are be definition doing things by hand, let them deal with the deps. OK, I suppose "casual" users may run setup.py install, so that mode _might_ want to auto install dependencies, if somethign has to. But develop mode is very much for developers, you really don't want this handled there. -Chris On Wed, Dec 31, 2014 at 9:41 AM, Chris Barker wrote: > On Wed, Dec 31, 2014 at 9:10 AM, Nick Coghlan wrote: > >> The problem always existed - it's the longstanding conflict between >> "platform independent, language specific" tooling and "platform specific, >> language independent" tooling. >> >> The former is often preferred on the developer side (since the tools are >> oriented towards building individual custom applications rather than >> maintaining a suite of applications written by different groups), while the >> latter is essential on the system integrator side (since you're dealing >> with inevitable technology sprawl in infrastructure that evolves over the >> course of years and decades). >> >> One of the best things coming out of the whole "DevOps" notion is the >> recognition that language independence and platform independence are aimed >> at solving *different problems*, and that what we need is suitable tools >> with different roles and responsibilities that interoperate effectively, >> rather than attempting to provide a single universal tool and having >> different groups of users yelling at each other for wanting to solve the >> "wrong" problem. >> >> Tools like conda and zc.buildout fit into that landscape by aiming to >> provide a platform & language independent approach to doing *application* >> level integration, which tends to have the effect of making them new >> platforms in their own right. >> > Indeed -- thanks for providing a clear way to talk/think about these > systems. > > I guess the trick here is that we want the different level tools to work > well with each-other. > > As conda started with python packages in mind, it does a pretty good job > with them. But I still find some conflicts between setuptools and conda -- > in particular, if you specify dependencies in setup.py (install_requires), > it can kind of make a mess of things. conda tries to ignore them, but > somehow I've had issues, even though I had specified it all in the conda's > meta.yaml. This is probably a conda bug/issue, but I'm still trying to > figure out how to best set up a python package so that it can be built > installed with the "regular" python tools, and also conda... > > Use case -- developing in a conda environment -- so I want to install > dependencies with conda, but the package under development with setuptools > "develop" mode. (conda does not (yet) have a develop mode that works...) > > Oh, and there does seem to be an odd (I think non-fatal) issue with > setuptools and conda: > > I have package A, with a bunch of stuff listed with "install_requires" > > I have all these dependencies installed with conda. > > When I run setup.py develop, setuptools goes and dumps all the > dependencies in easy_install.pth. > > I have no idea why that is, and it's probably only annoying, rather than a > problem, but I'm not sure what will happen when I upgrade one of those > dependencies with conda. > >> If you compare them to Linux distributions, then zc.buildout is a >> platform that practices the Gentoo style approach of building everything >> from source for each application, while conda uses the more common >> Debian/Fedora style of defining a binary packaging format that allows a >> redistributor to handle the build process on behalf of their users. >> > indeed -- and conda actually provides (to my disappointment) very little > in the way of build support -- you need to write platform dependent build > scripts to actually build the packages. > > But it does provide a nice way for me to provide a full "just install > this" distribution of my complex, ugly, hard to build packages.... > > -Chris > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu Jan 8 10:50:02 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 8 Jan 2015 09:50:02 +0000 Subject: [Distutils] Role of setuptools and eggs in "modern" distributing... In-Reply-To: References: Message-ID: On 8 January 2015 at 00:20, Chris Barker wrote: > When I call "setup.py develop", setuptools apparently looks for the > "install_requires" packages. If it doesn't find them, it goes out and > decided to apparently pip install them: gets the source form pypi, download, > tries to compile, etc.... Do you get any better results if you use "pip install -e ."? I'm not sure you will, but it might mean that pip does the dependency management for you rather than setuptools, and as long as conda records dependency information in a way that pip recognises that might help. > Even if it does find them already installed, it does some annoying adding to > easy_install.pth magic for them. Unfortunately, that's just the way develop mode (pip's editable mode) works. It sounds to me that this is more of a conda issue - it doesn't seem to be creating standard distribution metadata to allow pip/setuptools to recognise what it has installed, and it doesn't provide its own equivalent of editable/develop mode (which would allow you to work purely within the conda framework and avoid these issues). Have you tried asking the conda folks about these issues? I thought that when I briefly tried it out, it did install packages in a way that pip could recognise - so maybe something has changed? Paul From tseaver at palladion.com Thu Jan 8 16:41:06 2015 From: tseaver at palladion.com (Tres Seaver) Date: Thu, 08 Jan 2015 10:41:06 -0500 Subject: [Distutils] Role of setuptools and eggs in "modern" distributing... In-Reply-To: References: Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 01/07/2015 07:20 PM, Chris Barker wrote: > OK -- I just found the --no-deps option. So I can do what I want, but > still, I don't think it belongs there and all, and certainly would be > better to have the default be no-deps. Let pip (or conda, or...) > handle that. Unlike pip (which provides no API), setuptools is a library, used by e.g. zc.buildout. The behavior which you find objectionable (installing dependencies needed to satisfy 'install_requires') is a core part of what setuptools *does* -- ripping it out (or even changing the default) would break nearly every other user of setuptools in the world. Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver at palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iEYEARECAAYFAlSupRIACgkQ+gerLs4ltQ5/fgCfZTk7b9+eVwLqJcztO1RggQJ2 XxkAoM0gYO+vV/sOrIcVqPFbCRVmAi+o =gRN+ -----END PGP SIGNATURE----- From robertc at robertcollins.net Tue Jan 13 02:39:45 2015 From: robertc at robertcollins.net (Robert Collins) Date: Tue, 13 Jan 2015 14:39:45 +1300 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: <20150101141916.4e2f0bf0@anarchist> References: <20150101141916.4e2f0bf0@anarchist> Message-ID: On 2 January 2015 at 08:19, Barry Warsaw wrote: > I hope the following makes sense; I've been a little under the weather. ;) > Apologies in advance for the long data-dump, but I wanted to provide as > complete information as possible. Welcome to hell :) > I have ported Mailman 3 to Python 3.4[*]. While working on various > development tasks, I noticed something rather strange regarding > (pseudo-)namespace packages which MM3 is dependent on. These libraries work > for both Python 2 and 3, but of course namespace packages work completely > differently in those two versions (as of Python 3.3, i.e. PEP 420). I've been > thinking about how such bilingual packages can play better with PEP 420 when > running under a compatible Python 3 version, and still *seem* to work well in > Python 2. More on that later. > > It turns out this isn't actually my problem. Here's what's going wrong. > > My py3 branch has a fairly straightforward tox.ini. When I run `tox -e py34` > it builds the hidden .tox/py34 virtualenv, installs all the dependencies into > this, runs the test suite, and all is happy. Since I do most of my > development this way, I never noticed any problems. > > But then I wanted to work with the code in a slightly different way, so I > created a virtual environment, activated it, and ran `python setup.py develop` > which appeared to work fine. The problem though is that some packages are not > importable by the venv's python. Let's concentrate on three of these > dependencies, lazr.config, lazr.delegates, and lazr.smtptest. All three give > ImportErrors when trying to import them from the interactive prompt of the > venv's python. > > I tried something different to see if I could narrow down the problem, so I > created another venv, activated it, and ran `pip install lazr.config > lazr.delegates lazr.smtptest`. Firing up the venv's interactive python, I can > now import all three of them just fine. > > So this tells me that `python setup.py develop` is broken, or at the very > least behaves differently that the `pip install` route. Actually, `python > setup.py install` exhibits the same problem. > > Just to be sure Debian's Python 3.4 package isn't introducing any weirdnesses, > all of this work was done with an up-to-date build of Python from hg, using > the 3.4 branch, and the virtualenvs were created with `./python -m venv` > > Let's look more closely at part of the site-packages directories of the two > virtualenvs. The one that fails looks like this: > > drwxrwxr-x [...] lazr.config-2.0.1-py3.4.egg > drwxrwxr-x [...] lazr.delegates-2.0.1-py3.4.egg > drwxrwxr-x [...] lazr.smtptest-2.0.2-py3.4.egg > > All three directories exhibit the same pattern: > > lazr.config-2.0.1-py3.4.egg > EGG-INFO > namespace_packages.txt (contains one line saying "lazr") > lazr > config > __init__.py (contains the pseudo-namespace boilerplate code[+]) > __pycache__ > > [+] that boilerplate code looks like: > > -----snip snip----- > # This is a namespace package, however under >= Python 3.3, let it be a true > # namespace package (i.e. this cruft isn't necessary). > import sys > > if sys.hexversion < 0x30300f0: > try: > import pkg_resources > pkg_resources.declare_namespace(__name__) > except ImportError: > import pkgutil > __path__ = pkgutil.extend_path(__path__, __name__) > -----snip snip----- > > but of course, this doesn't really play nicely with PEP 420 because it's the > mere existence of the __init__.py in the namespace package that prevents PEP > 420 from getting invoked. As an aside, Debian's packaging tools will actually > *remove* this __init__.py for declared namespace packages in Python 3, so > under Debian-land, things do work properly. The upstream tools have no such > provision, so it's clear why we're getting the ImportErrors. These three > packages do not share a PEP 420-style namespace, and the sys.hexversion guard > prevents the old-style pseudo-namespace hack from working. > > This could potentially be fixable in the upstream packages, but again, more on > that later. > > Now let's look at the working venvs: > > drwxrwxr-x [...] lazr > drwxrwxr-x [...] lazr.config-2.0.1-py3.4.egg-info > -rw-rw-r-- [...] lazr.config-2.0.1-py3.4-nspkg.pth > drwxrwxr-x [...] lazr.delegates-2.0.1-py3.4.egg-info > -rw-rw-r-- [...] lazr.delegates-2.0.1-py3.4-nspkg.pth > drwxrwxr-x [...] lazr.smtptest-2.0.2-py3.4.egg-info > -rw-rw-r-- [...] lazr.smtptest-2.0.2-py3.4-nspkg.pth > > > This looks quite different, and digging into the 'lazr' directory shows: > > lazr > config > delegates > smtptest > > No __init__.py in 'lazr', and the sub-package directories contain just the > code for the appropriate library. The egg-info directories look similar, but > now we also have -nspkg.pth files which contain something like the following: > > -----snip snip----- > import sys, types, os;p = os.path.join(sys._getframe(1).f_locals['sitedir'], *('lazr',));ie = os.path.exists(os.path.join(p,'__init__.py'));m = not ie and sys.modules.setdefault('lazr', types.ModuleType('lazr'));mp = (m or []) and m.__dict__.setdefault('__path__',[]);(p not in mp) and mp.append(p) > -----snip snip----- > > which look to me a lot like they are setting up something akin to the PEP > 420-ish virtual `lazr` top-level package. It all makes sense too; importing > lazr.{config.delegates,smtptest} all come from the expected locations and > importing 'lazr' gives you a stripped down module object with no __file__. .... > Stepping back a moment, what is it I want? I want `python setup.py develop` > to work properly in both Python 2 and Python 3 virtualenvs. I want clear > rules we can give library authors so that they can be sure their bilingual > namespace packages will work properly in all supported venv environments. > How to best accomplish this? > > - Should distutils/setuptools remove the namespace package __init__.py files > automatically? (Something like what the Debian packaging tools do.) > > - Should some other boilerplate be added to the namespace-package __init__.py > files in each portion? Maybe the sys.hexversion guards should be removed so > that it acts the same way in both Python 2 and Python 3. > > - Should we introduce some other protocol in Python 3.5 that would allow such > bilingual namespace packages to actually *be* PEP 420 namespace packages in > Python 3? I took a quick review of importlib and it seems tricky, since > it's the mere existence of the top-level __init__.py files that trigger PEP > 420 namespace packages. So putting some code in the __init__.py won't get > executed until the top-level package is attempted to be imported. Any > trigger for "hey, treat me as a namespace package" would have to be file > system based, since even the namespace_packages.txt file may not exist. > > I'll stop here. Happy new year and let the games begin. So I dug down ridiculously deep into this when investigating a very similar issue in the openstack space. I have an alternative solution to the ones mentioned already - use importlib2 and patch the importer (e.g. in a local site.py) to allow genuine namespaces to work. (Its nearly there - just needs a little more love I think). >From memory, this is the set of issues: - setuptools 'namespace' packages are actual nspth files which directly alter the state of the interpreter to create a module. - they *don't* fixup the path in that module to deal with virtualenvs (in particular sys.base_prefix will differ from sys.prefix and you need to put all possible paths that modules within the namespace are found at into the holder object) - genuine namespace packages don't work with older pythons at all - venn-diagram style splitouts (real common package with a __init__) don't work in a venv environment when some modules are in the base os and some in the virtualenv, because the path isn't adjusted... unless the common package takes care to fixup it's own path... which needs to include the OS site dir, the virtual env site dir, and the local working directory (for pip install -e . support). I did in fact get a venn-diagram style setup that would work for all pythons, for all modes, by taking advantage of pip not seeing the non-virtualenv copies of things and just installing them duplicatively when that occured, but we decided that it was probably going to end up with whack-a-mole if we'd missed any use cases along the way, so we decided to just migrate away from namespace packages at all. If it would help I can dig up my test code etc, though its probably all googleable by now anyhow. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From madewokherd at gmail.com Mon Jan 12 06:48:48 2015 From: madewokherd at gmail.com (Vincent Povirk) Date: Sun, 11 Jan 2015 23:48:48 -0600 Subject: [Distutils] oneget python provider? Message-ID: Hey everyone, Microsoft is working on a thing called OneGet to be shipped in new versions of Windows, which will essentially be a standard interface for accessing whatever package managers are on the system (and bootstrapping new ones). Currently users of OneGet access it using PowerShell (but an API is planned and other interfaces will be available), and package managers interact with it through libraries called "Providers" (which currently must be implemented in .NET or PowerShell, but the system is extensible to other environments). The idea is that you'll be able to point OneGet at some sort of package file, or url, or the name of some project you vaguely remember hearing about, and it'll figure out how to install it and keep it updated. The ultimate goal is to make package management on Windows at least as easy as it is on Linux, while taking advantage of existing systems, rather than imposing a new unified package manager. It has a github repo at https://github.com/oneget/oneget and someone wrote a more detailed introduction (which hopefully makes some sense) here: http://www.howtogeek.com/200334/windows-10-includes-a-linux-style-package-manager-named-oneget/ Sorry if this doesn't make sense, I've been immersed in this and related projects for a long time, and I've internalized so much of it that I don't know what the important points are for an introduction. So, if you can ask questions to help me clarify those things, I'd appreciate it. I've been working on a OneGet provider for Python, written in C#, which would basically be a frontend for PyPI/pip, but would avoid calling out to python for certain operations like searching PyPI, and would bootstrap python/pip as needed. However, the project lead asked at a meeting last week how useful it would be to be able to implement providers in other languages (by writing what we're calling a "meta-provider"). I could use the work I've done so far to make this happen for Python. So that would mean that any package manager written in Python could have providers that are also written in Python. To get an idea of what a provider for a package manager looks like, this is the interface in C#: https://github.com/OneGet/oneget/blob/master/OneGet/Providers/IPackageProvider.cs and here's a builtin provider that only implements "GetInstalledPackages" and "UninstallPackage": https://github.com/OneGet/oneget/blob/master/OneGet/Builtin/ArpProvider.cs. All the methods except for "GetPackageProviderName" are optional. There might also be some metadata required, I'd need to work out some details of how this would work for Python. So, is this interesting to anyone? Would anyone want to work on making a Python package manager available through this system (especially pip, as that would save me a lot of work), assuming I can provide some good documentation/sample code and no C# or PowerShell coding is required? Did I even explain this well enough for the question to make sense? From barry at python.org Tue Jan 13 16:25:08 2015 From: barry at python.org (Barry Warsaw) Date: Tue, 13 Jan 2015 10:25:08 -0500 Subject: [Distutils] Bilingual namespace package conundrum In-Reply-To: References: <20150101141916.4e2f0bf0@anarchist> Message-ID: <20150113102508.5c249afd@anarchist.wooz.org> On Jan 13, 2015, at 02:39 PM, Robert Collins wrote: >Welcome to hell :) Well, purgatory maybe. I patched the lazr packages I cared about. :) >So I dug down ridiculously deep into this when investigating a very >similar issue in the openstack space. I have an alternative solution >to the ones mentioned already - use importlib2 and patch the importer >(e.g. in a local site.py) to allow genuine namespaces to work. (Its >nearly there - just needs a little more love I think). I like the idea in theory, but I'm not sure how general of a solution this will be. Not everyone can or will want to install the custom importer just to get real namespace packages in Python 2. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From p.f.moore at gmail.com Tue Jan 13 18:05:15 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 13 Jan 2015 17:05:15 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On 12 January 2015 at 05:48, Vincent Povirk wrote: > Microsoft is working on a thing called OneGet to be shipped in new > versions of Windows, which will essentially be a standard interface > for accessing whatever package managers are on the system (and > bootstrapping new ones). [...] > So, is this interesting to anyone? Would anyone want to work on making > a Python package manager available through this system (especially > pip, as that would save me a lot of work), assuming I can provide some > good documentation/sample code and no C# or PowerShell coding is > required? Did I even explain this well enough for the question to make > sense? I'm sort of interested in this, but as far as I've been able to work out, it's only available on Windows 8 and later. As I'm still on Windows 7, that means I have limited opportunity to investigate. And as you noted yourself, the whole environment seems to be expressed in terms that don't make a huge amount of sense to outsiders (I'm not completely clear on what a "Provider" might be, and what would be involved in pip being one, if it's not already...) So basically, until it's available on Windows 7 and there's a bit more accessible documentation, I'm not likely to do anything about this. But it would be good to hear how things are going, certainly). Paul From p.f.moore at gmail.com Wed Jan 14 10:28:49 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 14 Jan 2015 09:28:49 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On 14 January 2015 at 01:51, Vincent Povirk wrote: > On Tue, Jan 13, 2015 at 11:05 AM, Paul Moore wrote: >> I'm sort of interested in this, but as far as I've been able to work >> out, it's only available on Windows 8 and later. As I'm still on >> Windows 7, that means I have limited opportunity to investigate. > > There's no installer for Windows 7 just yet (technically no installer > for it at all, it's just been shipped in the installers of other > things), but the .zip downloads linked from the github page > (http://oneget.org/oneget.zip) should work. > > See https://github.com/OneGet/oneget/wiki/cmdlets for some hint on how > to use it from its directory once extracted. Cool, thanks. I'll take a look. I guess the most obvious question about OneGet from a distutils-sig (Python) perspective, is what is the expected process for a user who's trying to do something with Python? At the moment, the process goes: 1. Go to www.python.org and locate the appropriate Python installer for your system (2.7, 3.4, 32-bit vs 64-bit). 2. Run the installer, selecting "Add Python to PATH". 3. Go to a command prompt and type "pip install requests matplotlib whatever". 4. Start coding. (Note for distutils-sig veterans, I know it's rarely *quite* this easy, but this is the ideal, and it certainly can be as good as this...) What would be the equivalent experience with OneGet, and how would it help with this? Given how easy the above is, it's hard to see the benefits. If it's about fitting in with the wider ecosystem, I'd be tempted to say we should wait till more people are using OneGet first. Things are changing a lot in the Pythobn packaging world at the moment, and adding more options at this stage is likely to be counterproductive. (Note, I'm not saying I'm against OneGet - far from it - just that I'm not sure we should be "early adopters" here...) Paul From p.f.moore at gmail.com Wed Jan 14 10:55:20 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 14 Jan 2015 09:55:20 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On 14 January 2015 at 09:28, Paul Moore wrote: >> See https://github.com/OneGet/oneget/wiki/cmdlets for some hint on how >> to use it from its directory once extracted. > > Cool, thanks. I'll take a look. Neat - it seems to work on Win7, which is great. But I note: >find-package Python WARNING: MSG:ProviderSwidtagUnavailable Name Version Status ProviderName Source Summary ---- ------- ------ ------------ ------ ------- python 3.4.2 Available Chocolatey chocolatey Python is a programming language that let... That seems odd for a number of reasons: 1. I have Python 2.6.6, 2.7.8, 3.3.5 and 3.4.1 installed already. Should it not be showing them (so I know not to install the above blindly)? OK, get-package shows me, maybe there's a validation step I missed that won't let me overwrite what I have without confirmation (I'm not keen on trying an install to see what happens!) 2. It's not clear whether the above is the 32-bit or the 64-bit version. If I do find-package Python AllVersions, I get Version ------- 2.7.2 2.7.2.1 2.7.3 2.7.4 2.7.5 2.7.6 3.4.0 3.4.0.20140321 3.4.1 3.4.1.20140610 3.4.1.20141004 3.4.2 (only the versions, to make it paste properly). Still no indication of 32-bit vs 64-bit, multiple versions of some of the Python versions (e.g., 3.4.1). I suspect this is a complaint about Chocolatey rather than OneGet, and it's way off-topic for distutils-sig anyway, but I'd be reluctant to recommend OneGet to Python users until it's much simpler to get a clear understanding of what you're installing. One thought - before looking at a pip provider for OneGet, I'd rather look at what's involved in properly exposing the official python.org installers through OneGet. It *would* be nice if we could say to people "just do Install-Package Python 3.4" to get the latest release of Python 3.4 automatically installed" and know they were getting the official python.org build appropriate to their architecture (x86 or amd64). Chocolatey is fine, but I wouldn't want to blindly trust them by default... Paul. From p.f.moore at gmail.com Wed Jan 14 12:24:21 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 14 Jan 2015 11:24:21 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: OK, now that I've done some reading and playing with OneGet, I have a couple of questions. On 12 January 2015 at 05:48, Vincent Povirk wrote: > I've been working on a OneGet provider for Python, written in C#, > which would basically be a frontend for PyPI/pip, but would avoid > calling out to python for certain operations like searching PyPI, and > would bootstrap python/pip as needed. If someone has a couple of versions of Python installed (2.7 and 3.4 would be a typical case) how would "Install-Package requests" know which Python installation to install requests into? The OneGet infrastructure (much like yum, apt, etc) seems to be based around managing "the" system Python, but on Windows the notion of having a single special version of Python is atypical. Unix distributions get round this by having official "python2" and "python3" packages, and corresponding official "python2-requests" and "python3-requests" packages. That needs to be backed by a whole curated package stack, though, which doesn't exist for Windows (Chocolatey ain't it, sadly...) > However, the project lead asked at a meeting last week how useful it > would be to be able to implement providers in other languages (by > writing what we're calling a "meta-provider"). I could use the work > I've done so far to make this happen for Python. So that would mean > that any package manager written in Python could have providers that > are also written in Python. If I understand this correctly, it's separate from the above, and would allow someone to write a provider in Python that could be used to do Install-Package from (say) their internal corporate repository. Or someone (you or the PyPA, for example) could write a provider that installed from pip, should the question of how to handle multiple mentioned above Pythons be resolved. Is that right? If so, then I'd be very interested in such a thing, although more for experimenting with alternative OneGet providers than for any Python package management reasons. In general, I'd have thought that being able to implement providers in multiple languages is "obviously" a good thing, as it allows people with no PowerShell or C# background to get involved. Having said all this, the fact that I'm the only one who's commenting on this thread may give you an idea of how many people round here are likely to be interested :-) I'd probably better shut up now... Paul From gaianeka at yahoo.com Wed Jan 14 04:29:05 2015 From: gaianeka at yahoo.com (Heng Cheong) Date: Wed, 14 Jan 2015 03:29:05 +0000 (UTC) Subject: [Distutils] Please help me install. Message-ID: <867779653.806483.1421206145701.JavaMail.yahoo@jws10653.mail.bf1.yahoo.com> Dear Python.org,I am not a programmer. I don't know how to program. I have Python 3.4. I've installed pandas and matplotlib to IDLE but they are asking for dateutils and six modules. I don't know how to install these modules with pip installer. From your website, funny sentences like "python -m pip install SomePackage" are suggested to install pip in order to be able to install dateutils and six. I don't know where should I copy&paste this "python -m pip install SomePackage" to as I've tried it in the cmd.exe and returned error. I've tried it in Python IDLE and returned error. As I said I don't know programming and such sentence is very alien to me. How can I install dateutil, six and pip without looking like a total idiot? And how do I install such a funny file extension as .whl? I double-click on the .whl file and there is no reaction nor response. I opened the .whl file with winrar and found a pile of .py files which I don't know what to do with them to install. And I can't find any .exe file that I can simply click and install without hassle. Please help me install without looking like an idiot. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From madewokherd at gmail.com Wed Jan 14 02:51:06 2015 From: madewokherd at gmail.com (Vincent Povirk) Date: Tue, 13 Jan 2015 19:51:06 -0600 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On Tue, Jan 13, 2015 at 11:05 AM, Paul Moore wrote: > I'm sort of interested in this, but as far as I've been able to work > out, it's only available on Windows 8 and later. As I'm still on > Windows 7, that means I have limited opportunity to investigate. There's no installer for Windows 7 just yet (technically no installer for it at all, it's just been shipped in the installers of other things), but the .zip downloads linked from the github page (http://oneget.org/oneget.zip) should work. See https://github.com/OneGet/oneget/wiki/cmdlets for some hint on how to use it from its directory once extracted. From Erik.Purins at imgtec.com Wed Jan 14 18:05:12 2015 From: Erik.Purins at imgtec.com (Erik Purins) Date: Wed, 14 Jan 2015 17:05:12 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: , Message-ID: >> One thought - before looking at a pip provider for OneGet, I'd rather look at what's involved in properly exposing the official python.org installers through OneGet. ^This Since windows doesn't have a system cpython interpreter, this has odds of becoming 'the one true way' of getting/updating one. Likewise other non-cpython versions (ironpython, stackless, so-on). -e From p.f.moore at gmail.com Wed Jan 14 19:46:27 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 14 Jan 2015 18:46:27 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On 14 January 2015 at 17:05, Erik Purins wrote: >>> One thought - before looking at a pip provider for OneGet, I'd rather look at what's involved in properly exposing the official python.org installers through OneGet. > > ^This > > Since windows doesn't have a system cpython interpreter, this has odds of becoming 'the one true way' of getting/updating one. > Likewise other non-cpython versions (ironpython, stackless, so-on). Hmm, yes. And my current worry is that the Chocolatey version (which is exposed via OneGet) will become the de facto standard people use, which IMO is *not* good :-( Note that I have no idea what Chocolatey *do* to expose the Python installer. They may just provide a pointer to the python.org website (with some added metadata) or they may provide their own repackaged version. Also, if they are hacked, they may unwillingly provide a pointer to a compromised version. Paul From chris at simplistix.co.uk Thu Jan 15 09:54:17 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Thu, 15 Jan 2015 08:54:17 +0000 Subject: [Distutils] Please help me install. In-Reply-To: <867779653.806483.1421206145701.JavaMail.yahoo@jws10653.mail.bf1.yahoo.com> References: <867779653.806483.1421206145701.JavaMail.yahoo@jws10653.mail.bf1.yahoo.com> Message-ID: <54B78039.7060409@simplistix.co.uk> Heng, I'd strongly recommend toy just use Anaconda: http://continuum.io/downloads It will do everything you need from one .exe. cheers, Chris On 14/01/2015 03:29, Heng Cheong wrote: > Dear Python.org, > I am not a programmer. I don't know how to program. I have Python 3.4. > I've installed pandas and matplotlib to IDLE but they are asking for > dateutils and six modules. I don't know how to install these modules > with pip installer. From your website, funny sentences like "python -m > pip install SomePackage" are suggested to install pip in order to be > able to install dateutils and six. I don't know where should I > copy&paste this "python -m pip install SomePackage" to as I've tried > it in the cmd.exe and returned error. I've tried it in Python IDLE and > returned error. As I said I don't know programming and such sentence > is very alien to me. How can I install dateutil, six and pip without > looking like a total idiot? And how do I install such a funny file > extension as .whl? I double-click on the .whl file and there is no > reaction nor response. I opened the .whl file with winrar and found a > pile of .py files which I don't know what to do with them to install. > And I can't find any .exe file that I can simply click and install > without hassle. Please help me install without looking like an idiot. > Thank you. > > ______________________________________________________________________ > This email has been scanned by the Symantec Email Security.cloud service. > For more information please visit http://www.symanteccloud.com > ______________________________________________________________________ > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu Jan 15 10:21:11 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 15 Jan 2015 09:21:11 +0000 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On 15 January 2015 at 02:28, Vincent Povirk wrote: > I don't think we want to take that approach. We should be able to work > with the official releases from python.org at the very least. You may want to involve Steve Dower, as the Python 3.5 installers will be in a new format. Most significantly for you, AIUI, they will be exes rather than msi's. See the thread starting at https://mail.python.org/pipermail/python-dev/2015-January/137669.html for some details. Also, Install-Package may well need to offer some choices related to install options (system install or user install, whether to register Python file associations, etc). Paul From madewokherd at gmail.com Thu Jan 15 03:28:14 2015 From: madewokherd at gmail.com (Vincent Povirk) Date: Wed, 14 Jan 2015 20:28:14 -0600 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: Get-Package lists locally installed packages, and Install-Packages lists packages available from online repos. > If someone has a couple of versions of Python installed (2.7 and 3.4 > would be a typical case) how would "Install-Package requests" know > which Python installation to install requests into? The OneGet > infrastructure (much like yum, apt, etc) seems to be based around > managing "the" system Python, but on Windows the notion of having a > single special version of Python is atypical. I'm not sure (haven't implemented installs yet), but the way my provider currently works is that the package provider scans the registry for python installs, runs them to get some basic information (currently the version and site-packages location, as well as 32-bit or 64-bit which it infers from the registry), and unless otherwise specified Get-Package will list packages from all of them. One can specify a python install using a -PythonVersion switch (which is currently implemented a stupid way but will eventually do a wildcard match, i.e. -PythonVersion 3.2 will look for 3.2.*) or by specifying the runtime directory with -PythonLocation. I'm starting to wonder if specifying the interpreter binary would've been better as we could then work with e.g. IronPython without needing it to be named python.exe. BTW, the source code for that is here: https://github.com/madewokherd/oneget-python For find-package, it queries PyPI without using Python, so no version selection is needed. My thinking for install-package is that it will respect the -PythonVersion and -PythonLocation switches, and will look for the newest installed Python that is compatible with the package (assuming the metadata has that information). If there is no compatible Python installed (or you specified a version that isn't installed) it will query python.org and install Python from there. No one should have to manually install Python or Pip when install-package is involved, if you need it then oneget should know to install it for you. Of course, having providers written in Python makes this a bit more complicated, and don't have as clear an idea of how it can work. > Unix distributions get round this by having official "python2" and > "python3" packages, and corresponding official "python2-requests" and > "python3-requests" packages. That needs to be backed by a whole > curated package stack, though, which doesn't exist for Windows > (Chocolatey ain't it, sadly...) I don't think we want to take that approach. We should be able to work with the official releases from python.org at the very least. > If I understand this correctly, it's separate from the above, and > would allow someone to write a provider in Python that could be used > to do Install-Package from (say) their internal corporate repository. > Or someone (you or the PyPA, for example) could write a provider that > installed from pip, should the question of how to handle multiple > mentioned above Pythons be resolved. Is that right? If so, then I'd be > very interested in such a thing, although more for experimenting with > alternative OneGet providers than for any Python package management > reasons. In general, I'd have thought that being able to implement > providers in multiple languages is "obviously" a good thing, as it > allows people with no PowerShell or C# background to get involved. This is separate (unless we prefer a pip provider to be implemented in Python) but would share some code. Ideally, we'd want to be able to run oneget commands with -Provider Pip without having to specify a particular install. Ideally, get-package would query all applicable python installs, and install-package would choose one. But for some providers, that sort of muliplexing may not be desired. One possible approach is for the meta-provider to look at a manifest that has a python version requirement and call into the newest python that meets the requirement, which then has the option to return with a request to repeat the command on a different Python install (or a version that isn't installed yet). I don't think the final release will have Chocolatey enabled by default. I'm a little unclear on what we want something like "Install-Package python" to do out of the box, since making it work would mean blessing some sort of central repo. We definitely want things like "Install-Package http://python.org" to work. I don't think the details are available yet, but this will locate the package using tags and some xml which will eventually point to package details including an installer (the existing msi's should work just fine because we have an msi provider that knows how to handle those). If you give a url to a .msi, I think that works now, and I think find-package will also print information about them. I could in theory write a provider that JUST gets the official python releases from python.org (and I almost have to, so that I can install it when needed), but I'm not sure whether it'd change the solution in any material way. I guess you could then do "Install-Package -Provider Python Python-3.4" but that seems a little weird. For me personally, what's most interesting about this is distributing Python code to people who are not Python developers, without requiring them to install anything extra or me having to package it and all its dependencies in some crazy Windows format. Ideally, I should be able to package my irc client (which happens to be written in Python) in the normal way Python software is packaged (wheels?), and if I do it right one should be able to do "Install-Package http://github.com/madewokherd/urk" and get it with the dependencies. Of course, most users won't want to type commands into PowerShell to install things either, but it's not hard to imagine transforming that into something normal people will use. From madewokherd at gmail.com Thu Jan 15 03:29:28 2015 From: madewokherd at gmail.com (Vincent Povirk) Date: Wed, 14 Jan 2015 20:29:28 -0600 Subject: [Distutils] oneget python provider? In-Reply-To: References: Message-ID: On Wed, Jan 14, 2015 at 8:28 PM, Vincent Povirk wrote: > Get-Package lists locally installed packages, and Install-Packages > lists packages available from online repos. Errr, I meant to say that Find-Packages lists available packages. From ben+python at benfinney.id.au Sun Jan 18 01:47:11 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Sun, 18 Jan 2015 11:47:11 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= Message-ID: <85a91gud80.fsf@benfinney.id.au> Howdy all, How can I specify to Distutils (Setuptools) that module ?foo? needs to be available for use by ?setup.py?, but should not be installed with the binary distribution? In the ?python-daemon? distribution, I have refactored a bunch of functionality to a separate top-level module (?version?). That module is required to perform Setuptools actions ? the ?egg_info.writers? entry point specifically ? but is not needed at all by the resulting installation and should not be installed. So it's not clear how to specify this dependency. I have a ?packages=find_packages(exclude=["test"])? specification; but that module isn't a package and so should not (?) be collected. I have the file included in ?MANIFEST.in?; but that only specifies what to include in the source distribution, and should not add any files to the binary. As it stands (?python-daemon? [0] version 2.0.3), the ?version.py? file is correctly included in the source distribution, correctly used by the ?egg_info.writers? entry point; but then ends up incorrectly installed to the run-time packages library. This causes problems for subsequent import of unrelated modules that happen to share the same name. How can I specify to Setuptools that the file is needed in the source distribution, is needed by the entry points for Setuptools, but should not be installed along with the binary distribution? [0] ; version 2.0.3 at . -- \ ?The entertainment industry calls DRM "security" software, | `\ because it makes them secure from their customers.? ?Cory | _o__) Doctorow, 2014-02-05 | Ben Finney From donald at stufft.io Sun Jan 18 13:22:46 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 18 Jan 2015 07:22:46 -0500 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85a91gud80.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> Message-ID: <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> > On Jan 17, 2015, at 7:47 PM, Ben Finney wrote: > > Howdy all, > > How can I specify to Distutils (Setuptools) that module ?foo? needs to > be available for use by ?setup.py?, but should not be installed with the > binary distribution? > > In the ?python-daemon? distribution, I have refactored a bunch of > functionality to a separate top-level module (?version?). That module is > required to perform Setuptools actions ? the ?egg_info.writers? entry > point specifically ? but is not needed at all by the resulting > installation and should not be installed. > > So it's not clear how to specify this dependency. I have a > ?packages=find_packages(exclude=["test"])? specification; but that > module isn't a package and so should not (?) be collected. I have the > file included in ?MANIFEST.in?; but that only specifies what to include > in the source distribution, and should not add any files to the binary. > > As it stands (?python-daemon? [0] version 2.0.3), the ?version.py? file > is correctly included in the source distribution, correctly used by the > ?egg_info.writers? entry point; but then ends up incorrectly installed > to the run-time packages library. This causes problems for subsequent > import of unrelated modules that happen to share the same name. > > How can I specify to Setuptools that the file is needed in the source > distribution, is needed by the entry points for Setuptools, but should > not be installed along with the binary distribution? > > > [0] ; version 2.0.3 at > . > > -- > \ ?The entertainment industry calls DRM "security" software, | > `\ because it makes them secure from their customers.? ?Cory | > _o__) Doctorow, 2014-02-05 | > Ben Finney > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig setup_requires? --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From ben+python at benfinney.id.au Sun Jan 18 21:35:29 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 07:35:29 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> Message-ID: <85vbk3su7i.fsf@benfinney.id.au> Donald Stufft writes: > > On Jan 17, 2015, at 7:47 PM, Ben Finney wrote: > > As it stands (?python-daemon? [0] version 2.0.3), the ?version.py? file > > is correctly included in the source distribution, correctly used by the > > ?egg_info.writers? entry point; but then ends up incorrectly installed > > to the run-time packages library. This causes problems for subsequent > > import of unrelated modules that happen to share the same name. > > > > How can I specify to Setuptools that the file is needed in the source > > distribution, is needed by the entry points for Setuptools, but should > > not be installed along with the binary distribution? > > setup_requires? Did you want to say anything else about that, for instance, how it applies to the question or what specifically you suggest I do? As it stands, I can only quote the documentation: setup_requires A string or list of strings specifying what other distributions need to be present in order for the setup script to run. [?] The module in question is part of the same code base, and is not an ?other distribution?. So I don't know why you suggest this option. -- \ ?Every valuable human being must be a radical and a rebel, for | `\ what he must aim at is to make things better than they are.? | _o__) ?Niels Bohr | Ben Finney From dw+distutils-sig at hmmz.org Sun Jan 18 19:57:26 2015 From: dw+distutils-sig at hmmz.org (dw+distutils-sig at hmmz.org) Date: Sun, 18 Jan 2015 18:57:26 +0000 Subject: [Distutils] [Python-Dev] [PyPI] Why are some packages on pypi.python.org/simple, but have no page? In-Reply-To: References: Message-ID: <20150118185726.GA13065@k3> [-cc python-dev, +cc distutils-sig] On Sun, Jan 18, 2015 at 10:14:11AM -0600, Ian Cordasco wrote: > Taking one of your examples: https://pypi.python.org/simple/acid/ 404s This package previously had uploads, but I removed them since some took it as a signal of the usability of the software. A project of that name exists, but not yet in a generally distributable form. Deleting the entry enables the possibility of having to rename a long term project when it comes time to releasing it, or living with some PyPI name that doesn't match the package name. David From ncoghlan at gmail.com Mon Jan 19 02:01:17 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 19 Jan 2015 11:01:17 +1000 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85vbk3su7i.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> Message-ID: On 19 January 2015 at 06:35, Ben Finney wrote: > Donald Stufft writes: > >> > On Jan 17, 2015, at 7:47 PM, Ben Finney wrote: >> > As it stands (?python-daemon? [0] version 2.0.3), the ?version.py? file >> > is correctly included in the source distribution, correctly used by the >> > ?egg_info.writers? entry point; but then ends up incorrectly installed >> > to the run-time packages library. This causes problems for subsequent >> > import of unrelated modules that happen to share the same name. >> > >> > How can I specify to Setuptools that the file is needed in the source >> > distribution, is needed by the entry points for Setuptools, but should >> > not be installed along with the binary distribution? >> >> setup_requires? > > Did you want to say anything else about that, for instance, how it > applies to the question or what specifically you suggest I do? > > As it stands, I can only quote the documentation: > > setup_requires > > A string or list of strings specifying what other distributions need > to be present in order for the setup script to run. [?] > > > > The module in question is part of the same code base, and is not an > ?other distribution?. So I don't know why you suggest this option. If you have a build/install time only dependency that you want to distribute, you *have* to separate it out into a separate component if you don't want it to also be present at runtime. We do not, and will not, support selective installation of subcomponents, as it's too hard to audit later. Instead, such components need to be separated out into distinct packages so that the segmentation of functionality and availability is clear to both the automated tools and to other humans. "Extras" work the same way - they select whether or not to install optional *dependencies*, but deliberately can't be used to selectively install pieces of the package itself (the affected subcomponents are instead expected to do runtime checks to see if the optional dependencies are present and provide a useful error message if they're missing). Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From donald at stufft.io Mon Jan 19 02:18:52 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 18 Jan 2015 20:18:52 -0500 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> Message-ID: <0CA09BAD-B20F-4234-A86A-F1C7F88188D9@stufft.io> > On Jan 18, 2015, at 8:01 PM, Nick Coghlan wrote: > > On 19 January 2015 at 06:35, Ben Finney wrote: >> Donald Stufft writes: >> >>>> On Jan 17, 2015, at 7:47 PM, Ben Finney wrote: >>>> As it stands (?python-daemon? [0] version 2.0.3), the ?version.py? file >>>> is correctly included in the source distribution, correctly used by the >>>> ?egg_info.writers? entry point; but then ends up incorrectly installed >>>> to the run-time packages library. This causes problems for subsequent >>>> import of unrelated modules that happen to share the same name. >>>> >>>> How can I specify to Setuptools that the file is needed in the source >>>> distribution, is needed by the entry points for Setuptools, but should >>>> not be installed along with the binary distribution? >>> >>> setup_requires? >> >> Did you want to say anything else about that, for instance, how it >> applies to the question or what specifically you suggest I do? >> >> As it stands, I can only quote the documentation: >> >> setup_requires >> >> A string or list of strings specifying what other distributions need >> to be present in order for the setup script to run. [?] >> >> >> >> The module in question is part of the same code base, and is not an >> ?other distribution?. So I don't know why you suggest this option. > > If you have a build/install time only dependency that you want to > distribute, you *have* to separate it out into a separate component if > you don't want it to also be present at runtime. We do not, and will > not, support selective installation of subcomponents, as it's too hard > to audit later. Instead, such components need to be separated out into > distinct packages so that the segmentation of functionality and > availability is clear to both the automated tools and to other humans. > > "Extras" work the same way - they select whether or not to install > optional *dependencies*, but deliberately can't be used to selectively > install pieces of the package itself (the affected subcomponents are > instead expected to do runtime checks to see if the optional > dependencies are present and provide a useful error message if they're > missing). > I?m confused what this actually is. If it?s just a file you don?t want installed? then don?t specify it in your setup.py?s setup() function in either the py_modules or the packages keyword. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From ben+python at benfinney.id.au Mon Jan 19 02:54:04 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 12:54:04 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <0CA09BAD-B20F-4234-A86A-F1C7F88188D9@stufft.io> Message-ID: <85r3ursfgj.fsf@benfinney.id.au> Donald Stufft writes: > I?m confused what this actually is. Please look at the example I cited, ?python-daemon? version 2.0.3. > If it?s just a file you don?t want installed? then don?t specify it in > your setup.py?s setup() function in either the py_modules or the > packages keyword. That doesn't have the desired effect; it is installed (unwanted) to the site packages along with the packages I do want. -- \ ?Faith is the determination to remain ignorant in the face of | `\ all evidence that you are ignorant.? ?Shaun Mason | _o__) | Ben Finney From ben+python at benfinney.id.au Mon Jan 19 02:59:29 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 12:59:29 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> Message-ID: <85mw5fsf7i.fsf@benfinney.id.au> Nick Coghlan writes: > If you have a build/install time only dependency that you want to > distribute, you *have* to separate it out into a separate component if > you don't want it to also be present at runtime. So, to be clear: if this module is needed during build-time for the distribution but does not have a stable API yet, you're saying it nevertheless needs to go to a repository for download as an independent distribution? That places a large burden on developing such components. Placing them in a repository for independent download strongly implies they have a reliable public API, which in this case is not yet true. The API for this component isn't stable, and so far isn't needed beyond the packaging for this one distribution. That's why I don't want to separate it out yet. > We do not, and will not, support selective installation of > subcomponents, as it's too hard to audit later. Instead, such > components need to be separated out into distinct packages so that the > segmentation of functionality and availability is clear to both the > automated tools and to other humans. That's your authority as PyPA, of course. I hope it's clear why I'm not satisfied by the reasoning though. -- \ ?Anyone who puts a small gloss on [a] fundamental technology, | `\ calls it proprietary, and then tries to keep others from | _o__) building on it, is a thief.? ?Tim O'Reilly, 2000-01-25 | Ben Finney From donald at stufft.io Mon Jan 19 03:06:27 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 18 Jan 2015 21:06:27 -0500 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85r3ursfgj.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <0CA09BAD-B20F-4234-A86A-F1C7F88188D9@stufft.io> <85r3ursfgj.fsf@benfinney.id.au> Message-ID: <7787EA0A-2D9D-4072-BD17-196C56D85A2E@stufft.io> > On Jan 18, 2015, at 8:54 PM, Ben Finney wrote: > > Donald Stufft writes: > >> I?m confused what this actually is. > > Please look at the example I cited, ?python-daemon? version 2.0.3. > >> If it?s just a file you don?t want installed? then don?t specify it in >> your setup.py?s setup() function in either the py_modules or the >> packages keyword. > > That doesn't have the desired effect; it is installed (unwanted) to the > site packages along with the packages I do want. > Line 53 of the setup.py for python-daemon version 2.0.3 has you calling setuptools.find_packages(exclude=[?test?]) and passing the return value of that to the packages kwarg. I?m pretty sure that find_packages is going to discover version.py and add it as part of the return value which is then passing that to the packages kwarg. Adding it to the exclude will probably prevent it from being installed, although you might need to add it to MANIFEST.in to ensure it gets added to the sdist. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From ben+python at benfinney.id.au Mon Jan 19 03:53:00 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 13:53:00 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <0CA09BAD-B20F-4234-A86A-F1C7F88188D9@stufft.io> <85r3ursfgj.fsf@benfinney.id.au> <7787EA0A-2D9D-4072-BD17-196C56D85A2E@stufft.io> Message-ID: <85iog3scqb.fsf@benfinney.id.au> Donald Stufft writes: > I?m pretty sure that find_packages is going to discover version.py and > add it as part of the return value which is then passing that to the > packages kwarg. That doesn't match the documentation for ?find_packages?: find_packages() walks the target directory, filtering by inclusion patterns, and finds Python packages (any directory). [?] the find_packages() function returns a list of package names suitable for use as the packages argument to setup() [?] That and other language in that document strongly imply that packages are the only things found by ?find_packages?. The ?version.py? module is not within any package, so I infer that ?find_packages? should ignore it. Is that wrong? -- \ ?Two rules to success in life: 1. Don't tell people everything | `\ you know.? ?Sassan Tat | _o__) | Ben Finney From ben+python at benfinney.id.au Mon Jan 19 05:11:34 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 15:11:34 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <0CA09BAD-B20F-4234-A86A-F1C7F88188D9@stufft.io> <85r3ursfgj.fsf@benfinney.id.au> <7787EA0A-2D9D-4072-BD17-196C56D85A2E@stufft.io> <85iog3scqb.fsf@benfinney.id.au> Message-ID: <85egqrs93d.fsf@benfinney.id.au> Ben Finney writes: > Donald Stufft writes: > > > I?m pretty sure that find_packages is going to discover version.py > > and add it as part of the return value which is then passing that to > > the packages kwarg. > > That doesn't match the documentation for ?find_packages?: > [?] > > The ?version.py? module is not within any package, so I infer that > ?find_packages? should ignore it. Is that wrong? Testing with Python 2.7 and Python 3.4 demonstrates that, indeed, ?find_packages? ignores the file because it's not in a package. When I put in ?setup.py? a diagnostic line:: print(find_packages(exclude=["test"])) the output shows ?version.py? is not collected:: $ python2.7 ./setup.py check ['daemon'] running check $ python3.4 ./setup.py check ['daemon'] running check Which is as it should be. So should I expect that, if a module is not specified in the ?packages? parameter nor the ?py_modules? parameter, it will not be installed? -- \ ?See, in my line of work you gotta keep repeating things over | `\ and over and over again, for the truth to sink in; to kinda | _o__) catapult the propaganda.? ?George W. Bush, 2005-05 | Ben Finney From ncoghlan at gmail.com Mon Jan 19 08:32:32 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 19 Jan 2015 17:32:32 +1000 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85egqrs93d.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <0CA09BAD-B20F-4234-A86A-F1C7F88188D9@stufft.io> <85r3ursfgj.fsf@benfinney.id.au> <7787EA0A-2D9D-4072-BD17-196C56D85A2E@stufft.io> <85iog3scqb.fsf@benfinney.id.au> <85egqrs93d.fsf@benfinney.id.au> Message-ID: On 19 January 2015 at 14:11, Ben Finney wrote: > So should I expect that, if a module is not specified in the ?packages? > parameter nor the ?py_modules? parameter, it will not be installed? It won't be installed, but I suspect it also won't end up in the sdist either. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Mon Jan 19 08:39:34 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 19 Jan 2015 17:39:34 +1000 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85mw5fsf7i.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> Message-ID: On 19 January 2015 at 11:59, Ben Finney wrote: > Nick Coghlan writes: > >> If you have a build/install time only dependency that you want to >> distribute, you *have* to separate it out into a separate component if >> you don't want it to also be present at runtime. > > So, to be clear: if this module is needed during build-time for the > distribution but does not have a stable API yet, you're saying it > nevertheless needs to go to a repository for download as an independent > distribution? I actually misunderstood your question. If you're just after the ability to say "I want to include this file in the sdist, but not in the built wheel file or installed distribution" (as I now believe you are), then you're in the implementation defined world of the significantly underspecified sdist format. I believe setting that up actually *is* possible already, but have no idea what incantation you'll need to pass to setuptools to make it do it (and the docs are unlikely to be a reliable guide). If you'd like to volunteer for the task of reverse engineering and properly documenting how sdists work (with regression tests!), that would be quite awesome. Not necessarily *fun* from your point of view, but definitely awesome from my point of view :) Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ben+python at benfinney.id.au Mon Jan 19 08:45:56 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 18:45:56 +1100 Subject: [Distutils] =?utf-8?q?Documenting_the_Distutils_sdist_format_=28w?= =?utf-8?q?as=3A_Python_module_for_use_in_=E2=80=98setup=2Epy=E2=80=99_but?= =?utf-8?q?_not_to_install=29?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> Message-ID: <851tmrrz63.fsf_-_@benfinney.id.au> Nick Coghlan writes: > If you'd like to volunteer for the task of reverse engineering and > properly documenting how sdists work (with regression tests!), that > would be quite awesome. Not necessarily *fun* from your point of view, > but definitely awesome from my point of view :) I'll embark on that work and see how far I get. Thanks for identifying that as a desirable area to work on. The contribution will be licensed under Apache License version 2.0 equally to all recipients (i.e. not signing a PSF-specific agreement). Is there a way that can be accepted? -- \ ?What is it that makes a complete stranger dive into an icy | `\ river to save a solid gold baby? Maybe we'll never know.? ?Jack | _o__) Handey | Ben Finney From ben+python at benfinney.id.au Mon Jan 19 09:23:47 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 19:23:47 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> Message-ID: <85wq4jqiuk.fsf@benfinney.id.au> Nick Coghlan writes: > I actually misunderstood your question. If you're just after the > ability to say "I want to include this file in the sdist, but not in > the built wheel file or installed distribution" (as I now believe you > are) Correct, that's the goal here. > then you're in the implementation defined world of the significantly > underspecified sdist format. I believe setting that up actually *is* > possible already, but have no idea what incantation you'll need to > pass to setuptools to make it do it (and the docs are unlikely to be a > reliable guide). My understanding, based on an answer received elsewhere [0], is that omitting the file from ?setup.py? but adding it to ?MANIFEST.in? causes it to be included in the sdist but omitted from install targets. [0] That has worked for me. But it appears to cause problems for some others, related to this module which should not be installed. Unfortunately this kind of problem (trouble post-install from a PyPI package) is difficult to test. How can I test the behaviour of ?pip? in this regard, without thrashing many releases of a package upload to PyPI? -- \ ?The internet's completely over.? Anyway, all these computers | `\ and digital gadgets are no good. They just fill your head with | _o__) numbers and that can't be good for you.? ?Prince, 2010-07-05 | Ben Finney From ben+python at benfinney.id.au Mon Jan 19 09:37:35 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 19 Jan 2015 19:37:35 +1100 Subject: [Distutils] =?utf-8?q?Module_from_install_breaks_subsequent_insta?= =?utf-8?q?ll_of_different_distribution_=28was=3A_Python_module_for_use_in?= =?utf-8?b?IOKAmHNldHVwLnB54oCZIGJ1dCBub3QgdG8gaW5zdGFsbCk=?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> Message-ID: <85sif7qi7k.fsf_-_@benfinney.id.au> Ben Finney writes: > That has worked for me. But it appears to cause problems for some > others, related to this module which should not be installed. This discussion was prompted, for me, by this bug report . Can someone please inspect the log output from Pip and help me understand why an unrelated package is breaking when installing ?python-daemon? 2.0.3? The ?python-daemon? 2.0.3 distribution declares ?docutils? as a dependency with ?setup_requires? and ?install_requires?. So it's expected that asking Pip to install ?python-daemon? will attempt to install ?docutils?. But why does the dependency install of ?docutils? then fail with:: ===== Downloading/unpacking docutils (from python-daemon) [?] Downloading from URL https://pypi.python.org/packages/source/d/docutils/docutils-0.12.tar.gz#md5=4622263b62c5c771c03502afa3157768 (from https://pypi.python.org/simple/docutils/) Running setup.py (path:/home/pwj/.virtualenvs/venv/build/docutils/setup.py) egg_info for package docutils [?] Installing collected packages: python-daemon, lockfile, docutils Running setup.py install for docutils Running command /home/pwj/.virtualenvs/venv/bin/python -c "import setuptools, tokenize;__file__='/home/pwj/.virtualenvs/venv/build/docutils/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-7iBtrf-record/install-record.txt --single-version-externally-managed --compile --install-headers /home/pwj/.virtualenvs/venv/include/site/python2.7 [?] Running docutils-0.12/setup.py -q bdist_egg --dist-dir /tmp/easy_install-XO8spm/docutils-0.12/egg-dist-tmp-0EsqLD Traceback (most recent call last): [?] File "/home/pwj/.virtualenvs/venv/local/lib/python2.7/site-packages/pkg_resources.py", line 2147, in load ['__name__']) ImportError: No module named version ===== Why on earth would the presence of ?version.py? in this distribution cause the install of ?docutils? to fail? Or is some other correlation going on? I have had other reports that subsequent attempts to install packages also fail with the same error. -- \ ?The fact that I have no remedy for all the sorrows of the | `\ world is no reason for my accepting yours. It simply supports | _o__) the strong probability that yours is a fake.? ?Henry L. Mencken | Ben Finney From marius at gedmin.as Mon Jan 19 11:25:48 2015 From: marius at gedmin.as (Marius Gedminas) Date: Mon, 19 Jan 2015 12:25:48 +0200 Subject: [Distutils] =?cp1257?q?Module_from_install_breaks_subsequent_inst?= =?cp1257?q?all_of_different_distribution_=28was=3A_Python_module_for_use_?= =?cp1257?q?in_=91setup=2Epy=92_but_not_to_install=29?= In-Reply-To: <85sif7qi7k.fsf_-_@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> Message-ID: <20150119102548.GB12805@platonas> On Mon, Jan 19, 2015 at 07:37:35PM +1100, Ben Finney wrote: > Ben Finney writes: > > That has worked for me. But it appears to cause problems for some > > others, related to this module which should not be installed. > > This discussion was prompted, for me, by this bug report > . > Can someone please inspect the log output from Pip and help me > understand why an unrelated package is breaking when installing > ?python-daemon? 2.0.3? (It would be a bit easier if Alioth didn't require creating a guest account and logging in just to read a public bug report.) > The ?python-daemon? 2.0.3 distribution declares ?docutils? as a > dependency with ?setup_requires? and ?install_requires?. So it's > expected that asking Pip to install ?python-daemon? will attempt to > install ?docutils?. > > But why does the dependency install of ?docutils? then fail with:: > > ===== > Downloading/unpacking docutils (from python-daemon) > [?] > Downloading from URL https://pypi.python.org/packages/source/d/docutils/docutils-0.12.tar.gz#md5=4622263b62c5c771c03502afa3157768 (from https://pypi.python.org/simple/docutils/) > Running setup.py (path:/home/pwj/.virtualenvs/venv/build/docutils/setup.py) egg_info for package docutils > [?] > Installing collected packages: python-daemon, lockfile, docutils > Running setup.py install for docutils > Running command /home/pwj/.virtualenvs/venv/bin/python -c "import setuptools, tokenize;__file__='/home/pwj/.virtualenvs/venv/build/docutils/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-7iBtrf-record/install-record.txt --single-version-externally-managed --compile --install-headers /home/pwj/.virtualenvs/venv/include/site/python2.7 > [?] > Running docutils-0.12/setup.py -q bdist_egg --dist-dir /tmp/easy_install-XO8spm/docutils-0.12/egg-dist-tmp-0EsqLD > Traceback (most recent call last): > [?] > File "/home/pwj/.virtualenvs/venv/local/lib/python2.7/site-packages/pkg_resources.py", line 2147, in load > ['__name__']) > ImportError: No module named version I've reproduced this error at a different stage: virtualenv /tmp/sandbox /tmp/sandbox/bin/pip install --no-use-wheel python-daemon==2.0.3 -I This tries to reinstall setuptools and fails while running 'python setup.py egg_info' in /tmp/pip-build-0DrVHy/setuptools, with the same ImportError: No module named version inside pkg_resources.py Very interesting. If I attempt pip install without the -I, I get a different (but also very weird error): Collecting python-daemon==2.0.3 Using cached python-daemon-2.0.3.tar.gz Traceback (most recent call last): File "", line 20, in File "/tmp/pip-build-ADeQ8n/python-daemon/setup.py", line 101, in ... File "/tmp/sandbox/lib/python2.7/site-packages/setuptools/dist.py", line 311, in fetch_build_eggs ... distutils.errors.DistutilsError: Setup script exited with error: [Errno 2] No such file or directory: u'ChangeLog' AFAICT it also happens during the setup_requires processing. I was unable to reproduce either error if I tried pip install /path/to/extracted/python-daemon but perhaps I wasn't careful enough to keep all the other variables the same? > ===== > > Why on earth would the presence of ?version.py? in this distribution > cause the install of ?docutils? to fail? Or is some other correlation > going on? I have had other reports that subsequent attempts to install > packages also fail with the same error. My gut instinct is to say 'setup_requires is horrible, is there any way to avoid using it'? Of course that's not a reasonable long-term strategy. It'd be good to get to the bottom of this. I tried to debug the 'ChangeLog' error with virtualenv /tmp/sandbox /tmp/sandbox/bin/pip install --no-use-wheel --no-clean python-daemon=2.0.3 cd /tmp/pip-build-zRzNwu/python-daemon strace -e chdir,open,execve -f -o /tmp/crash.log /tmp/sandbox/bin/python setup.py egg_info and it seems to me that the error happens because setuptools changes the working directory to /tmp/easy_install-qcgHIl/docutils-0.12, and then again to /tmp/easy_install-qcgHIl/docutils-0.12/temp/easy_install-5RFCuY/docutils-0.12, right before it gives control back to python-daemon's code that attempts to read ChangeLog from the current working directory. Hmm, I think this bit in your setup.py entry_points={ "distutils.setup_keywords": [ "release_date = version:validate_distutils_release_date_value", ], "egg_info.writers": [ "{filename} = version:generate_egg_info_metadata".format( filename=metadata.version_info_filename), ], }, causes your code to run while setup_requires is busy installing docutils (in the same process!), and your code makes assumptions about the package it's used with ('there's a version.py and a ChangeLog in the current working directory') that are false when it's used with other packages, installed via setup_requires. HTH, Marius Gedminas -- I think one of the enduring tragedies of the 22nd century will be that during the 20th and 21st centuries we persistently treat nuclear reactors as if they're nuclear weapons, and nuclear weapons as if they're nuclear reactors. -- Charlie Stoss -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 173 bytes Desc: Digital signature URL: From rmcgibbo at gmail.com Mon Jan 19 08:43:16 2015 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Sun, 18 Jan 2015 23:43:16 -0800 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> Message-ID: > If you're just after the ability to say "I want to include this file in the sdist, but not in the built wheel file or installed distribution" (as I now believe you are) I think that you can add the file to the MANIFEST.in file to achieve the desired behavior. -Robert On Sun, Jan 18, 2015 at 11:39 PM, Nick Coghlan wrote: > On 19 January 2015 at 11:59, Ben Finney > wrote: > > Nick Coghlan writes: > > > >> If you have a build/install time only dependency that you want to > >> distribute, you *have* to separate it out into a separate component if > >> you don't want it to also be present at runtime. > > > > So, to be clear: if this module is needed during build-time for the > > distribution but does not have a stable API yet, you're saying it > > nevertheless needs to go to a repository for download as an independent > > distribution? > > I actually misunderstood your question. If you're just after the > ability to say "I want to include this file in the sdist, but not in > the built wheel file or installed distribution" (as I now believe you > are), then you're in the implementation defined world of the > significantly underspecified sdist format. I believe setting that up > actually *is* possible already, but have no idea what incantation > you'll need to pass to setuptools to make it do it (and the docs are > unlikely to be a reliable guide). > > If you'd like to volunteer for the task of reverse engineering and > properly documenting how sdists work (with regression tests!), that > would be quite awesome. Not necessarily *fun* from your point of view, > but definitely awesome from my point of view :) > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Mon Jan 19 17:14:29 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 20 Jan 2015 02:14:29 +1000 Subject: [Distutils] =?utf-8?q?Documenting_the_Distutils_sdist_format_=28w?= =?utf-8?q?as=3A_Python_module_for_use_in_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_but_not_to_install=29?= In-Reply-To: <851tmrrz63.fsf_-_@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <851tmrrz63.fsf_-_@benfinney.id.au> Message-ID: On 19 January 2015 at 17:45, Ben Finney wrote: > Nick Coghlan writes: > >> If you'd like to volunteer for the task of reverse engineering and >> properly documenting how sdists work (with regression tests!), that >> would be quite awesome. Not necessarily *fun* from your point of view, >> but definitely awesome from my point of view :) > > I'll embark on that work and see how far I get. Thanks for identifying > that as a desirable area to work on. > > The contribution will be licensed under Apache License version 2.0 > equally to all recipients (i.e. not signing a PSF-specific agreement). > Is there a way that can be accepted? Yes, the CLA is only necessary for CPython and the standard library itself, courtesy of the fun history with CNRI that currently requires relicensing of contributions to a Python specific license. For projects under the PyPA banner, including packaging.python.org, license in = license out is fine. That text is CC-BY-SA, so including Apache Licensed text shouldn't pose any problems. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Mon Jan 19 17:16:34 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 20 Jan 2015 02:16:34 +1000 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85wq4jqiuk.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> Message-ID: On 19 January 2015 at 18:23, Ben Finney wrote: > Unfortunately this kind of problem (trouble post-install from a PyPI > package) is difficult to test. How can I test the behaviour of ?pip? in > this regard, without thrashing many releases of a package upload to > PyPI? A local devpi instance is great for that kind of testing: http://doc.devpi.net/latest/ Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From carl at oddbird.net Mon Jan 19 19:03:22 2015 From: carl at oddbird.net (Carl Meyer) Date: Mon, 19 Jan 2015 11:03:22 -0700 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85wq4jqiuk.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> Message-ID: <54BD46EA.3020307@oddbird.net> On 01/19/2015 01:23 AM, Ben Finney wrote: [snip] > My understanding, based on an answer received elsewhere [0], is that > omitting the file from ?setup.py? but adding it to ?MANIFEST.in? causes > it to be included in the sdist but omitted from install targets. > > [0] > > That has worked for me. But it appears to cause problems for some > others, related to this module which should not be installed. That's the right answer. > Unfortunately this kind of problem (trouble post-install from a PyPI > package) is difficult to test. How can I test the behaviour of ?pip? in > this regard, without thrashing many releases of a package upload to > PyPI? Pip can install from the filesystem just as well as from PyPI. Run "python setup.py sdist" to generate your sdist file, and then run "pip install path/to/generated/sdist.tar.gz". Carl -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From carl at meyerloewen.net Mon Jan 19 19:00:17 2015 From: carl at meyerloewen.net (Carl Meyer) Date: Mon, 19 Jan 2015 11:00:17 -0700 Subject: [Distutils] =?windows-1252?q?Python_module_for_use_in_=91setup=2E?= =?windows-1252?q?py=92_but_not_to_install?= In-Reply-To: References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> Message-ID: <54BD4631.809@meyerloewen.net> On 01/19/2015 12:39 AM, Nick Coghlan wrote: > On 19 January 2015 at 11:59, Ben Finney wrote: >> Nick Coghlan writes: > I actually misunderstood your question. If you're just after the > ability to say "I want to include this file in the sdist, but not in > the built wheel file or installed distribution" (as I now believe you > are), then you're in the implementation defined world of the > significantly underspecified sdist format. I believe setting that up > actually *is* possible already, but have no idea what incantation > you'll need to pass to setuptools to make it do it (and the docs are > unlikely to be a reliable guide). This is actually quite easy, its a distutils feature not a setuptools one, and it works as documented. Just put the file in MANIFEST (or add a MANIFEST.in rule matching it), but don't reference it in any of the various setup.py kwargs that cause things to be installed. Carl -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From ben+python at benfinney.id.au Mon Jan 19 22:24:00 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 20 Jan 2015 08:24:00 +1100 Subject: [Distutils] Documenting the Distutils sdist format References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <851tmrrz63.fsf_-_@benfinney.id.au> Message-ID: <85h9vmqxan.fsf@benfinney.id.au> Nick Coghlan writes: > For projects under the PyPA banner, including packaging.python.org, > license in = license out is fine. That text is CC-BY-SA, so including > Apache Licensed text shouldn't pose any problems. Okay. So with which repository from should I start? Where would the documentation go, where would the regression tests go? You appear to be distinguishing the license for executable code versus non-executable documentation. What license conventions should I be aware of? If these are answered in a ?hacking? document somewhere, a response pointing me to that document would be fine. -- \ ?Quidquid latine dictum sit, altum viditur.? (?Whatever is | `\ said in Latin, sounds profound.?) ?anonymous | _o__) | Ben Finney From ben+python at benfinney.id.au Mon Jan 19 22:37:39 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 20 Jan 2015 08:37:39 +1100 Subject: [Distutils] Module from install breaks subsequent install of different distribution References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> Message-ID: <85bnluqwnw.fsf@benfinney.id.au> Marius Gedminas writes: > (It would be a bit easier if Alioth didn't require creating a guest > account and logging in just to read a public bug report.) You're right, I wasn't aware of that. I have filed a feature request asking that Alioth not do this. We'll see whether it is implemented. -- \ ?Skepticism is the highest duty and blind faith the one | `\ unpardonable sin.? ?Thomas Henry Huxley, _Essays on | _o__) Controversial Questions_, 1889 | Ben Finney From ben+python at benfinney.id.au Mon Jan 19 22:57:41 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 20 Jan 2015 08:57:41 +1100 Subject: [Distutils] Module from install breaks subsequent install of different distribution References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> Message-ID: <8561c2qvqi.fsf@benfinney.id.au> Marius Gedminas writes: > On Mon, Jan 19, 2015 at 07:37:35PM +1100, Ben Finney wrote: > > Why on earth would the presence of ?version.py? in this distribution > > cause the install of ?docutils? to fail? Or is some other > > correlation going on? I have had other reports that subsequent > > attempts to install packages also fail with the same error. > > My gut instinct is to say 'setup_requires is horrible, is there any > way to avoid using it'? Of course that's not a reasonable long-term > strategy. It'd be good to get to the bottom of this. Thank you very much for spending the time to do so. > Hmm, I think this bit in your setup.py [defining entry points] [?] > causes your code to run while setup_requires is busy installing > docutils (in the same process!), and your code makes assumptions about > the package it's used with ('there's a version.py and a ChangeLog in > the current working directory') that are false when it's used with > other packages, installed via setup_requires. My current position would be: that's a bug in Setuptools, it should not be applying entry points defined for package FOO when running the setup for some other package BAR. Is that right? Should I report a bug against Setuptools for this behaviour? -- \ ?Do unto others twenty-five percent better than you expect them | `\ to do unto you. (The twenty-five percent is [to correct] for | _o__) error.)? ?Linus Pauling's Golden Rule | Ben Finney From tseaver at palladion.com Tue Jan 20 00:04:17 2015 From: tseaver at palladion.com (Tres Seaver) Date: Mon, 19 Jan 2015 18:04:17 -0500 Subject: [Distutils] Module from install breaks subsequent install of different distribution In-Reply-To: <8561c2qvqi.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 01/19/2015 04:57 PM, Ben Finney wrote: > My current position would be: that's a bug in Setuptools, it should > not be applying entry points defined for package FOO when running the > setup for some other package BAR. > > Is that right? Should I report a bug against Setuptools for this > behaviour? Notabug. setuptools itself is extensible by means of entry points. Both entry points in your example (as cited by Marius) demonstrate that: one adds support for a new keyword argument to 'setup()'[1], and the other defines a new "writer" for 'egg-info'[2]. By design, both of those are supposed to be loaded / available for any invocation of 'setup()' in a Python where the are installed (not merely for packages which "mention" them). [1] https://pythonhosted.org/setuptools/setuptools.html#adding-setup-arguments [2] https://pythonhosted.org/setuptools/setuptools.html#adding-new-egg-info-files Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver at palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iEYEARECAAYFAlS9jXEACgkQ+gerLs4ltQ6L8gCghmv+gf1jTuesgd7+LU9iERbN ofIAn3KEQjxt5Zmcxrr1oi3RpW8dB1ci =59fE -----END PGP SIGNATURE----- From ben+python at benfinney.id.au Tue Jan 20 00:11:33 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 20 Jan 2015 10:11:33 +1100 Subject: [Distutils] Module from install breaks subsequent install of different distribution References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> Message-ID: <85vbk2pdqy.fsf@benfinney.id.au> Tres Seaver writes: > On 01/19/2015 04:57 PM, Ben Finney wrote: > > My current position would be: that's a bug in Setuptools, it should > > not be applying entry points defined for package FOO when running the > > setup for some other package BAR. > > > > Is that right? Should I report a bug against Setuptools for this > > behaviour? > > Notabug. setuptools itself is extensible by means of entry points. > Both entry points in your example (as cited by Marius) demonstrate > that: one adds support for a new keyword argument to 'setup()'[1], and > the other defines a new "writer" for 'egg-info'[2]. By design, both of > those are supposed to be loaded / available for any invocation of > 'setup()' in a Python where the are installed (not merely for packages > which "mention" them). I can see the logic when it's explained, but that doesn't make it not a bug IMO. That's a pretty astonishing behaviour: a declaration in the setup for one package has a mystifying effect on the setup of other packages. I won't press the point. Hopefully, though, someone does consider it a bug worth fixing. -- \ ?To have the choice between proprietary software packages, is | `\ being able to choose your master. Freedom means not having a | _o__) master.? ?Richard M. Stallman, 2007-05-16 | Ben Finney From ben+python at benfinney.id.au Tue Jan 20 02:50:14 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 20 Jan 2015 12:50:14 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions_=28was=3A_Module_from_install_breaks_subsequent_install_of?= =?utf-8?q?_different_distribution=29?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> Message-ID: <85ppaap6eh.fsf_-_@benfinney.id.au> Tres Seaver writes: > setuptools itself is extensible by means of entry points. Both entry > points in your example (as cited by Marius) demonstrate that: one adds > support for a new keyword argument to 'setup()'[1], and the other > defines a new "writer" for 'egg-info'[2]. By design, both of those are > supposed to be loaded / available for any invocation of 'setup()' in a > Python where the are installed (not merely for packages which > "mention" them). What recourse do I have, then? I'm using entry points because it seems to be the only way I can declare functionality that resides in a module alongside the ?setup.py? which itself needs third-party packages. * During the distribution build stage (?./setup.py build? or earlier), I want to parse a reST document and derive some of the distribution metadata from that. * The ?setup.py? is the wrong place for this; it's complex and deserves its own module which can be imported for unit testing. * This also is totally unrelated to the functionality this distribution installs, so it doesn't belong in any of the packages to distribute for installation. * So I place it in a separate top-level module, ?version?, only for use within ?setup.py?. * That module itself needs a third-party distribution (?docutils?) from PyPI. So I need that distribution to be installed *before* the ?version? module can be used. Apparently the ?setup_requires? option is the only way to do that. * Then, in order to declare how that functionality is to be used for commands like ?./setup.py egg_info?, I have no other way than declaring a Setuptools entry point. * Declaring a Setuptools entry point makes Setuptools think the distribution-specific module must be imported by every other distribution in the same run. Of course it's not available there so those distributions fail to install. * It even thwarts the installation of ?docutils?, the very distribution that is needed for ending this circular dependency. What am I missing? How can I implement complex functionality specific to packaging this distribution, without making an utter mess? -- \ ?The whole area of [treating source code as intellectual | `\ property] is almost assuring a customer that you are not going | _o__) to do any innovation in the future.? ?Gary Barnett | Ben Finney From donald at stufft.io Tue Jan 20 03:14:42 2015 From: donald at stufft.io (Donald Stufft) Date: Mon, 19 Jan 2015 21:14:42 -0500 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions_=28was=3A_Module_from_install_breaks_subsequent_install_of?= =?utf-8?q?_different_distribution=29?= In-Reply-To: <85ppaap6eh.fsf_-_@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> Message-ID: > On Jan 19, 2015, at 8:50 PM, Ben Finney wrote: > > Tres Seaver writes: > >> setuptools itself is extensible by means of entry points. Both entry >> points in your example (as cited by Marius) demonstrate that: one adds >> support for a new keyword argument to 'setup()'[1], and the other >> defines a new "writer" for 'egg-info'[2]. By design, both of those are >> supposed to be loaded / available for any invocation of 'setup()' in a >> Python where the are installed (not merely for packages which >> "mention" them). > > What recourse do I have, then? > > I'm using entry points because it seems to be the only way I can declare > functionality that resides in a module alongside the ?setup.py? which > itself needs third-party packages. > > * During the distribution build stage (?./setup.py build? or earlier), > I want to parse a reST document and derive some of the distribution > metadata from that. > > * The ?setup.py? is the wrong place for this; it's complex and deserves > its own module which can be imported for unit testing. > > * This also is totally unrelated to the functionality this distribution > installs, so it doesn't belong in any of the packages to distribute > for installation. > > * So I place it in a separate top-level module, ?version?, only for use > within ?setup.py?. > > * That module itself needs a third-party distribution (?docutils?) from > PyPI. So I need that distribution to be installed *before* the > ?version? module can be used. Apparently the ?setup_requires? option > is the only way to do that. > > * Then, in order to declare how that functionality is to be used for > commands like ?./setup.py egg_info?, I have no other way than > declaring a Setuptools entry point. > > * Declaring a Setuptools entry point makes Setuptools think the > distribution-specific module must be imported by every other > distribution in the same run. Of course it's not available there so > those distributions fail to install. > > * It even thwarts the installation of ?docutils?, the very distribution > that is needed for ending this circular dependency. > > What am I missing? How can I implement complex functionality specific to > packaging this distribution, without making an utter mess? > > -- > \ ?The whole area of [treating source code as intellectual | > `\ property] is almost assuring a customer that you are not going | > _o__) to do any innovation in the future.? ?Gary Barnett | > Ben Finney > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig I think the thing you?re missing here is that the entry point information is used to say ?this project supplies these entry points?. setuptools does not offer the ability to have only build time entry points, it assumes the installed project supplies those entry points. The right way to handle this is to either import it in setup.py and do it there, or spin it out into it?s own thing that can be installed on it?s own. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From dholth at gmail.com Tue Jan 20 04:50:45 2015 From: dholth at gmail.com (Daniel Holth) Date: Mon, 19 Jan 2015 22:50:45 -0500 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_di?= =?utf-8?q?stributions_=28was=3A_Module_from_install_breaks_subsequ?= =?utf-8?q?ent_install_of_different_distribution=29?= In-Reply-To: References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> Message-ID: Have you had a go with https://bitbucket.org/dholth/setup-requires ? I wrote it because it has always been problematic putting setup_requires in setup.py to try to get something that setup.py must import into the global namespace before it can run. I'd love to see some build helpers that take advantage of this feature. My version of setup-requires replaces setup.py with a short script that uses pip to install the setup requirements as specified in a special section in setup.cfg file, and then runs the real setup.py which does not have to mention any setup requirements, because they have already been installed by the setup-requires script. The setup requirements are installed into a subdirectory and are only available for the duration of the install. Then real-setup.py can use those requirements to accomplish whatever it has to do. On Mon, Jan 19, 2015 at 9:14 PM, Donald Stufft wrote: > >> On Jan 19, 2015, at 8:50 PM, Ben Finney wrote: >> >> Tres Seaver writes: >> >>> setuptools itself is extensible by means of entry points. Both entry >>> points in your example (as cited by Marius) demonstrate that: one adds >>> support for a new keyword argument to 'setup()'[1], and the other >>> defines a new "writer" for 'egg-info'[2]. By design, both of those are >>> supposed to be loaded / available for any invocation of 'setup()' in a >>> Python where the are installed (not merely for packages which >>> "mention" them). >> >> What recourse do I have, then? >> >> I'm using entry points because it seems to be the only way I can declare >> functionality that resides in a module alongside the ?setup.py? which >> itself needs third-party packages. >> >> * During the distribution build stage (?./setup.py build? or earlier), >> I want to parse a reST document and derive some of the distribution >> metadata from that. >> >> * The ?setup.py? is the wrong place for this; it's complex and deserves >> its own module which can be imported for unit testing. >> >> * This also is totally unrelated to the functionality this distribution >> installs, so it doesn't belong in any of the packages to distribute >> for installation. >> >> * So I place it in a separate top-level module, ?version?, only for use >> within ?setup.py?. >> >> * That module itself needs a third-party distribution (?docutils?) from >> PyPI. So I need that distribution to be installed *before* the >> ?version? module can be used. Apparently the ?setup_requires? option >> is the only way to do that. >> >> * Then, in order to declare how that functionality is to be used for >> commands like ?./setup.py egg_info?, I have no other way than >> declaring a Setuptools entry point. >> >> * Declaring a Setuptools entry point makes Setuptools think the >> distribution-specific module must be imported by every other >> distribution in the same run. Of course it's not available there so >> those distributions fail to install. >> >> * It even thwarts the installation of ?docutils?, the very distribution >> that is needed for ending this circular dependency. >> >> What am I missing? How can I implement complex functionality specific to >> packaging this distribution, without making an utter mess? >> >> -- >> \ ?The whole area of [treating source code as intellectual | >> `\ property] is almost assuring a customer that you are not going | >> _o__) to do any innovation in the future.? ?Gary Barnett | >> Ben Finney >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig > > I think the thing you?re missing here is that the entry point information > is used to say ?this project supplies these entry points?. setuptools does > not offer the ability to have only build time entry points, it assumes > the installed project supplies those entry points. The right way to handle > this is to either import it in setup.py and do it there, or spin it out > into it?s own thing that can be installed on it?s own. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From marius at gedmin.as Tue Jan 20 08:10:32 2015 From: marius at gedmin.as (Marius Gedminas) Date: Tue, 20 Jan 2015 09:10:32 +0200 Subject: [Distutils] =?cp1257?q?How_to_implement_=91setup=2Epy=92_function?= =?cp1257?q?ality_that_itself_needs_third-party_distributions_=28was=3A_Mo?= =?cp1257?q?dule_from_install_breaks_subsequent_install_of_different_distr?= =?cp1257?q?ibution=29?= In-Reply-To: <85ppaap6eh.fsf_-_@benfinney.id.au> References: <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> Message-ID: <20150120071032.GB20185@platonas> On Tue, Jan 20, 2015 at 12:50:14PM +1100, Ben Finney wrote: > Tres Seaver writes: > > setuptools itself is extensible by means of entry points. Both entry > > points in your example (as cited by Marius) demonstrate that: one adds > > support for a new keyword argument to 'setup()'[1], and the other > > defines a new "writer" for 'egg-info'[2]. By design, both of those are > > supposed to be loaded / available for any invocation of 'setup()' in a > > Python where the are installed (not merely for packages which > > "mention" them). > > What recourse do I have, then? > > I'm using entry points because it seems to be the only way I can declare > functionality that resides in a module alongside the ?setup.py? which > itself needs third-party packages. Are you aware that those entry points will be used for every single package the user tries to install after installing python-daemon? > * During the distribution build stage (?./setup.py build? or earlier), > I want to parse a reST document and derive some of the distribution > metadata from that. And presumably you don't want to use regexps for that. ;) > * The ?setup.py? is the wrong place for this; it's complex and deserves > its own module which can be imported for unit testing. > > * This also is totally unrelated to the functionality this distribution > installs, so it doesn't belong in any of the packages to distribute > for installation. > > * So I place it in a separate top-level module, ?version?, only for use > within ?setup.py?. These three items sound very reasonable and shouldn't be causing problems. > * That module itself needs a third-party distribution (?docutils?) from > PyPI. So I need that distribution to be installed *before* the > ?version? module can be used. Apparently the ?setup_requires? option > is the only way to do that. This is the point where my experience with setup_requires would tempt me to go "hey, regexes aren't _that_ bad". ;) > * Then, in order to declare how that functionality is to be used for > commands like ?./setup.py egg_info?, I have no other way than > declaring a Setuptools entry point. I think there's also the option of providing custom command classes to the setup() function. For completeness' sake there's also the option of monkey-patching distutils/setuptools internals (not recommended). > * Declaring a Setuptools entry point makes Setuptools think the > distribution-specific module must be imported by every other > distribution in the same run. *And in every subsequent run.* This is important. Entry points live in the installed package metadata. You shouldn't ever define an entry point that points to a package or module that won't be installed. > Of course it's not available there so those distributions fail to install. > > * It even thwarts the installation of ?docutils?, the very distribution > that is needed for ending this circular dependency. > > What am I missing? How can I implement complex functionality specific to > packaging this distribution, without making an utter mess? I'll attempt some suggestions: 1. Simplify metadata extraction so it doesn't rely on docutils. 2. Implement metadata extraction using custom command classes[*] and setup_requires. (Sorry about the lack of specifics: I've never done this. I'm not even 100% certain the things you want can be done this way.) [*] https://docs.python.org/2/distutils/extending.html#integrating-new-commands, the bits about setup(..., cmdclass=...); not the bits about command_packages. 3. Duplicate the metadata in your setup.py; have a unit test that extracts it from your ReST and from your setup.py and compares the two. Since your release process should involve a "run all tests" step, this ought to prevent you from making releases with mismatched metadata. (Not a big fan of this one, but it's better than some alternatives, like having duplicated metadata that must be checked for consistency by hand.) 4. Move your entry points into an installed module that checks the name of the package it's working on and does nothing if it's not "python-daemon". (Not a fan of this one, it adds a moving part that can break any subsequent Python package installation.) 5. Use some replacement of setup_requires as suggested elsewhere in this thread, so you could import docutils at the top level of a setup.py. (I've never tried this approach and have no idea how reliable it is.) HTH, Marius Gedminas -- Since this protocol deals with Firewalls there are no real security considerations. -- RFC 3093 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 173 bytes Desc: Digital signature URL: From ncoghlan at gmail.com Tue Jan 20 11:39:32 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 20 Jan 2015 20:39:32 +1000 Subject: [Distutils] Module from install breaks subsequent install of different distribution In-Reply-To: <85vbk2pdqy.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85vbk2pdqy.fsf@benfinney.id.au> Message-ID: On 20 Jan 2015 09:11, "Ben Finney" wrote: > > Tres Seaver writes: > > > On 01/19/2015 04:57 PM, Ben Finney wrote: > > > My current position would be: that's a bug in Setuptools, it should > > > not be applying entry points defined for package FOO when running the > > > setup for some other package BAR. > > > > > > Is that right? Should I report a bug against Setuptools for this > > > behaviour? > > > > Notabug. setuptools itself is extensible by means of entry points. > > Both entry points in your example (as cited by Marius) demonstrate > > that: one adds support for a new keyword argument to 'setup()'[1], and > > the other defines a new "writer" for 'egg-info'[2]. By design, both of > > those are supposed to be loaded / available for any invocation of > > 'setup()' in a Python where the are installed (not merely for packages > > which "mention" them). > > I can see the logic when it's explained, but that doesn't make it not a > bug IMO. That's a pretty astonishing behaviour: a declaration in the > setup for one package has a mystifying effect on the setup of other > packages. > > I won't press the point. Hopefully, though, someone does consider it a > bug worth fixing. It's an artefact of setuptools's origins in the Chandler project: it assumes it *is* the build infrastructure for the entire integrated system, which is thoroughly wrong in the context of a Linux distribution. easy_install has the same origins and hence the same problematic assumptions. On the installation side, changing the underlying design assumption to "potentially only one component of a larger system" is one of the most critical differences between easy_install and pip. I'd like to see us achieve a similar replacement on the build side as well, but cross platform build support is such a hard problem that it's currently easier to tolerate the occasional quirk than it is to figure out a viable transition plan to migrate to a less surprising build management system. Cheers, Nick. P.S. For example, Meson is an interesting project that I encountered at LCA2015 last week and would like to learn more about. In particular, it apparently has a JSON based metadata format aimed at implementation independent integration between IDEs and build systems that may be well worth exploring further. > > -- > \ ?To have the choice between proprietary software packages, is | > `\ being able to choose your master. Freedom means not having a | > _o__) master.? ?Richard M. Stallman, 2007-05-16 | > Ben Finney > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Tue Jan 20 16:42:22 2015 From: barry at python.org (Barry Warsaw) Date: Tue, 20 Jan 2015 10:42:22 -0500 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> Message-ID: <20150120104222.467a6732@anarchist.wooz.org> On Jan 19, 2015, at 07:23 PM, Ben Finney wrote: >My understanding, based on an answer received elsewhere [0], is that >omitting the file from ?setup.py? but adding it to ?MANIFEST.in? causes >it to be included in the sdist but omitted from install targets. I haven't read the whole thread (no time right now) but this is a pretty standard solution for packaging problems I've had with Debian. If a file (like MANIFEST.in itself) is not included in the sdist, then building the binary package will often fail. I forget the exact symptoms, but it's probably easily reproducible if you care. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From ben+python at benfinney.id.au Wed Jan 21 03:28:30 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 21 Jan 2015 13:28:30 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> Message-ID: <85h9vkq33l.fsf@benfinney.id.au> Donald Stufft writes: > setuptools does not offer the ability to have only build time entry > points, it assumes the installed project supplies those entry points. I'm now convinced that Setuptools entry points are not suitable to this purpose. > spin it out into it?s own thing that can be installed on it?s own. Presumably this leads back to ?setup_requires? and ?install_requires?? > The right way to handle this is to either import it in setup.py and do > it there I'd love to. How can I ?import docutils? in ?setup.py? without creating a circular dependency: * The correct way to get Docutils installed is to run ?setup.py? and have it install the dependencies. * Running ?setup.py? performs ?import docutils?, and so will fail if Docutils is not already installed. What I need is a way to express ?ensure Docutils is installed before continuing with other Setuptools actions? in ?setup.py?. I don't know of a neat way to tell Setuptools that. -- \ ?I'm beginning to think that life is just one long Yoko Ono | `\ album; no rhyme or reason, just a lot of incoherent shrieks and | _o__) then it's over.? ?Ian Wolff | Ben Finney From ben+python at benfinney.id.au Wed Jan 21 03:33:39 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 21 Jan 2015 13:33:39 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> Message-ID: <85d268q2v0.fsf@benfinney.id.au> Daniel Holth writes: > Have you had a go with https://bitbucket.org/dholth/setup-requires ? Thanks for the pointer. > My version of setup-requires replaces setup.py with a short script Why two separate files? Why not add the code in ?setup.py?, prior to the ?setup()? call? > that uses pip to install the setup requirements as specified in a > special section in setup.cfg file This seems even more fragile than what I'm currently faced with, to be honest. I'm trying to get away from hard dependencies on specific ways of obtaining and installing packages and instead use declarative ?I need this situation, make it so? that remain relevant without alteration as Distutils continues to evolve. Adding explicit parsing of the current Setuptools config format, and explicit calls to specific ?pip? command lines, seems to make the packaging more dependent on fast-changing APIs, not less. Perhaps I'm misunderstanding? -- \ ?If you go parachuting, and your parachute doesn't open, and | `\ you friends are all watching you fall, I think a funny gag | _o__) would be to pretend you were swimming.? ?Jack Handey | Ben Finney From carl at oddbird.net Wed Jan 21 03:34:29 2015 From: carl at oddbird.net (Carl Meyer) Date: Tue, 20 Jan 2015 19:34:29 -0700 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= In-Reply-To: <85h9vkq33l.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> Message-ID: <54BF1035.4010101@oddbird.net> On 01/20/2015 07:28 PM, Ben Finney wrote: > What I need is a way to express ?ensure Docutils is installed before > continuing with other Setuptools actions? in ?setup.py?. I don't know of > a neat way to tell Setuptools that. That is the precise purpose of the `setup_requires` kwarg. `setup_requires` isn't exactly "neat", though. Among other problems, if your package is installed via pip, installation of `setup_requires` dependencies will ignore installation options passed to pip, so for instance if someone were trying to install your package via an alternative index or local sdists, setuptools would still go out to PyPI for the `setup_requires` dependencies.) I would take a step back and think really hard about how badly your setup.py really needs docutils, and whether you can avoid doing this at all. There isn't any good way to do it that I know of. You could try the alternative-setup-requires that Daniel described and linked to earlier in this thread. Carl -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From ben+python at benfinney.id.au Wed Jan 21 03:38:32 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 21 Jan 2015 13:38:32 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= References: <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <20150120071032.GB20185@platonas> Message-ID: <857fwgq2mv.fsf@benfinney.id.au> Marius Gedminas writes: > You shouldn't ever define an entry point that points to a package or > module that won't be installed. This has now become clear. Thanks to everyone in this series of threads who has explained this. My preference of course was to avoid entry points, but it seemed a way out of the circular dependency. I now see that this is not a solution either. > I'll attempt some suggestions: Thanks very much for making a concerted attempt to find a solution. > 2. Implement metadata extraction using custom command classes[*] and > setup_requires. This seems to be the only one which might be feasible (and without needless duplication of information, which is part of the whole point of this exercise). I will learn more and try it. -- \ ?I was in Las Vegas, at the roulette table, having a furious | `\ argument over what I considered to be an odd number.? ?Steven | _o__) Wright | Ben Finney From ben+python at benfinney.id.au Wed Jan 21 04:27:52 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 21 Jan 2015 14:27:52 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> <54BF1035.4010101@oddbird.net> Message-ID: <85zj9cols7.fsf@benfinney.id.au> Carl Meyer writes: > On 01/20/2015 07:28 PM, Ben Finney wrote: > > What I need is a way to express ?ensure Docutils is installed before > > continuing with other Setuptools actions? in ?setup.py?. I don't > > know of a neat way to tell Setuptools that. > > That is the precise purpose of the `setup_requires` kwarg. That's too late though. What I need is to be able to instruct the build process to install Docutils so I can get stuff done *before ?setup()? is called*. Is there an API within Distutils or Setuptools that allows me to say ?I haven't declared anything yet but go get and install distribution FOO right now?? I could do that before importing the module which needs Docutils. -- \ ?The long-term solution to mountains of waste is not more | `\ landfill sites but fewer shopping centres.? ?Clive Hamilton, | _o__) _Affluenza_, 2005 | Ben Finney From dholth at gmail.com Wed Jan 21 04:50:06 2015 From: dholth at gmail.com (Daniel Holth) Date: Tue, 20 Jan 2015 22:50:06 -0500 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_di?= =?utf-8?q?stributions?= In-Reply-To: <85zj9cols7.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> <54BF1035.4010101@oddbird.net> <85zj9cols7.fsf@benfinney.id.au> Message-ID: Just use my script. Once new-metadata has been standardized pip itself will know how to install build (setup) requirements and the script will be obsolete. Until then I think you will find it works. On Jan 20, 2015 10:27 PM, "Ben Finney" wrote: > Carl Meyer writes: > > > On 01/20/2015 07:28 PM, Ben Finney wrote: > > > What I need is a way to express ?ensure Docutils is installed before > > > continuing with other Setuptools actions? in ?setup.py?. I don't > > > know of a neat way to tell Setuptools that. > > > > That is the precise purpose of the `setup_requires` kwarg. > > That's too late though. What I need is to be able to instruct the build > process to install Docutils so I can get stuff done *before ?setup()? is > called*. > > Is there an API within Distutils or Setuptools that allows me to say ?I > haven't declared anything yet but go get and install distribution FOO > right now?? I could do that before importing the module which needs > Docutils. > > -- > \ ?The long-term solution to mountains of waste is not more | > `\ landfill sites but fewer shopping centres.? ?Clive Hamilton, | > _o__) _Affluenza_, 2005 | > Ben Finney > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From me at m.merickel.org Wed Jan 21 04:38:23 2015 From: me at m.merickel.org (Michael Merickel) Date: Tue, 20 Jan 2015 21:38:23 -0600 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_di?= =?utf-8?q?stributions?= In-Reply-To: <85zj9cols7.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> <54BF1035.4010101@oddbird.net> <85zj9cols7.fsf@benfinney.id.au> Message-ID: On Tue, Jan 20, 2015 at 9:27 PM, Ben Finney wrote: > Is there an API within Distutils or Setuptools that allows me to say ?I > haven't declared anything yet but go get and install distribution FOO > right now?? I could do that before importing the module which needs > Docutils. What you're asking for is the *exact* purpose of Daniel Holth's setup-requires project linked in earlier posts. It's a downfall in the current tooling where there is no declarative configuration. From donald at stufft.io Wed Jan 21 05:12:02 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 20 Jan 2015 23:12:02 -0500 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= In-Reply-To: <85zj9cols7.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> <54BF1035.4010101@oddbird.net> <85zj9cols7.fsf@benfinney.id.au> Message-ID: > On Jan 20, 2015, at 10:27 PM, Ben Finney wrote: > > Carl Meyer writes: > >> On 01/20/2015 07:28 PM, Ben Finney wrote: >>> What I need is a way to express ?ensure Docutils is installed before >>> continuing with other Setuptools actions? in ?setup.py?. I don't >>> know of a neat way to tell Setuptools that. >> >> That is the precise purpose of the `setup_requires` kwarg. > > That's too late though. What I need is to be able to instruct the build > process to install Docutils so I can get stuff done *before ?setup()? is > called*. > > Is there an API within Distutils or Setuptools that allows me to say ?I > haven't declared anything yet but go get and install distribution FOO > right now?? I could do that before importing the module which needs > Docutils. > You could also add setup_requires=[?docutils?], and then pass in a Distclass (defined in your setup.py) and/or custom cmdclass entries which delay the import until the body of the methods on those classes. This should be after the setup_requires code has ran so the docutils import should work. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From donald at stufft.io Wed Jan 21 05:41:16 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 20 Jan 2015 23:41:16 -0500 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= In-Reply-To: References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> <54BF1035.4010101@oddbird.net> <85zj9cols7.fsf@benfinney.id.au> Message-ID: > On Jan 20, 2015, at 10:50 PM, Daniel Holth wrote: > > Just use my script. Once new-metadata has been standardized pip itself will know how to install build (setup) requirements and the script will be obsolete. Until then I think you will find it works. > > One downside to https://bitbucket.org/dholth/setup-requires is that it?s not going to work for people who use something like ``pip install ?cert`` or ``pip install ?index`` (specifically the CLI flags, not environment variables or config files). This is already a problem with setup_requires, but one I have a plan to fix. It?s also not going to work on Python 2.6 but that?s a trivial fix. The major downside to this though is that trying to work around the built in setuptools mechanisms is going to go wrong I think. I?m hoping to in the next pip version to have a solution where pip itself will install setup_requires instead of setuptools handling it. However looking at the implementation of dholth/setup-requires I?m pretty sure that this is going to ?defeat? that and prevent pip from being able to wrest control of dholth/setup-requires style setup_requires (which will make things like ``pip install ?cert`` work). Generally I?m of the opinion that the trickier someone makes their setup.py, the worse of an idea it is and the more likely it is going to break. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From alex at hill.net.au Wed Jan 21 07:13:16 2015 From: alex at hill.net.au (Alexander Hill) Date: Wed, 21 Jan 2015 14:13:16 +0800 Subject: [Distutils] Dependencies conditional on specific installed versions of other dependencies? Message-ID: Hi all, I have run into a bit of a head-scratcher. Prior to the upcoming version 1.8, Django included a comments app at django.contrib.comments. As of 1.8, the app has been broken out into an independent package, django-contrib-comments [1]. django-contrib-comments requires Django >= 1.5. The project I'm working on (Mezzanine) requires Django comments. I want Mezzanine to support Django 1.4 through to 1.8, so I need django-contrib-comments. But because it requires Django >= 1.5, when it's in my dependencies Django gets automatically upgraded if version 1.4 is installed, so the Mezzanine test suite isn't actually run under 1.4 [2]. Is there some way to only depend on django-contrib-comments if a minimum version of Django is installed? Or any other way around this problem? Any advice greatly appreciated! Cheers, Alex [1] https://github.com/django/django-contrib-comments [2] https://travis-ci.org/stephenmcd/mezzanine/jobs/47675518#L315 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu Jan 22 06:29:41 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 22 Jan 2015 15:29:41 +1000 Subject: [Distutils] Dependencies conditional on specific installed versions of other dependencies? In-Reply-To: References: Message-ID: On 21 January 2015 at 16:13, Alexander Hill wrote: > Hi all, > > I have run into a bit of a head-scratcher. > > Prior to the upcoming version 1.8, Django included a comments app at > django.contrib.comments. As of 1.8, the app has been broken out into an > independent package, django-contrib-comments [1]. django-contrib-comments > requires Django >= 1.5. > > The project I'm working on (Mezzanine) requires Django comments. I want > Mezzanine to support Django 1.4 through to 1.8, so I need > django-contrib-comments. But because it requires Django >= 1.5, when it's in > my dependencies Django gets automatically upgraded if version 1.4 is > installed, so the Mezzanine test suite isn't actually run under 1.4 [2]. > > Is there some way to only depend on django-contrib-comments if a minimum > version of Django is installed? Or any other way around this problem? Any > advice greatly appreciated! The closest I can think of is to make the comments support an extra, so you'd need to specify "mezzanine[comments]" to get django-contrib-comments on 1.5+. That's unlikely to be acceptable in this case though, since existing dependencies on mezzanine would fail to install the comments support. While it won't help you right now, I did have an idea for a possible way to deal with this in PEP 426: provide a way to specify a set of "default extras" for a project. Those dependencies would still be installed by default, but you'd have an easy way to turn them off if you didn't want them. Then, in this case, you'd be able to run the Django 1.4 tests using "mezzanine[-comments]" rather than a default install to avoid attempting to install django-contrib-comments. Filed as: https://github.com/pypa/interoperability-peps/issues/18 Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From alex at hill.net.au Thu Jan 22 07:18:55 2015 From: alex at hill.net.au (Alexander Hill) Date: Thu, 22 Jan 2015 14:18:55 +0800 Subject: [Distutils] Dependencies conditional on specific installed versions of other dependencies? In-Reply-To: References: Message-ID: Hi Nick, Just thinking some more about the scenario where Django defines a "comments" extra. Say in our setup.py we have install_requires=['django[comments] >= 1.4']. What happens when this setup.py is run with Django 1.4 already installed? Is the (missing) extra just ignored, or will Django be upgraded to a version which declares the extra? (anyone following along, please see my response to Nick below - forgot to reply all) Cheers Alex On Thu, Jan 22, 2015 at 2:04 PM, Alexander Hill wrote: > Thanks for your feedback Nick. > > Just checking my understanding: say default extras existed and we'd > implemented them as described, and I'm a Mezzanine user running Django 1.4. > I see a new version has been released, don't read the release notes > thoroughly, and run pip install --upgrade mezzanine as usual instead of > specifying [-comments]. This would still install django-contrib-comments > and upgrade Django in the process, is that right? > > Seems like the cleanest way to do this would be for Django itself to > define a "comments" extra, requiring django-contrib-comments, in Django > 1.8's setup.py. > > Cheers, > Alex > > On Thu, Jan 22, 2015 at 1:29 PM, Nick Coghlan wrote: > >> On 21 January 2015 at 16:13, Alexander Hill wrote: >> > Hi all, >> > >> > I have run into a bit of a head-scratcher. >> > >> > Prior to the upcoming version 1.8, Django included a comments app at >> > django.contrib.comments. As of 1.8, the app has been broken out into an >> > independent package, django-contrib-comments [1]. >> django-contrib-comments >> > requires Django >= 1.5. >> > >> > The project I'm working on (Mezzanine) requires Django comments. I want >> > Mezzanine to support Django 1.4 through to 1.8, so I need >> > django-contrib-comments. But because it requires Django >= 1.5, when >> it's in >> > my dependencies Django gets automatically upgraded if version 1.4 is >> > installed, so the Mezzanine test suite isn't actually run under 1.4 [2]. >> > >> > Is there some way to only depend on django-contrib-comments if a minimum >> > version of Django is installed? Or any other way around this problem? >> Any >> > advice greatly appreciated! >> >> The closest I can think of is to make the comments support an extra, >> so you'd need to specify "mezzanine[comments]" to get >> django-contrib-comments on 1.5+. That's unlikely to be acceptable in >> this case though, since existing dependencies on mezzanine would fail >> to install the comments support. >> >> While it won't help you right now, I did have an idea for a possible >> way to deal with this in PEP 426: provide a way to specify a set of >> "default extras" for a project. Those dependencies would still be >> installed by default, but you'd have an easy way to turn them off if >> you didn't want them. >> >> Then, in this case, you'd be able to run the Django 1.4 tests using >> "mezzanine[-comments]" rather than a default install to avoid >> attempting to install django-contrib-comments. >> >> Filed as: https://github.com/pypa/interoperability-peps/issues/18 >> >> Regards, >> Nick. >> >> -- >> Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dholth at gmail.com Thu Jan 22 16:45:24 2015 From: dholth at gmail.com (Daniel Holth) Date: Thu, 22 Jan 2015 10:45:24 -0500 Subject: [Distutils] Dependencies conditional on specific installed versions of other dependencies? In-Reply-To: References: Message-ID: It is an error to refer to an extra that has not been declared. In your case you would get an error if some version of Django >= 1.4 that did not declare the extra was installed. pip install pip[pop] UnknownExtra: pip 6.0.6 has no such extra feature 'pop' One thing you could do with the current technology is to have two empty mezzanine-django-old and mezzanine-django-new packages that just depend on a third mezzanine package, the appropriate version of django, and the appropriate comments add on. On Thu, Jan 22, 2015 at 1:18 AM, Alexander Hill wrote: > Hi Nick, > > Just thinking some more about the scenario where Django defines a "comments" > extra. > > Say in our setup.py we have install_requires=['django[comments] >= 1.4']. > What happens when this setup.py is run with Django 1.4 already installed? Is > the (missing) extra just ignored, or will Django be upgraded to a version > which declares the extra? > > (anyone following along, please see my response to Nick below - forgot to > reply all) > > Cheers > Alex > > On Thu, Jan 22, 2015 at 2:04 PM, Alexander Hill wrote: >> >> Thanks for your feedback Nick. >> >> Just checking my understanding: say default extras existed and we'd >> implemented them as described, and I'm a Mezzanine user running Django 1.4. >> I see a new version has been released, don't read the release notes >> thoroughly, and run pip install --upgrade mezzanine as usual instead of >> specifying [-comments]. This would still install django-contrib-comments and >> upgrade Django in the process, is that right? >> >> Seems like the cleanest way to do this would be for Django itself to >> define a "comments" extra, requiring django-contrib-comments, in Django >> 1.8's setup.py. >> >> Cheers, >> Alex >> >> On Thu, Jan 22, 2015 at 1:29 PM, Nick Coghlan wrote: >>> >>> On 21 January 2015 at 16:13, Alexander Hill wrote: >>> > Hi all, >>> > >>> > I have run into a bit of a head-scratcher. >>> > >>> > Prior to the upcoming version 1.8, Django included a comments app at >>> > django.contrib.comments. As of 1.8, the app has been broken out into an >>> > independent package, django-contrib-comments [1]. >>> > django-contrib-comments >>> > requires Django >= 1.5. >>> > >>> > The project I'm working on (Mezzanine) requires Django comments. I want >>> > Mezzanine to support Django 1.4 through to 1.8, so I need >>> > django-contrib-comments. But because it requires Django >= 1.5, when >>> > it's in >>> > my dependencies Django gets automatically upgraded if version 1.4 is >>> > installed, so the Mezzanine test suite isn't actually run under 1.4 >>> > [2]. >>> > >>> > Is there some way to only depend on django-contrib-comments if a >>> > minimum >>> > version of Django is installed? Or any other way around this problem? >>> > Any >>> > advice greatly appreciated! >>> >>> The closest I can think of is to make the comments support an extra, >>> so you'd need to specify "mezzanine[comments]" to get >>> django-contrib-comments on 1.5+. That's unlikely to be acceptable in >>> this case though, since existing dependencies on mezzanine would fail >>> to install the comments support. >>> >>> While it won't help you right now, I did have an idea for a possible >>> way to deal with this in PEP 426: provide a way to specify a set of >>> "default extras" for a project. Those dependencies would still be >>> installed by default, but you'd have an easy way to turn them off if >>> you didn't want them. >>> >>> Then, in this case, you'd be able to run the Django 1.4 tests using >>> "mezzanine[-comments]" rather than a default install to avoid >>> attempting to install django-contrib-comments. >>> >>> Filed as: https://github.com/pypa/interoperability-peps/issues/18 >>> >>> Regards, >>> Nick. >>> >>> -- >>> Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia >> >> > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > From ben+python at benfinney.id.au Fri Jan 23 03:15:51 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Fri, 23 Jan 2015 13:15:51 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= References: <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <20150120071032.GB20185@platonas> <857fwgq2mv.fsf@benfinney.id.au> Message-ID: <858ugunsx4.fsf@benfinney.id.au> Ben Finney writes: > Marius Gedminas writes: > > > 2. Implement metadata extraction using custom command classes[*] and > > setup_requires. > > This seems to be the only one which might be feasible (and without > needless duplication of information, which is part of the whole point of > this exercise). I will learn more and try it. Okay, the Setuptools ?egg_info? command is rather hostile to extension, but less hostile than Setuptools entry points. I have cobbled together a solution. The packaging code for version 2.0.4 of ?python-daemon?: * Has no Setuptools entry points. These are not the right tool for the job, it seems. * Adds a custom Setuptools command ?write_version_info? to parse the Changelog and write the version info metadata file. * Minor hack: Uses a custom ?egg_info? command that will recognise and run sub-commands. (Why doesn't every top-level command do this without needing to be taught?) * Major hack: Re-binds class names in order to update their base class, after dynamically importing ?docutils?. (If there was a way to ensure Docutils was installed before running any of the commands, this would not be needed.) Thanks for all the assistance understanding the limitations and quirks of Distutils and Setuptools. I'm not very happy with this solution, but hopefully it is modular enough that it can be changed, in some happy future when Python's packaging features are more tractable. -- \ ?Programs must be written for people to read, and only | `\ incidentally for machines to execute.? ?Abelson & Sussman, | _o__) _Structure and Interpretation of Computer Programs_ | Ben Finney From george at reilly.org Fri Jan 23 05:11:26 2015 From: george at reilly.org (George V. Reilly) Date: Thu, 22 Jan 2015 20:11:26 -0800 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_di?= =?utf-8?q?stributions?= In-Reply-To: <858ugunsx4.fsf@benfinney.id.au> References: <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <20150120071032.GB20185@platonas> <857fwgq2mv.fsf@benfinney.id.au> <858ugunsx4.fsf@benfinney.id.au> Message-ID: I haven't followed this thread very closely, but it reminds me of a question I had a couple of years ago. http://stackoverflow.com/questions/16822400/using-code-from-a-dependency-in-setup-py -- /George V. Reilly george at reilly.org Twitter: @georgevreilly *http://georgevreilly.github.io/ * -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben+python at benfinney.id.au Fri Jan 23 05:37:11 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Fri, 23 Jan 2015 15:37:11 +1100 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_distribut?= =?utf-8?q?ions?= References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <85h9vkq33l.fsf@benfinney.id.au> Message-ID: <85zj9am7t4.fsf@benfinney.id.au> Ben Finney writes: > How can I ?import docutils? in ?setup.py? without creating a circular > dependency: > > * The correct way to get Docutils installed is to run ?setup.py? and > have it install the dependencies. [?] > What I need is a way to express ?ensure Docutils is installed before > continuing with other Setuptools actions? in ?setup.py?. I don't know of > a neat way to tell Setuptools that. Thanks to George V. Reilly for a much neater solution: # setup.py from setuptools import (dist, setup, find_packages) # Declare a requirement for Docutils in a throw-away distribution. # Surprisingly, Setuptools will magically ensure this is installed, # or raise an exception if it can't. dist.Distribution(dict(setup_requires="docutils")) # By the time we get here, the following import will work. import docutils # ? other code that needs Docutils ? setup(?) Should this be in the Setuptools documentation? My time over the past several weeks would have been *much* less frustrating if this was the documented One Obvious Way? to early-load a dependency in setup. -- \ ?Free thought is a necessary, but not a sufficient, condition | `\ for democracy.? ?Carl Sagan | _o__) | Ben Finney From leorochael at gmail.com Fri Jan 23 06:04:35 2015 From: leorochael at gmail.com (Leonardo Rochael Almeida) Date: Fri, 23 Jan 2015 03:04:35 -0200 Subject: [Distutils] =?utf-8?q?How_to_implement_=E2=80=98setup=2Epy?= =?utf-8?q?=E2=80=99_functionality_that_itself_needs_third-party_di?= =?utf-8?q?stributions?= In-Reply-To: <858ugunsx4.fsf@benfinney.id.au> References: <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <85wq4jqiuk.fsf@benfinney.id.au> <85sif7qi7k.fsf_-_@benfinney.id.au> <20150119102548.GB12805@platonas> <8561c2qvqi.fsf@benfinney.id.au> <85ppaap6eh.fsf_-_@benfinney.id.au> <20150120071032.GB20185@platonas> <857fwgq2mv.fsf@benfinney.id.au> <858ugunsx4.fsf@benfinney.id.au> Message-ID: I have faced an issue with setuptools (and even plain distutils) in the past that has some resemblance to this. There was a PyPI package that contained C extensions and headers, and another python package that had more C extensions, but depended on the location of the headers on the first package to be built. Was it numpy and scipy? Anyway zope.proxy and zope.container had the same issue, as did Zope's Acquisition and ExtensionClass. The issue can be clearly summarized with the `setup.py` snippet below, which I found at [1]: from distutils.core import setup, Extension import numpy # define the extension module cos_module_np = Extension('cos_module_np', sources=['cos_module_np.c'], include_dirs=[numpy.get_include()]) # run the setup setup(ext_modules=[cos_module_np]) [1] https://scipy-lectures.github.io/advanced/interfacing_with_c/interfacing_with_c.html Semantically, it's clear that numpy is both a `setup_requires` and an `install_requires` for the `setup.py` above. It's needed both at build time for the header locations and at run time for dynamically linking / module importing. However, even if the snippet above were using `setuptools` there'd be no way of passing `setup_requires` to the `setup()` call above, the script will have failed with an `ImportError` long before that, in the `include_dirs` part. The way the zope packages solved this issue was horribly crude and unsatisfying, but effective: they vendored the header files (and sometimes whole extensions) of the dependency packages. First by svn:extensions, and then by outright copying, when the `git` wave came around. The simplest and most backward compatible fix for setuptools I could think of would be for setup() keywords to have lazy evaluation: In setuptools, some `setup()` keywords accept parameters in multiple formats. For example `entry_points` accepts both a dictionary and a .ini formatted string [2]. [2] https://pythonhosted.org/setuptools/setuptools.html#dynamic-discovery-of-services-and-plugins If `setup()` keywords were to accept callables as well, then we could delay importing dependency packages until `setup_requires` where already satisfied, like this: from setuptools import setup, Extension def ext_modules(): import numpy # define the extension module cos_module_np = Extension('cos_module_np', sources=['cos_module_np.c'], include_dirs=[numpy.get_include()]) return [cos_module_np] # run the setup setup(setup_requires=['numpy'], ext_modules=ext_modules) Cheers, Leo On Friday, January 23, 2015, Ben Finney wrote: > Ben Finney writes: > > > Marius Gedminas writes: > > > > > 2. Implement metadata extraction using custom command classes[*] and > > > setup_requires. > > > > This seems to be the only one which might be feasible (and without, > > needless duplication of information, which is part of the whole point of > > this exercise). I will learn more and try it. > > Okay, the Setuptools ?egg_info? command is rather hostile to extension, > but less hostile than Setuptools entry points. I have cobbled together a > solution. > > The packaging code for version 2.0.4 of ?python-daemon?: > > * Has no Setuptools entry points. These are not the right tool for the > job, it seems. > > * Adds a custom Setuptools command ?write_version_info? to parse the > Changelog and write the version info metadata file. > > * Minor hack: Uses a custom ?egg_info? command that will recognise and > run sub-commands. (Why doesn't every top-level command do this without > needing to be taught?) > > * Major hack: Re-binds class names in order to update their base class, > after dynamically importing ?docutils?. (If there was a way to ensure > Docutils was installed before running any of the commands, this would > not be needed.) > > Thanks for all the assistance understanding the limitations and quirks > of Distutils and Setuptools. I'm not very happy with this solution, but > hopefully it is modular enough that it can be changed, in some happy > future when Python's packaging features are more tractable. > > -- > \ ?Programs must be written for people to read, and only | > `\ incidentally for machines to execute.? ?Abelson & Sussman, | > _o__) _Structure and Interpretation of Computer Programs_ | > Ben Finney > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sat Jan 24 18:31:35 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 24 Jan 2015 12:31:35 -0500 Subject: [Distutils] Closing the Delete File + Re-upload File Loophole. Message-ID: I've pushed changes to PyPI where it is no longer possible to reuse a filename and attempting to do it will give an 400 error that says: This filename has previously been used, you should use a different version. This does NOT prevent authors from being allowed to delete files from PyPI, however if a file is deleted from PyPI it cannot be re-uploaded again. This means that if you upload say foobar-1.0.tar.gz, and your 1.0 has a mistake in it then you *must* issue a new release to correct it. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From sontek at gmail.com Sat Jan 24 18:37:51 2015 From: sontek at gmail.com (John Anderson) Date: Sat, 24 Jan 2015 09:37:51 -0800 Subject: [Distutils] Closing the Delete File + Re-upload File Loophole. In-Reply-To: References: Message-ID: On Saturday, January 24, 2015, Donald Stufft wrote: > I've pushed changes to PyPI where it is no longer possible to reuse a > filename > and attempting to do it will give an 400 error that says: > > This filename has previously been used, you should use a different > version. > > This does NOT prevent authors from being allowed to delete files from PyPI, > however if a file is deleted from PyPI it cannot be re-uploaded again. This > means that if you upload say foobar-1.0.tar.gz, and your 1.0 has a mistake > in > it then you *must* issue a new release to correct it. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > My only concern is that there is no reliable way to test that your README will be parsed correctly. Is there a timeline for switch it to use https://github.com/pypa/readme? I would say majority of the time I do a release of the same version it's because of the fragile rst parsing. If I have to run the risk of bumping versions just to fix a valid restructured text document to fit pypi parsing it'll make releasing a very stressful experience. > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sat Jan 24 18:38:54 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 24 Jan 2015 12:38:54 -0500 Subject: [Distutils] Closing the Delete File + Re-upload File Loophole. In-Reply-To: References: Message-ID: <255CD6EE-CC48-4C5B-A417-4191D9DFEDF4@stufft.io> > On Jan 24, 2015, at 12:37 PM, John Anderson wrote: > > > > On Saturday, January 24, 2015, Donald Stufft > wrote: > I've pushed changes to PyPI where it is no longer possible to reuse a filename > and attempting to do it will give an 400 error that says: > > This filename has previously been used, you should use a different version. > > This does NOT prevent authors from being allowed to delete files from PyPI, > however if a file is deleted from PyPI it cannot be re-uploaded again. This > means that if you upload say foobar-1.0.tar.gz, and your 1.0 has a mistake in > it then you *must* issue a new release to correct it. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > > My only concern is that there is no reliable way to test that your README will be parsed correctly. Is there a timeline for switch it to use https://github.com/pypa/readme ? > > I would say majority of the time I do a release of the same version it's because of the fragile rst parsing. > > If I have to run the risk of bumping versions just to fix a valid restructured text document to fit pypi parsing it'll make releasing a very stressful experience. > You can re-run register as many times as you want which is all you need to adjust the README. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sat Jan 24 19:40:20 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 24 Jan 2015 13:40:20 -0500 Subject: [Distutils] PEP 440 on PyPI Message-ID: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> I've just started enforcing parts of PEP 440 on PyPI. In particular: * All new versions must be a valid PEP 440 version (including normalization). * All new versions must not have leading or trailing white space. * All new versions must not have a local version. * Going forward sort order will be determined by PEP 440 sorting. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From graffatcolmingov at gmail.com Sat Jan 24 20:06:07 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Sat, 24 Jan 2015 13:06:07 -0600 Subject: [Distutils] Closing the Delete File + Re-upload File Loophole. In-Reply-To: <255CD6EE-CC48-4C5B-A417-4191D9DFEDF4@stufft.io> References: <255CD6EE-CC48-4C5B-A417-4191D9DFEDF4@stufft.io> Message-ID: On Sat, Jan 24, 2015 at 11:38 AM, Donald Stufft wrote: > > On Jan 24, 2015, at 12:37 PM, John Anderson wrote: > > > > On Saturday, January 24, 2015, Donald Stufft wrote: >> >> I've pushed changes to PyPI where it is no longer possible to reuse a >> filename >> and attempting to do it will give an 400 error that says: >> >> This filename has previously been used, you should use a different >> version. >> >> This does NOT prevent authors from being allowed to delete files from >> PyPI, >> however if a file is deleted from PyPI it cannot be re-uploaded again. >> This >> means that if you upload say foobar-1.0.tar.gz, and your 1.0 has a mistake >> in >> it then you *must* issue a new release to correct it. >> >> --- >> Donald Stufft >> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA >> > > My only concern is that there is no reliable way to test that your README > will be parsed correctly. Is there a timeline for switch it to use > https://github.com/pypa/readme? > > I would say majority of the time I do a release of the same version it's > because of the fragile rst parsing. > > If I have to run the risk of bumping versions just to fix a valid > restructured text document to fit pypi parsing it'll make releasing a very > stressful experience. > > > You can re-run register as many times as you want which is all you need to > adjust the README. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > .post{N} releases are also a good way of fixing this in the package (assuming you want the most current and correct version of the README to be what the user downloads). The .post{N} part of PEP440 is semantically for build errors in a package where no other changes to the package have been made. I think this qualifies as a use case. From msabramo at gmail.com Sat Jan 24 19:53:25 2015 From: msabramo at gmail.com (Marc Abramowitz) Date: Sat, 24 Jan 2015 10:53:25 -0800 Subject: [Distutils] Closing the Delete File + Re-upload File Loophole. In-Reply-To: <255CD6EE-CC48-4C5B-A417-4191D9DFEDF4@stufft.io> References: <255CD6EE-CC48-4C5B-A417-4191D9DFEDF4@stufft.io> Message-ID: <7201094C-8CC0-4BF7-8E4E-DCC0E5AD7C34@gmail.com> > You can re-run register as many times as you want which is all you need to adjust the README. Maybe true but it would be pretty awesome to solve https://bitbucket.org/pypa/pypi/issue/161/rest-formatting-fails-and-there-is-no-way because repeatedly registering and doing trial and error is also not a great experience and it wastes PyPI resources. From donald at stufft.io Sat Jan 24 21:43:13 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 24 Jan 2015 15:43:13 -0500 Subject: [Distutils] PyPI Rendering Switched Message-ID: I've switched the rendering on PyPI away from using a big blog of code that was embedded into PyPI to the readme library (https://pypi.python.org/pypi/readme). Currently this only has a single API (``import readme.rst; readme.rst.render(...)``) however in the near future I want to provide a CLI for this so people can more easily test their READMEs against how PyPI renders them. The only real change should be some slight themeing differences. Some things might be valid that didn't used to be valid, and certain error conditions will now render escaped HTML instead of just failing to render. Note, for the next 24 hours you may need to force refresh your browser. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From mal at egenix.com Sun Jan 25 15:32:08 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sun, 25 Jan 2015 15:32:08 +0100 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> Message-ID: <54C4FE68.9070200@egenix.com> I think you ought to make a more prominent announcement on c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). On 24.01.2015 19:40, Donald Stufft wrote: > I've just started enforcing parts of PEP 440 on PyPI. > > In particular: > > * All new versions must be a valid PEP 440 version (including normalization). > * All new versions must not have leading or trailing white space. > * All new versions must not have a local version. > * Going forward sort order will be determined by PEP 440 sorting. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jan 25 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From donald at stufft.io Sun Jan 25 16:34:03 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 25 Jan 2015 10:34:03 -0500 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <54C4FE68.9070200@egenix.com> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> Message-ID: <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> > On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: > > I think you ought to make a more prominent announcement on > c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). What is c.l.p and c.l.p.a? There is no distutils blog. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From vladimir.v.diaz at gmail.com Sun Jan 25 16:42:42 2015 From: vladimir.v.diaz at gmail.com (Vladimir Diaz) Date: Sun, 25 Jan 2015 10:42:42 -0500 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> Message-ID: Probably the Python mailing list and newsgroups (comp.lang.python and comp.lang.python.announce) https://www.python.org/community/lists/ -Vlad --- PGP key fingerprint = ACCF 9DCA 73B9 862F 93C5 6608 63F8 90AA 1D25 3935 --- On Sun, Jan 25, 2015 at 10:34 AM, Donald Stufft wrote: > > > On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: > > > > I think you ought to make a more prominent announcement on > > c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). > > What is c.l.p and c.l.p.a? > > There is no distutils blog. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Sun Jan 25 17:43:36 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sun, 25 Jan 2015 17:43:36 +0100 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> Message-ID: <54C51D38.9000103@egenix.com> On 25.01.2015 16:34, Donald Stufft wrote: > >> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >> >> I think you ought to make a more prominent announcement on >> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). > > What is c.l.p and c.l.p.a? That's short for comp.lang.python and comp.lang.python.announce, which are gateway'ed to mailing lists: https://mail.python.org/mailman/listinfo/python-list https://mail.python.org/mailman/listinfo/python-announce-list > There is no distutils blog. Given how quickly things change, I think this would be a good way of getting the word out to a wider audience than the distutils mailing list. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jan 25 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From graffatcolmingov at gmail.com Sun Jan 25 17:46:49 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Sun, 25 Jan 2015 10:46:49 -0600 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <54C51D38.9000103@egenix.com> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> <54C51D38.9000103@egenix.com> Message-ID: On Sun, Jan 25, 2015 at 10:43 AM, M.-A. Lemburg wrote: > On 25.01.2015 16:34, Donald Stufft wrote: >> >>> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >>> >>> I think you ought to make a more prominent announcement on >>> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). >> >> What is c.l.p and c.l.p.a? > > That's short for comp.lang.python and comp.lang.python.announce, > which are gateway'ed to mailing lists: > > https://mail.python.org/mailman/listinfo/python-list > https://mail.python.org/mailman/listinfo/python-announce-list > >> There is no distutils blog. > > Given how quickly things change, I think this would be a good > way of getting the word out to a wider audience than the > distutils mailing list. I've only been on this list for about a year. This PEP (and others like it) has been in motion for quite a while. I think the blog would miss far more people than the mailing list would and I'm not sure I agree with "how quickly things change". There doesn't seem to be much content for the blog and it seems like something that would just become neglected. What topics do you really think would be better suited for a blog post than a message to here and related announcement lists? From chris.jerdonek at gmail.com Sun Jan 25 17:51:09 2015 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Sun, 25 Jan 2015 08:51:09 -0800 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <54C51D38.9000103@egenix.com> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> <54C51D38.9000103@egenix.com> Message-ID: On Sun, Jan 25, 2015 at 8:43 AM, M.-A. Lemburg wrote: > On 25.01.2015 16:34, Donald Stufft wrote: >> >>> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >>> >>> I think you ought to make a more prominent announcement on >>> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). >> >> What is c.l.p and c.l.p.a? > > That's short for comp.lang.python and comp.lang.python.announce, > which are gateway'ed to mailing lists: > > https://mail.python.org/mailman/listinfo/python-list > https://mail.python.org/mailman/listinfo/python-announce-list > >> There is no distutils blog. > > Given how quickly things change, I think this would be a good > way of getting the word out to a wider audience than the > distutils mailing list. A Twitter account seems ideal for this type of thing. The bar for following a Twitter account is a lot lower (and easier to do) than following a blog or subscribing to a mailing list, so I think more people could eventually be reached. Retweeting also allows word to be spread more easily and quickly. --Chris From mal at egenix.com Sun Jan 25 17:52:46 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sun, 25 Jan 2015 17:52:46 +0100 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> <54C51D38.9000103@egenix.com> Message-ID: <54C51F5E.3030603@egenix.com> On 25.01.2015 17:46, Ian Cordasco wrote: > On Sun, Jan 25, 2015 at 10:43 AM, M.-A. Lemburg wrote: >> On 25.01.2015 16:34, Donald Stufft wrote: >>> >>>> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >>>> >>>> I think you ought to make a more prominent announcement on >>>> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). >>> >>> What is c.l.p and c.l.p.a? >> >> That's short for comp.lang.python and comp.lang.python.announce, >> which are gateway'ed to mailing lists: >> >> https://mail.python.org/mailman/listinfo/python-list >> https://mail.python.org/mailman/listinfo/python-announce-list >> >>> There is no distutils blog. >> >> Given how quickly things change, I think this would be a good >> way of getting the word out to a wider audience than the >> distutils mailing list. > > I've only been on this list for about a year. This PEP (and others > like it) has been in motion for quite a while. I think the blog would > miss far more people than the mailing list would and I'm not sure I > agree with "how quickly things change". There doesn't seem to be much > content for the blog and it seems like something that would just > become neglected. What topics do you really think would be better > suited for a blog post than a message to here and related announcement > lists? The blog posts could automatically get copied over to Twitter and mailing lists using e.g. IFTTT, and it would be possible to subscribe using RSS/Atom feeds. The advantage of a blog is having news entries persist and be easy to find, while at the same time simplifying the whole publishing process. Anyway, just a suggestion. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jan 25 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From graffatcolmingov at gmail.com Sun Jan 25 18:01:12 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Sun, 25 Jan 2015 11:01:12 -0600 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <54C51F5E.3030603@egenix.com> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> <54C51D38.9000103@egenix.com> <54C51F5E.3030603@egenix.com> Message-ID: On Sun, Jan 25, 2015 at 10:52 AM, M.-A. Lemburg wrote: > On 25.01.2015 17:46, Ian Cordasco wrote: >> On Sun, Jan 25, 2015 at 10:43 AM, M.-A. Lemburg wrote: >>> On 25.01.2015 16:34, Donald Stufft wrote: >>>> >>>>> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >>>>> >>>>> I think you ought to make a more prominent announcement on >>>>> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). >>>> >>>> What is c.l.p and c.l.p.a? >>> >>> That's short for comp.lang.python and comp.lang.python.announce, >>> which are gateway'ed to mailing lists: >>> >>> https://mail.python.org/mailman/listinfo/python-list >>> https://mail.python.org/mailman/listinfo/python-announce-list >>> >>>> There is no distutils blog. >>> >>> Given how quickly things change, I think this would be a good >>> way of getting the word out to a wider audience than the >>> distutils mailing list. >> >> I've only been on this list for about a year. This PEP (and others >> like it) has been in motion for quite a while. I think the blog would >> miss far more people than the mailing list would and I'm not sure I >> agree with "how quickly things change". There doesn't seem to be much >> content for the blog and it seems like something that would just >> become neglected. What topics do you really think would be better >> suited for a blog post than a message to here and related announcement >> lists? > > The blog posts could automatically get copied over to > Twitter and mailing lists using e.g. IFTTT, and it would > be possible to subscribe using RSS/Atom feeds. > > The advantage of a blog is having news entries persist and be > easy to find, while at the same time simplifying the whole > publishing process. > > Anyway, just a suggestion. I still don't understand the benefit. You could just as easily have the mailing list mirrored to a blog if we felt it would help but distutils-sig has public archives and (I think) is indexed by Google. Donald, Paul, Nick, Richard, Jason, and everyone else whose name I'm forgetting are already working a lot on PyPI, packaging, pip, setuptools, etc. I don't see the benefit in maintaining the blog on top of their other responsibilities. Maybe if someone else steps up to be an editor and take mailing list posts and turn them into blog posts, that'd be cool, but I don't see the value in making them maintain yet another thing on top of what they do. From donald at stufft.io Sun Jan 25 18:27:34 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 25 Jan 2015 12:27:34 -0500 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> <54C51D38.9000103@egenix.com> <54C51F5E.3030603@egenix.com> Message-ID: > On Jan 25, 2015, at 12:01 PM, Ian Cordasco wrote: > > On Sun, Jan 25, 2015 at 10:52 AM, M.-A. Lemburg wrote: >> On 25.01.2015 17:46, Ian Cordasco wrote: >>> On Sun, Jan 25, 2015 at 10:43 AM, M.-A. Lemburg wrote: >>>> On 25.01.2015 16:34, Donald Stufft wrote: >>>>> >>>>>> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >>>>>> >>>>>> I think you ought to make a more prominent announcement on >>>>>> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). >>>>> >>>>> What is c.l.p and c.l.p.a? >>>> >>>> That's short for comp.lang.python and comp.lang.python.announce, >>>> which are gateway'ed to mailing lists: >>>> >>>> https://mail.python.org/mailman/listinfo/python-list >>>> https://mail.python.org/mailman/listinfo/python-announce-list >>>> >>>>> There is no distutils blog. >>>> >>>> Given how quickly things change, I think this would be a good >>>> way of getting the word out to a wider audience than the >>>> distutils mailing list. >>> >>> I've only been on this list for about a year. This PEP (and others >>> like it) has been in motion for quite a while. I think the blog would >>> miss far more people than the mailing list would and I'm not sure I >>> agree with "how quickly things change". There doesn't seem to be much >>> content for the blog and it seems like something that would just >>> become neglected. What topics do you really think would be better >>> suited for a blog post than a message to here and related announcement >>> lists? >> >> The blog posts could automatically get copied over to >> Twitter and mailing lists using e.g. IFTTT, and it would >> be possible to subscribe using RSS/Atom feeds. >> >> The advantage of a blog is having news entries persist and be >> easy to find, while at the same time simplifying the whole >> publishing process. >> >> Anyway, just a suggestion. > > I still don't understand the benefit. You could just as easily have > the mailing list mirrored to a blog if we felt it would help but > distutils-sig has public archives and (I think) is indexed by Google. > Donald, Paul, Nick, Richard, Jason, and everyone else whose name I'm > forgetting are already working a lot on PyPI, packaging, pip, > setuptools, etc. I don't see the benefit in maintaining the blog on > top of their other responsibilities. Maybe if someone else steps up to > be an editor and take mailing list posts and turn them into blog > posts, that'd be cool, but I don't see the value in making them > maintain yet another thing on top of what they do. I?ve thought about doing something like this https://devcenter.heroku.com/changelog but I hadn?t had the time to actually do anything to make it a thing. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From randy at thesyrings.us Sun Jan 25 20:45:11 2015 From: randy at thesyrings.us (Randy Syring) Date: Sun, 25 Jan 2015 14:45:11 -0500 Subject: [Distutils] PyPA Announcements & Twitter Message-ID: <54C547C7.3060102@thesyrings.us> I'd like to pick up a conversation that was happening on another thread: >> I've only been on this list for about a year. This PEP (and others >> like it) has been in motion for quite a while. I think the blog would >> miss far more people than the mailing list would and I'm not sure I >> agree with "how quickly things change". There doesn't seem to be much >> content for the blog and it seems like something that would just >> become neglected. What topics do you really think would be better >> suited for a blog post than a message to here and related announcement >> lists? > > The blog posts could automatically get copied over to > Twitter and mailing lists using e.g. IFTTT, and it would > be possible to subscribe using RSS/Atom feeds. > > The advantage of a blog is having news entries persist and be > easy to find, while at the same time simplifying the whole > publishing process. > > Anyway, just a suggestion. pypa.io also says: > They host projects on github and bitbucket > , and discuss issues on the pypa-dev > and distutils-sig > mailing lists. I think the issue with the current mailing lists is one of signal-to-noise ratio. Let me give you a little background: I maintain a small number of pypi published packages. I'm very interested in using best practices and keeping up with the more significant changes in the Python packaging ecosystem. I joined the distutils-sig list to stay informed. However, the more detailed discussions that happen on the distutils-sig list are not of much value to me. I know they are important and I very much appreciate those who are pushing things forward, it's just too much detail for my context. So, I'm on the distituls-sig list, but there is a lot of "noise" for me and not much "signal." I would rather be a part of some channel for receiving announcements that relate to the broader community of those who maintain & release packages and need to stay abreast of python packing ecosystem & best practice changes. Therefore, I'd like to propose something like a packaging-announce mailing list. This could be linked to the PyPA Twitter account. Then, announcements that pertain to the broader Python community would be cross posted to the distutils-sig and packaging-announce (and therefore twitter). The recent post to distutils-sig "PyPI Rendering Switched" would be a great example of something that should be more broadly announced. If there is no one else currently working in the PyPA admin world who wants to do it, I'd be willing to setup and maintain the packaging-announce list as well as work on linking that list so it automatically posts to the twitter account (and take care of manually posting in the mean time). Then, the question becomes, what gets put on packaging-announce. My guess is that it won't be hard to figure out who should have the authority to post and the overhead for making the announcement becomes exceptionally small, simply add another "to" line on the email that would have been sent to distutils-sig anyway. Feedback welcome. Thanks. *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.jerdonek at gmail.com Sun Jan 25 20:52:51 2015 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Sun, 25 Jan 2015 11:52:51 -0800 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <54C547C7.3060102@thesyrings.us> References: <54C547C7.3060102@thesyrings.us> Message-ID: On Sun, Jan 25, 2015 at 11:45 AM, Randy Syring wrote: > I'd like to pick up a conversation that was happening on another thread: > > I've only been on this list for about a year. This PEP (and others > like it) has been in motion for quite a while. I think the blog would > miss far more people than the mailing list would and I'm not sure I > agree with "how quickly things change". There doesn't seem to be much > content for the blog and it seems like something that would just > become neglected. What topics do you really think would be better > suited for a blog post than a message to here and related announcement > lists? > > The blog posts could automatically get copied over to > Twitter and mailing lists using e.g. IFTTT, and it would > be possible to subscribe using RSS/Atom feeds. > > The advantage of a blog is having news entries persist and be > easy to find, while at the same time simplifying the whole > publishing process. > > Anyway, just a suggestion. > > > pypa.io also says: > > They host projects on github and bitbucket, and discuss issues on the > pypa-dev and distutils-sig mailing lists. > > > I think the issue with the current mailing lists is one of signal-to-noise > ratio. Let me give you a little background: > > I maintain a small number of pypi published packages. I'm very interested > in using best practices and keeping up with the more significant changes in > the Python packaging ecosystem. I joined the distutils-sig list to stay > informed. > > However, the more detailed discussions that happen on the distutils-sig list > are not of much value to me. I know they are important and I very much > appreciate those who are pushing things forward, it's just too much detail > for my context. > > So, I'm on the distituls-sig list, but there is a lot of "noise" for me and > not much "signal." I would rather be a part of some channel for receiving > announcements that relate to the broader community of those who maintain & > release packages and need to stay abreast of python packing ecosystem & best > practice changes. > > Therefore, I'd like to propose something like a packaging-announce mailing > list. This could be linked to the PyPA Twitter account. Just to clarify, PyPA doesn't have a Twitter account yet, right? --Chris From randy at thesyrings.us Sun Jan 25 20:57:54 2015 From: randy at thesyrings.us (Randy Syring) Date: Sun, 25 Jan 2015 14:57:54 -0500 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: <54C54AC2.2020300@thesyrings.us> On 01/25/2015 02:52 PM, Chris Jerdonek wrote: > Just to clarify, PyPA doesn't have a Twitter account yet, right? --Chris They do: https://twitter.com/thepypa *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From msabramo at gmail.com Sat Jan 24 23:11:04 2015 From: msabramo at gmail.com (Marc Abramowitz) Date: Sat, 24 Jan 2015 14:11:04 -0800 Subject: [Distutils] PyPI Rendering Switched In-Reply-To: References: Message-ID: <2DF0D744-935F-47AF-835D-B62C34E01D7F@gmail.com> > On Jan 24, 2015, at 12:43 PM, Donald Stufft wrote: > > I've switched the rendering on PyPI away from using a big blog of code that > was embedded into PyPI to the readme library (https://pypi.python.org/pypi/readme). *Applause* This is awesome news! Thanks for working on this! Marc From pydanny at gmail.com Sat Jan 24 22:16:50 2015 From: pydanny at gmail.com (Daniel Greenfeld) Date: Sat, 24 Jan 2015 13:16:50 -0800 Subject: [Distutils] PyPI Rendering Switched In-Reply-To: References: Message-ID: Thank you Donald! This is wonderful! On Sat, Jan 24, 2015 at 12:43 PM, Donald Stufft wrote: > I've switched the rendering on PyPI away from using a big blog of code that > was embedded into PyPI to the readme library (https://pypi.python.org/pypi/readme). > Currently this only has a single API (``import readme.rst; readme.rst.render(...)``) > however in the near future I want to provide a CLI for this so people can > more easily test their READMEs against how PyPI renders them. > > The only real change should be some slight themeing differences. Some things > might be valid that didn't used to be valid, and certain error conditions will > now render escaped HTML instead of just failing to render. > > Note, for the next 24 hours you may need to force refresh your browser. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -- 'Knowledge is Power' Daniel Greenfeld Principal at Cartwheel Web; co-author of Two Scoops of Django twoscoopspress.org | pydanny.com From msabramo at gmail.com Sun Jan 25 17:21:23 2015 From: msabramo at gmail.com (Marc Abramowitz) Date: Sun, 25 Jan 2015 08:21:23 -0800 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> Message-ID: On Jan 25, 2015, at 7:34 AM, Donald Stufft wrote: > > >> On Jan 25, 2015, at 9:32 AM, M.-A. Lemburg wrote: >> >> I think you ought to make a more prominent announcement on >> c.l.p, c.l.p.a and perhaps a distutils blog (if there is one). > > What is c.l.p and c.l.p.a? > > There is no distutils blog. How about the PyPA Twitter account? From msabramo at gmail.com Sun Jan 25 21:03:20 2015 From: msabramo at gmail.com (Marc Abramowitz) Date: Sun, 25 Jan 2015 12:03:20 -0800 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: Actually, they do: https://twitter.com/ThePyPA But I don't know if they are in the habit of thinking to use it every time something worthy happens. On Sun, Jan 25, 2015 at 11:52 AM, Chris Jerdonek wrote: > On Sun, Jan 25, 2015 at 11:45 AM, Randy Syring > wrote: > > I'd like to pick up a conversation that was happening on another thread: > > > > I've only been on this list for about a year. This PEP (and others > > like it) has been in motion for quite a while. I think the blog would > > miss far more people than the mailing list would and I'm not sure I > > agree with "how quickly things change". There doesn't seem to be much > > content for the blog and it seems like something that would just > > become neglected. What topics do you really think would be better > > suited for a blog post than a message to here and related announcement > > lists? > > > > The blog posts could automatically get copied over to > > Twitter and mailing lists using e.g. IFTTT, and it would > > be possible to subscribe using RSS/Atom feeds. > > > > The advantage of a blog is having news entries persist and be > > easy to find, while at the same time simplifying the whole > > publishing process. > > > > Anyway, just a suggestion. > > > > > > pypa.io also says: > > > > They host projects on github and bitbucket, and discuss issues on the > > pypa-dev and distutils-sig mailing lists. > > > > > > I think the issue with the current mailing lists is one of > signal-to-noise > > ratio. Let me give you a little background: > > > > I maintain a small number of pypi published packages. I'm very > interested > > in using best practices and keeping up with the more significant changes > in > > the Python packaging ecosystem. I joined the distutils-sig list to stay > > informed. > > > > However, the more detailed discussions that happen on the distutils-sig > list > > are not of much value to me. I know they are important and I very much > > appreciate those who are pushing things forward, it's just too much > detail > > for my context. > > > > So, I'm on the distituls-sig list, but there is a lot of "noise" for me > and > > not much "signal." I would rather be a part of some channel for > receiving > > announcements that relate to the broader community of those who maintain > & > > release packages and need to stay abreast of python packing ecosystem & > best > > practice changes. > > > > Therefore, I'd like to propose something like a packaging-announce > mailing > > list. This could be linked to the PyPA Twitter account. > > Just to clarify, PyPA doesn't have a Twitter account yet, right? > > --Chris > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From msabramo at gmail.com Sun Jan 25 21:33:47 2015 From: msabramo at gmail.com (Marc Abramowitz) Date: Sun, 25 Jan 2015 12:33:47 -0800 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <54C547C7.3060102@thesyrings.us> References: <54C547C7.3060102@thesyrings.us> Message-ID: > On Jan 25, 2015, at 11:45 AM, Randy Syring wrote: > > Therefore, I'd like to propose something like a packaging-announce mailing list. This could be linked to the PyPA Twitter account. Then, announcements that pertain to the broader Python community would be cross posted to the distutils-sig and packaging-announce (and therefore twitter). +1 to this. I think the key observation here is that there are two audiences when it comes to packaging: 1. A large audience of folks who are package authors and maintainers, who want to know about the latest developments and best practices. 2. The smaller audience of folks who know packaging well and are working hard to improve it and having discussions among themselves on how to best move things forward. I don't know of an information channel that serves #1 well. The ones I know of try to handle both. I think having different channels for each group will improve the experience for both groups. Randy, I'm willing to help if you want. Marc From xav.fernandez at gmail.com Sun Jan 25 22:33:58 2015 From: xav.fernandez at gmail.com (Xavier Fernandez) Date: Sun, 25 Jan 2015 22:33:58 +0100 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <54C54AC2.2020300@thesyrings.us> References: <54C547C7.3060102@thesyrings.us> <54C54AC2.2020300@thesyrings.us> Message-ID: Maybe those announcements could simply/also go to https://www.python.org/blogs/ ? On Sun, Jan 25, 2015 at 8:57 PM, Randy Syring wrote: > On 01/25/2015 02:52 PM, Chris Jerdonek wrote: > > Just to clarify, PyPA doesn't have a Twitter account yet, right? --Chris > > > They do: https://twitter.com/thepypa > > > *Randy Syring* > Husband | Father | Redeemed Sinner > > > *"For what does it profit a man to gain the whole world and forfeit his > soul?" (Mark 8:36 ESV)* > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Mon Jan 26 12:46:13 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 26 Jan 2015 21:46:13 +1000 Subject: [Distutils] PEP 440 on PyPI In-Reply-To: References: <3E793206-0A4E-4591-A4D9-4CF648FDC79A@stufft.io> <54C4FE68.9070200@egenix.com> <58BEA451-2274-47F0-80F6-A4E31916DFE8@stufft.io> <54C51D38.9000103@egenix.com> <54C51F5E.3030603@egenix.com> Message-ID: On 26 January 2015 at 03:27, Donald Stufft wrote: > >> On Jan 25, 2015, at 12:01 PM, Ian Cordasco wrote: >> >> On Sun, Jan 25, 2015 at 10:52 AM, M.-A. Lemburg wrote: >>> The advantage of a blog is having news entries persist and be >>> easy to find, while at the same time simplifying the whole >>> publishing process. >>> >>> Anyway, just a suggestion. >> >> I still don't understand the benefit. You could just as easily have >> the mailing list mirrored to a blog if we felt it would help but >> distutils-sig has public archives and (I think) is indexed by Google. >> Donald, Paul, Nick, Richard, Jason, and everyone else whose name I'm >> forgetting are already working a lot on PyPI, packaging, pip, >> setuptools, etc. I don't see the benefit in maintaining the blog on >> top of their other responsibilities. Maybe if someone else steps up to >> be an editor and take mailing list posts and turn them into blog >> posts, that'd be cool, but I don't see the value in making them >> maintain yet another thing on top of what they do. > > I?ve thought about doing something like this https://devcenter.heroku.com/changelog > but I hadn?t had the time to actually do anything to make it a thing. I think that's probably the kind of thing that would be most useful, rather than a full blog. It could link back to the relevant distutils-sig posts in the archives for additional context. The key is to have a place to track and report changes to PyPI-the-service rather than PyPI-the-software-project or PyPA-the-ecosystem. Given that "single source of truth", we can figure out how best to send it other places (e.g. RSS, Twitter). (RSS in particular could be syndicated to Planet Python) As far as "why isn't the mailing list enough?" goes, the problem with the list is that it also handles a lot of *other* traffic. Similarly, @pypi on Twitter reports all the package updates, so it's too noisy to be particularly useful to report "Hey, we just changed something on PyPI, if you get unexpected behaviour, check if it may be related!" While having such a feed available won't necessarily stop people from being surprised the first time they encounter an unexpected change, being able to answer "How can I avoid being surprised like this again?" with "Subscribe to this tightly focused RSS feed" can make the difference between a genuinely unhappy user (they got a nasty surprise, and have no low investment way to reduce the risk of it happening again in the future), and a temporarily annoyed one (they still got a nasty surprise, but they also learned about a useful information feed highly relevant to their interests). Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Mon Jan 26 12:54:03 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 26 Jan 2015 21:54:03 +1000 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: On 26 January 2015 at 06:33, Marc Abramowitz wrote: > >> On Jan 25, 2015, at 11:45 AM, Randy Syring wrote: >> >> Therefore, I'd like to propose something like a packaging-announce mailing list. This could be linked to the PyPA Twitter account. Then, announcements that pertain to the broader Python community would be cross posted to the distutils-sig and packaging-announce (and therefore twitter). > > +1 to this. > > I think the key observation here is that there are two audiences when it comes to packaging: > > 1. A large audience of folks who are package authors and maintainers, who want to know about the latest developments and best practices. > > 2. The smaller audience of folks who know packaging well and are working hard to improve it and having discussions among themselves on how to best move things forward. > > I don't know of an information channel that serves #1 well. The ones I know of try to handle both. I think having different channels for each group will improve the experience for both groups. I also had a look at the current contents of the python-announce list, and I suspect even that would be a bit high volume if we had significant PyPI and core packaging toolchain announcements intermingled with the other existing announcements related to conferences and various popular Python packages. So a packaging toolchain specific changelog/announcement channel likely makes sense. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From randy at thesyrings.us Mon Jan 26 17:33:53 2015 From: randy at thesyrings.us (Randy Syring) Date: Mon, 26 Jan 2015 11:33:53 -0500 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: <54C66C71.4090603@thesyrings.us> On 01/26/2015 06:54 AM, Nick Coghlan wrote: > I also had a look at the current contents of the python-announce list, > and I suspect even that would be a bit high volume if we had > significant PyPI and core packaging toolchain announcements > intermingled with the other existing announcements related to > conferences and various popular Python packages. FWIW, I did the same and ended up where you did. > > So a packaging toolchain specific changelog/announcement channel > likely makes sense. > Based on another of Nick's posts, it seems like this mailing list would want to include "significant" news/announcements for the following areas: * PyPI-the-service * PyPI-the-software-project * PyPA-the-ecosystem IMO, these areas are often linked and someone who is interested in one of them is likely interested in the others or, at least, not very annoyed at the prospect of seeing announcements related to them. So, setting up the list and the twitter connection is likely pretty easy. However, it won't work unless those who are doing the work in these areas are willing to "flag" the changes happening that need to be communicated on the packaging-announce list. Who would those people be and do they like this idea enough to be willing to support it? Is there some kind of PyPA board or committee that would need to be a part of this decision? *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From qwcode at gmail.com Mon Jan 26 18:14:31 2015 From: qwcode at gmail.com (Marcus Smith) Date: Mon, 26 Jan 2015 09:14:31 -0800 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: +1 to a pypa-announce list. I personally care more about the list than twitter. at this point, probably need to post the idea to pypa-dev, and get a few +1's there, and get someone to agree to execute on the idea. there's a few people there that don't monitor distutils-sig Marcus On Sun, Jan 25, 2015 at 12:33 PM, Marc Abramowitz wrote: > > > On Jan 25, 2015, at 11:45 AM, Randy Syring wrote: > > > > Therefore, I'd like to propose something like a packaging-announce > mailing list. This could be linked to the PyPA Twitter account. Then, > announcements that pertain to the broader Python community would be cross > posted to the distutils-sig and packaging-announce (and therefore twitter). > > +1 to this. > > I think the key observation here is that there are two audiences when it > comes to packaging: > > 1. A large audience of folks who are package authors and maintainers, who > want to know about the latest developments and best practices. > > 2. The smaller audience of folks who know packaging well and are working > hard to improve it and having discussions among themselves on how to best > move things forward. > > I don't know of an information channel that serves #1 well. The ones I > know of try to handle both. I think having different channels for each > group will improve the experience for both groups. > > Randy, I'm willing to help if you want. > > Marc > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From randy at thesyrings.us Mon Jan 26 18:36:05 2015 From: randy at thesyrings.us (Randy Syring) Date: Mon, 26 Jan 2015 12:36:05 -0500 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: <54C67B05.2020905@thesyrings.us> On 01/26/2015 12:14 PM, Marcus Smith wrote: > +1 to a pypa-announce list. > I personally care more about the list than twitter. > at this point, probably need to post the idea to pypa-dev, and get a > few +1's there, and get someone to agree to execute on the idea. > there's a few people there that don't monitor distutils-sig > Marcus > > Thanks for the pointer. I put a post out on pypa-dev. Randy From jannis at leidel.info Mon Jan 26 19:27:31 2015 From: jannis at leidel.info (Jannis Leidel) Date: Mon, 26 Jan 2015 19:27:31 +0100 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: References: <54C547C7.3060102@thesyrings.us> Message-ID: <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> > On 25 Jan 2015, at 21:03, Marc Abramowitz wrote: > > Actually, they do: > > https://twitter.com/ThePyPA > > But I don't know if they are in the habit of thinking to use it every time something worthy happens. Currently this is a one man operation that could probably be hooked up with a RSS feed to be honest. So not something that you should rely on for anything important. But I'm +1 on pypa-announce, so I just created it over here: https://groups.google.com/forum/#!forum/pypa-announce Jannis > On Sun, Jan 25, 2015 at 11:52 AM, Chris Jerdonek wrote: > On Sun, Jan 25, 2015 at 11:45 AM, Randy Syring wrote: > > I'd like to pick up a conversation that was happening on another thread: > > > > I've only been on this list for about a year. This PEP (and others > > like it) has been in motion for quite a while. I think the blog would > > miss far more people than the mailing list would and I'm not sure I > > agree with "how quickly things change". There doesn't seem to be much > > content for the blog and it seems like something that would just > > become neglected. What topics do you really think would be better > > suited for a blog post than a message to here and related announcement > > lists? > > > > The blog posts could automatically get copied over to > > Twitter and mailing lists using e.g. IFTTT, and it would > > be possible to subscribe using RSS/Atom feeds. > > > > The advantage of a blog is having news entries persist and be > > easy to find, while at the same time simplifying the whole > > publishing process. > > > > Anyway, just a suggestion. > > > > > > pypa.io also says: > > > > They host projects on github and bitbucket, and discuss issues on the > > pypa-dev and distutils-sig mailing lists. > > > > > > I think the issue with the current mailing lists is one of signal-to-noise > > ratio. Let me give you a little background: > > > > I maintain a small number of pypi published packages. I'm very interested > > in using best practices and keeping up with the more significant changes in > > the Python packaging ecosystem. I joined the distutils-sig list to stay > > informed. > > > > However, the more detailed discussions that happen on the distutils-sig list > > are not of much value to me. I know they are important and I very much > > appreciate those who are pushing things forward, it's just too much detail > > for my context. > > > > So, I'm on the distituls-sig list, but there is a lot of "noise" for me and > > not much "signal." I would rather be a part of some channel for receiving > > announcements that relate to the broader community of those who maintain & > > release packages and need to stay abreast of python packing ecosystem & best > > practice changes. > > > > Therefore, I'd like to propose something like a packaging-announce mailing > > list. This could be linked to the PyPA Twitter account. > > Just to clarify, PyPA doesn't have a Twitter account yet, right? > > --Chris > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From jannis at leidel.info Mon Jan 26 20:12:41 2015 From: jannis at leidel.info (Jannis Leidel) Date: Mon, 26 Jan 2015 20:12:41 +0100 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <54C690B4.6070608@thesyrings.us> References: <54C547C7.3060102@thesyrings.us> <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> <54C690B4.6070608@thesyrings.us> Message-ID: <4BB93F9C-4932-424F-93E5-04AA50F879EF@leidel.info> > On 26 Jan 2015, at 20:08, Randy Syring wrote: > > On 01/26/2015 01:27 PM, Jannis Leidel wrote: >>> On 25 Jan 2015, at 21:03, Marc Abramowitz >>> wrote: >>> >>> Actually, they do: >>> >>> >>> https://twitter.com/ThePyPA >>> >>> >>> But I don't know if they are in the habit of thinking to use it every time something worthy happens. >>> >> Currently this is a one man operation that could probably be hooked up with a RSS feed to be honest. So not something that you should rely on for anything important. >> >> But I'm +1 on pypa-announce, so I just created it over here: >> https://groups.google.com/forum/#!forum/pypa-announce >> >> >> Jannis >> > > I think it's important that the twitter account be used for to promote announcements on pypa-announce. Twitter can be shared/retweeted much easier and you are going to see a better spreading of messages that way. That was why I suggested we figure out a way to link the list to the twitter account automatically. > > Who "owns" the twitter account? Are they ok with having pypa announcements show up there? I do, and yeah, I'm okay for it to show up there. I'll set up a feed2twitter thing. I asked around a few months ago when I registered it if others in the PyPA team want access to it for the occasional manual announcement, but I agree that having a mailing list may be less controversial for the receiving parties. I didn't get many positive responses from the others anyway :) Jannis From randy at thesyrings.us Mon Jan 26 20:22:39 2015 From: randy at thesyrings.us (Randy Syring) Date: Mon, 26 Jan 2015 14:22:39 -0500 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <4BB93F9C-4932-424F-93E5-04AA50F879EF@leidel.info> References: <54C547C7.3060102@thesyrings.us> <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> <54C690B4.6070608@thesyrings.us> <4BB93F9C-4932-424F-93E5-04AA50F879EF@leidel.info> Message-ID: <54C693FF.8050108@thesyrings.us> On 01/26/2015 02:12 PM, Jannis Leidel wrote: >> On 26 Jan 2015, at 20:08, Randy Syring wrote: >> >> On 01/26/2015 01:27 PM, Jannis Leidel wrote: >>>> On 25 Jan 2015, at 21:03, Marc Abramowitz >>>> wrote: >>>> >>>> Actually, they do: >>>> >>>> >>>> https://twitter.com/ThePyPA >>>> >>>> >>>> But I don't know if they are in the habit of thinking to use it every time something worthy happens. >>>> >>> Currently this is a one man operation that could probably be hooked up with a RSS feed to be honest. So not something that you should rely on for anything important. >>> >>> But I'm +1 on pypa-announce, so I just created it over here: >>> https://groups.google.com/forum/#!forum/pypa-announce >>> >>> >>> Jannis >>> >> I think it's important that the twitter account be used for to promote announcements on pypa-announce. Twitter can be shared/retweeted much easier and you are going to see a better spreading of messages that way. That was why I suggested we figure out a way to link the list to the twitter account automatically. >> >> Who "owns" the twitter account? Are they ok with having pypa announcements show up there? > I do, and yeah, I'm okay for it to show up there. I'll set up a feed2twitter thing. I asked around a few months ago when I registered it if others in the PyPA team want access to it for the occasional manual announcement, but I agree that having a mailing list may be less controversial for the receiving parties. I didn't get many positive responses from the others anyway :) > > Jannis > It seems this discussion might now be more appropriate on the pypa-dev list. I'll pick it up with you there. *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jan 27 07:30:20 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 27 Jan 2015 16:30:20 +1000 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <4BB93F9C-4932-424F-93E5-04AA50F879EF@leidel.info> References: <54C547C7.3060102@thesyrings.us> <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> <54C690B4.6070608@thesyrings.us> <4BB93F9C-4932-424F-93E5-04AA50F879EF@leidel.info> Message-ID: On 27 January 2015 at 05:12, Jannis Leidel wrote: >> Who "owns" the twitter account? Are they ok with having pypa announcements show up there? > > I do, and yeah, I'm okay for it to show up there. I'll set up a feed2twitter thing. I asked around a few months ago when I registered it if others in the PyPA team want access to it for the occasional manual announcement, but I agree that having a mailing list may be less controversial for the receiving parties. I didn't get many positive responses from the others anyway :) Thanks for tackling this folks. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From brett at python.org Tue Jan 27 16:34:02 2015 From: brett at python.org (Brett Cannon) Date: Tue, 27 Jan 2015 15:34:02 +0000 Subject: [Distutils] PyPA Announcements & Twitter References: <54C547C7.3060102@thesyrings.us> <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> Message-ID: On Mon Jan 26 2015 at 1:29:50 PM Jannis Leidel wrote: > > > On 25 Jan 2015, at 21:03, Marc Abramowitz wrote: > > > > Actually, they do: > > > > https://twitter.com/ThePyPA > > > > But I don't know if they are in the habit of thinking to use it every > time something worthy happens. > > Currently this is a one man operation that could probably be hooked up > with a RSS feed to be honest. So not something that you should rely on for > anything important. > > But I'm +1 on pypa-announce, so I just created it over here: > https://groups.google.com/forum/#!forum/pypa-announce > > So is that it then and this is where it's going to live and I just go and subscribe, or is more of a discussion gong to happen on pypa-dev and there will be some announcement email over here when this is official? -Brett > Jannis > > > On Sun, Jan 25, 2015 at 11:52 AM, Chris Jerdonek < > chris.jerdonek at gmail.com> wrote: > > On Sun, Jan 25, 2015 at 11:45 AM, Randy Syring > wrote: > > > I'd like to pick up a conversation that was happening on another > thread: > > > > > > I've only been on this list for about a year. This PEP (and others > > > like it) has been in motion for quite a while. I think the blog would > > > miss far more people than the mailing list would and I'm not sure I > > > agree with "how quickly things change". There doesn't seem to be much > > > content for the blog and it seems like something that would just > > > become neglected. What topics do you really think would be better > > > suited for a blog post than a message to here and related announcement > > > lists? > > > > > > The blog posts could automatically get copied over to > > > Twitter and mailing lists using e.g. IFTTT, and it would > > > be possible to subscribe using RSS/Atom feeds. > > > > > > The advantage of a blog is having news entries persist and be > > > easy to find, while at the same time simplifying the whole > > > publishing process. > > > > > > Anyway, just a suggestion. > > > > > > > > > pypa.io also says: > > > > > > They host projects on github and bitbucket, and discuss issues on the > > > pypa-dev and distutils-sig mailing lists. > > > > > > > > > I think the issue with the current mailing lists is one of > signal-to-noise > > > ratio. Let me give you a little background: > > > > > > I maintain a small number of pypi published packages. I'm very > interested > > > in using best practices and keeping up with the more significant > changes in > > > the Python packaging ecosystem. I joined the distutils-sig list to > stay > > > informed. > > > > > > However, the more detailed discussions that happen on the > distutils-sig list > > > are not of much value to me. I know they are important and I very much > > > appreciate those who are pushing things forward, it's just too much > detail > > > for my context. > > > > > > So, I'm on the distituls-sig list, but there is a lot of "noise" for > me and > > > not much "signal." I would rather be a part of some channel for > receiving > > > announcements that relate to the broader community of those who > maintain & > > > release packages and need to stay abreast of python packing ecosystem > & best > > > practice changes. > > > > > > Therefore, I'd like to propose something like a packaging-announce > mailing > > > list. This could be linked to the PyPA Twitter account. > > > > Just to clarify, PyPA doesn't have a Twitter account yet, right? > > > > --Chris > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > > > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From randy at thesyrings.us Mon Jan 26 20:08:36 2015 From: randy at thesyrings.us (Randy Syring) Date: Mon, 26 Jan 2015 14:08:36 -0500 Subject: [Distutils] PyPA Announcements & Twitter In-Reply-To: <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> References: <54C547C7.3060102@thesyrings.us> <1FA56BCD-4E85-4CBF-8D0C-C0EB66C3D78D@leidel.info> Message-ID: <54C690B4.6070608@thesyrings.us> On 01/26/2015 01:27 PM, Jannis Leidel wrote: >> On 25 Jan 2015, at 21:03, Marc Abramowitz wrote: >> >> Actually, they do: >> >> https://twitter.com/ThePyPA >> >> But I don't know if they are in the habit of thinking to use it every time something worthy happens. > Currently this is a one man operation that could probably be hooked up with a RSS feed to be honest. So not something that you should rely on for anything important. > > But I'm +1 on pypa-announce, so I just created it over here: https://groups.google.com/forum/#!forum/pypa-announce > > Jannis I think it's important that the twitter account be used for to promote announcements on pypa-announce. Twitter can be shared/retweeted much easier and you are going to see a better spreading of messages that way. That was why I suggested we figure out a way to link the list to the twitter account automatically. Who "owns" the twitter account? Are they ok with having pypa announcements show up there? *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at simplistix.co.uk Thu Jan 29 12:55:43 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Thu, 29 Jan 2015 11:55:43 +0000 Subject: [Distutils] binary wheels, linux and ucs2/4 Message-ID: <54CA1FBF.4010902@simplistix.co.uk> Hi All, So, we have an internal wheel repo where we put binary wheels for those that need them. We just hit an issue where UCS4 machines have started pulling down a PyCrypto wheel called pycrypto-2.6.1-cp27-none-linux_x86_64.whl, which was built for UCS2, resulting in ugly exceptions along the lines of: Crypto/Cipher/_ARC4.so: undefined symbol: PyUnicodeUCS2_FromString I had a hunt and found: https://bitbucket.org/pypa/wheel/issue/101/python-wheels-need-ucs-tag-on-2x https://bitbucket.org/pypa/wheel/issue/63/backport-soabi-to-python-2 ...but I'm none the wiser. What's the state of play with this? How can we build PyCrypto for Python 2, both UCS2 and UCS4, and have them both in the same repo and have pip pick the right one? cheers, Chris From p.f.moore at gmail.com Thu Jan 29 13:45:36 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 29 Jan 2015 12:45:36 +0000 Subject: [Distutils] binary wheels, linux and ucs2/4 In-Reply-To: <54CA1FBF.4010902@simplistix.co.uk> References: <54CA1FBF.4010902@simplistix.co.uk> Message-ID: On 29 January 2015 at 11:55, Chris Withers wrote: > ...but I'm none the wiser. What's the state of play with this? Currently, the tags don't include the UCS setting. "It's being worked on" (as with most other aspects of Linux ABI compatibility). > How can we build PyCrypto for Python 2, both UCS2 and UCS4, and have them > both in the same repo and have pip pick the right one? At the moment, you can't, you'll need separate UCS2 and UCS4 repos. Paul From ncoghlan at gmail.com Thu Jan 29 14:37:26 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 29 Jan 2015 23:37:26 +1000 Subject: [Distutils] binary wheels, linux and ucs2/4 In-Reply-To: References: <54CA1FBF.4010902@simplistix.co.uk> Message-ID: On 29 January 2015 at 22:45, Paul Moore wrote: > On 29 January 2015 at 11:55, Chris Withers wrote: >> ...but I'm none the wiser. What's the state of play with this? > > Currently, the tags don't include the UCS setting. "It's being worked > on" (as with most other aspects of Linux ABI compatibility). It's more accurate to say "we're open to suggestions". My last idea (deriving a more specific platform tag from /etc/os-release) was shot down for several entirely valid reasons, so there isn't currently a concrete proposal on the table. I'm pretty sure we're all agreed that the current definition of the platform tag in PEP 425 is entirely inadequate for *nix systems other than Mac OS X, but a lot of us have other options there to use as a workaround or because it's what we were doing anyway (like using system packages instead of wheels for our binary distribution). So while I'll definitely need to help work out an agreed solution before we take the idea of a Fedora specific wheel repo beyond the pilot stage, we have no ETA on that at the moment, which means there's no guarantee on when I'll get to pushing such an enhancement forward myself. >> How can we build PyCrypto for Python 2, both UCS2 and UCS4, and have them >> both in the same repo and have pip pick the right one? > > At the moment, you can't, you'll need separate UCS2 and UCS4 repos. This could potentially be worked around in pip and setuptools, by having them calculate a suitable wheel ABI tag on Python 2. If we're on CPython 2.x and sysconfig.get_config_var('SOABI') returns None, then we can calculate a synthetic SOABI tag as: * the start of the SOABI tag should be "cpython-" * the next two digits will be the major/minor of the release (i.e. 26 or 27) * the next character will be 'd' if sys.pydebug is set (I'm fairly sure, but double check this) * we can assume 'm' (for using pymalloc) safely enough * the final character will be 'u' if sys.maxunicode == 0x10ffff We're not going to add a new kind of ABI variation to Python 2 at this stage of its lifecycle, so that calculation should remain valid for as long as Python 2 remains in use. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From dholth at gmail.com Thu Jan 29 19:27:00 2015 From: dholth at gmail.com (Daniel Holth) Date: Thu, 29 Jan 2015 13:27:00 -0500 Subject: [Distutils] binary wheels, linux and ucs2/4 In-Reply-To: References: <54CA1FBF.4010902@simplistix.co.uk> Message-ID: On Thu, Jan 29, 2015 at 8:37 AM, Nick Coghlan wrote: > On 29 January 2015 at 22:45, Paul Moore wrote: >> On 29 January 2015 at 11:55, Chris Withers wrote: >>> ...but I'm none the wiser. What's the state of play with this? >> >> Currently, the tags don't include the UCS setting. "It's being worked >> on" (as with most other aspects of Linux ABI compatibility). > > It's more accurate to say "we're open to suggestions". My last idea > (deriving a more specific platform tag from /etc/os-release) was shot > down for several entirely valid reasons, so there isn't currently a > concrete proposal on the table. > > I'm pretty sure we're all agreed that the current definition of the > platform tag in PEP 425 is entirely inadequate for *nix systems other > than Mac OS X, but a lot of us have other options there to use as a > workaround or because it's what we were doing anyway (like using > system packages instead of wheels for our binary distribution). > > So while I'll definitely need to help work out an agreed solution > before we take the idea of a Fedora specific wheel repo beyond the > pilot stage, we have no ETA on that at the moment, which means there's > no guarantee on when I'll get to pushing such an enhancement forward > myself. > >>> How can we build PyCrypto for Python 2, both UCS2 and UCS4, and have them >>> both in the same repo and have pip pick the right one? >> >> At the moment, you can't, you'll need separate UCS2 and UCS4 repos. > > This could potentially be worked around in pip and setuptools, by > having them calculate a suitable wheel ABI tag on Python 2. > > If we're on CPython 2.x and sysconfig.get_config_var('SOABI') returns > None, then we can calculate a synthetic SOABI tag as: > > * the start of the SOABI tag should be "cpython-" > * the next two digits will be the major/minor of the release (i.e. 26 or 27) > * the next character will be 'd' if sys.pydebug is set (I'm fairly > sure, but double check this) > * we can assume 'm' (for using pymalloc) safely enough > * the final character will be 'u' if sys.maxunicode == 0x10ffff > > We're not going to add a new kind of ABI variation to Python 2 at this > stage of its lifecycle, so that calculation should remain valid for as > long as Python 2 remains in use. I would be very happy to receive a patch for: https://bitbucket.org/pypa/wheel/src/bdf053a70200c5857c250c2044a2d91da23db4a9/wheel/bdist_wheel.py?at=default#cl-150 abi_tag = sysconfig.get_config_vars().get('SOABI', 'none') if abi_tag.startswith('cpython-'): abi_tag = 'cp' + abi_tag.rsplit('-', 1)[-1] It would have to have a Python 2 implementation as Nick described. and this file which controls what can be installed (duplicated in pip) but also only works on py3: https://bitbucket.org/pypa/wheel/src/bdf053a70200c5857c250c2044a2d91da23db4a9/wheel/pep425tags.py?at=default#cl-65 abi3s = set() import imp for suffix in imp.get_suffixes(): if suffix[0].startswith('.abi'): abi3s.add(suffix[0].split('.', 2)[1]) From ben+python at benfinney.id.au Fri Jan 30 01:34:49 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Fri, 30 Jan 2015 11:34:49 +1100 Subject: [Distutils] Documenting the Distutils sdist format References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <851tmrrz63.fsf_-_@benfinney.id.au> Message-ID: <851tmdjec6.fsf@benfinney.id.au> (Ping; I didn't see a response and would like to start this work soon.) Nick Coghlan writes: > For projects under the PyPA banner, including packaging.python.org, > license in = license out is fine. That text is CC-BY-SA, so including > Apache Licensed text shouldn't pose any problems. Okay. So with which repository from should I start? Where would the documentation go, where would the regression tests go? You appear to be distinguishing the license for executable code versus non-executable documentation. What license conventions should I be aware of? If these are answered in a ?hacking? document somewhere, a response pointing me to that document would be fine. -- \ ?Quidquid latine dictum sit, altum viditur.? (?Whatever is | `\ said in Latin, sounds profound.?) ?anonymous | _o__) | Ben Finney From donald at stufft.io Fri Jan 30 04:09:17 2015 From: donald at stufft.io (Donald Stufft) Date: Thu, 29 Jan 2015 22:09:17 -0500 Subject: [Distutils] Documenting the Distutils sdist format In-Reply-To: <851tmdjec6.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <399CE685-7EE4-4B4A-A4B2-A58FF70423B0@stufft.io> <85vbk3su7i.fsf@benfinney.id.au> <85mw5fsf7i.fsf@benfinney.id.au> <851tmrrz63.fsf_-_@benfinney.id.au> <851tmdjec6.fsf@benfinney.id.au> Message-ID: > On Jan 29, 2015, at 7:34 PM, Ben Finney wrote: > > (Ping; I didn't see a response and would like to start this work soon.) > > Nick Coghlan writes: > >> For projects under the PyPA banner, including packaging.python.org, >> license in = license out is fine. That text is CC-BY-SA, so including >> Apache Licensed text shouldn't pose any problems. > > Okay. So with which repository from > should I start? Where would the documentation go, where would the > regression tests go? > > You appear to be distinguishing the license for executable code versus > non-executable documentation. What license conventions should I be aware > of? > > If these are answered in a ?hacking? document somewhere, a response > pointing me to that document would be fine. packaging.python.org (if you?re looking for that) is located on Github at https://github.com/pypa/python-packaging-user-guide. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From ethan at stoneleaf.us Fri Jan 30 04:52:50 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 29 Jan 2015 19:52:50 -0800 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85a91gud80.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> Message-ID: <54CB0012.2020203@stoneleaf.us> Ben, I appreciate the difficulty of trying to get sdists to work the way you want -- I'm in much the same boat myself. However, I feel that requiring a dependency simply for the installation of the main package (the main package doesn't actually use this install-dependency, correct?) is heavy-handed and should be avoided. Or have I misunderstood? -- ~Ethan~ -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: OpenPGP digital signature URL: From ben+python at benfinney.id.au Fri Jan 30 05:58:54 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Fri, 30 Jan 2015 15:58:54 +1100 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= References: <85a91gud80.fsf@benfinney.id.au> <54CB0012.2020203@stoneleaf.us> Message-ID: <85mw50j241.fsf@benfinney.id.au> Ethan Furman writes: > However, I feel that requiring a dependency simply for the > installation of the main package (the main package doesn't actually > use this install-dependency, correct?) is heavy-handed and should be > avoided. It's lighter than that; the third-party dependency is needed only for building packages (sdist, or wheel). It parses a reST document to create an extra Setuptools metadata file, and then it's not needed any more. It is ideally a build-time dependency and not an install-time dependency, but I'm having a difficult time figuring out how to distinguish those so Setuptools will pay attention. -- \ ?? one of the main causes of the fall of the Roman Empire was | `\ that, lacking zero, they had no way to indicate successful | _o__) termination of their C programs.? ?Robert Firth | Ben Finney From ethan at stoneleaf.us Sat Jan 31 21:19:39 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Sat, 31 Jan 2015 12:19:39 -0800 Subject: [Distutils] =?utf-8?q?Python_module_for_use_in_=E2=80=98setup=2Ep?= =?utf-8?q?y=E2=80=99_but_not_to_install?= In-Reply-To: <85mw50j241.fsf@benfinney.id.au> References: <85a91gud80.fsf@benfinney.id.au> <54CB0012.2020203@stoneleaf.us> <85mw50j241.fsf@benfinney.id.au> Message-ID: <54CD38DB.9030405@stoneleaf.us> On 01/29/2015 08:58 PM, Ben Finney wrote: > Ethan Furman writes: > >> However, I feel that requiring a dependency simply for the >> installation of the main package (the main package doesn't actually >> use this install-dependency, correct?) is heavy-handed and should be >> avoided. > > It is ideally a build-time dependency and not an install-time > dependency, but I'm having a difficult time figuring out how to > distinguish those so Setuptools will pay attention. Ah, so it's needed with the sdist command and not the install command? Yeah, you definitely have my sympathies with that one. Too bad there's no easy way to specify behavior based on the command used... (hopefully someone will now tell how I am wrong about that ;) . -- ~Ethan~ -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: OpenPGP digital signature URL: From guettliml at thomas-guettler.de Fri Jan 30 08:57:38 2015 From: guettliml at thomas-guettler.de (=?UTF-8?B?VGhvbWFzIEfDvHR0bGVy?=) Date: Fri, 30 Jan 2015 08:57:38 +0100 Subject: [Distutils] Allowed characters in setuptools.setup(name='????') Message-ID: <54CB3972.6040800@thomas-guettler.de> Hi, where is the reference of the allowed characters in the name argument of setuptool.setup()? I could not find it. Regards, Thomas G?ttler