[Distutils] option #1 plus download_url scraping

Barry Warsaw barry at python.org
Wed Jun 5 00:16:16 CEST 2013


Like many of you, I got Donald's message about the changes to URLs for
Cheeseshop packages.  My question is about the three options; I think I want a
middle ground, but I'm interested to see why you will discourage me from that
<wink>.

IIUC, option #1 is fine for packages hosted on PyPI.  But what if our packages
are *also* hosted elsewhere, say for redundancy purposes, and that external
location needs to be scraped?

Specifically, say I have a download_url in my setup.py.  I *want* that url to
be essentially a wildcard or index page because I don't want to have to change
setup.py every time I make a release (unless of course `setup.py sdist` did it
for me).  I also can't add this url to the "Additional File URLs" page for my
package because again I'd have to change it every time I do a release.

So the middle ground I think I want is: option #1 plus scraping from
download_url, but only download_url.

Am I a horrible person for wanting this?  Is there a better way.

Cheers,
-Barry
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/distutils-sig/attachments/20130604/20063971/attachment.pgp>


More information about the Distutils-SIG mailing list