From techtonik at gmail.com Sun May 3 10:24:08 2015 From: techtonik at gmail.com (anatoly techtonik) Date: Sun, 3 May 2015 11:24:08 +0300 Subject: [Distutils] OpenID login is broken for PyPI Message-ID: Hi, Can't login with Google and with LaunchPad. Google breakage was predictable, but LaunchPad worked before. What is the status of PyPI maintenance. A year or two ago patches were not welcome, because people were busy with warehouse. -- anatoly t. From skip.montanaro at gmail.com Sun May 3 23:21:37 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Sun, 3 May 2015 16:21:37 -0500 Subject: [Distutils] Can't install google-api-python-client using pip Message-ID: I want to install google-api-python-client using pip. I can see it when I search from the command line: % pip-2.7 search google | egrep google-api google-api-python-client - Google API Client Library for Python google-apitools - client libraries for humans google-api-python-client-py3 - Google API Client Library for Python (python 3x port) google-apis-client-generator - Google Apis Client Generator I can see it in PyPI: ? When I try to install it, pip denies any knowledge of the package: % pip-2.7 install --user google-api-python-client Downloading/unpacking google-api-python-client Cannot fetch index base URL http://pypi.python.org/simple/ Could not find any downloads that satisfy the requirement google-api-python-client No distributions at all found for google-api-python-client Storing complete log in /Users/skip/.pip/pip.log I can visit http://pypi.python.org/simple/ and see that google-api-python-client is one of the links. Looking at pip.log I see that it's somehow translating "http" into "https": Could not fetch URL http://pypi.python.org/simple/google-api-python-client/: Am I missing something? Thx, Skip Montanaro -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screen Shot 2015-05-03 at 4.16.16 PM.png Type: image/png Size: 32247 bytes Desc: not available URL: From donald at stufft.io Sun May 3 23:55:45 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 3 May 2015 17:55:45 -0400 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: References: Message-ID: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> > On May 3, 2015, at 5:21 PM, Skip Montanaro wrote: > > I want to install google-api-python-client using pip. I can see it when I search from the command line: > > % pip-2.7 search google | egrep google-api > google-api-python-client - Google API Client Library for Python > google-apitools - client libraries for humans > google-api-python-client-py3 - Google API Client Library for Python (python 3x port) > google-apis-client-generator - Google Apis Client Generator > > I can see it in PyPI: > > > ? > When I try to install it, pip denies any knowledge of the package: > > % pip-2.7 install --user google-api-python-client > Downloading/unpacking google-api-python-client > Cannot fetch index base URL http://pypi.python.org/simple/ > Could not find any downloads that satisfy the requirement google-api-python-client > No distributions at all found for google-api-python-client > Storing complete log in /Users/skip/.pip/pip.log > > I can visit http://pypi.python.org/simple/ and see that google-api-python-client is one of the links. Looking at pip.log I see that it's somehow translating "http" into "https": > > Could not fetch URL http://pypi.python.org/simple/google-api-python-client/ : > > Am I missing something? > > Well, PyPI redirects HTTP to HTTPS, so it?s going to be just following that redirect. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From richard at python.org Mon May 4 05:13:46 2015 From: richard at python.org (Richard Jones) Date: Mon, 04 May 2015 03:13:46 +0000 Subject: [Distutils] OpenID login is broken for PyPI In-Reply-To: References: Message-ID: Hi, PyPI is being kept on life-support, and I do not currently have time to debug LaunchPad's openid. The only active development at present is warehouse. On Mon, 4 May 2015 at 05:49 anatoly techtonik wrote: > Hi, > > Can't login with Google and with LaunchPad. > Google breakage was predictable, but LaunchPad > worked before. > > What is the status of PyPI maintenance. A year or > two ago patches were not welcome, because > people were busy with warehouse. > -- > anatoly t. > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Mon May 4 15:05:49 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Mon, 4 May 2015 08:05:49 -0500 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> Message-ID: On Sun, May 3, 2015 at 4:55 PM, Donald Stufft wrote: > Well, PyPI redirects HTTP to HTTPS, so it?s going to be just following > that redirect. Hmmm... This morning this is working. I'm mystified. Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Mon May 4 15:08:30 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Mon, 4 May 2015 08:08:30 -0500 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> Message-ID: On Mon, May 4, 2015 at 8:05 AM, Skip Montanaro wrote: > Hmmm... This morning this is working. I'm mystified. Actually, I should rephrase that. This morning this is working from my Linux machine at work. Yesterday I was twiddling bits from my Mac laptop more-or-less at home. I say, "more-or-less", because we are in a sublet for a bit and I was using the apartment building's "business center" to get on the net. I will have to try it from somewhere else with my laptop and see if it behaves differently. Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at acm.org Tue May 5 22:01:03 2015 From: nad at acm.org (Ned Deily) Date: Tue, 05 May 2015 13:01:03 -0700 Subject: [Distutils] Can't install google-api-python-client using pip References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> Message-ID: In article , Skip Montanaro wrote: > On Mon, May 4, 2015 at 8:05 AM, Skip Montanaro > wrote: > > Hmmm... This morning this is working. I'm mystified. > Actually, I should rephrase that. This morning this is working from my > Linux machine at work. Yesterday I was twiddling bits from my Mac laptop > more-or-less at home. I say, "more-or-less", because we are in a sublet for > a bit and I was using the apartment building's "business center" to get on > the net. I will have to try it from somewhere else with my laptop and see > if it behaves differently. Based on the messages, my guess is that you are not using the most recent version of pip and/or you may be trying to use it with an older version of OpenSSL. Try upgrading to the latest pip; you may find that the download works or that you'll get a more meaningful message. -- Ned Deily, nad at acm.org From skip.montanaro at gmail.com Wed May 6 02:57:51 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Tue, 5 May 2015 19:57:51 -0500 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> Message-ID: On Tue, May 5, 2015 at 3:01 PM, Ned Deily wrote: > > Based on the messages, my guess is that you are not using the most > recent version of pip and/or you may be trying to use it with an older > version of OpenSSL. Try upgrading to the latest pip; you may find that > the download works or that you'll get a more meaningful message. Well, that's not working either: % pip-2.7 install --upgrade pip Cannot fetch index base URL http://pypi.python.org/simple/ Could not find any downloads that satisfy the requirement pip in /Users/skip/local/lib/python2.7/site-packages Downloading/unpacking pip No distributions at all found for pip in /Users/skip/local/lib/python2.7/site-packages Storing complete log in /Users/skip/.pip/pip.log And then I had the bright idea to see if ensurepip was available: % python Python 2.7.9+ (2.7:94ec4d8cf104, Jan 24 2015, 14:56:50) [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.56)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> % python -m ensurepip Ignoring ensurepip failure: pip 6.0.6 requires SSL/TLS which would explain what's wrong. I'll have to see why SSL/TLS is unavailable. Skip From chris.jerdonek at gmail.com Wed May 6 03:02:23 2015 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Tue, 5 May 2015 18:02:23 -0700 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> Message-ID: On Tue, May 5, 2015 at 5:57 PM, Skip Montanaro wrote: > On Tue, May 5, 2015 at 3:01 PM, Ned Deily wrote: >> >> Based on the messages, my guess is that you are not using the most >> recent version of pip and/or you may be trying to use it with an older >> version of OpenSSL. Try upgrading to the latest pip; you may find that >> the download works or that you'll get a more meaningful message. > > Well, that's not working either: > > % pip-2.7 install --upgrade pip > Cannot fetch index base URL http://pypi.python.org/simple/ > Could not find any downloads that satisfy the requirement pip in > /Users/skip/local/lib/python2.7/site-packages > Downloading/unpacking pip > No distributions at all found for pip in > /Users/skip/local/lib/python2.7/site-packages > Storing complete log in /Users/skip/.pip/pip.log > > And then I had the bright idea to see if ensurepip was available: > > % python > Python 2.7.9+ (2.7:94ec4d8cf104, Jan 24 2015, 14:56:50) > [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.56)] on darwin > Type "help", "copyright", "credits" or "license" for more information. >>>> > % python -m ensurepip > Ignoring ensurepip failure: pip 6.0.6 requires SSL/TLS > > which would explain what's wrong. I'll have to see why SSL/TLS is unavailable. It still seems look your first approach should have said something about SSL being required, or is there a reason the message can't or shouldn't be shown in that case? --Chris From donald at stufft.io Wed May 6 15:00:22 2015 From: donald at stufft.io (Donald Stufft) Date: Wed, 6 May 2015 09:00:22 -0400 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> Message-ID: <57837E33-838B-45C4-81A6-0CCB6AC5732B@stufft.io> > On May 5, 2015, at 9:02 PM, Chris Jerdonek wrote: > > On Tue, May 5, 2015 at 5:57 PM, Skip Montanaro wrote: >> On Tue, May 5, 2015 at 3:01 PM, Ned Deily wrote: >>> >>> Based on the messages, my guess is that you are not using the most >>> recent version of pip and/or you may be trying to use it with an older >>> version of OpenSSL. Try upgrading to the latest pip; you may find that >>> the download works or that you'll get a more meaningful message. >> >> Well, that's not working either: >> >> % pip-2.7 install --upgrade pip >> Cannot fetch index base URL http://pypi.python.org/simple/ >> Could not find any downloads that satisfy the requirement pip in >> /Users/skip/local/lib/python2.7/site-packages >> Downloading/unpacking pip >> No distributions at all found for pip in >> /Users/skip/local/lib/python2.7/site-packages >> Storing complete log in /Users/skip/.pip/pip.log >> >> And then I had the bright idea to see if ensurepip was available: >> >> % python >> Python 2.7.9+ (2.7:94ec4d8cf104, Jan 24 2015, 14:56:50) >> [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.56)] on darwin >> Type "help", "copyright", "credits" or "license" for more information. >>>>> >> % python -m ensurepip >> Ignoring ensurepip failure: pip 6.0.6 requires SSL/TLS >> >> which would explain what's wrong. I'll have to see why SSL/TLS is unavailable. > > It still seems look your first approach should have said something > about SSL being required, or is there a reason the message can't or > shouldn't be shown in that case? Open an issue on github.com/pypa/pip please. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From skip.montanaro at gmail.com Wed May 6 15:13:11 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Wed, 6 May 2015 08:13:11 -0500 Subject: [Distutils] Can't install google-api-python-client using pip In-Reply-To: <57837E33-838B-45C4-81A6-0CCB6AC5732B@stufft.io> References: <800129C1-B7A7-4795-9519-CCE16139EBE9@stufft.io> <57837E33-838B-45C4-81A6-0CCB6AC5732B@stufft.io> Message-ID: On Wed, May 6, 2015 at 8:00 AM, Donald Stufft wrote: > Open an issue on github.com/pypa/pip please. > Done: https://github.com/pypa/pip/issues/2760 Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Thu May 7 03:16:49 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Wed, 6 May 2015 20:16:49 -0500 Subject: [Distutils] Multiple pips to go along with my multiple Pythons Message-ID: I try to maintain several built and installed versions of Python, currently 2.6, 2.7, 3.2, 3.3, 3.4 and 3.5alpha, all built from the tips of their respective branches in Mercurial. What's the correct/best way to keep pip up-to-date for all those versions? 2.7, 3.4 and 3.5 have ensurepip, but I think that just makes sure you have pip, not that there is a version associated with a specific version of Python, right? Do I just download the latest tarball and run "PY setup.py install" for each version of PY? Thx, Skip From robertc at robertcollins.net Thu May 7 05:16:43 2015 From: robertc at robertcollins.net (Robert Collins) Date: Thu, 7 May 2015 15:16:43 +1200 Subject: [Distutils] Multiple pips to go along with my multiple Pythons In-Reply-To: References: Message-ID: On 7 May 2015 at 13:16, Skip Montanaro wrote: > I try to maintain several built and installed versions of Python, > currently 2.6, 2.7, 3.2, 3.3, 3.4 and 3.5alpha, all built from the > tips of their respective branches in Mercurial. > > What's the correct/best way to keep pip up-to-date for all those > versions? 2.7, 3.4 and 3.5 have ensurepip, but I think that just makes > sure you have pip, not that there is a version associated with a > specific version of Python, right? Do I just download the latest > tarball and run "PY setup.py install" for each version of PY? python -m pip install -U pip will update pip on Windows or *nix, using the current pip. If you don't have a pip at all, then yes, download + python setup.py install in the pip tree. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From p.f.moore at gmail.com Thu May 7 10:03:38 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 7 May 2015 09:03:38 +0100 Subject: [Distutils] Multiple pips to go along with my multiple Pythons In-Reply-To: References: Message-ID: On 7 May 2015 at 04:16, Robert Collins wrote: > If you don't have a pip at all, then yes, download + python setup.py > install in the pip tree. Or use get-pip.py from https://bootstrap.pypa.io/get-pip.py (which will install from wheel, so is probably a tiny bit faster if you care). Paul From skip.montanaro at gmail.com Thu May 7 16:58:35 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Thu, 7 May 2015 09:58:35 -0500 Subject: [Distutils] Multiple pips to go along with my multiple Pythons In-Reply-To: References: Message-ID: Thanks for the responses. I eventually decided to just download and expand the tarball, then execute PY setup.py clean -a PY setup.py install for each of my PYs. I then deleted the pip, pip2 and pip3 items, leaving only the pipXY versions. That avoids any confusion. While it's a bit more work on my part, it achieved the desired result - nothing but a bunch of unambiguous pipXY programs. Skip From doug at doughellmann.com Thu May 7 18:30:49 2015 From: doug at doughellmann.com (Doug Hellmann) Date: Thu, 07 May 2015 12:30:49 -0400 Subject: [Distutils] Multiple pips to go along with my multiple Pythons In-Reply-To: References: Message-ID: <1431016196-sup-6486@lrrr.local> Excerpts from Skip Montanaro's message of 2015-05-07 09:58:35 -0500: > Thanks for the responses. I eventually decided to just download and > expand the tarball, then execute > > PY setup.py clean -a > PY setup.py install > > for each of my PYs. I then deleted the pip, pip2 and pip3 items, > leaving only the pipXY versions. That avoids any confusion. While it's > a bit more work on my part, it achieved the desired result - nothing > but a bunch of unambiguous pipXY programs. > > Skip Depending on your OS, you might be interested in the ansible role I created for doing this on my dev VMs: http://doughellmann.com/2015/03/07/ansible-roles-for-python-developers.html It's meant for Ubuntu now, but I would be happy to have updates to make it work for other platforms. Doug From setuptools at bugs.python.org Thu May 7 22:57:48 2015 From: setuptools at bugs.python.org (Blake VandeMerwe) Date: Thu, 07 May 2015 20:57:48 +0000 Subject: [Distutils] [issue161] commands.upload.upload_file breaks when iterable 'value' is not of type tuple Message-ID: <1431032268.61.0.889209995657.issue161@psf.upfronthosting.co.za> New submission from Blake VandeMerwe: In distutils\command\upload.py: On line 152 `if type(value) != type([])` will break if value is a tuple instead of a list, which messes up the lgoic from 155-159 To duplicate, in setup.py: setup( name = 'test setup', ... ... classifiers = ( 'Development Status :: 3 - Alpha', ) ) Traceback: running upload Traceback (most recent call last): File "setup.py", line 47, in test_suite = 'tests' File "C:\Python34\lib\distutils\core.py", line 148, in setup dist.run_commands() File "C:\Python34\lib\distutils\dist.py", line 955, in run_commands self.run_command(cmd) File "C:\Python34\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Python34\lib\distutils\command\upload.py", line 65, in run self.upload_file(command, pyversion, filename) File "C:\Python34\lib\distutils\command\upload.py", line 163, in upload_file body.write(value) TypeError: 'str' does not support the buffer interface ---------- messages: 749 nosy: blakev priority: bug status: unread title: commands.upload.upload_file breaks when iterable 'value' is not of type tuple _______________________________________________ Setuptools tracker _______________________________________________ From robertc at robertcollins.net Mon May 11 09:14:32 2015 From: robertc at robertcollins.net (Robert Collins) Date: Mon, 11 May 2015 19:14:32 +1200 Subject: [Distutils] extras in PEP-426 Message-ID: https://www.python.org/dev/peps/pep-0426/#extras-optional-dependencies Gives an example of: "name": "ComfyChair", "extras": ["warmup", "c-accelerators"] "run_requires": [ { "requires": ["SoftCushions"], "extra": "warmup" } ] "build_requires": [ { "requires": ["cython"], "extra": "c-accelerators" } ] But I am not sure how to map the current setuptools example below to PEP-426: extras_require={ 'warmup': ['softcushions', 'barheater'], 'c-accelerators': ['cython', 'barheater'], } Would we expect multiple build_requires that list barheater, with separate extra lines, in PEP-426? Like so (adjusted to be all run-requires as extra setup-requires aren't a thing in setuptools today): "name": "ComfyChair", "extras": ["warmup", "c-accelerators"] "build_requires": [ { "requires": ["SoftCushions","barheater"], "extra": "warmup" } ] "build_requires": [ { "requires": ["cython", "barheater"], "extra": "c-accelerators" } ] Since build_requires is a key, I'm more than a little confused here. Seems like the extra has to be part of the key, or we're going to be uhm, stuck. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From ncoghlan at gmail.com Mon May 11 12:39:21 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 11 May 2015 20:39:21 +1000 Subject: [Distutils] extras in PEP-426 In-Reply-To: References: Message-ID: On 11 May 2015 at 17:14, Robert Collins wrote: > https://www.python.org/dev/peps/pep-0426/#extras-optional-dependencies > > Gives an example of: > > "name": "ComfyChair", > "extras": ["warmup", "c-accelerators"] > "run_requires": [ > { > "requires": ["SoftCushions"], > "extra": "warmup" > } > ] > "build_requires": [ > { > "requires": ["cython"], > "extra": "c-accelerators" > } > ] > > > But I am not sure how to map the current setuptools example below to PEP-426: > > extras_require={ > 'warmup': ['softcushions', 'barheater'], > 'c-accelerators': ['cython', 'barheater'], > } > > Would we expect multiple build_requires that list barheater, with > separate extra lines, in PEP-426? > > Like so (adjusted to be all run-requires as extra setup-requires > aren't a thing in setuptools today): > "name": "ComfyChair", > "extras": ["warmup", "c-accelerators"] > "build_requires": [ > { > "requires": ["SoftCushions","barheater"], > "extra": "warmup" > } > ] > "build_requires": [ > { > "requires": ["cython", "barheater"], > "extra": "c-accelerators" > } > ] > > Since build_requires is a key, I'm more than a little confused here. > Seems like the extra has to be part of the key, or we're going to be > uhm, stuck. You're close, you just need to rely on the fact that build_requires is a sequence rather than a single mapping: "build_requires": [ { "requires": ["SoftCushions","barheater"], "extra": "warmup" }, { "requires": ["cython", "barheater"], "extra": "c-accelerators" } ] That's also how you mix unconditional dependencies with conditional ones, as well as apply environmental constraints to dependencies included in an extra. I don't currently have any examples of mixing and matching in the PEP, but it's *already* a monster document :( Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From dholth at gmail.com Mon May 11 16:04:07 2015 From: dholth at gmail.com (Daniel Holth) Date: Mon, 11 May 2015 10:04:07 -0400 Subject: [Distutils] extras in PEP-426 In-Reply-To: References: Message-ID: bdist_wheel tries to convert setuptools' own requirements into PEP 426 (draft) json metadata and writes it to .dist-info/. However because the document is draft the JSON metadata is not used for anything. The runtime requirements read: "run_requires": [ { "extra": "certs", "requires": [ "certifi (==1.0.1)" ] }, { "environment": "sys_platform=='win32'", "extra": "ssl", "requires": [ "wincertstore (==0.2)" ] } ], On Mon, May 11, 2015 at 6:39 AM, Nick Coghlan wrote: > On 11 May 2015 at 17:14, Robert Collins wrote: >> https://www.python.org/dev/peps/pep-0426/#extras-optional-dependencies >> >> Gives an example of: >> >> "name": "ComfyChair", >> "extras": ["warmup", "c-accelerators"] >> "run_requires": [ >> { >> "requires": ["SoftCushions"], >> "extra": "warmup" >> } >> ] >> "build_requires": [ >> { >> "requires": ["cython"], >> "extra": "c-accelerators" >> } >> ] >> >> >> But I am not sure how to map the current setuptools example below to PEP-426: >> >> extras_require={ >> 'warmup': ['softcushions', 'barheater'], >> 'c-accelerators': ['cython', 'barheater'], >> } >> >> Would we expect multiple build_requires that list barheater, with >> separate extra lines, in PEP-426? >> >> Like so (adjusted to be all run-requires as extra setup-requires >> aren't a thing in setuptools today): >> "name": "ComfyChair", >> "extras": ["warmup", "c-accelerators"] >> "build_requires": [ >> { >> "requires": ["SoftCushions","barheater"], >> "extra": "warmup" >> } >> ] >> "build_requires": [ >> { >> "requires": ["cython", "barheater"], >> "extra": "c-accelerators" >> } >> ] >> >> Since build_requires is a key, I'm more than a little confused here. >> Seems like the extra has to be part of the key, or we're going to be >> uhm, stuck. > > You're close, you just need to rely on the fact that build_requires is > a sequence rather than a single mapping: > > "build_requires": [ > { > "requires": ["SoftCushions","barheater"], > "extra": "warmup" > }, > { > "requires": ["cython", "barheater"], > "extra": "c-accelerators" > } > ] > > That's also how you mix unconditional dependencies with conditional > ones, as well as apply environmental constraints to dependencies > included in an extra. I don't currently have any examples of mixing > and matching in the PEP, but it's *already* a monster document :( > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From dholth at gmail.com Tue May 12 16:44:22 2015 From: dholth at gmail.com (Daniel Holth) Date: Tue, 12 May 2015 10:44:22 -0400 Subject: [Distutils] Fwd: add install_paths.json to .egg-info and .dist-info directory In-Reply-To: References: Message-ID: Cross-posting to distutils-sig: As a prerequisite to adding useful finer-grained installations to setuptools & wheel, I propose adding install_paths.json as a metadata file within the egg-info and dist-info directories, initially containing only the current paths as a key - value mappping, for example it might contain the following keys: base data headers lib libbase platbase platlib purelib scripts userbase (~/.local) usersite (~/.local/lib/python...) These are the existing paths present as properties on the distutils/setuptools install command. Later, when we add a finer grained install scheme, those paths would also land in the new install_paths.json file. In setuptools (non-wheel installations from setup.py) it's very easy to add an additional "egg_info.writers" entry point to create this file. Wheel installers would need to be modified to generate this file. Recording the existing paths would solve a current problem with customized installs e.g. "python setup.py install --install-lib=/custom/path" ; the custom lib path is not recorded anywhere, making it difficult to locate the library. In C, libraries and headers are not really installed to standard locations. There are conventions that differ between operating systems and distributions. That is why the pkg-config command exists, which reads a similar key : path mapping from files in /usr/share/pkgconfig/ or from PKG_CONFIG_PATH. This mapping would serve a similar purpose for Python distributions that might need to locate more of their installed files besides just those installed on PYTHONPATH. Straw man API to use the recorded paths: pkg_resources.resource_filename(pkg_resources.Requirement.parse('dist_name'), '$lib/library.so') The pkg_resources API could be extended to support interpolating these paths. You would need to pass a Requirement, not an importable name from sys.modules, to locate a particular .egg-info or .dist-info. Under the hood pkg_resources would use Distribution._get_metadata('_install_paths.json') to read the install scheme used for that particular distribution. From ncoghlan at gmail.com Tue May 12 17:53:47 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 13 May 2015 01:53:47 +1000 Subject: [Distutils] Fwd: add install_paths.json to .egg-info and .dist-info directory In-Reply-To: References: Message-ID: On 13 May 2015 at 00:44, Daniel Holth wrote: > Cross-posting to distutils-sig: > > As a prerequisite to adding useful finer-grained installations to > setuptools & wheel, I propose adding install_paths.json as a metadata > file within the egg-info and dist-info directories, initially > containing only the current paths as a key - value mappping, for > example it might contain the following keys: > > base > data > headers > lib > libbase > platbase > platlib > purelib > scripts > userbase (~/.local) > usersite (~/.local/lib/python...) > > These are the existing paths present as properties on the > distutils/setuptools install command. Later, when we add a finer > grained install scheme, those paths would also land in the new > install_paths.json file. +1, this sounds like a good incremental step forward to me. I'm not sure how best to document it, though. Perhaps have this initial increment as an implementation detail, and then formalise it in the next version of the wheel spec. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From vinay_sajip at yahoo.co.uk Tue May 12 23:36:38 2015 From: vinay_sajip at yahoo.co.uk (Vinay Sajip) Date: Tue, 12 May 2015 21:36:38 +0000 (UTC) Subject: [Distutils] Fwd: add install_paths.json to .egg-info and .dist-info directory In-Reply-To: References: Message-ID: <605713008.1273004.1431466598831.JavaMail.yahoo@mail.yahoo.com> > From: Nick Coghlan >On 13 May 2015 at 00:44, Daniel Holth wrote: >> As a prerequisite to adding useful finer-grained installations to >> setuptools & wheel, I propose adding install_paths.json as a metadata > +1, this sounds like a good incremental step forward to me. I'm not > sure how best to document it, though. Perhaps have this initial> increment as an implementation detail, and then formalise it in the > next version of the wheel spec. Sounds like a reasonable idea to me, too. But IIUC this file will be written by an installer, but not necessarily present in the wheel - is that right? If not present in the wheel (or perhaps anyway) ISTM it makes sense to specify it in PEP 426 because it's part of the installation metadata. Regards, Vinay Sajip From dholth at gmail.com Wed May 13 00:20:05 2015 From: dholth at gmail.com (Daniel Holth) Date: Tue, 12 May 2015 18:20:05 -0400 Subject: [Distutils] Fwd: add install_paths.json to .egg-info and .dist-info directory In-Reply-To: <605713008.1273004.1431466598831.JavaMail.yahoo@mail.yahoo.com> References: <605713008.1273004.1431466598831.JavaMail.yahoo@mail.yahoo.com> Message-ID: No, PEP 426 needs to be static metadata that is not modified by the installer. The new json file would make more sense in (a successor to) PEP 376 "Database of Installed Python Distributions". In wheel it would would need to be handled as a processing step similar to writing console_scripts, or doing non-console_scripts #! rewriting, or updating MANIFEST. Too bad the wheel post-install is not as pluggable as setuptools egg-info writers. On May 12, 2015 5:36 PM, "Vinay Sajip" wrote: > > From: Nick Coghlan > >On 13 May 2015 at 00:44, Daniel Holth wrote: > > >> As a prerequisite to adding useful finer-grained installations to > >> setuptools & wheel, I propose adding install_paths.json as a metadata > > > +1, this sounds like a good incremental step forward to me. I'm not > > > sure how best to document it, though. Perhaps have this initial> > increment as an implementation detail, and then formalise it in the > > next version of the wheel spec. > > > Sounds like a reasonable idea to me, too. But IIUC this file will be > written by an installer, but not necessarily present in the wheel - is that > right? If not present in the wheel (or perhaps anyway) ISTM it makes sense > to specify it in PEP 426 because it's part of the installation metadata. > > Regards, > > Vinay Sajip > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vinay_sajip at yahoo.co.uk Wed May 13 00:53:59 2015 From: vinay_sajip at yahoo.co.uk (Vinay Sajip) Date: Tue, 12 May 2015 22:53:59 +0000 (UTC) Subject: [Distutils] Fwd: add install_paths.json to .egg-info and .dist-info directory In-Reply-To: References: Message-ID: <1109155069.1299901.1431471239534.JavaMail.yahoo@mail.yahoo.com> From: Daniel Holth > No, PEP 426 needs to be static metadata that is not modified by the installer. > The new json file would make more sense in (a successor to) PEP 376 "Database of Installed Python Distributions". Agreed. That's what I meant :-) Regards, Vinay Sajip From ben+python at benfinney.id.au Wed May 13 08:19:12 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 13 May 2015 16:19:12 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> Message-ID: <8538319ezj.fsf_-_@benfinney.id.au> Chris Barker writes: > On Tue, Apr 14, 2015 at 8:41 AM, Nick Coghlan wrote: > > > The point where I draw the line is supporting *dynamic* linking > > between modules - > > I'm confused -- you don't want a system to be able to install ONE version > of a lib that various python packages can all link to? That's really the > key use-case for me.... Agreed. A key pain point for Python distributions is the lack of support for installing *one* instrance of a Python library, and other Python modules able to discover such installed libraries which meet their declared dependency. For example: * Python distribution ?foo?, for Python implementation ?D? on architecture ?X?, declares dependency on ?bar >= 1.7?. * Installing Python distribution ?bar? version 1.8, on a host running Python ?D? for architecture ?X?, goes to a single instance for the ?bar? library for that architecture and Python implementation. * Invoking the ?foo? code on the same host will go looking (dynamically?) for the dependency ?bar? and find version 1.8 already installed in the one instance on that host. It uses that and all is well. I'm in agreement with Chris that, while the above example may not currently play out as described, that is a fault to be fixed by improving Python's packaging and distribution tools so that it *is* a first-class use case. Nick, you seem to be arguing against that. Can you clarify? -- \ ?Natural catastrophes are rare, but they come often enough. We | `\ need not force the hand of nature.? ?Carl Sagan, _Cosmos_, 1980 | _o__) | Ben Finney From ben+python at benfinney.id.au Wed May 13 08:32:48 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 13 May 2015 16:32:48 +1000 Subject: [Distutils] Beyond wheel 1.0: more fine-grained installation scheme References: Message-ID: <85y4kt7zsf.fsf@benfinney.id.au> David Cournapeau writes: > My suggestion for a better scheme would be to use an extended version > of the various default directories defined by autotools. The extension > would handle windows-specifics. This is a good direction to go, IMO. The expectations of Windows and *nix, when it comes to where files should be installed, are so disparate that abstracting the locations to functional labels which mean the same on all platforms will inevitably result in a lot of such labels. So, if we agree that such abstraction is worthwhile to pursue, the large set of labels (and perhaps even leaving it open to growing even more) should be embraced. > # Suggested variables > > The goal of supporting those variables is to take something that is > flexible enough to support almost any installation scheme, without > putting additional burden on the developer. People who do not > want/need the flexibility will not need to do anything more than what > they do today. A point to note is that ?people? in that statement can be different people with conflicting wants/needs. The developer who writes the Distutils metadata may want to have as little in there as possible, at the expense of flexibility in installation. The system integrator who needs to specify many different locations for the various files in the distribution will want the opposite. They still need to work on the same code base. So perhaps one consideration is that the distribution metadata should be extensible by the integrator, without extensive patches to the Python code; and the developer's code as written still finds the files in the resulting filesystem locations. Is that feasible? > The variables I would suggest are every variable defined in > https://github.com/cournape/Bento/blob/master/bento/core/platforms/sysconfig.py#L10, > except for destdir which is not relevant here. That looks like a lot of named locations. I think that is inevitable, as argued above. > For now, I would be happy to just make a proof of concept not caring > about backward compatibility in a pip branch. Does that sound like a > workable basis to flush out an actual proposal ? Looking forward to it. Thanks! -- \ ?Teach a man to make fire, and he will be warm for a day. Set a | `\ man on fire, and he will be warm for the rest of his life.? | _o__) ?John A. Hrastar | Ben Finney From ncoghlan at gmail.com Wed May 13 10:36:56 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 13 May 2015 18:36:56 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: <8538319ezj.fsf_-_@benfinney.id.au> References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 13 May 2015 at 16:19, Ben Finney wrote: > Chris Barker writes: > >> On Tue, Apr 14, 2015 at 8:41 AM, Nick Coghlan wrote: >> >> > The point where I draw the line is supporting *dynamic* linking >> > between modules - >> >> I'm confused -- you don't want a system to be able to install ONE version >> of a lib that various python packages can all link to? That's really the >> key use-case for me.... > > Agreed. A key pain point for Python distributions is the lack of support > for installing *one* instrance of a Python library, and other Python > modules able to discover such installed libraries which meet their > declared dependency. Are we talking about Python libraries accessed via Python APIs, or linking to external dependencies not written in Python (including linking directly to C libraries shipped with a Python library)? It's the latter I consider to be out of scope for a language specific packaging system - Python packaging dependencies are designed to describe inter-component dependencies based on the Python import system, not dependencies based on the operating system provided C/C++ dynamic linking system. If folks are after the latter, than they want a language independent package system, like conda, nix, or the system package manager in a Linux distribution. > For example: > > * Python distribution ?foo?, for Python implementation ?D? on > architecture ?X?, declares dependency on ?bar >= 1.7?. > > * Installing Python distribution ?bar? version 1.8, on a host running > Python ?D? for architecture ?X?, goes to a single instance for the > ?bar? library for that architecture and Python implementation. > > * Invoking the ?foo? code on the same host will go looking > (dynamically?) for the dependency ?bar? and find version 1.8 already > installed in the one instance on that host. It uses that and all is > well. > > I'm in agreement with Chris that, while the above example may not > currently play out as described, that is a fault to be fixed by > improving Python's packaging and distribution tools so that it *is* a > first-class use case. > > Nick, you seem to be arguing against that. Can you clarify? I'm arguing against supporting direct C level dependencies between packages that rely on dynamic linking to find each other rather than going through the Python import system, as I consider that the point where you cross the line into defining a new platform of your own, rather than providing components that can plug into a range of platforms. (Another way of looking at this: if a tool can manage the Python runtime in addition to Python modules, it's a full-blown arbitrary software distribution platform, not just a Python package manager). Defining cross-platform ABIs (cf. http://bugs.python.org/issue23966) is an unholy mess that will be quite willing to consume vast amounts of time without a great deal to show for it beyond can already be achieved more easily by telling people to just use one of the many existing systems designed specifically to solve that problem (with conda being my default recommendation if you care about Windows, and nix being my recommendation if you only care about *nix systems). Integrator oriented packaging tools and developer oriented packaging tools solve different problems for different groups of people, so I'm firmly of the opinion that trying to solve both sets of problems with a single tool will produce a result that doesn't work as well for *either* use case as separate tools can. Cheers, Nick. P.S. The ABI definition problem is at least somewhat manageable for Windows and Mac OS X desktop/laptop environments (since you can mostly pretend that architectures other than x86_64 don't exist, with perhaps some grudging concessions to the existence of 32-bit mode), but beyond those two, things get very messy, very fast - identifying CPU architectures, CPU operating modes and kernel syscall interfaces correctly is still a hard problem in the Linux distribution space, and they've been working at it a lot longer than we have (and that's *without* getting into things like determining which vectorisation instructions are available). Folks often try to "deal" with this complexity by wishing it away, but the rise of aarch64 and IBM's creation of the OpenPOWER Foundation is making the data centre space interesting again, while in the mobile and embedded spaces it's ARM that is the default, with x86_64 attempting to make inroads. As a result of all that, distributing software that uses dynamically linked dependencies is genuinely difficult, to the point where even operating system vendors struggle to get it right. This is why "statically link all the things" keeps coming back in various guises (whether that's Go's lack of dynamic linking support, or the surge in adoption of Linux containers as a deployment technique), despite the fact that these techniques inevitably bring back the *other* problems that led to the invention of dynamic linking in the first place. The only solution that is known to work reliably for dynamic linking is to have a curated set of packages all built by the same build system, so you know they're using consistent build settings. Linux distributions provide this, as do multi-OS platforms like nix and conda. We *might* be able to provide it for Python someday if PyPI ever gets an integrated build farm, but that's still a big "if" at this point. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From holger at merlinux.eu Wed May 13 16:16:02 2015 From: holger at merlinux.eu (holger krekel) Date: Wed, 13 May 2015 14:16:02 +0000 Subject: [Distutils] new devpi-server/web/client/common: wheel testing, SHA256, ... Message-ID: <20150513141602.GS28354@merlinux.eu> devpi-{server-2.2,web-2.3,client-2.2}: plugins, wheel support, pypi compat ============================================================================ With devpi-server-2.2, devpi-web-2.3, devpi-client-2.2 you'll get a host of fixes and improvements as well as some major new features for the private pypi system: - refined devpi-server plugin architecture based on "pluggy", the plugin and hook mechanism factored out from pytest. - devpi-client "test" now support testing of universal wheels IFF you have have an sdist alongside your wheel. python-specific wheels are not supported at this point. - devpi client "refresh NAME" allows to force a refresh of a pypi package from the command line. - internal refactorings to allow future pypi compatibility (for when it changes to SHA256 checksums). Also we use SHA256 checksumming ourselves now. For many more changes and fixes, please see the respective CHANGELOG of the respective packages. UPGRADE note: devpi-server-2.2 requires to ``--export`` your 2.1 server state and then using ``--import`` with the new version before you can serve your private packages through devpi-server-2.2. UPGRADE note: if you use ``updatetrigger_jenkins``, you have to install the devpi-jenkins plugin. You will get an error during import if you forget it. For docs and quickstart tutorials see http://doc.devpi.net many thanks to Florian Schulze who co-implemented many of the new features. And special thanks go to the two companies who funded major parts of the above work. have fun, holger krekel, merlinux GmbH From chris.barker at noaa.gov Thu May 14 20:01:01 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 14 May 2015 11:01:01 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: > > >> I'm confused -- you don't want a system to be able to install ONE > version > >> of a lib that various python packages can all link to? That's really the > >> key use-case for me.... > > Are we talking about Python libraries accessed via Python APIs, or > linking to external dependencies not written in Python (including > linking directly to C libraries shipped with a Python library)? > I, at least, am talking about the latter. for a concrete example: libpng, for instance, might be needed by PIL, wxPython, Matplotlib, and who knows what else. At this point, if you want to build a package of any of these, you need to statically link it into each of them, or distribute shared libs with each package -- if you ware using them all together (which I do, anyway) you now have three copies of the same lib (but maybe different versions) all linked into your executable. Maybe there is no downside to that (I haven't had a problem yet), but it seems like a bad way to do it! It's the latter I consider to be out of scope for a language specific > packaging system Maybe, but it's a problem to be solved, and the Linux distros more or less solve it for us, but OS-X and Windows have no such system built in (OS-X does have Brew and macports....) > - Python packaging dependencies are designed to > describe inter-component dependencies based on the Python import > system, not dependencies based on the operating system provided C/C++ > dynamic linking system. I think there is a bit of fuzz here -- cPython, at least, uses the "the operating system provided C/C++ dynamic linking system" -- it's not a totally independent thing. If folks are after the latter, than they want > a language independent package system, like conda, nix, or the system > package manager in a Linux distribution. And I am, indeed, focusing on conda lately for this reason -- but not all my users want to use a whole new system, they just want to "pip install" and have it work. And if you are using something like conda you don't need pip or wheels anyway! > I'm arguing against supporting direct C level dependencies between > packages that rely on dynamic linking to find each other rather than > going through the Python import system, Maybe there is a mid ground. For instance, I have a complex wrapper system around a bunch of C++ code. There are maybe 6 or 7 modules that all need to link against that C++ code. On OS-X (and I think Linux, I haven't been doing those builds), we can statically link all the C++ into one python module -- then, as long as that python module is imported before the others, they will all work, and all use that same already loaded version of that library. (this doesn't work so nicely on Windows, unfortunately, so there, we build a dll, and have all the extensions link to it, then put the dll somewhere it gets found -- a little fuzzy on those details) So option (1) for something like libpng is to have a compiled python module that is little but a something that can be linked to ibpng, so that it can be found and loaded by cPython on import, and any other modules can then expect it to be there. This is a big old kludge, but I think could be done with little change to anything in Python or wheel, or...but it would require changes to how each package that use that lib sets itself up and checks for and install dependencies -- maybe not really possible. and it would be better if dependencies could be platform independent, which I'm not sure is supported now. option (2) would be to extend python's import mechanism a bit to allow it to do a raw "link in this arbitrary lib" action, so the lib would not have to be wrapped in a python module -- I don't know how possible that is, or if it would be worth it. > (Another way of looking at this: if a tool can manage the > Python runtime in addition to Python modules, it's a full-blown > arbitrary software distribution platform, not just a Python package > manager). > sure, but if it's ALSO a Python package manger, then why not? i.e. conda -- if we all used conda, we wouldn't need pip+wheel. > Defining cross-platform ABIs (cf. http://bugs.python.org/issue23966) > This is a mess that you need to deal with for ANY binary package -- that's why we don't distribute binary wheels on pypi for Linux, yes? > I'm > firmly of the opinion that trying to solve both sets of problems with > a single tool will produce a result that doesn't work as well for > *either* use case as separate tools can. > I'm going to point to conda again -- it solves both problems, and it's better to use it for all your packages than mingling it with pip (though you CAN mingle it with pip...). So if we say "pip and friends are not going to do that", then we are saying: we don't support a substantial class of packages, and then I wonder what the point is to supporting binary packages at all? P.S. The ABI definition problem is at least somewhat manageable for > Windows and Mac OS X desktop/laptop environments Ah -- here is a key point -- because of that, we DO support binary packages on PyPi -- but only for Windows and OS-X.. I'm just suggesting we find a way to extend that to pacakges that require a non-system non-python dependency. but beyond > those two, things get very messy, very fast - identifying CPU > architectures, CPU operating modes and kernel syscall interfaces > correctly is still a hard problem in the Linux distribution space right -- but where I am confused is where the line is drawn -- it seem sto be the line is REALLY drawn at "yuo need to compile some C (or Fortran, or???) code, rather than at "you depend on another lib" -- the C code, whether it is a third party lib, or part of your extension, still needs to be compiled to match the host platform. and that's > *without* getting into things like determining which vectorisation > instructions are available). yup -- that's why we don't have binary wheels for numpy up on PyPi at this point..... but the rise of aarch64 and IBM's > creation of the OpenPOWER Foundation is making the data centre space > interesting again, while in the mobile and embedded spaces it's ARM > that is the default, with x86_64 attempting to make inroads. > Are those the targets for binary wheels? I don't think so. > This is why > "statically link all the things" keeps coming back in various guises > but if you statically link, you need to build the static package right anyway -- so it doesn't actually solve the problem at hand anyway. The only solution that is known to work reliably for dynamic linking > is to have a curated set of packages all built by the same build > system, so you know they're using consistent build settings. Linux > distributions provide this, as do multi-OS platforms like nix and > conda. We *might* be able to provide it for Python someday if PyPI > ever gets an integrated build farm, but that's still a big "if" at > this point. > Ah -- here is the issue -- but I think we HAVE pretty much got what we need here -- at least for Windows and OS-X. It depends what you mean by "curated", but it seems we have a (defacto?) policy for PyPi: binary wheels should be compatible with the python.org builds. So while each package wheel is supplied by the package maintainer one way or another, rather than by a central entity, it is more or less curated -- or at least standardized. And if you are going to put a binary wheel up, you need to make sure it matches -- and that is less than trivial for packages that require a third party dependency -- but building the lib statically and then linking it in is not inherently easier than doing a dynamic link. OK -- I just remembered the missing link for doing what I proposed above for third party dynamic libs: at this point dependencies are tied to a particular package -- whereas my plan above would require a dependency ties to particular wheel, not the package as a whole. i.e: my mythical matplotlib wheel on OS-X would depend on a py_libpng module -- which could be provided as separate binary wheel. but matplotlib in general would not have that dependency -- for instance, on Linux, folks would want it to build against the system lib, and not have another dependency. Even on OS-X, homebrew users would want it to build against the homebrew lib, etc... So what would be good is a way to specify a "this build" dependency. That can be hacked in, of course, but nicer not to have to. NOTE; we ran into this with readline and iPython wheels -- I can't remember how that was resolved. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Fri May 15 01:41:07 2015 From: robertc at robertcollins.net (Robert Collins) Date: Fri, 15 May 2015 11:41:07 +1200 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 15 May 2015 at 06:01, Chris Barker wrote: >> >> I'm confused -- you don't want a system to be able to install ONE >> >> version >> >> of a lib that various python packages can all link to? That's really >> >> the >> >> key use-case for me.... > > >> >> Are we talking about Python libraries accessed via Python APIs, or >> linking to external dependencies not written in Python (including >> linking directly to C libraries shipped with a Python library)? > > > I, at least, am talking about the latter. for a concrete example: libpng, > for instance, might be needed by PIL, wxPython, Matplotlib, and who knows > what else. At this point, if you want to build a package of any of these, > you need to statically link it into each of them, or distribute shared libs > with each package -- if you ware using them all together (which I do, > anyway) you now have three copies of the same lib (but maybe different > versions) all linked into your executable. Maybe there is no downside to > that (I haven't had a problem yet), but it seems like a bad way to do it! If they are exchanging data structures, it will break at some point. Consider libpng; say that you have a handle to the native C struct for it in PIL, and you pass the wrapping Python object for it to Matplotlib but the struct changed between the version embedded in PIL and that in Matplotlib. Boom. If you communicate purely via Python objects that get remarshalled within each lib its safe (perhaps heavy on the footprint, but safe). -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From chris.barker at noaa.gov Fri May 15 01:54:42 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 14 May 2015 16:54:42 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Thu, May 14, 2015 at 4:41 PM, Robert Collins wrote: > > anyway) you now have three copies of the same lib (but maybe different > > versions) all linked into your executable. Maybe there is no downside to > > that (I haven't had a problem yet), but it seems like a bad way to do it! > > If they are exchanging data structures, it will break at some point. > Consider libpng; say that you have a handle to the native C struct for > it in PIL, and you pass the wrapping Python object for it to > Matplotlib but the struct changed between the version embedded in PIL > and that in Matplotlib. Boom. > > If you communicate purely via Python objects that get remarshalled > within each lib its safe (perhaps heavy on the footprint, but safe). > As far as I know -- no one tries to pass, say, libpng structure pointers around between different python packages. You are right, that would be pretty insane! the best you can do is pass python buffer objects around so you are not copying data where you don't need to. But maybe there is a use-case for passing a native lib data structure around, in which case, yes, you'd really want the lib versions to match -- yes! I suppose if I were to do this, I'd do a run-time check on the lib version number... not sure how else you could be safe in Python-land. So maybe the only real downside is some wasted disk space an memory, which are pretty cheap thee days -- but I stil l don't like it ;-) But the linker/run time/whatever can keep track of which version of a given function is called where? -CHB > -Rob > > > -- > Robert Collins > Distinguished Technologist > HP Converged Cloud > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Fri May 15 10:49:24 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 15 May 2015 09:49:24 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 14 May 2015 at 19:01, Chris Barker wrote: > Ah -- here is the issue -- but I think we HAVE pretty much got what we need > here -- at least for Windows and OS-X. It depends what you mean by > "curated", but it seems we have a (defacto?) policy for PyPi: binary wheels > should be compatible with the python.org builds. So while each package wheel > is supplied by the package maintainer one way or another, rather than by a > central entity, it is more or less curated -- or at least standardized. And > if you are going to put a binary wheel up, you need to make sure it matches > -- and that is less than trivial for packages that require a third party > dependency -- but building the lib statically and then linking it in is not > inherently easier than doing a dynamic link. I think the issue is that, if we have 5 different packages that depend on (say) libpng, and we're using dynamic builds, then how do those packages declare that they need access to libpng.dll? And on Windows, where does the user put libpng.dll so that it gets picked up? And how does a non-expert user do this ("put it in $DIRECTORY, update your PATH, blah blah blah" doesn't work for the average user)? In particular, on Windows, note that the shared DLL must either be in the directory where the executable is located (which is fun when you have virtualenvs, embedded interpreters, etc), or on PATH (which has other implications - suppose I have an incompatible version of libpng.dll, from mingw, say, somewhere earlier on PATH). The problem isn't so much defining a standard ABI that shared DLLs need - as you say, that's a more or less solved problem on Windows - it's managing how those shared DLLs are made available to Python extensions. And *that* is what Unix package managers do for you, and Windows doesn't have a good solution for (other than "bundle all the dependent DLLs with the app, or suffer DLL hell"). Paul PS For a fun exercise, it might be interesting to try breaking conda - find a Python extension which uses a shared DLL, and check that it works. Then grab an incompatible copy of that DLL (say a 32-bit version on a 64-bit system) and try hacking around with PATH, putting the incompatible DLL in a directory earlier on PATH than the correct one, in the Windows directory, use an embedded interpreter like mod_wsgi, tricks like that. If conda survives that, then the solution that they use might be something worth documenting and might offer an approach to solving the issue I described above. If it *doesn't* survive, then that probably implies that the general environment pip has to work in is less forgiving than the curated environment conda manages (which is, of course, the whole point of using conda - to get that curated environment :-)) From donald at stufft.io Fri May 15 15:48:21 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 15 May 2015 09:48:21 -0400 Subject: [Distutils] PyPI and Uploading Documentation Message-ID: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Hey! First, for anyone who isn't aware we recently migrated PyPI and TestPyPI so that instead of storing files and documentation locally (really in a glusterfs cluster) it will store them inside of S3. This will reduce maintenance overhead of running PyPI by two servers since we'll no longer need to run our own glusterfs cluster as well as improve the reliaiblity and scalability of the PyPI service as a whole since we've had nothing but problems from glusterfs in this regard. One of the things that this brought to light was that the documentation upload ability in PyPI is something that is not often used* however it represents something which is one of our slowest routes. It's not a well supported feature and I feel that it's going outside of the core competancy for PyPI itself and instead PyPI should be focused on the files themselves. In addition since the time this was added to PyPI a number of free services or cheap services have came about that allow people to sanely upload raw document without a reliance on any particular documentation system and we've also had the rise of ReadTheDocs for when someone is using Sphinx as their documentation system. I think that it's time to retire this aspect of PyPI which has never been well supported and instead focus on just the things that are core to PyPI. I don't have a fully concrete proposal for doing this, but I wanted to reach out here and figure out if anyone had any ideas. The rough idea I have currently is to simply disable new documentation uploads and add two new small features. One will allow users to delete their existing documentation from PyPI and the other would allow them to register a redirect which would take them from the current location to wherever they move their documentation too. In order to prevent breaking documentation for projects which are defunct or not actively maintained we would maintain the archived documentation (sans what anyone has deleted) indefinetely. Ideally I hope people start to use ReadTheDocs instead of PyPI itself. I think that ReadTheDocs is a great service with heavy ties to the Python community. They will do a better job at hosting documentation than PyPI ever could since that is their core goal. In addition there is a dialog between ReadTheDocs and PyPI where there is an opportunity to add integration between the two sites as well as features to ReadTheDocs that it currently lacks that people feel are a requirement before we move PyPI's documentation to read-only. Thoughts? * Out of ~60k projects only ~2.8k have ever uploaded documentation. It's not easy to tell if all of them are still using it as their primary source of documentation though or if it's old documentation that they just can't delete. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From jim at zope.com Fri May 15 15:54:53 2015 From: jim at zope.com (Jim Fulton) Date: Fri, 15 May 2015 09:54:53 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: On Fri, May 15, 2015 at 9:48 AM, Donald Stufft wrote: > Hey! > > First, for anyone who isn't aware we recently migrated PyPI and TestPyPI so > that instead of storing files and documentation locally (really in a glusterfs > cluster) it will store them inside of S3. This will reduce maintenance overhead > of running PyPI by two servers since we'll no longer need to run our own > glusterfs cluster as well as improve the reliaiblity and scalability of the > PyPI service as a whole since we've had nothing but problems from glusterfs in > this regard. > > One of the things that this brought to light was that the documentation > upload ability in PyPI is something that is not often used* however it > represents something which is one of our slowest routes. It's not a well > supported feature and I feel that it's going outside of the core competancy for > PyPI itself and instead PyPI should be focused on the files themselves. In > addition since the time this was added to PyPI a number of free services or > cheap services have came about that allow people to sanely upload raw document > without a reliance on any particular documentation system and we've also had > the rise of ReadTheDocs for when someone is using Sphinx as their documentation > system. > > I think that it's time to retire this aspect of PyPI which has never been well > supported and instead focus on just the things that are core to PyPI. I don't > have a fully concrete proposal for doing this, but I wanted to reach out here > and figure out if anyone had any ideas. The rough idea I have currently is to > simply disable new documentation uploads and add two new small features. One > will allow users to delete their existing documentation from PyPI and the other > would allow them to register a redirect which would take them from the current > location to wherever they move their documentation too. In order to prevent > breaking documentation for projects which are defunct or not actively > maintained we would maintain the archived documentation (sans what anyone has > deleted) indefinetely. > > Ideally I hope people start to use ReadTheDocs instead of PyPI itself. I think > that ReadTheDocs is a great service with heavy ties to the Python community. > They will do a better job at hosting documentation than PyPI ever could since > that is their core goal. In addition there is a dialog between ReadTheDocs and > PyPI where there is an opportunity to add integration between the two sites as > well as features to ReadTheDocs that it currently lacks that people feel are a > requirement before we move PyPI's documentation to read-only. > > Thoughts? +1 > * Out of ~60k projects only ~2.8k have ever uploaded documentation. It's not > easy to tell if all of them are still using it as their primary source of > documentation though or if it's old documentation that they just can't > delete. I know I have documentation for at least one project hosted this way. I don't remember how I set that up. :) I assume there will be some way to notify owners of effected documentation. Jim -- Jim Fulton http://www.linkedin.com/in/jimfulton From graffatcolmingov at gmail.com Fri May 15 15:55:36 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Fri, 15 May 2015 08:55:36 -0500 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: On Fri, May 15, 2015 at 8:48 AM, Donald Stufft wrote: > Hey! > > First, for anyone who isn't aware we recently migrated PyPI and TestPyPI so > that instead of storing files and documentation locally (really in a glusterfs > cluster) it will store them inside of S3. This will reduce maintenance overhead > of running PyPI by two servers since we'll no longer need to run our own > glusterfs cluster as well as improve the reliaiblity and scalability of the > PyPI service as a whole since we've had nothing but problems from glusterfs in > this regard. > > One of the things that this brought to light was that the documentation > upload ability in PyPI is something that is not often used* however it > represents something which is one of our slowest routes. It's not a well > supported feature and I feel that it's going outside of the core competancy for > PyPI itself and instead PyPI should be focused on the files themselves. In > addition since the time this was added to PyPI a number of free services or > cheap services have came about that allow people to sanely upload raw document > without a reliance on any particular documentation system and we've also had > the rise of ReadTheDocs for when someone is using Sphinx as their documentation > system. > > I think that it's time to retire this aspect of PyPI which has never been well > supported and instead focus on just the things that are core to PyPI. I don't > have a fully concrete proposal for doing this, but I wanted to reach out here > and figure out if anyone had any ideas. The rough idea I have currently is to > simply disable new documentation uploads and add two new small features. One > will allow users to delete their existing documentation from PyPI and the other > would allow them to register a redirect which would take them from the current > location to wherever they move their documentation too. In order to prevent > breaking documentation for projects which are defunct or not actively > maintained we would maintain the archived documentation (sans what anyone has > deleted) indefinetely. > > Ideally I hope people start to use ReadTheDocs instead of PyPI itself. I think > that ReadTheDocs is a great service with heavy ties to the Python community. > They will do a better job at hosting documentation than PyPI ever could since > that is their core goal. In addition there is a dialog between ReadTheDocs and > PyPI where there is an opportunity to add integration between the two sites as > well as features to ReadTheDocs that it currently lacks that people feel are a > requirement before we move PyPI's documentation to read-only. > > Thoughts? > > * Out of ~60k projects only ~2.8k have ever uploaded documentation. It's not > easy to tell if all of them are still using it as their primary source of > documentation though or if it's old documentation that they just can't > delete. > > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > I'm +1 on reducing the responsibilities of PyPI so it can act as an index/repository in a much more efficient manner. I'm also +1 on recommending people use ReadTheDocs. It supports more than just Sphinx so it's a rather flexible option. It's also open source, which means that anyone can contribute to it. I'm curious to hear more about integrations between PyPI and ReadTheDocs but I fully understand if they're not concrete enough to be worthy of discussion. -- Ian From dholth at gmail.com Fri May 15 16:20:42 2015 From: dholth at gmail.com (Daniel Holth) Date: Fri, 15 May 2015 10:20:42 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: I'm using pypi's documentation hosting for pysdl2-cffi because I thought it would be too difficult to run the documentation generator (which parses documentation comments out of the wrapped C code) on the readthedocs server. Perhaps there is a different way to do it that I'm not familiar with. On Fri, May 15, 2015 at 9:55 AM, Ian Cordasco wrote: > On Fri, May 15, 2015 at 8:48 AM, Donald Stufft wrote: >> Hey! >> >> First, for anyone who isn't aware we recently migrated PyPI and TestPyPI so >> that instead of storing files and documentation locally (really in a glusterfs >> cluster) it will store them inside of S3. This will reduce maintenance overhead >> of running PyPI by two servers since we'll no longer need to run our own >> glusterfs cluster as well as improve the reliaiblity and scalability of the >> PyPI service as a whole since we've had nothing but problems from glusterfs in >> this regard. >> >> One of the things that this brought to light was that the documentation >> upload ability in PyPI is something that is not often used* however it >> represents something which is one of our slowest routes. It's not a well >> supported feature and I feel that it's going outside of the core competancy for >> PyPI itself and instead PyPI should be focused on the files themselves. In >> addition since the time this was added to PyPI a number of free services or >> cheap services have came about that allow people to sanely upload raw document >> without a reliance on any particular documentation system and we've also had >> the rise of ReadTheDocs for when someone is using Sphinx as their documentation >> system. >> >> I think that it's time to retire this aspect of PyPI which has never been well >> supported and instead focus on just the things that are core to PyPI. I don't >> have a fully concrete proposal for doing this, but I wanted to reach out here >> and figure out if anyone had any ideas. The rough idea I have currently is to >> simply disable new documentation uploads and add two new small features. One >> will allow users to delete their existing documentation from PyPI and the other >> would allow them to register a redirect which would take them from the current >> location to wherever they move their documentation too. In order to prevent >> breaking documentation for projects which are defunct or not actively >> maintained we would maintain the archived documentation (sans what anyone has >> deleted) indefinetely. >> >> Ideally I hope people start to use ReadTheDocs instead of PyPI itself. I think >> that ReadTheDocs is a great service with heavy ties to the Python community. >> They will do a better job at hosting documentation than PyPI ever could since >> that is their core goal. In addition there is a dialog between ReadTheDocs and >> PyPI where there is an opportunity to add integration between the two sites as >> well as features to ReadTheDocs that it currently lacks that people feel are a >> requirement before we move PyPI's documentation to read-only. >> >> Thoughts? >> >> * Out of ~60k projects only ~2.8k have ever uploaded documentation. It's not >> easy to tell if all of them are still using it as their primary source of >> documentation though or if it's old documentation that they just can't >> delete. >> >> >> --- >> Donald Stufft >> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA >> >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > I'm +1 on reducing the responsibilities of PyPI so it can act as an > index/repository in a much more efficient manner. I'm also +1 on > recommending people use ReadTheDocs. It supports more than just Sphinx > so it's a rather flexible option. It's also open source, which means > that anyone can contribute to it. > > I'm curious to hear more about integrations between PyPI and > ReadTheDocs but I fully understand if they're not concrete enough to > be worthy of discussion. > > -- > Ian > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig From barry at python.org Fri May 15 17:38:48 2015 From: barry at python.org (Barry Warsaw) Date: Fri, 15 May 2015 11:38:48 -0400 Subject: [Distutils] PyPI and Uploading Documentation References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: <20150515113848.1e2003b0@anarchist.wooz.org> On May 15, 2015, at 09:48 AM, Donald Stufft wrote: >One of the things that this brought to light was that the documentation >upload ability in PyPI is something that is not often used* however it >represents something which is one of our slowest routes. I use it for all my packages, mostly because it's easy for my upload workflow: `python setup.py upload_docs`. That said, with the rise of RTD, I have wondered about the usefulness of pythonhosted documentation. And because twine supports secure uploads of code, but not documentation, that unease has grown. So even while I use it, I agree it's time to consider retiring the service. One thing I definitely want to retain though is the link to "Package Documentation" from the project's PyPI page. Please do give us a way to specify that link. The PSF is a supporter of RTD, but let's all make sure they stay in business! https://readthedocs.org/sustainability/#about Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From erik.m.bray at gmail.com Fri May 15 20:23:42 2015 From: erik.m.bray at gmail.com (Erik Bray) Date: Fri, 15 May 2015 14:23:42 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: On Fri, May 15, 2015 at 9:48 AM, Donald Stufft wrote: > Hey! > > First, for anyone who isn't aware we recently migrated PyPI and TestPyPI so > that instead of storing files and documentation locally (really in a glusterfs > cluster) it will store them inside of S3. This will reduce maintenance overhead > of running PyPI by two servers since we'll no longer need to run our own > glusterfs cluster as well as improve the reliaiblity and scalability of the > PyPI service as a whole since we've had nothing but problems from glusterfs in > this regard. > > One of the things that this brought to light was that the documentation > upload ability in PyPI is something that is not often used* however it > represents something which is one of our slowest routes. It's not a well > supported feature and I feel that it's going outside of the core competancy for > PyPI itself and instead PyPI should be focused on the files themselves. In > addition since the time this was added to PyPI a number of free services or > cheap services have came about that allow people to sanely upload raw document > without a reliance on any particular documentation system and we've also had > the rise of ReadTheDocs for when someone is using Sphinx as their documentation > system. > > I think that it's time to retire this aspect of PyPI which has never been well > supported and instead focus on just the things that are core to PyPI. I don't > have a fully concrete proposal for doing this, but I wanted to reach out here > and figure out if anyone had any ideas. The rough idea I have currently is to > simply disable new documentation uploads and add two new small features. One > will allow users to delete their existing documentation from PyPI and the other > would allow them to register a redirect which would take them from the current > location to wherever they move their documentation too. In order to prevent > breaking documentation for projects which are defunct or not actively > maintained we would maintain the archived documentation (sans what anyone has > deleted) indefinetely. > > Ideally I hope people start to use ReadTheDocs instead of PyPI itself. I think > that ReadTheDocs is a great service with heavy ties to the Python community. > They will do a better job at hosting documentation than PyPI ever could since > that is their core goal. In addition there is a dialog between ReadTheDocs and > PyPI where there is an opportunity to add integration between the two sites as > well as features to ReadTheDocs that it currently lacks that people feel are a > requirement before we move PyPI's documentation to read-only. > > Thoughts? > > * Out of ~60k projects only ~2.8k have ever uploaded documentation. It's not > easy to tell if all of them are still using it as their primary source of > documentation though or if it's old documentation that they just can't > delete. +1 for all the stated reasons. I have a few docs hosted on pythonhosted.org, but it's become a nuisance to maintain since it does not support multiple doc versions like ReadTheDocs, so now I've wound up with documentation for the same projects on both sites. The nuisance comes not so much in the process (like Barry wrote, I've enjoyed the simplicity of `setup.py upload_docs`), but because more often than not I've had to redirect users to the Readthedocs docs to make sure they're using the correct version of the docs. So I wish I were not locked into updating the pythonhosted.org docs and would be happy to retire them altogether (much as I appreciated the service). One question is how this would be handled at the tooling end. setup.py upload_docs would have to be retired somehow. Though it might also be nice if some simple tools were added to make it just as easy to add docs to ReadTheDocs. I know something like upload_docs doesn't really make sense, since RTD handles the checkout and build of the docs. But there's still a manual step of enabling new versions of the docs that it would be nice to make as effortless as `setup.py upload_docs`. I gues that's off-topic for the PyPI end of things though. Erik From donald at stufft.io Fri May 15 20:34:44 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 15 May 2015 14:34:44 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: <4B6FBEC7-3A3B-4CFC-9452-090F9B9B70D3@stufft.io> > On May 15, 2015, at 2:23 PM, Erik Bray wrote: > > On Fri, May 15, 2015 at 9:48 AM, Donald Stufft wrote: >> Hey! >> >> First, for anyone who isn't aware we recently migrated PyPI and TestPyPI so >> that instead of storing files and documentation locally (really in a glusterfs >> cluster) it will store them inside of S3. This will reduce maintenance overhead >> of running PyPI by two servers since we'll no longer need to run our own >> glusterfs cluster as well as improve the reliaiblity and scalability of the >> PyPI service as a whole since we've had nothing but problems from glusterfs in >> this regard. >> >> One of the things that this brought to light was that the documentation >> upload ability in PyPI is something that is not often used* however it >> represents something which is one of our slowest routes. It's not a well >> supported feature and I feel that it's going outside of the core competancy for >> PyPI itself and instead PyPI should be focused on the files themselves. In >> addition since the time this was added to PyPI a number of free services or >> cheap services have came about that allow people to sanely upload raw document >> without a reliance on any particular documentation system and we've also had >> the rise of ReadTheDocs for when someone is using Sphinx as their documentation >> system. >> >> I think that it's time to retire this aspect of PyPI which has never been well >> supported and instead focus on just the things that are core to PyPI. I don't >> have a fully concrete proposal for doing this, but I wanted to reach out here >> and figure out if anyone had any ideas. The rough idea I have currently is to >> simply disable new documentation uploads and add two new small features. One >> will allow users to delete their existing documentation from PyPI and the other >> would allow them to register a redirect which would take them from the current >> location to wherever they move their documentation too. In order to prevent >> breaking documentation for projects which are defunct or not actively >> maintained we would maintain the archived documentation (sans what anyone has >> deleted) indefinetely. >> >> Ideally I hope people start to use ReadTheDocs instead of PyPI itself. I think >> that ReadTheDocs is a great service with heavy ties to the Python community. >> They will do a better job at hosting documentation than PyPI ever could since >> that is their core goal. In addition there is a dialog between ReadTheDocs and >> PyPI where there is an opportunity to add integration between the two sites as >> well as features to ReadTheDocs that it currently lacks that people feel are a >> requirement before we move PyPI's documentation to read-only. >> >> Thoughts? >> >> * Out of ~60k projects only ~2.8k have ever uploaded documentation. It's not >> easy to tell if all of them are still using it as their primary source of >> documentation though or if it's old documentation that they just can't >> delete. > > +1 for all the stated reasons. > > I have a few docs hosted on pythonhosted.org, but it's become a > nuisance to maintain since it does not support multiple doc versions > like ReadTheDocs, so now I've wound up with documentation for the same > projects on both sites. The nuisance comes not so much in the process > (like Barry wrote, I've enjoyed the simplicity of `setup.py > upload_docs`), but because more often than not I've had to redirect > users to the Readthedocs docs to make sure they're using the correct > version of the docs. So I wish I were not locked into updating the > pythonhosted.org docs and would be happy to retire them altogether > (much as I appreciated the service). > > One question is how this would be handled at the tooling end. > setup.py upload_docs would have to be retired somehow. Though it > might also be nice if some simple tools were added to make it just as > easy to add docs to ReadTheDocs. I know something like upload_docs > doesn't really make sense, since RTD handles the checkout and build of > the docs. But there's still a manual step of enabling new versions of > the docs that it would be nice to make as effortless as `setup.py > upload_docs`. I gues that's off-topic for the PyPI end of things > though. > > Erik So I can?t speak for ReadTheDocs, but I believe that they are considering and/or are planning on offering arbitrary HTML uploads similarly to how you can upload documentation to PyPI. I don?t know if this will actually happen and what it would look like but I know they are thinking about it. As far as retiring upload_docs, the sanest thing to do I think would be to just have PyPI return an error code that upload_docs would display to the end user. The command itself is in use by a few other systems I think so we might not want to remove it wholesale from Python itself (or maybe we do? It?s a hard question since it?s tied to an external service unlike most of the stdlib). --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From merwok at netwok.org Fri May 15 21:02:38 2015 From: merwok at netwok.org (=?windows-1252?Q?=C9ric_Araujo?=) Date: Fri, 15 May 2015 15:02:38 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <4B6FBEC7-3A3B-4CFC-9452-090F9B9B70D3@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> <4B6FBEC7-3A3B-4CFC-9452-090F9B9B70D3@stufft.io> Message-ID: <555642CE.5070505@netwok.org> Le 2015-05-15 14:34, Donald Stufft a ?crit : > As far as retiring upload_docs, the sanest thing to do I think would be > to just have PyPI return an error code that upload_docs would display > to the end user. The command itself is in use by a few other systems I think > so we might not want to remove it wholesale from Python itself (or maybe > we do? It?s a hard question since it?s tied to an external service unlike > most of the stdlib). upload_docs is implemented by setuptools, not distutils. Cheers From robertc at robertcollins.net Fri May 15 20:57:22 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 06:57:22 +1200 Subject: [Distutils] PyPI is a sick sick hoarder Message-ID: So, I am working on pip issue 988: pip doesn't resolve packages at all. This is O(packages^alternatives_per_package): if you are resolving 10 packages with 10 versions each, there are approximately 10^10 or 10G combinations. 10 packages with 100 versions each - 10^100. So - its going to depend pretty heavily on some good heuristics in whatever final algorithm makes its way in, but the problem is exacerbated by PyPI's nature. Most Linux (all that i'm aware of) distributions have at most 5 versions of a package to consider at any time - installed(might be None), current release, current release security updates, new release being upgraded to, new release being upgraded to's security updates. And their common worst case is actually 2 versions: installed==current release and one new release present. They map alternatives out into separate packages (e.g. when an older soname is deliberately kept across an ABI incompatibility, you end up with 2 packages, not 2 versions of one package). To when comparing pip's challenge to apt's: apt has ~20-30K packages, with altnernatives ~= 2, or pip has ~60K packages, with alternatives ~= 5.7 (I asked dstufft) Scaling the number of packages is relatively easy; scaling the number of alternatives is harder. Even 300 packages (the dependency tree for openstack) is ~2.4T combinations to probe. I wonder if it makes sense to give some back-pressure to people, or at the very least encourage them to remove distributions that: - they don't support anymore - have security holes If folk consider PyPI a sort of historical archive then perhaps we could have a feature to select 'supported' versions by the author, and allow a query parameter to ask for all the versions. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From jcappos at nyu.edu Fri May 15 21:18:10 2015 From: jcappos at nyu.edu (Justin Cappos) Date: Fri, 15 May 2015 15:18:10 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: One thing to consider is that if conflicts do not exist (or are very rare), the number of possible combinations is a moot point. A greedy algorithm for installation (which just chooses the most favored package to resolve each dependency) will run in linear time with the number of packages it would install, if no conflicts exist. So, what you are saying about state exploration may be true for a resolver that uses something like a SAT solver, but doesn't apply to backtracking dependency resolution (unless a huge number of conflicts occur) or simple dependency resolution (at all). SAT solvers do have heuristics to avoid this blow up, except in pathological cases. However, simple / backtracking dependency resolution systems have the further advantage of not needing to request unneeded metadata in the first place... Thanks, Justin On Fri, May 15, 2015 at 2:57 PM, Robert Collins wrote: > So, I am working on pip issue 988: pip doesn't resolve packages at all. > > This is O(packages^alternatives_per_package): if you are resolving 10 > packages with 10 versions each, there are approximately 10^10 or 10G > combinations. 10 packages with 100 versions each - 10^100. > > So - its going to depend pretty heavily on some good heuristics in > whatever final algorithm makes its way in, but the problem is > exacerbated by PyPI's nature. > > Most Linux (all that i'm aware of) distributions have at most 5 > versions of a package to consider at any time - installed(might be > None), current release, current release security updates, new release > being upgraded to, new release being upgraded to's security updates. > And their common worst case is actually 2 versions: installed==current > release and one new release present. They map alternatives out into > separate packages (e.g. when an older soname is deliberately kept > across an ABI incompatibility, you end up with 2 packages, not 2 > versions of one package). To when comparing pip's challenge to apt's: > apt has ~20-30K packages, with altnernatives ~= 2, or > pip has ~60K packages, with alternatives ~= 5.7 (I asked dstufft) > > Scaling the number of packages is relatively easy; scaling the number > of alternatives is harder. Even 300 packages (the dependency tree for > openstack) is ~2.4T combinations to probe. > > I wonder if it makes sense to give some back-pressure to people, or at > the very least encourage them to remove distributions that: > - they don't support anymore > - have security holes > > If folk consider PyPI a sort of historical archive then perhaps we > could have a feature to select 'supported' versions by the author, and > allow a query parameter to ask for all the versions. > > -Rob > > -- > Robert Collins > Distinguished Technologist > HP Converged Cloud > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Fri May 15 21:19:37 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 15 May 2015 15:19:37 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: <15D6987E-2B8F-41BC-B1CD-74A1054CCE41@stufft.io> > On May 15, 2015, at 2:57 PM, Robert Collins wrote: > > So, I am working on pip issue 988: pip doesn't resolve packages at all. > > This is O(packages^alternatives_per_package): if you are resolving 10 > packages with 10 versions each, there are approximately 10^10 or 10G > combinations. 10 packages with 100 versions each - 10^100. > > So - its going to depend pretty heavily on some good heuristics in > whatever final algorithm makes its way in, but the problem is > exacerbated by PyPI's nature. > > Most Linux (all that i'm aware of) distributions have at most 5 > versions of a package to consider at any time - installed(might be > None), current release, current release security updates, new release > being upgraded to, new release being upgraded to's security updates. > And their common worst case is actually 2 versions: installed==current > release and one new release present. They map alternatives out into > separate packages (e.g. when an older soname is deliberately kept > across an ABI incompatibility, you end up with 2 packages, not 2 > versions of one package). To when comparing pip's challenge to apt's: > apt has ~20-30K packages, with altnernatives ~= 2, or > pip has ~60K packages, with alternatives ~= 5.7 (I asked dstufft) > > Scaling the number of packages is relatively easy; scaling the number > of alternatives is harder. Even 300 packages (the dependency tree for > openstack) is ~2.4T combinations to probe. > > I wonder if it makes sense to give some back-pressure to people, or at > the very least encourage them to remove distributions that: > - they don't support anymore > - have security holes > > If folk consider PyPI a sort of historical archive then perhaps we > could have a feature to select 'supported' versions by the author, and > allow a query parameter to ask for all the versions. > There have been a handful of projects which would only keep the latest N versions uploaded to PyPI. I know this primarily because it has caused people a decent amount of pain over time. It?s common for deployments people have to use a requirements.txt file like ``foo==1.0`` and to just continue to pull from PyPI. Deleting the old files breaks anyone doing that, so it would require either having people bundle their deps in their repositories or some way to get at those old versions. Personally I think that we shouldn?t go deleting the old versions or encouraging people to do that. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From robertc at robertcollins.net Fri May 15 21:20:05 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 07:20:05 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: On 16 May 2015 at 06:57, Robert Collins wrote: > So, I am working on pip issue 988: pip doesn't resolve packages at all. > > This is O(packages^alternatives_per_package): if you are resolving 10 ... > Scaling the number of packages is relatively easy; scaling the number > of alternatives is harder. Even 300 packages (the dependency tree for > openstack) is ~2.4T combinations to probe. I added a check for the exact number (when the current step limit is hit): Hit step limit during resolving, 22493640689038530013767184665222125808455708963348534886974974630893524036813561125576881299950281714638872640331745747555743820280235291929928862660035516365300612827387994788286647556890876840654454905860390366740480.000000 from 4038 versions in 205 packages after 100000 steps Which indicates a alternatives factor of ~20. And AIUI PyPI has a long tail itself, so its more common that folk will see > 5.7 factors, rather than less common. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From ben+python at benfinney.id.au Fri May 15 21:24:04 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Sat, 16 May 2015 05:24:04 +1000 Subject: [Distutils] PyPI is a sick sick hoarder References: <15D6987E-2B8F-41BC-B1CD-74A1054CCE41@stufft.io> Message-ID: <85h9rd1w6j.fsf@benfinney.id.au> Donald Stufft writes: > > On May 15, 2015, at 2:57 PM, Robert Collins wrote: > > > > If folk consider PyPI a sort of historical archive then perhaps we > > could have a feature to select 'supported' versions by the author, > > and allow a query parameter to ask for all the versions. > > > > It?s common for deployments people have to use a requirements.txt file > like ``foo==1.0`` and to just continue to pull from PyPI. Deleting the > old files breaks anyone doing that, so it would require either having > people bundle their deps in their repositories or some way to get at > those old versions. Personally I think that we shouldn?t go deleting > the old versions or encouraging people to do that. Yes, it's common to consider PyPI as a repository of all versions ever released, and to treat it as an archive whose URLs will continue to make available the historical versions. -- \ ?If history and science have taught us anything, it is that | `\ passion and desire are not the same as truth.? ?E. O. Wilson, | _o__) _Consilience_, 1998 | Ben Finney From robertc at robertcollins.net Fri May 15 21:26:11 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 07:26:11 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: <15D6987E-2B8F-41BC-B1CD-74A1054CCE41@stufft.io> References: <15D6987E-2B8F-41BC-B1CD-74A1054CCE41@stufft.io> Message-ID: On 16 May 2015 at 07:19, Donald Stufft wrote: > There have been a handful of projects which would only keep the latest N > versions uploaded to PyPI. I know this primarily because it has caused > people a decent amount of pain over time. It?s common for deployments people > have to use a requirements.txt file like ``foo==1.0`` and to just continue > to pull from PyPI. Deleting the old files breaks anyone doing that, so it would > require either having people bundle their deps in their repositories or > some way to get at those old versions. Personally I think that we shouldn?t > go deleting the old versions or encouraging people to do that. I think 'most recent only' is too much. Most upstreams will support more than one release. Like - I don't care what testtools release you use. OTOH, every version with distinct dependencies becomes a very expensive liability to the ecosystem here. It's beyond human scale, and well in the territory of argh wtf the universe is burning around me and my tardis has run out of power. I'm sure we can provide an escape hatch in pip (and I'm going to do that in my branch soon - offering simple 'error on conflict' and 'use first seen specifier only' strategies) while folk work on different heuristics - the actual resolver is only ~100 LOC in my branch today - the rest is refactoring (that can be made better and I plan to do so before suggesting we merge it). But a significant contributing factor is the O of the problem, and we can do something about that. I don't know what exactly, and I think we're going to need to have our creative caps firmly on to come up with something meeting the broad needs of the ecosystem: which includes pip Just Working. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From noah at coderanger.net Fri May 15 21:51:36 2015 From: noah at coderanger.net (Noah Kantrowitz) Date: Fri, 15 May 2015 21:51:36 +0200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: <15D6987E-2B8F-41BC-B1CD-74A1054CCE41@stufft.io> References: <15D6987E-2B8F-41BC-B1CD-74A1054CCE41@stufft.io> Message-ID: On May 15, 2015, at 9:19 PM, Donald Stufft wrote: >> >> On May 15, 2015, at 2:57 PM, Robert Collins wrote: >> >> So, I am working on pip issue 988: pip doesn't resolve packages at all. >> >> This is O(packages^alternatives_per_package): if you are resolving 10 >> packages with 10 versions each, there are approximately 10^10 or 10G >> combinations. 10 packages with 100 versions each - 10^100. >> >> So - its going to depend pretty heavily on some good heuristics in >> whatever final algorithm makes its way in, but the problem is >> exacerbated by PyPI's nature. >> >> Most Linux (all that i'm aware of) distributions have at most 5 >> versions of a package to consider at any time - installed(might be >> None), current release, current release security updates, new release >> being upgraded to, new release being upgraded to's security updates. >> And their common worst case is actually 2 versions: installed==current >> release and one new release present. They map alternatives out into >> separate packages (e.g. when an older soname is deliberately kept >> across an ABI incompatibility, you end up with 2 packages, not 2 >> versions of one package). To when comparing pip's challenge to apt's: >> apt has ~20-30K packages, with altnernatives ~= 2, or >> pip has ~60K packages, with alternatives ~= 5.7 (I asked dstufft) >> >> Scaling the number of packages is relatively easy; scaling the number >> of alternatives is harder. Even 300 packages (the dependency tree for >> openstack) is ~2.4T combinations to probe. >> >> I wonder if it makes sense to give some back-pressure to people, or at >> the very least encourage them to remove distributions that: >> - they don't support anymore >> - have security holes >> >> If folk consider PyPI a sort of historical archive then perhaps we >> could have a feature to select 'supported' versions by the author, and >> allow a query parameter to ask for all the versions. >> > > There have been a handful of projects which would only keep the latest N > versions uploaded to PyPI. I know this primarily because it has caused > people a decent amount of pain over time. It?s common for deployments people > have to use a requirements.txt file like ``foo==1.0`` and to just continue > to pull from PyPI. Deleting the old files breaks anyone doing that, so it would > require either having people bundle their deps in their repositories or > some way to get at those old versions. Personally I think that we shouldn?t > go deleting the old versions or encouraging people to do that. +1 for this. While I appreciate why Linux distress purge old versions, it is absolutely hellish for reproducibility. If you are looking for prior art, check out the Molinillo project (https://github.com/CocoaPods/Molinillo) used by Bundler and CocoaPods. It is not as complex as the Solve gem used in Chef but offers a good balance of performance in satisfying constraints and false negatives on solution failures. --Noah -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 163 bytes Desc: Message signed with OpenPGP using GPGMail URL: From robertc at robertcollins.net Fri May 15 21:58:26 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 07:58:26 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: On 16 May 2015 at 07:18, Justin Cappos wrote: > One thing to consider is that if conflicts do not exist (or are very rare), > the number of possible combinations is a moot point. A greedy algorithm for > installation (which just chooses the most favored package to resolve each > dependency) will run in linear time with the number of packages it would > install, if no conflicts exist. > > So, what you are saying about state exploration may be true for a resolver > that uses something like a SAT solver, but doesn't apply to backtracking > dependency resolution (unless a huge number of conflicts occur) or simple > dependency resolution (at all). SAT solvers do have heuristics to avoid > this blow up, except in pathological cases. However, simple / backtracking > dependency resolution systems have the further advantage of not needing to > request unneeded metadata in the first place... Your intuition here is misleading, sorry :(. You're right about 'hey if everything fits its linear', but the reason we have this bug open, with people adding a new example of it every week or so (as someone who didn't realise that pip doesn't resolve finds out and adds it to pip's issue tracker, only to have it made into a dupe). Backtracking recursive resolvers have exactly the same O as SAT. Example: say I have an ecosystem of 10 packages. A-J. And they do a release every 6 months that is guaranteed to work together, but every time some issue occurs which ends up clamping the group together- e.g. an external release breaks API and so A1s deps are disjoint with A2s, and then the same between A2 and A3. Even though A1's API is compatible with B2's: its not internal bad code, its just taking *one* external dep breaking its API. After 2 releases you have 10^2 combinations, but only 4 are valid at all. Thats 4%. 8 releases gets you 10^8, 8 valid combinations, or 0.0000008%. Now there are two things to examine here. How likely is this to happen to PyPI users, and can a backtracker (which btw is what my code is) handle this better than a SAT solver. In terms of likelyhood - OpenStack hits this every release. Its not that our libraries are incompatible with each other, its that given 250 packages (the 200 in error I quoted just shows that the resolver hadn't obtained version data for everything), *something* breaks API compat in each 6 month release cycle, and so you end up with the whole set effectively locking together. In fact, it has happened so consistently to OpenStack that we now release our libraries with closed specifiers : >=min_version, 3.0 (and we have releases 1,2,3,4 of A through J). Now, for a_version in A: is going to have the installed version of A in its first step. B likewise. So we'll end up with a trace like: A==3 B==3 C==3 D==3 E==3 F==3 G==3 H==3 I==3 J==3 error, backtrack (user specifier) J==4 error, backtracks once the external deps are considered and the conflict is found J==2 error, backtrack (user specifier) J==1 error, backtrack (user specifier) I==4 J==3 error, backtrack (user specifier) J==4 error, backtracks somewere in the external deps J==2, error, backtrack (user specifier) and so on, until we finally tick over to A==4. More generally, any already installed version (without -U) can cause a backtracking resolver to try *all* possible other combinations before realising that that installed version is the problem and bumping it. A heuristic to look for those and bump them first then hits molasses as soon as one of the installed versions needs to be kept as-is. Anyhow, my goal here is to start the conversation; pip will need some knobs because no matter how good the heuristics users will need escape hatches. (One of which is to fully specify their needs). -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From chris.barker at noaa.gov Fri May 15 21:56:01 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 15 May 2015 12:56:01 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Fri, May 15, 2015 at 1:49 AM, Paul Moore wrote: > On 14 May 2015 at 19:01, Chris Barker wrote: > > Ah -- here is the issue -- but I think we HAVE pretty much got what we > need > > here -- at least for Windows and OS-X. It depends what you mean by > > "curated", but it seems we have a (defacto?) policy for PyPi: binary > wheels > > should be compatible with the python.org builds. So while each package > wheel > > is supplied by the package maintainer one way or another, rather than by > a > > central entity, it is more or less curated -- or at least standardized. > And > > if you are going to put a binary wheel up, you need to make sure it > matches > > -- and that is less than trivial for packages that require a third party > > dependency -- but building the lib statically and then linking it in is > not > > inherently easier than doing a dynamic link. > > I think the issue is that, if we have 5 different packages that depend > on (say) libpng, and we're using dynamic builds, then how do those > packages declare that they need access to libpng.dll? this is the missing link -- it is a binary build dependency, not a package dependency -- so not such much that matplotlib-1.4.3 depends on libpng.x.y, but that: matplotlib-1.4.3-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl depends on: libpng-x.y (all those binary parts will come from the platform) That's what's missing now. And on Windows, > where does the user put libpng.dll so that it gets picked up? Well, here is the rub -- Windows dll hell really is hell -- but I think it goes into the python dll searchpath (sorry, not on a Windows box where I can really check this out right now), it can work -- I know have an in-house product that has multiple python modules sharing a single dll somehow.... > And how > does a non-expert user do this ("put it in $DIRECTORY, update your > PATH, blah blah blah" doesn't work for the average user)? > That's why we may need to update the tooling to handle this -- I"m not totally sure if the current wheel format can support this on Windows -- though it can on OS-X. In particular, on Windows, note that the shared DLL must either be in > the directory where the executable is located (which is fun when you > have virtualenvs, embedded interpreters, etc), or on PATH (which has > other implications - suppose I have an incompatible version of > libpng.dll, from mingw, say, somewhere earlier on PATH). > that would be dll hell, yes..... > The problem isn't so much defining a standard ABI that shared DLLs > need - as you say, that's a more or less solved problem on Windows - > it's managing how those shared DLLs are made available to Python > extensions. And *that* is what Unix package managers do for you, and > Windows doesn't have a good solution for (other than "bundle all the > dependent DLLs with the app, or suffer DLL hell"). > exactly -- but if we consider the python install to be the "app", rather than an individual python bundle, then we _may_ be OK. PS For a fun exercise, it might be interesting to try breaking conda - > Windows really is simply broken [1] in this regard -- so I'm quite sure you could break conda -- but it does seem to do a pretty good job of not being broken easily by common uses -- I can't say I know enough about Windows dll finding or conda to know how... Oh, and conda is actually broken in this regard on OS-X at this point -- if you compile your own extension in an anaconda environment, it will find a shared lib at compile time that it won't find at run time. -- the conda install process fixes these, but that's a pain when under development -- i.e. you don't want to have to actually install the package with conda to run a test each time you re-build the dll.. (or even change a bit of python code...) But in short -- I'm pretty sure there is a way, on all systems, to have a standard way to build extension modules, combined with a standard way to install shared libs, so that a lib can be shared among multiple packages. So the question remains: Is there any point? or is the current approach of statically linking all third party libs the way to go? If so, then is there any chance of getting folks to conform to this standard for PyPi hosted binary packages anyway? i.e. the curation problem. Personally, I'm on the fence here -- I really want newbies to be able to simply "pip install" as many packages as possible and get a good result when they do it. On the other hand, I've found that conda better supports this right now, so it's easier for me to simply use that for my tools. -Chris [1] My take on dll hell: a) it's inherently difficult -- which is why Linux provides a system package manager. b) however, Windows really does make it MORE difficult than it has to be: i) it looks first next the executable ii) it also looks on the PATH (rather than a separate DLL_PATH) Combine these two, and you have some folks dropping dlls next to their executable, which means they have inadvertently dropped it on the DLL search path for other apps to find it. Add to this the (very odd to me) long standing tradition of not putting extensive version numbers in dll file names, and presto: dll hell! -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From jim at zope.com Fri May 15 22:27:03 2015 From: jim at zope.com (Jim Fulton) Date: Fri, 15 May 2015 16:27:03 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: On Fri, May 15, 2015 at 2:57 PM, Robert Collins wrote: > So, I am working on pip issue 988: pip doesn't resolve packages at all. > > This is O(packages^alternatives_per_package): if you are resolving 10 > packages with 10 versions each, there are approximately 10^10 or 10G > combinations. 10 packages with 100 versions each - 10^100. > > So - its going to depend pretty heavily on some good heuristics in > whatever final algorithm makes its way in, but the problem is > exacerbated by PyPI's nature. > > Most Linux (all that i'm aware of) distributions have at most 5 > versions of a package to consider at any time - installed(might be > None), current release, current release security updates, new release > being upgraded to, new release being upgraded to's security updates. > And their common worst case is actually 2 versions: installed==current > release and one new release present. They map alternatives out into > separate packages (e.g. when an older soname is deliberately kept > across an ABI incompatibility, you end up with 2 packages, not 2 > versions of one package). To when comparing pip's challenge to apt's: > apt has ~20-30K packages, with altnernatives ~= 2, or > pip has ~60K packages, with alternatives ~= 5.7 (I asked dstufft) > > Scaling the number of packages is relatively easy; scaling the number > of alternatives is harder. Even 300 packages (the dependency tree for > openstack) is ~2.4T combinations to probe. > > I wonder if it makes sense to give some back-pressure to people, or at > the very least encourage them to remove distributions that: > - they don't support anymore > - have security holes > > If folk consider PyPI a sort of historical archive then perhaps we > could have a feature to select 'supported' versions by the author, and > allow a query parameter to ask for all the versions. You could simply limit the number of versions from PyPI you consider. Jim -- Jim Fulton http://www.linkedin.com/in/jimfulton From robertc at robertcollins.net Fri May 15 22:28:19 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 08:28:19 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: On 16 May 2015 at 08:27, Jim Fulton wrote: >> If folk consider PyPI a sort of historical archive then perhaps we >> could have a feature to select 'supported' versions by the author, and >> allow a query parameter to ask for all the versions. > > You could simply limit the number of versions from PyPI > you consider. Yes - it would be nice IMO to give package authors some influence over that though. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From p.f.moore at gmail.com Fri May 15 22:44:06 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 15 May 2015 21:44:06 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 15 May 2015 at 20:56, Chris Barker wrote: > But in short -- I'm pretty sure there is a way, on all systems, to have a > standard way to build extension modules, combined with a standard way to > install shared libs, so that a lib can be shared among multiple packages. So > the question remains: > > Is there any point? or is the current approach of statically linking all > third party libs the way to go? If someone can make it work, that would be good. But (a) nobody is actually offering to develop and maintain such a solution, and (b) it's not particularly clear how *much* of a benefit there would be (space savings aren't that important, ease of upgrade is fine as long as everything can be upgraded at once, etc...) > If so, then is there any chance of getting folks to conform to this standard > for PyPi hosted binary packages anyway? i.e. the curation problem. If it exists, and if there's a benefit, people will use it. > Personally, I'm on the fence here -- I really want newbies to be able to > simply "pip install" as many packages as possible and get a good result when > they do it. Static linking gives that on Windows FWIW. (And maybe also on OSX?) This is a key point, though - the goal shouldn't be "use dynamic linking" but rather "make the user experience as easy as possible". It may even be that the best approach (dynamic or static) differs depending on platform. > On the other hand, I've found that conda better supports this right now, so > it's easier for me to simply use that for my tools. And that's an entirely reasonable position. The only "problem" (if indeed it is a problem) is that by having two different solutions (pip/wheel and conda) splits the developer resource, which means that neither approach moves forward as fast as a combined approach does. But that's OK if the two solutions are addressing different needs (which seems to be the case for the moment). Paul From jcappos at nyu.edu Fri May 15 22:46:37 2015 From: jcappos at nyu.edu (Justin Cappos) Date: Fri, 15 May 2015 16:46:37 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: > > Example: say I have an ecosystem of 10 packages. A-J. And they do a > release every 6 months that is guaranteed to work together, but every > time some issue occurs which ends up clamping the group together- e.g. > an external release breaks API and so A1s deps are disjoint with A2s, > and then the same between A2 and A3. Even though A1's API is > compatible with B2's: its not internal bad code, its just taking *one* > external dep breaking its API. > > After 2 releases you have 10^2 combinations, but only 4 are valid at > all. Thats 4%. 8 releases gets you 10^8, 8 valid combinations, or > 0.0000008%. Yes, so this would not be a situation where "conflicts do not exist (or are very rare)" as my post mentioned. Is this rate of conflicts something you measured or is it a value you made up? I don't hear anyone arguing that the status quo makes sense. I think we're mostly just chatting about the right thing to optimize the solution for and what sorts of short cuts may be useful (or even necessary). Since we can measure the actual conflict and other values in practice, data seems like it may be a good path toward grounding the discussion... Thanks, Justin -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Fri May 15 23:06:17 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 09:06:17 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: On 16 May 2015 at 08:46, Justin Cappos wrote: >> Example: say I have an ecosystem of 10 packages. A-J. And they do a >> release every 6 months that is guaranteed to work together, but every >> time some issue occurs which ends up clamping the group together- e.g. >> an external release breaks API and so A1s deps are disjoint with A2s, >> and then the same between A2 and A3. Even though A1's API is >> compatible with B2's: its not internal bad code, its just taking *one* >> external dep breaking its API. >> >> After 2 releases you have 10^2 combinations, but only 4 are valid at >> all. Thats 4%. 8 releases gets you 10^8, 8 valid combinations, or >> 0.0000008%. > > > Yes, so this would not be a situation where "conflicts do not exist (or are > very rare)" as my post mentioned. Is this rate of conflicts something you > measured or is it a value you made up? It's drawn from the concrete example of OpenStack, which has a single group of co-installable releases that cluster together every 6 months. I don't have the actual valid/invalid ratio there because I don't have enough machines to calculate it:). -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From qwcode at gmail.com Sat May 16 01:08:37 2015 From: qwcode at gmail.com (Marcus Smith) Date: Fri, 15 May 2015 16:08:37 -0700 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: Why not start with pip at least being a "simple" fail-on-conflict resolver (vs the "1st found wins" resolver it is now)... You'd "backtrack" for the sake of re-walking when new constraints are found, but not for the purpose of solving conflicts. I know you're motivated to solve Openstack build issues, but many of the issues I've seen in the pip tracker, I think would be solved without the backtracking resolver you're trying to build. On Fri, May 15, 2015 at 11:57 AM, Robert Collins wrote: > So, I am working on pip issue 988: pip doesn't resolve packages at all. > > This is O(packages^alternatives_per_package): if you are resolving 10 > packages with 10 versions each, there are approximately 10^10 or 10G > combinations. 10 packages with 100 versions each - 10^100. > > So - its going to depend pretty heavily on some good heuristics in > whatever final algorithm makes its way in, but the problem is > exacerbated by PyPI's nature. > > Most Linux (all that i'm aware of) distributions have at most 5 > versions of a package to consider at any time - installed(might be > None), current release, current release security updates, new release > being upgraded to, new release being upgraded to's security updates. > And their common worst case is actually 2 versions: installed==current > release and one new release present. They map alternatives out into > separate packages (e.g. when an older soname is deliberately kept > across an ABI incompatibility, you end up with 2 packages, not 2 > versions of one package). To when comparing pip's challenge to apt's: > apt has ~20-30K packages, with altnernatives ~= 2, or > pip has ~60K packages, with alternatives ~= 5.7 (I asked dstufft) > > Scaling the number of packages is relatively easy; scaling the number > of alternatives is harder. Even 300 packages (the dependency tree for > openstack) is ~2.4T combinations to probe. > > I wonder if it makes sense to give some back-pressure to people, or at > the very least encourage them to remove distributions that: > - they don't support anymore > - have security holes > > If folk consider PyPI a sort of historical archive then perhaps we > could have a feature to select 'supported' versions by the author, and > allow a query parameter to ask for all the versions. > > -Rob > > -- > Robert Collins > Distinguished Technologist > HP Converged Cloud > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Sat May 16 03:22:02 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 13:22:02 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: On 16 May 2015 at 11:08, Marcus Smith wrote: > Why not start with pip at least being a "simple" fail-on-conflict resolver > (vs the "1st found wins" resolver it is now)... > > You'd "backtrack" for the sake of re-walking when new constraints are found, > but not for the purpose of solving conflicts. > > I know you're motivated to solve Openstack build issues, but many of the > issues I've seen in the pip tracker, I think would be solved without the > backtracking resolver you're trying to build. Well, I'm scratching the itch I have. If its too hard to get something decent, sure I might back off in my goals, but I see no point aiming for something less than all the other language specific packaging systems out there have. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From donald at stufft.io Sat May 16 03:45:30 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 15 May 2015 21:45:30 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: Message-ID: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> > On May 15, 2015, at 9:22 PM, Robert Collins wrote: > > On 16 May 2015 at 11:08, Marcus Smith wrote: >> Why not start with pip at least being a "simple" fail-on-conflict resolver >> (vs the "1st found wins" resolver it is now)... >> >> You'd "backtrack" for the sake of re-walking when new constraints are found, >> but not for the purpose of solving conflicts. >> >> I know you're motivated to solve Openstack build issues, but many of the >> issues I've seen in the pip tracker, I think would be solved without the >> backtracking resolver you're trying to build. > > Well, I'm scratching the itch I have. If its too hard to get something > decent, sure I might back off in my goals, but I see no point aiming > for something less than all the other language specific packaging > systems out there have. So what makes the other language specific packaging systems different? As far as I know all of them have complete archives (e.g. they are like PyPI where they have a lot of versions, not like Linux Distros). What can we learn from how they solved this? --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From robertc at robertcollins.net Sat May 16 03:52:51 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 16 May 2015 13:52:51 +1200 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On 16 May 2015 at 13:45, Donald Stufft wrote: > >> On May 15, 2015, at 9:22 PM, Robert Collins wrote: >> >> On 16 May 2015 at 11:08, Marcus Smith wrote: >>> Why not start with pip at least being a "simple" fail-on-conflict resolver >>> (vs the "1st found wins" resolver it is now)... >>> >>> You'd "backtrack" for the sake of re-walking when new constraints are found, >>> but not for the purpose of solving conflicts. >>> >>> I know you're motivated to solve Openstack build issues, but many of the >>> issues I've seen in the pip tracker, I think would be solved without the >>> backtracking resolver you're trying to build. >> >> Well, I'm scratching the itch I have. If its too hard to get something >> decent, sure I might back off in my goals, but I see no point aiming >> for something less than all the other language specific packaging >> systems out there have. > > > So what makes the other language specific packaging systems different? As far > as I know all of them have complete archives (e.g. they are like PyPI where they > have a lot of versions, not like Linux Distros). What can we learn from how they > solved this? NB; I have by no means finished low hanging heuristics and space trimming stuff :). I have some simple things in mind and am sure I'll end up with something 'good enough' for day to day use. The thing I'm worried about is the long term health of the approach. Good questions. Some of it is structural I suspect. A quick rundown. cabal (haskell) has a backtracking solver that accepts various parameters to tell it to try harder. javascript effectively vendors every dep ever, so you end up with many copies of the same library at different versions in the same process. rust's cargo system currently solves everything in a single project only - it has no binary packaging, only vendor-into-a-binary-build packaging. The gem behaviour I'm not yet familiar with. perl I used to know but time has eroded it :/. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From cournape at gmail.com Sat May 16 07:00:21 2015 From: cournape at gmail.com (David Cournapeau) Date: Sat, 16 May 2015 14:00:21 +0900 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On Sat, May 16, 2015 at 10:52 AM, Robert Collins wrote: > On 16 May 2015 at 13:45, Donald Stufft wrote: > > > >> On May 15, 2015, at 9:22 PM, Robert Collins > wrote: > >> > >> On 16 May 2015 at 11:08, Marcus Smith wrote: > >>> Why not start with pip at least being a "simple" fail-on-conflict > resolver > >>> (vs the "1st found wins" resolver it is now)... > >>> > >>> You'd "backtrack" for the sake of re-walking when new constraints are > found, > >>> but not for the purpose of solving conflicts. > >>> > >>> I know you're motivated to solve Openstack build issues, but many of > the > >>> issues I've seen in the pip tracker, I think would be solved without > the > >>> backtracking resolver you're trying to build. > >> > >> Well, I'm scratching the itch I have. If its too hard to get something > >> decent, sure I might back off in my goals, but I see no point aiming > >> for something less than all the other language specific packaging > >> systems out there have. > > > > > > So what makes the other language specific packaging systems different? > As far > > as I know all of them have complete archives (e.g. they are like PyPI > where they > > have a lot of versions, not like Linux Distros). What can we learn from > how they > > solved this? > > NB; I have by no means finished low hanging heuristics and space > trimming stuff :). I have some simple things in mind and am sure I'll > end up with something 'good enough' for day to day use. The thing I'm > worried about is the long term health of the approach. > > Good questions. Some of it is structural I suspect. A quick rundown. > cabal (haskell) has a backtracking solver that accepts various > parameters to tell it to try harder. > javascript effectively vendors every dep ever, so you end up with many > copies of the same library at different versions in the same process. > rust's cargo system currently solves everything in a single project > only - it has no binary packaging, only vendor-into-a-binary-build > packaging. > The gem behaviour I'm not yet familiar with. > perl I used to know but time has eroded it :/. > FWIW, php uses a SAT-based solver in composer, which started as a port of libsolv (the SAT solver used by openSUSE and soon Fedora). I am no expert, but I don't understand why backtracking algorithms would to be faster than SAT, since they both potentially need to walk over the full set of possible solutions. It is hard to reason about the cost because the worst case is in theory growing exponentially in both cases. With a SAT-based algorithm for dependency resolution, it is relatively simple to apply heuristics which massively prune the search space. For example, when considering package A with say 10 potential versions A_1, etc..., in theory, you need to generate the rules: # - means not install, + means install - A_1 | - A_2 - A_1 | - A_3 ... and those constitute most of the rules in common cases. But it is possible to tweak the SAT implementation to replace those rules by a single "AtMost one of" rule per *package*, which means the #rules do not grow much by versions. The real difficulty of SAT-based solver is the optimization part: many actually valid solutions are not acceptable, and that's where the heuristics get more complicated. David -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Sat May 16 07:45:33 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 15 May 2015 22:45:33 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Fri, May 15, 2015 at 1:44 PM, Paul Moore wrote: > > Is there any point? or is the current approach of statically linking all > > third party libs the way to go? > > If someone can make it work, that would be good. But (a) nobody is > actually offering to develop and maintain such a solution, well, it's on my list -- but it has been for a while, so I'm trying to gauge whether it's worth putting at the top of my "things to do for python" list. It's not at the top now ;-) > and (b) > it's not particularly clear how *much* of a benefit there would be > (space savings aren't that important, ease of upgrade is fine as long > as everything can be upgraded at once, etc...) > hmm -- that may be a trick, though not a uncommon one in python package dependencies -- it maybe hard to have more than one version of a given lib installed.... > If so, then is there any chance of getting folks to conform to this > standard > > for PyPi hosted binary packages anyway? i.e. the curation problem. > > If it exists, and if there's a benefit, people will use it. > OK -- that's encouraging... > > Personally, I'm on the fence here -- I really want newbies to be able to > > simply "pip install" as many packages as possible and get a good result > when > > they do it. > > Static linking gives that on Windows FWIW. (And maybe also on OSX?) > This is a key point, though - the goal shouldn't be "use dynamic > linking" but rather "make the user experience as easy as possible". It > may even be that the best approach (dynamic or static) differs > depending on platform. > true -- though we also have another problem -- that static linking solution is actually a big pain for package maintainers -- building and linking the dependencies the right way is a pain -- and now everyone that uses a given lib has to figure out how to do it. Giving folks a dynamic lib they can use would mie it easier for tehm to build their packages -- a nice benifit there. Though it's a lot harder to provide a build environment than just the lib to link too .. I"m going to have to think more about that... > > On the other hand, I've found that conda better supports this right now, > so > > it's easier for me to simply use that for my tools. > > And that's an entirely reasonable position. The only "problem" (if > indeed it is a problem) is that by having two different solutions > (pip/wheel and conda) splits the developer resource, which means that > neither approach moves forward as fast as a combined approach does. > That's not the only problem -- the current split between the (more than one) scientifc python distributions, and the community of folks using python.org and pypi creates a bit of a mess for newbies. I'm reviving this conversation because i just spent a class lecture in a python class on numpy/scipy -- these students have been using a python install for months, using virtualenv, ip installing whatever they need, et. and now, to use another lib, they have to go through machination, maybe even installing a entire additional python. This is not good. And I've had to help more than one student untangle a mess of Apple Python python.org python, homebrew, and/or Anaconda -- for someone that doesn't really get python pacakging, never mond PATHS, and .bashrc vs .bash_profile, etc, it's an unholy mess. "There should be one-- and preferably only one --obvious way to do it." -- HA! But that's OK if the two solutions are addressing different needs > The needs aren't really that different, however. Oh well. Anyway, it seems like if I can find some time to prototype what I have in mind, there may be some room to make it official if it works out. If anyone else want to help -- let me know! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Sat May 16 08:35:38 2015 From: cournape at gmail.com (David Cournapeau) Date: Sat, 16 May 2015 15:35:38 +0900 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Sat, May 16, 2015 at 4:56 AM, Chris Barker wrote: > On Fri, May 15, 2015 at 1:49 AM, Paul Moore wrote: > >> On 14 May 2015 at 19:01, Chris Barker wrote: >> > Ah -- here is the issue -- but I think we HAVE pretty much got what we >> need >> > here -- at least for Windows and OS-X. It depends what you mean by >> > "curated", but it seems we have a (defacto?) policy for PyPi: binary >> wheels >> > should be compatible with the python.org builds. So while each package >> wheel >> > is supplied by the package maintainer one way or another, rather than >> by a >> > central entity, it is more or less curated -- or at least standardized. >> And >> > if you are going to put a binary wheel up, you need to make sure it >> matches >> > -- and that is less than trivial for packages that require a third party >> > dependency -- but building the lib statically and then linking it in is >> not >> > inherently easier than doing a dynamic link. >> >> I think the issue is that, if we have 5 different packages that depend >> on (say) libpng, and we're using dynamic builds, then how do those >> packages declare that they need access to libpng.dll? > > > this is the missing link -- it is a binary build dependency, not a package > dependency -- so not such much that matplotlib-1.4.3 depends on libpng.x.y, > but that: > > > > matplotlib-1.4.3-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl > > depends on: > > libpng-x.y > > (all those binary parts will come from the platform) > > That's what's missing now. > > And on Windows, >> where does the user put libpng.dll so that it gets picked up? > > > Well, here is the rub -- Windows dll hell really is hell -- but I think > it goes into the python dll searchpath (sorry, not on a > Windows box where I can really check this out right now), it can work -- I > know have an in-house product that has multiple python modules sharing a > single dll somehow.... > > > >> And how >> does a non-expert user do this ("put it in $DIRECTORY, update your >> PATH, blah blah blah" doesn't work for the average user)? >> > > That's why we may need to update the tooling to handle this -- I"m not > totally sure if the current wheel format can support this on Windows -- > though it can on OS-X. > > In particular, on Windows, note that the shared DLL must either be in >> the directory where the executable is located (which is fun when you >> have virtualenvs, embedded interpreters, etc), or on PATH (which has >> other implications - suppose I have an incompatible version of >> libpng.dll, from mingw, say, somewhere earlier on PATH). >> > > that would be dll hell, yes..... > > >> The problem isn't so much defining a standard ABI that shared DLLs >> need - as you say, that's a more or less solved problem on Windows - >> it's managing how those shared DLLs are made available to Python >> extensions. And *that* is what Unix package managers do for you, and >> Windows doesn't have a good solution for (other than "bundle all the >> dependent DLLs with the app, or suffer DLL hell"). >> > > exactly -- but if we consider the python install to be the "app", rather > than an individual python bundle, then we _may_ be OK. > > PS For a fun exercise, it might be interesting to try breaking conda - >> > > Windows really is simply broken [1] in this regard -- so I'm quite sure > you could break conda -- but it does seem to do a pretty good job of not > being broken easily by common uses -- I can't say I know enough about > Windows dll finding or conda to know how... > > Oh, and conda is actually broken in this regard on OS-X at this point -- > if you compile your own extension in an anaconda environment, it will find > a shared lib at compile time that it won't find at run time. -- the conda > install process fixes these, but that's a pain when under development -- > i.e. you don't want to have to actually install the package with conda to > run a test each time you re-build the dll.. (or even change a bit of python > code...) > > But in short -- I'm pretty sure there is a way, on all systems, to have a > standard way to build extension modules, combined with a standard way to > install shared libs, so that a lib can be shared among multiple packages. > So the question remains: > There is actually no way to do that on windows without modifying the interpreter somehow. This was somehow discussed a bit at PyCon when talking about windows packaging: 1. the simple way to share DLLs across extensions is to put them in the %PATH%, but that's horrible. 2. there are ways to put DLLs in a shared directory *not* in the %PATH% since at least windows XP SP2 and above, through the SetDllDirectory API. With 2., you still have the issue of DLL hell, which may be resolved through naming and activation contexts. I had a brief chat with Steve where he mentioned that this may be a solution, but he was not 100 % sure IIRC. The main drawback of this solution is that it won't work when inheriting virtual environments (as you can only set a single directory). FWIW, we are about to deploy 2. @ Enthought (where we control the python interpreter, so it is much easier for us). David > Is there any point? or is the current approach of statically linking all > third party libs the way to go? > > If so, then is there any chance of getting folks to conform to this > standard for PyPi hosted binary packages anyway? i.e. the curation problem. > > Personally, I'm on the fence here -- I really want newbies to be able to > simply "pip install" as many packages as possible and get a good result > when they do it. > > On the other hand, I've found that conda better supports this right now, > so it's easier for me to simply use that for my tools. > > > -Chris > > > [1] My take on dll hell: > > a) it's inherently difficult -- which is why Linux provides a system > package manager. > > b) however, Windows really does make it MORE difficult than it has to be: > i) it looks first next the executable > ii) it also looks on the PATH (rather than a separate DLL_PATH) > Combine these two, and you have some folks dropping dlls next to their > executable, which means they have inadvertently dropped it on the DLL > search path for other apps to find it. > > Add to this the (very odd to me) long standing tradition of not putting > extensive version numbers in dll file names, and presto: dll hell! > > > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat May 16 11:47:26 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 16 May 2015 19:47:26 +1000 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <4B6FBEC7-3A3B-4CFC-9452-090F9B9B70D3@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> <4B6FBEC7-3A3B-4CFC-9452-090F9B9B70D3@stufft.io> Message-ID: On 16 May 2015 at 04:34, Donald Stufft wrote: > So I can?t speak for ReadTheDocs, but I believe that they are considering > and/or are planning on offering arbitrary HTML uploads similarly to how > you can upload documentation to PyPI. I don?t know if this will actually > happen and what it would look like but I know they are thinking about it. I've never tried it with ReadTheDocs, but in theory the ".. raw:: html" docutils directive allows arbitrary HTML content to be embedded in a reStructuredText page. Regardless, "Can ReadTheDocs do X?" questions are better asked on https://groups.google.com/forum/#!forum/read-the-docs, while both GitHub and Atlassian (via BitBucket) offer free static HTML hosting. In relation to the original question, +1 for attempting to phase out PyPI's documentation hosting capability in favour of delegating to RTFD or third party static HTML hosting. One possible option to explore that minimises disruption for existing users might be to stop offering it to *new* projects, while allowing existing projects to continue uploading new versions of their documentation. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Sat May 16 12:33:18 2015 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 16 May 2015 05:33:18 -0500 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> <4B6FBEC7-3A3B-4CFC-9452-090F9B9B70D3@stufft.io> Message-ID: On May 16, 2015 4:55 AM, "Nick Coghlan" wrote: > > On 16 May 2015 at 04:34, Donald Stufft wrote: > > So I can?t speak for ReadTheDocs, but I believe that they are considering > > and/or are planning on offering arbitrary HTML uploads similarly to how > > you can upload documentation to PyPI. I don?t know if this will actually > > happen and what it would look like but I know they are thinking about it. > > I've never tried it with ReadTheDocs, but in theory the ".. raw:: > html" docutils directive allows arbitrary HTML content to be embedded > in a reStructuredText page. > > Regardless, "Can ReadTheDocs do X?" questions are better asked on > https://groups.google.com/forum/#!forum/ read-the-docs , ReadTheDocs is hiring! https://blog.readthedocs.com/read-the-docs-is-hiring/ > while both > GitHub and Atlassian (via BitBucket) offer free static HTML hosting. I just wrote a tool (pypi:pgs) for serving files over HTTP directly from gh-pages branches that works in conjunction with pypi:ghp-import (> gh-pages; touch .nojekyll). CloudFront DNS can sort of be used to add TLS/SSL to custom domains with GitHub Pages (and probably BitBucket) > > In relation to the original question, +1 for attempting to phase out > PyPI's documentation hosting capability in favour of delegating to > RTFD or third party static HTML hosting. One possible option to > explore that minimises disruption for existing users might be to stop > offering it to *new* projects, while allowing existing projects to > continue uploading new versions of their documentation. - [ ] DOC: migration / alternatives guide > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Sat May 16 12:53:11 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Sat, 16 May 2015 11:53:11 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 16 May 2015 at 07:35, David Cournapeau wrote: >> But in short -- I'm pretty sure there is a way, on all systems, to have a >> standard way to build extension modules, combined with a standard way to >> install shared libs, so that a lib can be shared among multiple packages. So >> the question remains: > > There is actually no way to do that on windows without modifying the > interpreter somehow. This was somehow discussed a bit at PyCon when talking > about windows packaging: > > 1. the simple way to share DLLs across extensions is to put them in the > %PATH%, but that's horrible. > 2. there are ways to put DLLs in a shared directory *not* in the %PATH% > since at least windows XP SP2 and above, through the SetDllDirectory API. > > With 2., you still have the issue of DLL hell, which may be resolved through > naming and activation contexts. I had a brief chat with Steve where he > mentioned that this may be a solution, but he was not 100 % sure IIRC. The > main drawback of this solution is that it won't work when inheriting virtual > environments (as you can only set a single directory). > > FWIW, we are about to deploy 2. @ Enthought (where we control the python > interpreter, so it is much easier for us). This is indeed precisely the issue. In general, Python code can run with "the executable" being in many different places - there are the standard installs, virtualenvs, and embedding scenarios to consider. So "put DLLs alongside the executable", which is often how Windows applications deal with this issue, is not a valid option (that's an option David missed out above, but that's fine as it doesn't work :-)) Putting DLLs on %PATH% *does* cause problems, and pretty severe ones. People who use ports of Unix tools, such as myself, hit this a lot - at one point I got so frustrated with various incompatible versions of libintl showing up on my PATH, all with the same name, that I went on a spree of rebuilding all of the GNU tools without libintl support, just to avoid the issue (and older versions openssl were just as bad with libeay, etc). So, as David says, you pretty much have to use SetDllDirectory and similar features to get a viable location for shared DLLs. I guess it *may* be possible to call those APIs from a Python extension that you load *before* using any shared DLLs, but that seems like a very fragile solution. It's also possible for Python 3.6+ to add a new "shared DLLs" location for such things, which the core interpreter includes (either via SetDllDirectory or by the same mechanism that adds C:\PythonXY\DLLs to the search path at the moment). But that wouldn't help older versions. So while I encourage Chris' enthusiasm in looking for a solution to this issue, I'm not sure it's as easy as he's hoping. Paul From p.f.moore at gmail.com Sat May 16 13:13:05 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Sat, 16 May 2015 12:13:05 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 16 May 2015 at 06:45, Chris Barker wrote: >> > Personally, I'm on the fence here -- I really want newbies to be able to >> > simply "pip install" as many packages as possible and get a good result >> > when >> > they do it. >> >> Static linking gives that on Windows FWIW. (And maybe also on OSX?) >> This is a key point, though - the goal shouldn't be "use dynamic >> linking" but rather "make the user experience as easy as possible". It >> may even be that the best approach (dynamic or static) differs >> depending on platform. > > > true -- though we also have another problem -- that static linking solution > is actually a big pain for package maintainers -- building and linking the > dependencies the right way is a pain -- and now everyone that uses a given > lib has to figure out how to do it. Giving folks a dynamic lib they can use > would mie it easier for tehm to build their packages -- a nice benifit > there. > > Though it's a lot harder to provide a build environment than just the lib to > link too .. I"m going to have to think more about that... It seems to me that the end user doesn't really have a problem here ("pip install matplotlib" works fine for me using the existing wheel). It's the package maintainers (who have to build the binaries) that have the issue because everyone ends up doing the same work over and over, building dependencies. So rather than trying to address the hard problem of dynamic linking, maybe a simpler solution is to set up a PyPI-like hosting solution for static libraries of C dependencies? It could be as simple as a github project that contained a directory for each dependency, with scripts to build Python-compatible static libraries, and probably built .lib files for the supported architectures. With a setuptools build plugin you could even just specify your libraries in setup.py, and have the plugin download the lib files automatically at build time. People add libraries to the archive simply by posting pull requests. Maybe the project maintainer maintains the actual binaries by running the builds separately and publishing them separately, or maybe PRs include binaries - either way would work (although having the maintainer do it ensures a certain level of QA that the build process is reproducible). It could even include libraries that people need for embedding, rather than extensions (I recently needed a version of libxpm compatible with Python 3.5, for building a Python-enabled vim, for example). The msys2 projects provides something very similar to this at https://github.com/Alexpux/MINGW-packages which is a repository of build scripts for various packages. Paul PS The above is described as if it's single-platform, mostly because I only tend to think about these issues from a Windows POV, but it shouldn't be hard to extend it to multi-platform. From ncoghlan at gmail.com Sat May 16 14:02:46 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 16 May 2015 22:02:46 +1000 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On 16 May 2015 at 11:52, Robert Collins wrote: > On 16 May 2015 at 13:45, Donald Stufft wrote: >> >>> On May 15, 2015, at 9:22 PM, Robert Collins wrote: >>> >>> On 16 May 2015 at 11:08, Marcus Smith wrote: >>>> Why not start with pip at least being a "simple" fail-on-conflict resolver >>>> (vs the "1st found wins" resolver it is now)... >>>> >>>> You'd "backtrack" for the sake of re-walking when new constraints are found, >>>> but not for the purpose of solving conflicts. >>>> >>>> I know you're motivated to solve Openstack build issues, but many of the >>>> issues I've seen in the pip tracker, I think would be solved without the >>>> backtracking resolver you're trying to build. >>> >>> Well, I'm scratching the itch I have. If its too hard to get something >>> decent, sure I might back off in my goals, but I see no point aiming >>> for something less than all the other language specific packaging >>> systems out there have. >> >> >> So what makes the other language specific packaging systems different? As far >> as I know all of them have complete archives (e.g. they are like PyPI where they >> have a lot of versions, not like Linux Distros). What can we learn from how they >> solved this? > > NB; I have by no means finished low hanging heuristics and space > trimming stuff :). I have some simple things in mind and am sure I'll > end up with something 'good enough' for day to day use. The thing I'm > worried about is the long term health of the approach. Longer term, I think it makes sense to have the notion of "active" and "obsolete" versions baked into PyPI's API and the web UI. This wouldn't be baked into the package metadata itself (unlike the proposed "Obsoleted-By" field for project renaming), but rather be a dynamic reflection of whether or not *new* users should be looking at the affected version, and whether or not it should be considered as a candidate for dependency resolution when not specifically requested. (This could also replace the current "hidden versions" feature, which only hides things from the web UI, without having any impact on the information published to automated tools through the programmatic API) Tools that list outdated packages could also be simplified a bit, as their first pass could just be to check the obsolescence markers on installed packages, with the second pass being to check for newer versions of those packages. While the bare minimum would be to let project mantainers set the obsolescence flag directly, we could also potentially offer projects some automated obsolescence schemes, such as: * single active released version, anything older is marked as obsolete whenever a new (non pre-release) version is uploaded * semantic versioning, with a given maximum number of active released X versions (e.g. 2), but only the most recent (according to PEP 440) released version with a given X.* is active, everything else is obsolete * CPython-style and date-based versioning, with a given maximum number of active released X.Y versions (e.g. 2), but only the most recent (according to PEP 440) released version with a given X.Y.* is active, everything else is obsolete Pre-release versions could also be automatically flagged as obsolete by PyPI as soon as a newer version for the same release (including the final release itself) was uploaded for the given package. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From seb at nmeos.net Sat May 16 02:22:21 2015 From: seb at nmeos.net (=?ISO-8859-1?Q?S=E9bastien=20Douche?=) Date: Sat, 16 May 2015 02:22:21 +0200 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: <1431735741.3259499.270063017.16675E69@webmail.messagingengine.com> On Fri, 15 May 2015, at 15:48, Donald Stufft wrote: > Hey! Hi Donald > Ideally I hope people start to use ReadTheDocs instead of PyPI itself. +1. Do you want to use the python.org domain (ex. "pypi.python.org/docs") or keep RTD on it own domain? -- S?bastien Douche Twitter: @sdouche http://douche.name From jcappos at nyu.edu Sat May 16 16:36:06 2015 From: jcappos at nyu.edu (Justin Cappos) Date: Sat, 16 May 2015 10:36:06 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: > > I am no expert, but I don't understand why backtracking algorithms would > to be faster than SAT, since they both potentially need to walk over the > full set of possible solutions. It is hard to reason about the cost because > the worst case is in theory growing exponentially in both cases. > This is talked about a bit in this thread: https://github.com/pypa/pip/issues/988 Each algorithm could be computationally more efficient. Basically, *if there are no conflicts* backtracking will certainly win. If there are a huge number of conflicts a SAT solver will certainly win. It's not clear where the tipping point is between the two schemes. However, a better question is does the computational difference matter? If one is a microsecond faster than the other, I don't think anyone cares. However, from the OPIUM paper (listed off of that thread), it is clear that SAT solver resolution can be slow without optimizations to make them work more like backtracking resolvers. From my experience backtracking resolvers are also slow when the conflict rate is high. This only considers computation cost though. Other factors can become more expensive than computation. For example, SAT solvers need all the rules to consider. So a SAT solution needs to effectively download the full dependency graph before starting. A backtracking dependency resolver can just download packages or dependency information as it considers them. The bandwidth cost for SAT solvers should be higher. Thanks, Justin P.S. If you'd like to talk off list, possibly over Skype, I'd be happy to talk more with you and/or Robert about minutiae that others may not care about. -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Sat May 16 17:22:06 2015 From: cournape at gmail.com (David Cournapeau) Date: Sun, 17 May 2015 00:22:06 +0900 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On Sat, May 16, 2015 at 11:36 PM, Justin Cappos wrote: > I am no expert, but I don't understand why backtracking algorithms would >> to be faster than SAT, since they both potentially need to walk over the >> full set of possible solutions. It is hard to reason about the cost because >> the worst case is in theory growing exponentially in both cases. >> > > This is talked about a bit in this thread: > https://github.com/pypa/pip/issues/988 > > Each algorithm could be computationally more efficient. Basically, *if > there are no conflicts* backtracking will certainly win. If there are a > huge number of conflicts a SAT solver will certainly win. It's not clear > where the tipping point is between the two schemes. > > However, a better question is does the computational difference matter? > If one is a microsecond faster than the other, I don't think anyone cares. > However, from the OPIUM paper (listed off of that thread), it is clear that > SAT solver resolution can be slow without optimizations to make them work > more like backtracking resolvers. From my experience backtracking > resolvers are also slow when the conflict rate is high. > Pure SAT is fast enough in practice in my experience (concretely: solving thousand of rules takes < 1 sec). It becomes more complicated as you need to optimize the solution, especially when you have already installed packages. This is unfortunately not as well discussed in the literature. Pseudo-boolean SAT for optimization was argued to be too slow by the 0 install people, but OTOH, this seems to be what's used in conda, so who knows :) If you SAT solver is in pure python, you can choose a "direction" of the search which is more meaningful. I believe this is what 0install does from reading http://0install.net/solver.html, and what we have in our own SAT solver code. I unfortunately cannot look at the 0install code myself as it is under the GPL and am working on a BSD solver implementation. I also do not know how they handle updates and already installed packages. > This only considers computation cost though. Other factors can become > more expensive than computation. For example, SAT solvers need all the > rules to consider. So a SAT solution needs to effectively download the > full dependency graph before starting. A backtracking dependency resolver > can just download packages or dependency information as it considers them. > The bandwidth cost for SAT solvers should be higher. > With a reasonable representation, I think you can make it small enough. To give an idea, our index @ Enthought containing around 20k packages takes ~340 kb compressed w/ bz2 if you only keep the data required for dependency handling (name, version and runtime dependencies), and that's using json, an inefficient encoding, so I suspect encoding all of pypi may be a few MB only fetch, which is generally faster that doing tens of http requests. The libsvolv people worked on a binary representation that may also be worth looking at. > P.S. If you'd like to talk off list, possibly over Skype, I'd be happy to > talk more with you and/or Robert about minutiae that others may not care > about. > Sure, I would be happy too. As I mentioned before, we have some code around a SAT-based solver, but it is not ready yet, which is why we kept it private (https://github.com/enthought/sat-solver). It handles well (== both speed and quality-wise) the case where nothing is installed, but behaves poorly when packages are already installed, and does not handle the update case yet. The code is also very prototype-ish, but is not too complicated to experimente with it. David -------------- next part -------------- An HTML attachment was scrubbed... URL: From dholth at gmail.com Sat May 16 17:40:58 2015 From: dholth at gmail.com (Daniel Holth) Date: Sat, 16 May 2015 11:40:58 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On May 16, 2015 11:22 AM, "David Cournapeau" wrote: > > > > On Sat, May 16, 2015 at 11:36 PM, Justin Cappos wrote: >>> >>> I am no expert, but I don't understand why backtracking algorithms would to be faster than SAT, since they both potentially need to walk over the full set of possible solutions. It is hard to reason about the cost because the worst case is in theory growing exponentially in both cases. >> >> >> This is talked about a bit in this thread: https://github.com/pypa/pip/issues/988 >> >> Each algorithm could be computationally more efficient. Basically, *if there are no conflicts* backtracking will certainly win. If there are a huge number of conflicts a SAT solver will certainly win. It's not clear where the tipping point is between the two schemes. >> >> However, a better question is does the computational difference matter? If one is a microsecond faster than the other, I don't think anyone cares. However, from the OPIUM paper (listed off of that thread), it is clear that SAT solver resolution can be slow without optimizations to make them work more like backtracking resolvers. From my experience backtracking resolvers are also slow when the conflict rate is high. > > > Pure SAT is fast enough in practice in my experience (concretely: solving thousand of rules takes < 1 sec). It becomes more complicated as you need to optimize the solution, especially when you have already installed packages. This is unfortunately not as well discussed in the literature. Pseudo-boolean SAT for optimization was argued to be too slow by the 0 install people, but OTOH, this seems to be what's used in conda, so who knows :) Where optimizing means something like "find a solution with the newest possible releases of the required packages", not execution speed. > If you SAT solver is in pure python, you can choose a "direction" of the search which is more meaningful. I believe this is what 0install does from reading http://0install.net/solver.html, and what we have in our own SAT solver code. I unfortunately cannot look at the 0install code myself as it is under the GPL and am working on a BSD solver implementation. I also do not know how they handle updates and already installed packages. > >> >> This only considers computation cost though. Other factors can become more expensive than computation. For example, SAT solvers need all the rules to consider. So a SAT solution needs to effectively download the full dependency graph before starting. A backtracking dependency resolver can just download packages or dependency information as it considers them. The bandwidth cost for SAT solvers should be higher. > > > With a reasonable representation, I think you can make it small enough. To give an idea, our index @ Enthought containing around 20k packages takes ~340 kb compressed w/ bz2 if you only keep the data required for dependency handling (name, version and runtime dependencies), and that's using json, an inefficient encoding, so I suspect encoding all of pypi may be a few MB only fetch, which is generally faster that doing tens of http requests. > > The libsvolv people worked on a binary representation that may also be worth looking at. > >> >> P.S. If you'd like to talk off list, possibly over Skype, I'd be happy to talk more with you and/or Robert about minutiae that others may not care about. > > > Sure, I would be happy too. As I mentioned before, we have some code around a SAT-based solver, but it is not ready yet, which is why we kept it private (https://github.com/enthought/sat-solver). It handles well (== both speed and quality-wise) the case where nothing is installed, but behaves poorly when packages are already installed, and does not handle the update case yet. The code is also very prototype-ish, but is not too complicated to experimente with it. > > David > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Sat May 16 17:48:55 2015 From: cournape at gmail.com (David Cournapeau) Date: Sun, 17 May 2015 00:48:55 +0900 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On Sun, May 17, 2015 at 12:40 AM, Daniel Holth wrote: > > On May 16, 2015 11:22 AM, "David Cournapeau" wrote: > > > > > > > > On Sat, May 16, 2015 at 11:36 PM, Justin Cappos wrote: > >>> > >>> I am no expert, but I don't understand why backtracking algorithms > would to be faster than SAT, since they both potentially need to walk over > the full set of possible solutions. It is hard to reason about the cost > because the worst case is in theory growing exponentially in both cases. > >> > >> > >> This is talked about a bit in this thread: > https://github.com/pypa/pip/issues/988 > >> > >> Each algorithm could be computationally more efficient. Basically, *if > there are no conflicts* backtracking will certainly win. If there are a > huge number of conflicts a SAT solver will certainly win. It's not clear > where the tipping point is between the two schemes. > >> > >> However, a better question is does the computational difference > matter? If one is a microsecond faster than the other, I don't think > anyone cares. However, from the OPIUM paper (listed off of that thread), > it is clear that SAT solver resolution can be slow without optimizations to > make them work more like backtracking resolvers. From my experience > backtracking resolvers are also slow when the conflict rate is high. > > > > > > Pure SAT is fast enough in practice in my experience (concretely: > solving thousand of rules takes < 1 sec). It becomes more complicated as > you need to optimize the solution, especially when you have already > installed packages. This is unfortunately not as well discussed in the > literature. Pseudo-boolean SAT for optimization was argued to be too slow > by the 0 install people, but OTOH, this seems to be what's used in conda, > so who knows :) > > Where optimizing means something like "find a solution with the newest > possible releases of the required packages", not execution speed. > Indeed, it was not obvious in this context :) Though in theory, optimization is more general. It could be optimizing w.r.t. a cost function taking into account #packages, download size, minimal number of changes, etc... This is where you want a pseudo-boolean SAT, which is what conda uses I think. 0install, composer and I believe libsolv took a different route, and use heuristics to find a reasonably good solution by picking the next candidate. This requires access to the internals of the SAT solver though (not a problem if you have a python implementation). David > > If you SAT solver is in pure python, you can choose a "direction" of the > search which is more meaningful. I believe this is what 0install does from > reading http://0install.net/solver.html, and what we have in our own SAT > solver code. I unfortunately cannot look at the 0install code myself as it > is under the GPL and am working on a BSD solver implementation. I also do > not know how they handle updates and already installed packages. > > > >> > >> This only considers computation cost though. Other factors can become > more expensive than computation. For example, SAT solvers need all the > rules to consider. So a SAT solution needs to effectively download the > full dependency graph before starting. A backtracking dependency resolver > can just download packages or dependency information as it considers them. > The bandwidth cost for SAT solvers should be higher. > > > > > > With a reasonable representation, I think you can make it small enough. > To give an idea, our index @ Enthought containing around 20k packages takes > ~340 kb compressed w/ bz2 if you only keep the data required for dependency > handling (name, version and runtime dependencies), and that's using json, > an inefficient encoding, so I suspect encoding all of pypi may be a few MB > only fetch, which is generally faster that doing tens of http requests. > > > > The libsvolv people worked on a binary representation that may also be > worth looking at. > > > >> > >> P.S. If you'd like to talk off list, possibly over Skype, I'd be happy > to talk more with you and/or Robert about minutiae that others may not care > about. > > > > > > Sure, I would be happy too. As I mentioned before, we have some code > around a SAT-based solver, but it is not ready yet, which is why we kept it > private (https://github.com/enthought/sat-solver). It handles well (== > both speed and quality-wise) the case where nothing is installed, but > behaves poorly when packages are already installed, and does not handle the > update case yet. The code is also very prototype-ish, but is not too > complicated to experimente with it. > > > > David > > > > > > _______________________________________________ > > Distutils-SIG maillist - Distutils-SIG at python.org > > https://mail.python.org/mailman/listinfo/distutils-sig > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat May 16 19:12:51 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 May 2015 03:12:51 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 15 May 2015 at 04:01, Chris Barker wrote: >> >> I'm confused -- you don't want a system to be able to install ONE >> >> version >> >> of a lib that various python packages can all link to? That's really >> >> the >> >> key use-case for me.... > > >> >> Are we talking about Python libraries accessed via Python APIs, or >> linking to external dependencies not written in Python (including >> linking directly to C libraries shipped with a Python library)? > > > I, at least, am talking about the latter. for a concrete example: libpng, > for instance, might be needed by PIL, wxPython, Matplotlib, and who knows > what else. At this point, if you want to build a package of any of these, > you need to statically link it into each of them, or distribute shared libs > with each package -- if you ware using them all together (which I do, > anyway) you now have three copies of the same lib (but maybe different > versions) all linked into your executable. Maybe there is no downside to > that (I haven't had a problem yet), but it seems like a bad way to do it! > >> It's the latter I consider to be out of scope for a language specific >> packaging system > > > Maybe, but it's a problem to be solved, and the Linux distros more or less > solve it for us, but OS-X and Windows have no such system built in (OS-X > does have Brew and macports....) Windows 10 has Chocalatey and OneGet: * https://chocolatey.org/ * http://blogs.msdn.com/b/garretts/archive/2015/01/27/oneget-and-the-windows-10-preview.aspx conda and nix then fill the niche for language independent packaging at the user level rather than the system level. >> - Python packaging dependencies are designed to >> describe inter-component dependencies based on the Python import >> system, not dependencies based on the operating system provided C/C++ >> dynamic linking system. > > I think there is a bit of fuzz here -- cPython, at least, uses the "the > operating system provided C/C++ > dynamic linking system" -- it's not a totally independent thing. I'm specifically referring to the *declaration* of dependencies here. While CPython itself will use the dynamic linker to load extension modules found via the import system, the loading of further dynamically linked modules beyond that point is entirely opaque not only to the interpreter runtime at module import time, but also to pip at installation time. >> If folks are after the latter, than they want >> a language independent package system, like conda, nix, or the system >> package manager in a Linux distribution. > > And I am, indeed, focusing on conda lately for this reason -- but not all my > users want to use a whole new system, they just want to "pip install" and > have it work. And if you are using something like conda you don't need pip > or wheels anyway! Correct, just as if you're relying solely on Linux system packages, you don't need pip or wheels. Aside from the fact that conda is cross-platform, the main difference between the conda community and a Linux distro is in the *kind* of software we're likely to have already done the integration work for. The key to understanding the difference in the respective roles of pip and conda is realising that there are *two* basic distribution scenarios that we want to be able to cover (I go into this in more detail in https://www.python.org/dev/peps/pep-0426/#development-distribution-and-deployment-of-python-software): * software developer/publisher -> software integrator/service operator (or data analyst) * software developer/publisher -> software integrator -> service operator (or data analyst) Note the second line has 3 groups and 2 distribution arrows, while the first line only has the 2 groups and a single distribution step. pip and the other Python specific tools cover that initial developer/publisher -> integrator link for Python projects. This means that Python developers only need to learn a single publishing toolchain (the PyPA tooling) to get started, and they'll be able to publish their software in a format that any integrator that supports Python can consume (whether that's for direct consumption in a DIY integration scenario, or to put through a redistributor's integration processes). On the consumption side, though, the nature of the PyPA tooling as a platform-independent software publication toolchain means that if you want to consume the PyPA formats directly, you need to be prepared to do your own integration work. Many public web service developers are entirely happy with that deal, but most system administrators and data analysts trying to deal with components written in multiple programming languages aren't. That latter link, where the person or organisation handling the software integration task is distinct from the person or organisation running an operational service, or carrying out some data analysis, are where the language independent redistributor tools like Chocolatey, Nix, deb, rpm, conda, Docker, etc all come in - they let a redistributor handle the integration task (or at least some of it) on behalf of their users, leaving those users free to spend more of their time on problems that are unique to them, rather than having to duplicate the redistributor's integration work on their own time. If you look at those pipelines from the service operator/data analyst end, then the *first* question to ask is "Is there a software integrator that targets the audience I am a member of?". If there is, then you're likely to have a better experience reusing their work, rather than spending time going on a DIY integration adventure. In those cases, the fact that the tooling you're using to consume software differs from that the original developers used to publish it *should* be a hidden implementation detail. When it isn't, it's either a sign that those of us in the "software integrator" role aren't meeting the needs of our audience adequately, or else it's a sign that that particular user made the wrong call in opting out of tackling the "DIY integration" task. >> I'm arguing against supporting direct C level dependencies between >> packages that rely on dynamic linking to find each other rather than >> going through the Python import system, > > Maybe there is a mid ground. For instance, I have a complex wrapper system > around a bunch of C++ code. There are maybe 6 or 7 modules that all need to > link against that C++ code. On OS-X (and I think Linux, I haven't been doing > those builds), we can statically link all the C++ into one python module -- > then, as long as that python module is imported before the others, they will > all work, and all use that same already loaded version of that library. > > (this doesn't work so nicely on Windows, unfortunately, so there, we build a > dll, and have all the extensions link to it, then put the dll somewhere it > gets found -- a little fuzzy on those details) > > So option (1) for something like libpng is to have a compiled python module > that is little but a something that can be linked to ibpng, so that it can > be found and loaded by cPython on import, and any other modules can then > expect it to be there. This is a big old kludge, but I think could be done > with little change to anything in Python or wheel, or...but it would require > changes to how each package that use that lib sets itself up and checks for > and install dependencies -- maybe not really possible. and it would be > better if dependencies could be platform independent, which I'm not sure is > supported now. > > option (2) would be to extend python's import mechanism a bit to allow it to > do a raw "link in this arbitrary lib" action, so the lib would not have to > be wrapped in a python module -- I don't know how possible that is, or if it > would be worth it. Your option 2 is specifically the kind of thing I don't want to support, as it's incredibly hard to do right (to the tune of "people will pay you millions of dollars a year to reduce-or-eliminate their ABI compatibility concerns"), and has the potential to replace the current you-need-to-be-able-build-this-from-source-yourself issue with "oh, look, now you have a runtime ABI incompatibility, have fun debugging that one, buddy". Your option 1 seems somewhat more plausible, as I believe it should theoretically be possible to use the PyCObject/PyCapsule API (or even just normal Python objects) to pass the relevant shared library details from a "master" module that determines which versions of external libraries to link against, to other modules that always want to load them, in a way that ensures everything is linking against a version that it is ABI compatible with. That would require someone to actually work on the necessary tooling to help with that though, as you wouldn't be able to rely on the implicit dynamic linking provided by C/C++ toolchains any more. Probably the best positioned to tackle that idea would be the Cython community, since they could generate all the required cross-platform boilerplate code automatically. >> (Another way of looking at this: if a tool can manage the >> Python runtime in addition to Python modules, it's a full-blown >> arbitrary software distribution platform, not just a Python package >> manager). > > sure, but if it's ALSO a Python package manger, then why not? i.e. conda -- > if we all used conda, we wouldn't need pip+wheel. conda's not a Python package manager, it's a language independent package manager that was born out of the Scientific Python community and includes Python as one of its supported languages, just like nix, deb, rpm, etc. That makes it an interesting alternative to pip on the package *consumption* side for data analysts, but it isn't currently a good fit for any of pip's other use cases (e.g. one of the scenarios I'm personally most interested in is that pip is now part of the Fedora/RHEL/CentOS build pipeline for Python based RPM packages - we universally recommend using "pip install" in the %install phase over using "setup.py install" directly) >> Defining cross-platform ABIs (cf. http://bugs.python.org/issue23966) > > This is a mess that you need to deal with for ANY binary package -- that's > why we don't distribute binary wheels on pypi for Linux, yes? Yes, the reason we don't do *nix packages on any platform other than Mac OS X is because the platform defines the CPython ABI along with everything else. It's a fair bit more manageable when we're just dealing with extension modules on Windows and Mac OS X, as we can anchor the ABI on the CPython interpreter ABI. >> I'm >> firmly of the opinion that trying to solve both sets of problems with >> a single tool will produce a result that doesn't work as well for >> *either* use case as separate tools can. > > I'm going to point to conda again -- it solves both problems, and it's > better to use it for all your packages than mingling it with pip (though you > CAN mingle it with pip...). So if we say "pip and friends are not going to > do that", then we are saying: we don't support a substantial class of > packages, and then I wonder what the point is to supporting binary packages > at all? Binary wheels already work for Python packages that have been developed with cross-platform maintainability and deployability taken into account as key design considerations (including pure Python wheels, where the binary format just serves as an installation accelerator). That category just happens to exclude almost all research and data analysis software, because it excludes the libraries at the bottom of that stack (not worry to much about deployability concerns bought the Scientific Python stack a lot of functionality, but it *did* come at a price). It's also the case that when you *are* doing your own system integration, wheels are a powerful tool for caching builds, since you can deal with ABI compatibility concerns through out of band mechanisms, such as standardising your build platform and your deployment platform on a single OS. If you both build and deploy on CentOS 6, then it doesn't matter that your wheel files may not work on CentOS 7, or Ubuntu, or Debian, or Cygwin, because you're not deploying them there, and if you switched platforms, you'd just redo your builds. >> P.S. The ABI definition problem is at least somewhat manageable for >> Windows and Mac OS X desktop/laptop environments > > Ah -- here is a key point -- because of that, we DO support binary packages > on PyPi -- but only for Windows and OS-X.. I'm just suggesting we find a way > to extend that to pacakges that require a non-system non-python dependency. At the point you're managing arbitrary external binary dependencies, you've lost all the constraints that let us get away with doing this for extension modules without adequate metadata, and are back to trying to solve the same arbitrary ABI problem that exists on Linux. This is multi-billion-dollar-operating-system-companies-struggle-to-get-this-right levels of difficulty that we're talking about here :) >> but beyond >> those two, things get very messy, very fast - identifying CPU >> architectures, CPU operating modes and kernel syscall interfaces >> correctly is still a hard problem in the Linux distribution space > > right -- but where I am confused is where the line is drawn -- it seem sto > be the line is REALLY drawn at "yuo need to compile some C (or Fortran, > or???) code, rather than at "you depend on another lib" -- the C code, > whether it is a third party lib, or part of your extension, still needs to > be compiled to match the host platform. The line is drawn at ABI compatibility management. We're able to fuzz that line a little bit in the case of Windows and Mac OS X extension modules because we have the python.org CPython releases to act as an anchor for the ABI definition. We don't have that at all on other *nix platforms, and we don't have it on Windows and Mac OS X either once we move beyond the CPython C ABI (which encompasses the underlying platform ABI) We *might* be able to get to the point of being able to describe platform ABIs well enough to allow public wheels for arbitrary platforms, but we haven't had any plausible sounding designs put forward for that as yet, and it still wouldn't allow depending on arbitrary external binaries (only the versions integrated with a given platform). >> but the rise of aarch64 and IBM's >> creation of the OpenPOWER Foundation is making the data centre space >> interesting again, while in the mobile and embedded spaces it's ARM >> that is the default, with x86_64 attempting to make inroads. > > Are those the targets for binary wheels? I don't think so. Yes, they'll likely end up being one of Fedora's targets for prebuilt wheel files: https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManagement >> This is why >> >> "statically link all the things" keeps coming back in various guises > > but if you statically link, you need to build the static package right > anyway -- so it doesn't actually solve the problem at hand anyway. Yes it does - you just need to make sure your build environment suitably matches your target deployment environment. "Publishing on PyPI" is only one of the use cases for wheel files, and it isn't relevant to any of my own personal use cases (which all involve a PyPI independent build system, with PyPI used solely as a source of sdist archives). >> The only solution that is known to work reliably for dynamic linking >> is to have a curated set of packages all built by the same build >> system, so you know they're using consistent build settings. Linux >> distributions provide this, as do multi-OS platforms like nix and >> conda. We *might* be able to provide it for Python someday if PyPI >> ever gets an integrated build farm, but that's still a big "if" at >> this point. > > Ah -- here is the issue -- but I think we HAVE pretty much got what we need > here -- at least for Windows and OS-X. It depends what you mean by > "curated", but it seems we have a (defacto?) policy for PyPi: binary wheels > should be compatible with the python.org builds. So while each package wheel > is supplied by the package maintainer one way or another, rather than by a > central entity, it is more or less curated -- or at least standardized. And > if you are going to put a binary wheel up, you need to make sure it matches > -- and that is less than trivial for packages that require a third party > dependency -- but building the lib statically and then linking it in is not > inherently easier than doing a dynamic link. > > OK -- I just remembered the missing link for doing what I proposed above for > third party dynamic libs: at this point dependencies are tied to a > particular package -- whereas my plan above would require a dependency ties > to particular wheel, not the package as a whole. i.e: > > my mythical matplotlib wheel on OS-X would depend on a py_libpng module -- > which could be provided as separate binary wheel. but matplotlib in general > would not have that dependency -- for instance, on Linux, folks would want > it to build against the system lib, and not have another dependency. Even on > OS-X, homebrew users would want it to build against the homebrew lib, etc... > > So what would be good is a way to specify a "this build" dependency. That > can be hacked in, of course, but nicer not to have to. By the time you've solved all these problems I believe you'll find you have reinvented conda ;) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat May 16 19:24:47 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 May 2015 03:24:47 +1000 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: On 17 May 2015 at 00:36, Justin Cappos wrote: > This only considers computation cost though. Other factors can become more > expensive than computation. For example, SAT solvers need all the rules to > consider. So a SAT solution needs to effectively download the full > dependency graph before starting. A backtracking dependency resolver can > just download packages or dependency information as it considers them. This is the defining consideration for pip at this point: a SAT solver requires publication of static dependency metadata on PyPI, which is dependent on both the Warehouse migration *and* the completion and acceptance of PEP 426. Propagation out to PyPI caching proxies and mirrors like devpi and the pulp-python plugin will then take even longer. A backtracking resolver doesn't have those gating dependencies, as it can tolerate the current dynamic metadata model. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From donald at stufft.io Sat May 16 19:28:26 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 16 May 2015 13:28:26 -0400 Subject: [Distutils] PyPI is a sick sick hoarder In-Reply-To: References: <9DE50962-02D3-4B22-BC04-1C35FFA8BA2F@stufft.io> Message-ID: <8EAD2B10-7E93-4275-9B9B-D9E3D174D75A@stufft.io> > On May 16, 2015, at 1:24 PM, Nick Coghlan wrote: > > On 17 May 2015 at 00:36, Justin Cappos wrote: >> This only considers computation cost though. Other factors can become more >> expensive than computation. For example, SAT solvers need all the rules to >> consider. So a SAT solution needs to effectively download the full >> dependency graph before starting. A backtracking dependency resolver can >> just download packages or dependency information as it considers them. > > This is the defining consideration for pip at this point: a SAT solver > requires publication of static dependency metadata on PyPI, which is > dependent on both the Warehouse migration *and* the completion and > acceptance of PEP 426. Propagation out to PyPI caching proxies and > mirrors like devpi and the pulp-python plugin will then take even > longer. > > A backtracking resolver doesn't have those gating dependencies, as it > can tolerate the current dynamic metadata model. > Even when we have Warehouse and PEP 426, that only gives us that data going forward, the 400k files that currently exist on PyPI still won?t have static metadata. We could parse it out for Wheels but not for anything else. For the foreseeable future any solution will need to be able to handle iteratively finding constraints. Though I think a SAT solver can do it if it can handle incremental solving or just by re-doing the SAT problem each time we discover a new constraint. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From chris.barker at noaa.gov Sat May 16 20:40:11 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 11:40:11 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Fri, May 15, 2015 at 11:35 PM, David Cournapeau wrote: > On Sat, May 16, 2015 at 4:56 AM, Chris Barker > wrote: > >> >> But in short -- I'm pretty sure there is a way, on all systems, to have a >> standard way to build extension modules, combined with a standard way to >> install shared libs, so that a lib can be shared among multiple packages. >> So the question remains: >> > > There is actually no way to do that on windows without modifying the > interpreter somehow. > Darn. > This was somehow discussed a bit at PyCon when talking about windows > packaging: > > 1. the simple way to share DLLs across extensions is to put them in the > %PATH%, but that's horrible. > yes -- that has to be off the table, period. > 2. there are ways to put DLLs in a shared directory *not* in the %PATH% > since at least windows XP SP2 and above, through the SetDllDirectory API. > > With 2., you still have the issue of DLL hell, > could you clarify a bit -- I thought that this could, at least, put a dir on the search path that was specific to that python context. So it would require cooperation among all the packages being used at once, but not get tangled up with the rest of the system. but maybe I'm wrong here -- I have no idea what the heck I'm doing with this! which may be resolved through naming and activation contexts. > I guess that's what I mean by the above.. > I had a brief chat with Steve where he mentioned that this may be a > solution, but he was not 100 % sure IIRC. The main drawback of this > solution is that it won't work when inheriting virtual environments (as you > can only set a single directory). > no relative paths here? or path that can be set at run time? or maybe I"m missing what "inheriting virtual environments" means... > FWIW, we are about to deploy 2. @ Enthought (where we control the python > interpreter, so it is much easier for > us). > It'll be great to see how that works out, then. I take that this means that for Canopy, you've decided that statically linking everything is NOT The way to go. Which is a good data point to have. Thanks for the update. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Sat May 16 20:54:52 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Sat, 16 May 2015 19:54:52 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 16 May 2015 at 19:40, Chris Barker wrote: >> With 2., you still have the issue of DLL hell, > > could you clarify a bit -- I thought that this could, at least, put a dir on > the search path that was specific to that python context. So it would > require cooperation among all the packages being used at once, but not get > tangled up with the rest of the system. but maybe I'm wrong here -- I have > no idea what the heck I'm doing with this! Suppose Python adds C:\PythonXY\SharedDLLs to %PATH%. Suppose there's a libpng.dll in there, for matplotlib. Everything works fine. Then I install another non-Python application that uses libpng.dll, and does so by putting libpng.dll alongside the executable (a common way of making DLLs available with Windows applications). Also assume that the application installer adds the application directory to the *start* of PATH. Now, Python extensions will use this 3rd party application's DLL rather than the correct one. If it's ABI-incompatible, the Python extension will crash. If it's ABI compatible, but behaves differently (it could be a different version) there could be inconsistencies or failures. The problem is that while Python can add a DLL directory to PATH, it cannot control what *else* is on PATH, or what has priority. Paul From chris.barker at noaa.gov Sat May 16 20:58:22 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 11:58:22 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Sat, May 16, 2015 at 4:13 AM, Paul Moore wrote: > > Though it's a lot harder to provide a build environment than just the > lib to > > link too .. I"m going to have to think more about that... > > It seems to me that the end user doesn't really have a problem here > ("pip install matplotlib" works fine for me using the existing wheel). > Sure -- but that's because Matthew Brett has done a lot of work to make that happen. > It's the package maintainers (who have to build the binaries) that > have the issue because everyone ends up doing the same work over and > over, building dependencies. Exactly -- It would be nice if the ecosystem made that easier. > So rather than trying to address the hard > problem of dynamic linking, maybe a simpler solution is to set up a > PyPI-like hosting solution for static libraries of C dependencies? > > It could be as simple as a github project that contained a directory > for each dependency, I started that here: https://github.com/PythonCHB/mac-builds but haven't kept it up. And Matthew Brett has done most of the work here: https://github.com/MacPython not sure how he's sharing the static libs -- but it could be done. With a setuptools build plugin you could even just > specify your libraries in setup.py, and have the plugin download the > lib files automatically at build time. actually, that's a pretty cool idea! you'd need place to host them -- gitHbu is no longer hosting "downloads" are they? though you could probably use github-pages.. (or somethign else) > People add libraries to the > archive simply by posting pull requests. Maybe the project maintainer > maintains the actual binaries by running the builds separately and > publishing them separately, or maybe PRs include binaries or you use a CI system to build them. Something like this is being done by a bunch of folks for conda/binstar: https://github.com/ioos/conda-recipes is just one example. PS The above is described as if it's single-platform, mostly because I > only tend to think about these issues from a Windows POV, but it > shouldn't be hard to extend it to multi-platform. > Indeed -- the MacWheels projects are, of course single platform, but could be extended. though at the end of the day, there isn't much to share between building libs on different platforms (unless you are using a cross-platfrom build tool -- why I was trying out gattai for my stuff) The conda stuff is multi-platform, though, in fact, you have to write a separate build script for each platform -- it doesn't really provide anything to help with that part. But while these efforts are moving towards removing the need for every pacakge maintainer to build the deps -- we are now duplicating the effort of trying to remove duplication of effort :-) -- but maybe just waiting for something to gain momentum and rise to the top is the answer. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Sat May 16 21:04:25 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 12:04:25 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Sat, May 16, 2015 at 11:54 AM, Paul Moore wrote: > > could you clarify a bit -- I thought that this could, at least, put a > dir on > > the search path that was specific to that python context. So it would > > require cooperation among all the packages being used at once, but not > get > > tangled up with the rest of the system. but maybe I'm wrong here -- I > have > > no idea what the heck I'm doing with this! > > Suppose Python adds C:\PythonXY\SharedDLLs to %PATH%. Suppose there's > a libpng.dll in there, for matplotlib. > I think we all agree that %PATH% is NOT the option! Taht is the key source od dll hell on Windows. I was referring to the SetDllDirectory API. I don't think that gets picked up by other processes. from: https://msdn.microsoft.com/en-us/library/windows/desktop/ms686203%28v=vs.85%29.aspx It looks like you can add a path, at run time, that gets searched for dlls before the rest of the system locations. And this does to effect any other applications. But you'd need to make sure this got run before any of the effected packages where loaded -- which is proabbly what David meant by needing to "control the python binary". -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmertz at continuum.io Sat May 16 21:04:03 2015 From: dmertz at continuum.io (David Mertz) Date: Sat, 16 May 2015 12:04:03 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages Message-ID: I've just started monitoring this SIG to get a sense of the issues and status of things. I've also just started working for Continuum Analytics. Continuum has a great desire to make 'pip' work with conda packages. Obviously, we love for users to choose the Anaconda Python distribution but many will not for a variety of reasons (many good reasons). However, we would like for users of other distros still to be able to benefit from our creation of binary packages for many platforms in the conda format. As has been discussed in recent threads on dependency solving, the way conda provides metadata apart from entire packages makes much of that work easier. But even aside from that, there are simply a large number of well-tested packages (not only for Python, it is true, so that's possibly a wrinkle in the task) we have generated in conda format. It is true that right now, a user can in principle type: % pip install conda % conda install some_conda_package But that creates two separate systems for tracking what's installed and what dependencies are resolved; and many users will not want to convert completely to conda after that step. What would be better as a user experience would be to let users do this: % pip install --upgrade pip % pip install some_conda_package Whether that second command ultimately downloads code from pyip.python.org or from repo.continuum.io is probably less important for a user experience perspective. Continuum is very happy to upload all of our conda packages to PyPI if this would improve this user experience. Obviously, the idea here would be that the user would be able to type 'pip list' and friends afterward, and have knowledge of what was installed, even as conda packages. I'm hoping members of the SIG can help me understand both the technical and social obstacles that need to be overcome before this can happen. Yours, David... -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Sat May 16 21:40:28 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Sat, 16 May 2015 20:40:28 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 16 May 2015 at 20:04, David Mertz wrote: > What would be better as a user experience would be to let users do this: > > % pip install --upgrade pip > % pip install some_conda_package > > Whether that second command ultimately downloads code from pyip.python.org > or from repo.continuum.io is probably less important for a user experience > perspective. Continuum is very happy to upload all of our conda packages to > PyPI if this would improve this user experience. Obviously, the idea here > would be that the user would be able to type 'pip list' and friends > afterward, and have knowledge of what was installed, even as conda packages. > > I'm hoping members of the SIG can help me understand both the technical and > social obstacles that need to be overcome before this can happen. My immediate thought is, what obstacles stand in the way of a "conda to wheel" conversion utility? With such a utility, a wholesale conversion of conda packages to wheels, along with hosting those wheels somewhere (binstar? PyPI isn't immediately possible as only package owners can upload files), would essentially give this capability. There presumably are issues with this approach (maybe technical, more likely social) but it seems to me that understanding *why* this approach doesn't work would be a good first step towards identifying an actual solution. Paul From p.f.moore at gmail.com Sat May 16 21:36:22 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Sat, 16 May 2015 20:36:22 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 16 May 2015 at 20:04, Chris Barker wrote: > I was referring to the SetDllDirectory API. I don't think that gets picked > up by other processes. > > from: > > https://msdn.microsoft.com/en-us/library/windows/desktop/ms686203%28v=vs.85%29.aspx > > It looks like you can add a path, at run time, that gets searched for dlls > before the rest of the system locations. And this does to effect any other > applications. But you'd need to make sure this got run before any of the > effected packages where loaded -- which is proabbly what David meant by > needing to "control the python binary". Ah, sorry - I misunderstood you. This might work, but as you say, the DLL Path change would need to run before any imports needed it. Which basically means it needs to be part of the Python interpreter startup. It *could* be run as normal user code - you just have to ensure you run it before any imports that need shared libraries. But that seems very fragile to me. I'm not sure it's viable as a generic solution. Paul From chris.barker at noaa.gov Sat May 16 22:18:36 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 13:18:36 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Sat, May 16, 2015 at 10:12 AM, Nick Coghlan wrote: > > Maybe, but it's a problem to be solved, and the Linux distros more or > less > > solve it for us, but OS-X and Windows have no such system built in (OS-X > > does have Brew and macports....) > > Windows 10 has Chocalatey and OneGet: > > * https://chocolatey.org/ > * > http://blogs.msdn.com/b/garretts/archive/2015/01/27/oneget-and-the-windows-10-preview.aspx > cool -- though I don't think we want the "official" python to depend on a third party system, and one get won't be available for most users for a LONG time... The fact that OS-X users have to choose between fink, macport, homebrew or roll-your-own is a MAJOR soruce of pain for supporting the OS-X community. "More than one way to do it" is not the goal. conda and nix then fill the niche for language independent packaging > at the user level rather than the system level. > yup -- conda is, indeed, pretty cool. > I think there is a bit of fuzz here -- cPython, at least, uses the "the > > operating system provided C/C++ > > dynamic linking system" -- it's not a totally independent thing. > > I'm specifically referring to the *declaration* of dependencies here. > sure -- that's my point about the current "missing link" -- setuptools, pip, etc, can only declare python-package-level dependencies, not binary-level dependencies. My idea is to bundle up a shared lib in a python package -- then, if you declare a dependency on that package, you've handles the dep issue. The trick is that a particular binary wheel depends on that other binary wheel -- rather than the whole package depending on it. (that is, on linux, it would have no dependency, on OS-X it would -- but then only the wheel built for a non-macports build, etc....). I think we could hack around this by monkey-patching the wheel after it is built, so may be worth playing with to see how it works before proposing any changes to the ecosystem. > And if you are using something like conda you don't need pip > > or wheels anyway! > > Correct, just as if you're relying solely on Linux system packages, > you don't need pip or wheels. Aside from the fact that conda is > cross-platform, the main difference between the conda community and a > Linux distro is in the *kind* of software we're likely to have already > done the integration work for. > sure. but the cross-platform thing is BIG -- we NEED pip and wheel because rpm, or deb, or ... are all platform and distro dependent -- we want a way for package maintainers to support a broad audience without having to deal with 12 different package systems. The key to understanding the difference in the respective roles of pip > and conda is realising that there are *two* basic distribution > scenarios that we want to be able to cover (I go into this in more > detail in > https://www.python.org/dev/peps/pep-0426/#development-distribution-and-deployment-of-python-software > ): > hmm -- sure, they are different, but is it impossible to support both with one system? > * software developer/publisher -> software integrator/service operator > (or data analyst) > * software developer/publisher -> software integrator -> service > operator (or data analyst) > ... > On the consumption side, though, the nature of the PyPA tooling as a > platform-independent software publication toolchain means that if you > want to consume the PyPA formats directly, you need to be prepared to > do your own integration work. Exactly! and while Linux system admins can do their own system integration work, everyday users (and many Windows sys admins) can't, and we shouldn't expect them to. And, in fact, the PyPA tooling does support the more casual user much of the time -- for example, I'm in the third quarter of a Python certification class -- Intro, Web development, Advanced topics -- and only half way through the third class have I run into any problems with sticking with the PyPA tools. (except for pychecker -- not being on Pypi :-( ) Many public web service developers are > entirely happy with that deal, but most system administrators and data > analysts trying to deal with components written in multiple > programming languages aren't. > exactly -- but it's not because the audience is different in their role -- it's because different users need different python packages. The PyPA tools support pure-python great -- and compiled extensions without deps pretty well -- but there is a bit of gap with extensions that require other deps. It's a 90% (95%) solution... It'd be nice to get it to a 99% solution. Where is really gets ugly is where you need stuff that has nothing to do with python -- say a Julia run-time, or ... Anaconda is there to support that: their philosophy is that if you are trying to do full-on data analysis with python, you are likely to need stuff strickly beyond the python ecosystem -- your own Fortran code, numpy (which requires LLVM), etc. Maybe they are right -- but there is still a heck of a lot of stuff that you can do and stay within python, and it would be good if it was easier for web developers to use a bit of numpy, or matplotlib, or pandas in their web apps -- without having to jump to the "scipy stack" ecosystem (which does not support the web dev stuff that well yet... If you look at those pipelines from the service operator/data analyst > end, then the *first* question to ask is "Is there a software > integrator that targets the audience I am a member of?". I think that's part of my point here -- I bridge two communities -- the scientific community says: just use Anaconda or Canopy or ...., but the web developer community says "use python.org, pip, and pypi". If you need to both, there is a gap. > When it isn't, it's either > a sign that those of us in the "software integrator" role aren't > meeting the needs of our audience adequately, sure -- but where does PyPA fit in here -- having binary wheels and pypi puts us in teh role of integator -- and we aren't meeting the needs of a broad enough audience as we could -- that's my point there. If we didn't want to be an "integrator", we could have not build pypi, or pip, or wheel.... conda, rpm, macports, etc doesn't need those. I think PyPA tools could meet a braoder need with not much fudging. In some sense,the only question I have at this point is whether there is a compelling reason to better support dynamic libs -- if not, then, as Paul pointed out, all we need is a more coordinated community effort (not easy, but not a tooling question) > option (2) would be to extend python's import mechanism a bit to allow it > to > > do a raw "link in this arbitrary lib" action, so the lib would not have > to > > be wrapped in a python module -- I don't know how possible that is, or > if it > > would be worth it. > > Your option 2 is specifically the kind of thing I don't want to > support, as it's incredibly hard to do right (to the tune of "people > will pay you millions of dollars a year to reduce-or-eliminate their > ABI compatibility concerns"), and has the potential to replace the > current you-need-to-be-able-build-this-from-source-yourself issue with > "oh, look, now you have a runtime ABI incompatibility, have fun debugging that one, buddy". > fair enough -- that could be a pretty ugly nightmare. Your option 1 seems somewhat more plausible, as I believe it should > theoretically be possible to use the PyCObject/PyCapsule API (or even > just normal Python objects) to pass the relevant shared library > details from a "master" module that determines which versions of > external libraries to link against, to other modules that always want > to load them, in a way that ensures everything is linking against a > version that it is ABI compatible with. > > That would require someone to actually work on the necessary tooling > to help with that though, as you wouldn't be able to rely on the > implicit dynamic linking provided by C/C++ toolchains any more. > Probably the best positioned to tackle that idea would be the Cython > community, since they could generate all the required cross-platform > boilerplate code automatically. > good idea -- I'm tied in with those folks -- if I have to do any C stuff I turn to Cython already... > > sure, but if it's ALSO a Python package manger, then why not? i.e. conda > -- > > if we all used conda, we wouldn't need pip+wheel. > > conda's not a Python package manager, it's a language independent > package manager that was born out of the Scientific Python community > and includes Python as one of its supported languages, just like nix, > deb, rpm, etc. > indeed -- but it does have a bunch of python-specific features....it was built around the need to combine python with other systems. That makes it an interesting alternative to pip on the package > *consumption* side for data analysts, but it isn't currently a good > fit for any of pip's other use cases (e.g. one of the scenarios I'm > personally most interested in is that pip is now part of the > Fedora/RHEL/CentOS build pipeline for Python based RPM packages - we > universally recommend using "pip install" in the %install phase over > using "setup.py install" directly) > hmm -- conda generally uses "setup.py install" in its build scripts. And it doesn't use pip install because it wants to handle the downloading and dependencies itself (in fact, turning OFF setuptools dependency handling is an annoyance..) So I'm not sure why pip is needed here -- would it be THAT much harder to build rpms of python packages if it didn't exist? (I do see why you wouldn't want to use conda to build rpms..) But while _maybe_ if conda had been around 5 years earlier we could have not bothered with wheel, I'm not proposing that we drop it -- just that we push pip and wheel a bit farther to broaden the supported user-base. > Binary wheels already work for Python packages that have been > developed with cross-platform maintainability and deployability taken > into account as key design considerations (including pure Python > wheels, where the binary format just serves as an installation > accelerator). That category just happens to exclude almost all > research and data analysis software, because it excludes the libraries > at the bottom of that stack It doesn't quite exclude those -- just makes it harder. And while depending on Fortran, etc, is pretty unique to the data analysis stack, stuff like libpng, libcurl, etc, etc, isn't -- non-system libs are not a rare thing. > It's also the case that when you *are* doing your own system > integration, wheels are a powerful tool for caching builds, conda does this nicely as well :-) I"m not tlrying to argue, at all, that binary wheels are useless, jsu that they could be a bit more useful. > Ah -- here is a key point -- because of that, we DO support binary > packages > > on PyPi -- but only for Windows and OS-X.. I'm just suggesting we find a > way > > to extend that to pacakges that require a non-system non-python > dependency. > > At the point you're managing arbitrary external binary dependencies, > you've lost all the constraints that let us get away with doing this > for extension modules without adequate metadata, and are back to > trying to solve the same arbitrary ABI problem that exists on Linux. > I still don't get that -- any binary extension needs to match the ABI of the python is it used with -- a shared lib is the same problem. The line is drawn at ABI compatibility management. We're able to fuzz > that line a little bit in the case of Windows and Mac OS X extension > modules because we have the python.org CPython releases to act as an > anchor for the ABI definition. > > We don't have that at all on other *nix platforms, and we don't have > it on Windows and Mac OS X either once we move beyond the CPython C > ABI (which encompasses the underlying platform ABI) > Showing my ignorance here -- what else is there we want to support (fortran ABI maybe?) We *might* be able to get to the point of being able to describe > platform ABIs well enough to allow public wheels for arbitrary > platforms, That would be cool -- but not what I'm talking about here. I'm only talking about the ABIs we already describe. > > > Are those the targets for binary wheels? I don't think so. > > Yes, they'll likely end up being one of Fedora's targets for prebuilt > wheel files: > https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManagement > cool -- but it is Fedora that will be building those wheels -- so a systems integrator. > but if you statically link, you need to build the static package right > > anyway -- so it doesn't actually solve the problem at hand anyway. > > Yes it does - you just need to make sure your build environment > suitably matches your target deployment environment. > "just?" -- that's can actually be a major pain -- at least on OS-X. > So what would be good is a way to specify a "this build" dependency. That > > can be hacked in, of course, but nicer not to have to. > > By the time you've solved all these problems I believe you'll find you > have reinvented conda ;) > I really do have less lofty goals than that, but yes -- no point in going down that route! Anyway -- I've take a lot of my time (and a bunch of others on this list). And where ai think we are at is: * No one else seems to think it's worth trying to extend the PyPa ecosystem a bit more to better support dynamic libs. (except _maybe_ Enthought?) * I still think it can be done with minimal changes, and hacked in to do the proof of concept * But I'm not sure it's something that's going to get to the top of my ToDo list anyway -- I can get my needs met with conda anyway. My real production work is deep in the SciPy stack. * So I may or may not move my ideas forward -- if I do, I'll be back with questions and maybe a more concrete proposal some day.... But I learned a lot from this conversation -- thanks! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Sat May 16 22:38:50 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 13:38:50 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Sat, May 16, 2015 at 12:04 PM, David Mertz wrote: > Continuum has a great desire to make 'pip' work with conda packages. > Obviously, we love for users to choose the Anaconda Python distribution but > many will not for a variety of reasons (many good reasons). > Hmm -- this strikes me as very, very , tricky -- and of course, tied in to the other thread I've been spending a bunch of time on... However, we would like for users of other distros still to be able to > benefit from our creation of binary packages for many platforms in the > conda format. > Frankly, if you want your efforts at building binaries to get used outside of Anaconda, then you shoudl be building wheels in the first place. While conda does more than pip + wheel can do -- I suppose you _could_ use wheels for the things it can support.. But on to the technical issues: conda python packages depend on other conda packages, and some of those packages are not python packages at all. The common use case here are non-python dynamic libs -- exactly the use case I've been going on in the other thread about... And conda installs those dynamic libs in a conda environment -- outside of the python environment. So you can't really use a conda package without a conda enviroment, and an installer that understands that environment (I think conda install does some lib path re-naming, yes?), i.e. conda itself. So I think that's kind of a dead end. So what about the idea of a conda-package-to-wheel converter? conda packages an wheels have a bit in common -- IIUC, they are both basically a zip of all the files you need installed. But again the problem is those dependencies on third party dynamic libs. So far that to work -- pip+wheel would have to grow a way to deal with installing, managing and using dynamic libs. See the other thread for the nightmare there... And while I'd love to see this happen, perhaps an easier route would be for conda_build to grow a "static" flag that will statically link stuff and get to somethign already supported by pip, wheel, and pypi. -Chris > > It is true that right now, a user can in principle type: > > % pip install conda > % conda install some_conda_package > > But that creates two separate systems for tracking what's installed and > what dependencies are resolved; > Indeed -- which is why some folks are working on making it easier to use conda for everything....converting a wheel to a conda package is probably easier than the other way around.. Funny -- just moments ago I wrote that it didn't seem that anyone other than me was interested in extending pip_wheel to support this kind of thing -- I guess I was wrong! Great to see you and continuum thinking about this. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sun May 17 00:03:13 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 16 May 2015 18:03:13 -0400 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: <0591E08A-9608-4D8E-9F39-7924E2593F79@stufft.io> > On May 16, 2015, at 3:04 PM, David Mertz wrote: > > I've just started monitoring this SIG to get a sense of the issues and status of things. I've also just started working for Continuum Analytics. > > Continuum has a great desire to make 'pip' work with conda packages. Obviously, we love for users to choose the Anaconda Python distribution but many will not for a variety of reasons (many good reasons). > > However, we would like for users of other distros still to be able to benefit from our creation of binary packages for many platforms in the conda format. As has been discussed in recent threads on dependency solving, the way conda provides metadata apart from entire packages makes much of that work easier. But even aside from that, there are simply a large number of well-tested packages (not only for Python, it is true, so that's possibly a wrinkle in the task) we have generated in conda format. > > It is true that right now, a user can in principle type: > > % pip install conda > % conda install some_conda_package > > But that creates two separate systems for tracking what's installed and what dependencies are resolved; and many users will not want to convert completely to conda after that step. > > What would be better as a user experience would be to let users do this: > > % pip install --upgrade pip > % pip install some_conda_package > > Whether that second command ultimately downloads code from pyip.python.org or from repo.continuum.io is probably less important for a user experience perspective. Continuum is very happy to upload all of our conda packages to PyPI if this would improve this user experience. Obviously, the idea here would be that the user would be able to type 'pip list' and friends afterward, and have knowledge of what was installed, even as conda packages. > > I'm hoping members of the SIG can help me understand both the technical and social obstacles that need to be overcome before this can happen. As Paul mentioned, I?m not sure I see a major benefit to being able to ``pip install`` a conda package that doesn?t come with a lot of footguns, since any conda package either won?t be able to depend on things like Python or random C libraries or we?re going to have to just ignore those dependencies or what have you. I think a far more workable solution is one that translates a conda package to a Wheel. Practically speaking the only real benefit that conda packages has over pip is the one benefit that simply teaching pip to install conda packages won?t provide - Namely that it supports things which aren?t Python packages. However I don?t think it?s likely that we?re going to be able to install R or erlang or whatever into a virtual environment (for instance), but maybe I?m wrong. There are a few other benefits, but that?s not anything that are inherent in the two different approaches, it?s just things that conda has that pip is planning on getting, it just hasn?t gotten them yet because either we have to convince people to publish our new formats (e.g. we can?t go out and create a wheel repo of common packages) or because we haven?t gotten to it yet because dealing with the crushing legacy of PyPI?s ~400k packages is significant slow down factor. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From donald at stufft.io Sun May 17 01:16:55 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 16 May 2015 19:16:55 -0400 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: <0591E08A-9608-4D8E-9F39-7924E2593F79@stufft.io> Message-ID: <4871D671-A795-41F2-882B-AE43673F6E76@stufft.io> > On May 16, 2015, at 7:09 PM, Chris Barker wrote: > > On Sat, May 16, 2015 at 3:03 PM, Donald Stufft > wrote: > There are a few other benefits, but that?s not anything that are inherent in the two different approaches, it?s just things that conda has that pip is planning on getting, > > Huh? I"'m confused -- didn't we just have a big thread about how pip+wheel probably ISN'T going to handle shared libs -- that those are exactly what conda packages do provide -- aside from R and Erlange, anyway :-) > > but it's not the packages in this case that we need -- it's the environment -- and I can't see how pip is going to provide a conda environment?. I never said pip was going to provide an environment, I said the main benefit conda has over pip, which pip will most likely not get in any reasonable time frame, is that it handles things which are not Python packages. A shared library is not a Python package so I?m not sure what this message is even saying? ``pip install lxml-from-conda`` is just going to flat out break because pip won?t install the libxml2 shared library. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From chris.barker at noaa.gov Sun May 17 01:09:03 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 16:09:03 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: <0591E08A-9608-4D8E-9F39-7924E2593F79@stufft.io> References: <0591E08A-9608-4D8E-9F39-7924E2593F79@stufft.io> Message-ID: On Sat, May 16, 2015 at 3:03 PM, Donald Stufft wrote: > There are a few other benefits, but that?s not anything that are inherent > in the two different approaches, it?s just things that conda has that pip > is planning on getting, > Huh? I"'m confused -- didn't we just have a big thread about how pip+wheel probably ISN'T going to handle shared libs -- that those are exactly what conda packages do provide -- aside from R and Erlange, anyway :-) but it's not the packages in this case that we need -- it's the environment -- and I can't see how pip is going to provide a conda environment.... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sun May 17 02:51:32 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 16 May 2015 20:51:32 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: Ok, so unless someone comes out against this in the near future here are my plans: 1. Implement the ability to delete documentation. 2. Implement the ability to add a (simple) redirect where we would essentially just send //(.*) to $REDIRECT_BASE/$1. 3. Implement the ability to point the documentation URL to something that isn't pythonhosted.org 4. Send an email out to all projects that are currently utilizing the hosted documentation telling that it is going away, and give them links to RTD and GithubPages and whatever bitbucket calls their service. 5. Disable Documentation Uploads to PyPI with an error message that tells people the service has been discontinued. In addition to the above steps, we'll maintain any documentaton that doesn't get deleted (and the above redirects) indefinitely. Serving static read only documentation (other than deletes) is something that we can do without much trouble or cost. I think that this will cover all of the things that people in this thread have brought up as well as providing a sane migration path to go from pythonhosted.org documentation to wherever they choose to place their docs in the future. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From chris.barker at noaa.gov Sun May 17 02:50:30 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sat, 16 May 2015 17:50:30 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: <4871D671-A795-41F2-882B-AE43673F6E76@stufft.io> References: <0591E08A-9608-4D8E-9F39-7924E2593F79@stufft.io> <4871D671-A795-41F2-882B-AE43673F6E76@stufft.io> Message-ID: On Sat, May 16, 2015 at 4:16 PM, Donald Stufft wrote: > On Sat, May 16, 2015 at 3:03 PM, Donald Stufft wrote: > >> There are a few other benefits, but that?s not anything that are inherent >> in the two different approaches, it?s just things that conda has that pip >> is planning on getting, >> > > Huh? I"'m confused -- didn't we just have a big thread about how pip+wheel > probably ISN'T going to handle shared libs -- that those are exactly what > conda packages do provide -- aside from R and Erlange, anyway :-) > > but it's not the packages in this case that we need -- it's the > environment -- and I can't see how pip is going to provide a conda > environment?. > > > I never said pip was going to provide an environment, I said the main > benefit conda has over pip, which pip will most likely not get in any > reasonable time frame, is that it handles things which are not Python > packages. > well, I got a bit distraced by Erlang and R -- i.e. things that have nothing to do with python packages. libxml, on the other hand, is a lib that one might want to use with a python package -- so a bit more apropos here. But my confusion was about: "things that conda has that pip is planning on getting" -- what are those things? Any of the stuff that conda has that really useful like handling shared libs, pip is NOT getting -- yes? > A shared library is not a Python package so I?m not sure what this message > is even saying? ``pip install lxml-from-conda`` is just going to flat out > break because pip won?t install the libxml2 shared library. > exactly -- if you're going to install a shared lib, you need somewhere to put it -- and that's what a conda environment provides. Trying not to go around in circles, but python _could_ provide a standard place in which to put shared libs -- and then pip _could_ provide a way to manage them. That would require dealing with that whole binary API problem, so we probably won't do it. I'm not sure what the point of contention is here: I think it would be useful to have a way to manage shared libs solely for python packages to use -- and it would be useful to that way to be part of the standard python ecosytem. Others may not think it would be useful enough to be worth the pain in the neck it would be. And that's what the nifty conda packages continuum (and others) have built could provide -- those shared libs that are built in a compatible way with a python binary. After all, pure python packages are no problem, compiled python packages without any dependencies are little problem. The hard part is those darn third party libs. conda also provides a way to mange all sorts of other stuff that has nothing to do with python, but I'm guessing that's not what continuum would like to contribute to pypi.... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Sun May 17 03:05:51 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 16 May 2015 21:05:51 -0400 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: <0591E08A-9608-4D8E-9F39-7924E2593F79@stufft.io> <4871D671-A795-41F2-882B-AE43673F6E76@stufft.io> Message-ID: <0C03CE88-C94B-4189-8431-6DCCDA40E550@stufft.io> > On May 16, 2015, at 8:50 PM, Chris Barker wrote: > > On Sat, May 16, 2015 at 4:16 PM, Donald Stufft > wrote: >> On Sat, May 16, 2015 at 3:03 PM, Donald Stufft > wrote: >> There are a few other benefits, but that?s not anything that are inherent in the two different approaches, it?s just things that conda has that pip is planning on getting, >> >> Huh? I"'m confused -- didn't we just have a big thread about how pip+wheel probably ISN'T going to handle shared libs -- that those are exactly what conda packages do provide -- aside from R and Erlange, anyway :-) >> >> but it's not the packages in this case that we need -- it's the environment -- and I can't see how pip is going to provide a conda environment?. > > I never said pip was going to provide an environment, I said the main benefit conda has over pip, which pip will most likely not get in any reasonable time frame, is that it handles things which are not Python packages. > > well, I got a bit distraced by Erlang and R -- i.e. things that have nothing to do with python packages. > > libxml, on the other hand, is a lib that one might want to use with a python package -- so a bit more apropos here. > > But my confusion was about: "things that conda has that pip is planning on getting" -- what are those things? Any of the stuff that conda has that really useful like handling shared libs, pip is NOT getting -- yes? The ability to resolve dependencies with static metadata is the major one that comes to my mind that?s specific to pip. The ability to have better build systems besides distutils/setuptools is a more ecosystem level one but that?s something we?ll get too. As far as shared libs? beyond what?s already possible (sticking a shared lib inside of a python project and having libraries load that .dll explicitly) it?s not currently on the road map and may never be. I hesitate to say never because it?s obviously a problem that needs solved and if the Python ecosystem solves it (specific to shared libraries, not whole runtimes or other languages or what have you) then that would be a useful thing. I think we have lower hanging fruit that we need to deal with before something like that is even possibly to be on the radar though (if we ever put it on the radar). > > A shared library is not a Python package so I?m not sure what this message is even saying? ``pip install lxml-from-conda`` is just going to flat out break because pip won?t install the libxml2 shared library. > > exactly -- if you're going to install a shared lib, you need somewhere to put it -- and that's what a conda environment provides. > > Trying not to go around in circles, but python _could_ provide a standard place in which to put shared libs -- and then pip _could_ provide a way to manage them. That would require dealing with that whole binary API problem, so we probably won't do it. I'm not sure what the point of contention is here: > > I think it would be useful to have a way to manage shared libs solely for python packages to use -- and it would be useful to that way to be part of the standard python ecosytem. Others may not think it would be useful enough to be worth the pain in the neck it would be. > > And that's what the nifty conda packages continuum (and others) have built could provide -- those shared libs that are built in a compatible way with a python binary. After all, pure python packages are no problem, compiled python packages without any dependencies are little problem. The hard part is those darn third party libs. > > conda also provides a way to mange all sorts of other stuff that has nothing to do with python, but I'm guessing that's not what continuum would like to contribute to pypi?. I guess I?m confused what the benefit of making pip able to install a conda package would be. If Python adds someplace for shared libs to go then we could just add shared lib support to Wheels, it?s just another file type so that?s not a big deal. The hardest part is dealing with ABI compatibility. However, given the current state of things, what?s the benefit of being able to do ``pip install conda-lxml``? Either it?s going to flat out break or you?re going to have to do ``conda install libxml2`` first, and if you?re doing ``conda install libxml2`` first then why not just do ``conda install lxml``? I view conda the same way I view apt-get, yum, Chocolatey, etc. It provides an environment and you can install a Python package into that environment, but that pip shouldn?t know how to install a .deb or a .rpm or a conda package because those packages rely on specifics to that environment and Python packages can?t. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From ben+python at benfinney.id.au Sun May 17 03:31:53 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Sun, 17 May 2015 11:31:53 +1000 Subject: [Distutils] PyPI and Uploading Documentation References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: <85617s0z1y.fsf@benfinney.id.au> Donald Stufft writes: > Ok, so unless someone comes out against this in the near future here are my > plans: > > 1. Implement the ability to delete documentation. +1. > 2. Implement the ability to add a (simple) redirect where we would > essentially just send //(.*) to $REDIRECT_BASE/$1. > > 3. Implement the ability to point the documentation URL to something > that isn't pythonhosted.org Both of these turn PyPI into a vector for arbitrary content, including (for example) illegal, misleading, or malicious content. Automatic redirects actively expose the visitor to any malicious or mistaken links set by the project owner. If you want to allow the documentation to be at some arbitrary location of the project owner's choice, then an explicit static link, which the visitor must click on (similar to the project home page link) is best. -- \ ?I find the whole business of religion profoundly interesting. | `\ But it does mystify me that otherwise intelligent people take | _o__) it seriously.? ?Douglas Adams | Ben Finney From donald at stufft.io Sun May 17 03:33:49 2015 From: donald at stufft.io (Donald Stufft) Date: Sat, 16 May 2015 21:33:49 -0400 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <85617s0z1y.fsf@benfinney.id.au> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> <85617s0z1y.fsf@benfinney.id.au> Message-ID: > On May 16, 2015, at 9:31 PM, Ben Finney wrote: > > Donald Stufft writes: > >> Ok, so unless someone comes out against this in the near future here are my >> plans: >> >> 1. Implement the ability to delete documentation. > > +1. > >> 2. Implement the ability to add a (simple) redirect where we would >> essentially just send //(.*) to $REDIRECT_BASE/$1. >> >> 3. Implement the ability to point the documentation URL to something >> that isn't pythonhosted.org > > Both of these turn PyPI into a vector for arbitrary content, including > (for example) illegal, misleading, or malicious content. > > Automatic redirects actively expose the visitor to any malicious or > mistaken links set by the project owner. > > If you want to allow the documentation to be at some arbitrary location > of the project owner's choice, then an explicit static link, which the > visitor must click on (similar to the project home page link) is best. > To be clear, the documentation isn?t hosted on PyPI, it?s hosted on pythonhosted.org and we already allow people to upload arbitrary content to that domain, which can include JS based redirects. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From ncoghlan at gmail.com Sun May 17 05:48:55 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 May 2015 13:48:55 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 17 May 2015 06:19, "Chris Barker" wrote: > indeed -- but it does have a bunch of python-specific features....it was built around the need to combine python with other systems. > >> That makes it an interesting alternative to pip on the package >> *consumption* side for data analysts, but it isn't currently a good >> fit for any of pip's other use cases (e.g. one of the scenarios I'm >> personally most interested in is that pip is now part of the >> Fedora/RHEL/CentOS build pipeline for Python based RPM packages - we >> universally recommend using "pip install" in the %install phase over >> using "setup.py install" directly) > > > hmm -- conda generally uses "setup.py install" in its build scripts. And it doesn't use pip install because it wants to handle the downloading and dependencies itself (in fact, turning OFF setuptools dependency handling is an annoyance..) > > So I'm not sure why pip is needed here -- would it be THAT much harder to build rpms of python packages if it didn't exist? (I do see why you wouldn't want to use conda to build rpms..) We switched to recommending pip to ensure that the Fedora (et al) build toolchain can be updated to emit & handle newer Python metadata standards just by upgrading pip. For example, it means that system installed packages on modern Fedora installations should (at least in theory) provide full PEP 376 installation metadata with the installer reported as the system package manager. The conda folks (wastefully, in my view) are still attempting to compete directly with pip upstream, instead of delegating to it from their build scripts as an abstraction layer that helps hide the complexity of the Python packaging ecosystem. > But while _maybe_ if conda had been around 5 years earlier we could have not bothered with wheel, No, we couldn't, as conda doesn't work as well for system integrators. > I'm not proposing that we drop it -- just that we push pip and wheel a bit farther to broaden the supported user-base. I can't stop you working on something I consider a deep rabbithole, but why not just recommend the use of conda, and only pubish sdists on PyPI? conda needs more users and contributors seeking better integration with the PyPA tooling, and minimising the non-productive competition. The web development folks targeting Linux will generally be in a position to build from source (caching the resulting wheel file, or perhaps an entire container image). Also, assuming Fedora's experiment with language specific repos goes well ( https://fedoraproject.org/wiki/Env_and_Stacks/Projects/LanguageSpecificRepositories), we may see other distros replicating that model of handling the wheel creation task on behalf of their users. It's also worth noting that one of my key intended use cases for metadata extensions is to publish platform specific external dependencies in the upstream project metadata, which would get us one step closer to fully automated repackaging into policy compliant redistributor packages. >> Binary wheels already work for Python packages that have been >> developed with cross-platform maintainability and deployability taken >> into account as key design considerations (including pure Python >> wheels, where the binary format just serves as an installation >> accelerator). That category just happens to exclude almost all >> research and data analysis software, because it excludes the libraries >> at the bottom of that stack > > > It doesn't quite exclude those -- just makes it harder. And while depending on Fortran, etc, is pretty unique to the data analysis stack, stuff like libpng, libcurl, etc, etc, isn't -- non-system libs are not a rare thing. The rare thing is having two packages which are tightly coupled to the ABI of a given external dependency. That's a generally bad idea because it causes exactly these kinds of problems with independent distribution of prebuilt components. The existence of tight ABI coupling between components both gives the scientific Python stack a lot of its power, *and* makes it almost as hard to distribute in binary form as native GUI applications. >> It's also the case that when you *are* doing your own system >> integration, wheels are a powerful tool for caching builds, > conda does this nicely as well :-) I"m not tlrying to argue, at all, that binary wheels are useless, jsu that they could be a bit more useful. A PEP 426 metadata extension proposal for describing external binary dependencies would certainly be a welcome addition. That's going to be a common need for automated repackaging tools, even if we never find a practical way to take advantage of it upstream. >> > Ah -- here is a key point -- because of that, we DO support binary packages >> >> > on PyPi -- but only for Windows and OS-X.. I'm just suggesting we find a way >> > to extend that to pacakges that require a non-system non-python dependency. >> >> At the point you're managing arbitrary external binary dependencies, >> you've lost all the constraints that let us get away with doing this >> for extension modules without adequate metadata, and are back to >> trying to solve the same arbitrary ABI problem that exists on Linux. > > > I still don't get that -- any binary extension needs to match the ABI of the python is it used with -- a shared lib is the same problem. No, it's not, because we don't have python.org defining the binary ABI for anything except CPython itself. That's what constrains the target ABIs for extension modules on Windows & Mac OS X to a feasible number evolving at a feasible rate. A large part of what *defines* a platform is making decisions about the ABI to publish & target. Linux distros, nix, conda do that for everything they redistribute. I assume chocolatey does as well (I'm not sure if the Mac systems do prebuilt binaries) >> The line is drawn at ABI compatibility management. We're able to fuzz >> that line a little bit in the case of Windows and Mac OS X extension >> modules because we have the python.org CPython releases to act as an >> anchor for the ABI definition. >> >> We don't have that at all on other *nix platforms, and we don't have >> it on Windows and Mac OS X either once we move beyond the CPython C >> ABI (which encompasses the underlying platform ABI) > > > Showing my ignorance here -- what else is there we want to support (fortran ABI maybe?) Every single external binary dependency where the memory layout may be exposed to other extension modules when loaded into the same process becomes an ABI compatibility concern, with structs changing sizes so they don't fit in the allocated space any more being one of the most common issues. The stable ABI PEP has some additional background on that: https://www.python.org/dev/peps/pep-0384/ >> We *might* be able to get to the point of being able to describe >> platform ABIs well enough to allow public wheels for arbitrary >> platforms, > > That would be cool -- but not what I'm talking about here. I'm only talking about the ABIs we already describe. Except you're not, as we don't currently describe the ABIs for arbitrary external dependencies at all. > Anyway -- I've take a lot of my time (and a bunch of others on this list). And where ai think we are at is: > > * No one else seems to think it's worth trying to extend the PyPa ecosystem a bit more to better support dynamic libs. (except _maybe_ Enthought?) I know Donald is keen to see this, and a lot of ideas become more feasible if(/when?) PyPI gets an integrated wheel build farm. At that point, we can use the "centre of gravity" approach by letting the build farm implicitly determine the "standard" version of particular external dependencies, even if we can't communicate those versions effectively in the metadata. I'm also interested in metadata extensions to describe external binary dependencies, but I'm not sure that can be done sensibly in a platform independent way (in which case, it would be better to let various communities define their own extensions). > * I still think it can be done with minimal changes, and hacked in to do the proof of concept I'm still not clear on what "it" is. I've been pointing out how hard it is to do this right in the general case, but I get the impression you're actually more interested in the narrower case of defining a "SciPy ABI" that encompasses selected third party binary dependencies. That's a more attainable goal, but NumFocus or the SciPy community would likely be a better audience for that discussion than distutils-sig. > * But I'm not sure it's something that's going to get to the top of my ToDo list anyway -- I can get my needs met with conda anyway. My real production work is deep in the SciPy stack. > * So I may or may not move my ideas forward -- if I do, I'll be back with questions and maybe a more concrete proposal some day.... If I'm correct that your underlying notion is "It would be nice if there was an agreed SciPy ABI we could all build wheels against, and a way of publishing the external dependency manifest in the wheel metadata so it could be checked for consistency at installation time", that's a far more tractable problem than the "arbitrary binary dependencies" one. Attempting to solve the latter is what I believe leads to reinventing something functionally equivalent to conda, while I expect attempting to solve the former is likely to be more of a political battle than a technical one :) Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sun May 17 09:05:11 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 May 2015 17:05:11 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 17 May 2015 at 05:04, David Mertz wrote: > What would be better as a user experience would be to let users do this: > > % pip install --upgrade pip > % pip install some_conda_package This gets the respective role of the two tools reversed - it's like my asking for "pip install some_fedora_rpm" to be made to work. However, having conda use "pip install" in its build scripts so that it reliably generates pip compatible installation metadata would be a possibility worth discussing - that's what we've started doing in Fedora, so that runtime utilities like pkg_resources can work correctly. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From p.f.moore at gmail.com Sun May 17 13:36:39 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Sun, 17 May 2015 12:36:39 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 17 May 2015 at 04:48, Nick Coghlan wrote: > A large part of what *defines* a platform is making decisions about the ABI > to publish & target. Linux distros, nix, conda do that for everything they > redistribute. I assume chocolatey does as well I'm picking on this because it seems to be a common misconception about what Chocolatey provides on Windows. As far as I understand, Chocolatey does *not* provide a "platform" in this sense at all. The installers hosted by Chocolatey are typically nothing more than repackaged upstream installers (or maybe just scripting around downloading and running upstream installers directly), with a nice command line means of discovering and installing them. >From the Chocolatey FAQ: """ What does Chocolatey do? Are you redistributing software? Chocolatey does the same thing that you would do based on the package instructions. This usually means going out and downloading an installer from the official distribution point and then silently installing it on your machine. With most packages this means Chocolatey is not redistributing software because they are going to the same distribution point that you yourself would go get the software if you were performing this process manually. """ So AIUI, for example, if you install Python with Chocolatey, it just downloads and runs the python.org installer behind the scenes. Also, Chocolatey explicitly doesn't handle libraries - it is an application installer only. So there's no dependency management, or sharing of libraries beyond that which application installers do natively. Disclaimer: I haven't used Chocolatey much, except for some experimenting. This is precisely *because* it doesn't add much beyond application installs, which I'm pretty much happy handling myself. But it does mean that I could have missed some aspects of what it provides. Paul From robertc at robertcollins.net Sun May 17 15:41:24 2015 From: robertc at robertcollins.net (Robert Collins) Date: Mon, 18 May 2015 01:41:24 +1200 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 17 May 2015 at 19:05, Nick Coghlan wrote: > On 17 May 2015 at 05:04, David Mertz wrote: >> What would be better as a user experience would be to let users do this: >> >> % pip install --upgrade pip >> % pip install some_conda_package > > This gets the respective role of the two tools reversed - it's like my > asking for "pip install some_fedora_rpm" to be made to work. > > However, having conda use "pip install" in its build scripts so that > it reliably generates pip compatible installation metadata would be a > possibility worth discussing - that's what we've started doing in > Fedora, so that runtime utilities like pkg_resources can work > correctly. So you're generating a record file and then copying it into place in the rpm's? I'm not sure if thats an external-stable-thing yet, perhaps it is. The distutils data directory with the METADATA etc is what pkg_resources needs (and is preserve on Debian and derived systems already). I see no reason why we can't make this a part of the contract, but we should at least get it in a PEP first, no? -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From chris.barker at noaa.gov Sun May 17 23:31:45 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sun, 17 May 2015 14:31:45 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Sun, May 17, 2015 at 12:05 AM, Nick Coghlan wrote: > > % pip install --upgrade pip > > % pip install some_conda_package > > This gets the respective role of the two tools reversed - it's like my > asking for "pip install some_fedora_rpm" to be made to work. > I agree here -- I was thinking there was some promise in a conda_package_to_wheel converter though. It would, of course, only work in a subset of conda packages, but would be nice. The trick is that conda packages for the hard-to-build python packages (the ones we care about) often (always?) depend on conda packages for dynamic libs, and pip+wheel have no support for that. And this is a trick, because while I have some ideas for supporting just-for-python dynamic libs, conda's are not just-for-python -- so that might be hard to mash together. Continuum has a bunch of smart people, though. However, having conda use "pip install" in its build scripts so that > it reliably generates pip compatible installation metadata would be a > possibility worth discussing - that's what we've started doing in > Fedora, so that runtime utilities like pkg_resources can work > correctly. > Hmm -- that's something ot look into -- you can put essentially anything into a conda bulid script -- so this would be a matter of convention, rather than tooling. (of course the conventions used by Continuum for the "offical" conda packages is the standard). But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ??? I see pip has handling the dependency resolution, and finding and downloading of packages part of the problem -- conda does those already. So what would using pip inside a conda build script buy you that using setuptools does not? And would this be the right incantation to put in a build script: pip install --no-deps ./ (if you are in the package's main dir -- next to setup.py) -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon May 18 00:50:30 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sun, 17 May 2015 15:50:30 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: Trying to keep this brief, because the odds of my finding time to do much with this are slim.. > I'm not proposing that we drop it -- just that we push pip and wheel a bit farther to broaden the supported user-base. > I can't stop you working on something I consider a deep rabbithole, > no -- but I do appreciate your assessment of how deep that hole is -- you certainly have a while lot more background with all this than I do -- I could well be being very naive here. > but why not just recommend the use of conda, and only pubish sdists on > PyPI? conda needs more users and contributors seeking better integration > with the PyPA tooling, and minimising the non-productive competition. > I essentially where two hats here: 1) I produce software built on top of the scientific python stack, and I want my users to have an easy experience with installing and running my code. For that -- I am going the conda route. I"m not there yet, but close to being able to say: a) install Anaconda b) add my binstar channel to your conda environment c) conda install my_package The complication here is that we also have a web front end for our computational code, and it makes heavy use of all sorts of web-oriented packages that are not supported by Anaconda or, for the most part, the conda community (binstar). My solution is to make conda packages myself of those and put them in my binstar channel. The other option is to piip install those packages, but then you get pretty tangled up in dependencies ans conda environments, vs viirtual environments, etc... 2) Hat two is an instructor for the University of Washington Continuing Education Program's Python Certification. In that program, we do very little with the scipy stack, but have an entire course on web development. And the instructor of that class, quite rightly, pushes the standard of practice for web developers: heavy use of virtualenv and pip. Oh, and hat (3) is a long time pythonista, who, among other things, has been working for years to make using python easier to use on the Mac for folks that don't know or care what the unix command line is.... I guess the key thing here for me is that I don't see pushing conda to budding web developers -- but what if web developers have the need for a bit of the scipy stack? or??? We really don't have a good solution for those folks. > The web development folks targeting Linux will generally be in a position > to build from source (caching the resulting wheel file, or perhaps an > entire container image). > again, I'm not concerned about linux -- it's an ABI nightmare, so we really don't want to go there, and its users are generally more "sophisticated" a little building is not a big deal. > It's also worth noting that one of my key intended use cases for metadata > extensions is to publish platform specific external dependencies in the > upstream project metadata, which would get us one step closer to fully > automated repackaging into policy compliant redistributor packages. > Honestly, I don't follow this! -- but I'll keep an eye out for it - sounds useful. > The existence of tight ABI coupling between components both gives the > scientific Python stack a lot of its power, *and* makes it almost as hard > to distribute in binary form as native GUI applications. I think harder, actually :-) > * No one else seems to think it's worth trying to extend the PyPa ecosystem a bit more to better support dynamic libs. (except _maybe_ Enthought?) > > I know Donald is keen to see this, and a lot of ideas become more feasible > if(/when?) PyPI gets an integrated wheel build farm. At that point, we can > use the "centre of gravity" approach by letting the build farm implicitly > determine the "standard" version of particular external dependencies, even > if we can't communicate those versions effectively in the metadata. > That's more what I'm thinking, yes. > * I still think it can be done with minimal changes, and hacked in to do the proof of concept I'm still not clear on what "it" is. I've been pointing out how hard it is to do this right in the general case, but I get the impression you're actually more interested in the narrower case of defining a "SciPy ABI" that encompasses selected third party binary dependencies. I wouldn't say SciPyABI -- that, in a way is already being done -- folks are coordinating the "official" bianries of at least the core "scipy stack" -- it's a pain -- no Windows wheels for numpy, for instance (though I think they are close) My interest is actually taking it beyond that -- honestly in my case there are only a handful of libs that I'm aware of that get common use, for instance libfreetype and libpng in wxPython, PIL, matplotlib, etc. If I were only SciPy focused -- conda would be the way to go. That's part ofteh problem I see -- there are split communities, but they DO overlap, I thin ti's a diservice to punt thes issues of to individual sub-communities to address on their own. > * But I'm not sure it's something that's going to get to the top of my ToDo list anyway -- I can get my needs met with conda anyway. My real production work is deep in the SciPy stack. > * So I may or may not move my ideas forward -- if I do, I'll be back with questions and maybe a more concrete proposal some day.... > If I'm correct that your underlying notion is "It would be nice if there > was an agreed SciPy ABI we could all build wheels against, and a way of > publishing the external dependency manifest in the wheel metadata so it > could be checked for consistency at installation time", that's a far more > tractable problem than the "arbitrary binary dependencies" one. > What I'm talking about it in-between -- not just SciPy, but not "arbitrary binary dependencies" either. But i think the trick is that the dependency really is binary: i.e: wheelA depends on this particular wheel -- not this version of a lib, but this BUILD of a lib. and what I envision is that "this build of a lib" would be, for instance ,libnnnx.ym "built to be compatible with the python.org 64 bit Windows build of python 3.4" Which is what we need to do now anyway -- if you are going to deliver a build of PIL (pillow), for instance, you need to deliver the libs it needs, built to match the python it's built for, whether you statically link or dump the dll in with the extension. All I'm suggesting is that we have a way of letting others use that same lib build -- this would be more a social thing than anything else. while I expect attempting to solve the former is likely to be more of a political battle than a technical one :) yes -- I think this is a social / political issue as much as anything else. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Mon May 18 02:05:27 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 18 May 2015 10:05:27 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 18 May 2015 07:32, "Chris Barker" wrote: > > On Sun, May 17, 2015 at 12:05 AM, Nick Coghlan wrote: >> >> > % pip install --upgrade pip >> > % pip install some_conda_package >> >> This gets the respective role of the two tools reversed - it's like my >> asking for "pip install some_fedora_rpm" to be made to work. > > > I agree here -- I was thinking there was some promise in a conda_package_to_wheel converter though. It would, of course, only work in a subset of conda packages, but would be nice. > > The trick is that conda packages for the hard-to-build python packages (the ones we care about) often (always?) depend on conda packages for dynamic libs, and pip+wheel have no support for that. > > And this is a trick, because while I have some ideas for supporting just-for-python dynamic libs, conda's are not just-for-python -- so that might be hard to mash together. > > Continuum has a bunch of smart people, though. > >> However, having conda use "pip install" in its build scripts so that >> it reliably generates pip compatible installation metadata would be a >> possibility worth discussing - that's what we've started doing in >> Fedora, so that runtime utilities like pkg_resources can work >> correctly. > > > Hmm -- that's something ot look into -- you can put essentially anything into a conda bulid script -- so this would be a matter of convention, rather than tooling. (of course the conventions used by Continuum for the "offical" conda packages is the standard). > > But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ??? > > I see pip has handling the dependency resolution, and finding and downloading of packages part of the problem -- conda does those already. > > So what would using pip inside a conda build script buy you that using setuptools does not? Indirection via pip injects the usage of setuptools even for plain distutils projects, and generates https://www.python.org/dev/peps/pep-0376/ compliant metadata by default. However, looking at the current packaging policy, I think I misremembered the situation - it looks like we *discussed* recommending indirection via pip & attaining PEP 376 compliance, but haven't actually moved forward with the idea yet. That makes sense, since pursuing it would have been gated on ensurepip, and the Python 3 migration has been higher priority recently. Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Mon May 18 02:12:36 2015 From: robertc at robertcollins.net (Robert Collins) Date: Mon, 18 May 2015 12:12:36 +1200 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 17 May 2015 5:05 pm, "Nick Coghlan" wrote: > > > On 18 May 2015 07:32, "Chris Barker" wrote: > > > > On Sun, May 17, 2015 at 12:05 AM, Nick Coghlan wrote: > >> > >> > % pip install --upgrade pip > >> > % pip install some_conda_package > >> > >> This gets the respective role of the two tools reversed - it's like my > >> asking for "pip install some_fedora_rpm" to be made to work. > > > > > > I agree here -- I was thinking there was some promise in a conda_package_to_wheel converter though. It would, of course, only work in a subset of conda packages, but would be nice. > > > > The trick is that conda packages for the hard-to-build python packages (the ones we care about) often (always?) depend on conda packages for dynamic libs, and pip+wheel have no support for that. > > > > And this is a trick, because while I have some ideas for supporting just-for-python dynamic libs, conda's are not just-for-python -- so that might be hard to mash together. > > > > Continuum has a bunch of smart people, though. > > > >> However, having conda use "pip install" in its build scripts so that > >> it reliably generates pip compatible installation metadata would be a > >> possibility worth discussing - that's what we've started doing in > >> Fedora, so that runtime utilities like pkg_resources can work > >> correctly. > > > > > > Hmm -- that's something ot look into -- you can put essentially anything into a conda bulid script -- so this would be a matter of convention, rather than tooling. (of course the conventions used by Continuum for the "offical" conda packages is the standard). > > > > But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ??? > > > > I see pip has handling the dependency resolution, and finding and downloading of packages part of the problem -- conda does those already. > > > > So what would using pip inside a conda build script buy you that using setuptools does not? > > Indirection via pip injects the usage of setuptools even for plain distutils projects, and generates https://www.python.org/dev/peps/pep-0376/ compliant metadata by default. > > However, looking at the current packaging policy, I think I misremembered the situation - it looks like we *discussed* recommending indirection via pip & attaining PEP 376 compliance, but haven't actually moved forward with the idea yet. That makes sense, since pursuing it would have been gated on ensurepip, and the Python 3 migration has been higher priority recently. That glue is actually very shallow...I think we should rip it out of pip and perhaps put it in setuptools. It's about building, not installing. Rob -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Mon May 18 02:16:45 2015 From: donald at stufft.io (Donald Stufft) Date: Sun, 17 May 2015 20:16:45 -0400 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: <4729D716-C490-487B-8375-09D22BD3915B@stufft.io> > On May 17, 2015, at 8:12 PM, Robert Collins wrote: > > > On 17 May 2015 5:05 pm, "Nick Coghlan" > wrote: > > > > > > On 18 May 2015 07:32, "Chris Barker" > wrote: > > > > > > On Sun, May 17, 2015 at 12:05 AM, Nick Coghlan > wrote: > > >> > > >> > % pip install --upgrade pip > > >> > % pip install some_conda_package > > >> > > >> This gets the respective role of the two tools reversed - it's like my > > >> asking for "pip install some_fedora_rpm" to be made to work. > > > > > > > > > I agree here -- I was thinking there was some promise in a conda_package_to_wheel converter though. It would, of course, only work in a subset of conda packages, but would be nice. > > > > > > The trick is that conda packages for the hard-to-build python packages (the ones we care about) often (always?) depend on conda packages for dynamic libs, and pip+wheel have no support for that. > > > > > > And this is a trick, because while I have some ideas for supporting just-for-python dynamic libs, conda's are not just-for-python -- so that might be hard to mash together. > > > > > > Continuum has a bunch of smart people, though. > > > > > >> However, having conda use "pip install" in its build scripts so that > > >> it reliably generates pip compatible installation metadata would be a > > >> possibility worth discussing - that's what we've started doing in > > >> Fedora, so that runtime utilities like pkg_resources can work > > >> correctly. > > > > > > > > > Hmm -- that's something ot look into -- you can put essentially anything into a conda bulid script -- so this would be a matter of convention, rather than tooling. (of course the conventions used by Continuum for the "offical" conda packages is the standard). > > > > > > But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ??? > > > > > > I see pip has handling the dependency resolution, and finding and downloading of packages part of the problem -- conda does those already. > > > > > > So what would using pip inside a conda build script buy you that using setuptools does not? > > > > Indirection via pip injects the usage of setuptools even for plain distutils projects, and generates https://www.python.org/dev/peps/pep-0376/ compliant metadata by default. > > > > However, looking at the current packaging policy, I think I misremembered the situation - it looks like we *discussed* recommending indirection via pip & attaining PEP 376 compliance, but haven't actually moved forward with the idea yet. That makes sense, since pursuing it would have been gated on ensurepip, and the Python 3 migration has been higher priority recently. > > That glue is actually very shallow...I think we should rip it out of pip and perhaps put it in setuptools. It's about building, not installing. > So a benefit of using pip instead of setuptools is that as we move to a pluggable build system pip can act as a unified fronted to multiple build systems, instead of every system having to implement each pluggable build system themselves. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From chris.barker at noaa.gov Mon May 18 06:03:21 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Sun, 17 May 2015 21:03:21 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Sun, May 17, 2015 at 5:12 PM, Robert Collins wrote: > > > But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ??? > > > > > > I see pip has handling the dependency resolution, and finding and > downloading of packages part of the problem -- conda does those already. > > > > > > So what would using pip inside a conda build script buy you that using > setuptools does not? > > > > Indirection via pip injects the usage of setuptools even for plain > distutils projects, and generates > https://www.python.org/dev/peps/pep-0376/ compliant metadata by default. > > > > However, looking at the current packaging policy, I think I > misremembered the situation - it looks like we *discussed* recommending > indirection via pip & attaining PEP 376 compliance, but haven't actually > moved forward with the idea yet. That makes sense, since pursuing it would > have been gated on ensurepip, and the Python 3 migration has been higher > priority recently. > > That glue is actually very shallow...I think we should rip it out of pip > and perhaps put it in setuptools. It's about building, not installing. > +1 -- and rip out setuptools installing of dependencies, while we're at it :-) (OK, I know e can't do that...) But is the up shot that using pip to install won't buy anythign over setuptools right now? (except for packages that aren't already using setuptools...) -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Mon May 18 06:32:54 2015 From: cournape at gmail.com (David Cournapeau) Date: Mon, 18 May 2015 13:32:54 +0900 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Mon, May 18, 2015 at 9:05 AM, Nick Coghlan wrote: > > On 18 May 2015 07:32, "Chris Barker" wrote: > > > > On Sun, May 17, 2015 at 12:05 AM, Nick Coghlan > wrote: > >> > >> > % pip install --upgrade pip > >> > % pip install some_conda_package > >> > >> This gets the respective role of the two tools reversed - it's like my > >> asking for "pip install some_fedora_rpm" to be made to work. > > > > > > I agree here -- I was thinking there was some promise in a > conda_package_to_wheel converter though. It would, of course, only work in > a subset of conda packages, but would be nice. > > > > The trick is that conda packages for the hard-to-build python packages > (the ones we care about) often (always?) depend on conda packages for > dynamic libs, and pip+wheel have no support for that. > > > > And this is a trick, because while I have some ideas for supporting > just-for-python dynamic libs, conda's are not just-for-python -- so that > might be hard to mash together. > > > > Continuum has a bunch of smart people, though. > > > >> However, having conda use "pip install" in its build scripts so that > >> it reliably generates pip compatible installation metadata would be a > >> possibility worth discussing - that's what we've started doing in > >> Fedora, so that runtime utilities like pkg_resources can work > >> correctly. > > > > > > Hmm -- that's something ot look into -- you can put essentially anything > into a conda bulid script -- so this would be a matter of convention, > rather than tooling. (of course the conventions used by Continuum for the > "offical" conda packages is the standard). > > > > But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ??? > > > > I see pip has handling the dependency resolution, and finding and > downloading of packages part of the problem -- conda does those already. > > > > So what would using pip inside a conda build script buy you that using > setuptools does not? > > Indirection via pip injects the usage of setuptools even for plain > distutils projects, and generates > https://www.python.org/dev/peps/pep-0376/ compliant metadata by default. > Note that some packages will push hard against injecting setuptools, at least until it does not offer a way to prevent from installing as an egg directory. Most of the core scientific packages avoid setuptools because of this. David However, looking at the current packaging policy, I think I misremembered > the situation - it looks like we *discussed* recommending indirection via > pip & attaining PEP 376 compliance, but haven't actually moved forward with > the idea yet. That makes sense, since pursuing it would have been gated on > ensurepip, and the Python 3 migration has been higher priority recently. > > Cheers, > Nick. > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Mon May 18 12:17:34 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 18 May 2015 11:17:34 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 17 May 2015 at 23:50, Chris Barker wrote: > I guess the key thing here for me is that I don't see pushing conda to > budding web developers -- but what if web developers have the need for a bit > of the scipy stack? or??? > > We really don't have a good solution for those folks. Agreed. My personal use case is as a general programmer (mostly sysadmin and automation type of work) with some strong interest in business data analysis and a side interest in stats. For that sort of scenario, some of the scipy stack (specifically matplotlib and pandas and their dependencies) is really useful. But conda is *not* what I'd use for day to day work, so being able to install via pip is important to me. It should be noted that installing via pip *is* possible - via some of the relevant projects having published wheels, and the rest being available via Christoph Gohlke's site either as wheels or as wininsts that I can convert. But that's not a seamless process, so it's not something I'd be too happy explaining to a colleague should I want to share the workload for that type of thing. Paul From p.f.moore at gmail.com Mon May 18 12:30:28 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 18 May 2015 11:30:28 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 18 May 2015 at 05:32, David Cournapeau wrote: > Note that some packages will push hard against injecting setuptools, at > least until it does not offer a way to prevent from installing as an egg > directory. Most of the core scientific packages avoid setuptools because of > this. One way forward in terms of building wheels is to use any build process you like to do an isolated build (I think it's --root that distutils uses for this sort of thing) and then use distlib to build a wheel from the resulting directory structure (or do it by hand, it's not much more than a bit of directory rearrangement and zipping things up). That process can be automated any way you like - although ideally via something general, so projects don't have to reinvent the wheel every time. If processes like conda then used wheels as their input for building packages, the wheels could *also* be published (either on PyPI or on a 3rd party site such as binstar) to provide support for people not using conda for their package maintenance. There may be technical issues with this process - not least, does the way conda uses shared libraries make going via wheels impossible (or at least make the wheels unusable without conda's support for installing non-Python shared libraries)? But it does remove a lot of the duplication of effort and competition that currently seems to be involved in the conda vs wheel situation. In this scenario, there are two binary formats - wheel and conda (the format). There are a number of packaging tools, notably pip and conda (the tool). Other tools and/or formats can be built on the wheel binary format to address specific communities' needs, much as conda does for the Scientific community. Here, wheel and pip are the bottom layer, the Python specific binary format and package manager. They are appropriate for people with generalised needs, and for communities with no need for a more advanced/specialised tool. Paul From dmertz at continuum.io Mon May 18 19:50:05 2015 From: dmertz at continuum.io (David Mertz) Date: Mon, 18 May 2015 10:50:05 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: This pertains more to the other thread I started, but I'm sort of becoming convinced--especially by Paul Moore's suggestion there--that the better approach is to grow conda (the tool) rather than shoehorn conda packages into pip. Getting pip to recognize the archive format of conda would be easy enough alone, but that really doesn't cover the fact that 'conda ~= pip+virtualenv', and pip alone simply should not try to grow that latter aspect itself. Plus pip is not going to be fully language agnostic, for various reasons, but including the fact that apt-get and yum and homebrew and ports already exist. So it might make sense to actually allow folks to push conda to budding web developers, if conda allowed installation (and environment management) of sdist packages on PyPI. So perhaps it would be good if *this* worked: % pip install conda % conda install scientific_stuff % conda install --sdist django_widget # we know to look on PyPI Maybe that flag is mis-named, or could be omitted altogether. But there's no conceptual reason that conda couldn't build an sdist fetched from PyPI into a platform specific binary matching the current user machine (and do all the metadata dependency and environment stuff the conda tool does). On Mon, May 18, 2015 at 3:17 AM, Paul Moore wrote: > On 17 May 2015 at 23:50, Chris Barker wrote: > > I guess the key thing here for me is that I don't see pushing conda to > > budding web developers -- but what if web developers have the need for a > bit > > of the scipy stack? or??? > > > > We really don't have a good solution for those folks. > > Agreed. My personal use case is as a general programmer (mostly > sysadmin and automation type of work) with some strong interest in > business data analysis and a side interest in stats. > > For that sort of scenario, some of the scipy stack (specifically > matplotlib and pandas and their dependencies) is really useful. But > conda is *not* what I'd use for day to day work, so being able to > install via pip is important to me. It should be noted that installing > via pip *is* possible - via some of the relevant projects having > published wheels, and the rest being available via Christoph Gohlke's > site either as wheels or as wininsts that I can convert. But that's > not a seamless process, so it's not something I'd be too happy > explaining to a colleague should I want to share the workload for that > type of thing. > > Paul > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Mon May 18 20:21:31 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 18 May 2015 19:21:31 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 18 May 2015 at 18:50, David Mertz wrote: > % pip install conda > % conda install scientific_stuff > % conda install --sdist django_widget # we know to look on PyPI But that doesn't give (Windows, mainly) users a solution for things that need a C compiler, but aren't provided as conda packages. My honest view is that unless conda is intending to replace pip and wheel totally, you cannot assume that people will be happy to use conda alongside pip (or indeed, use any pair of independent packaging tools together - people typically want one unified solution). And if the scientific community stops working towards providing wheels for people without compilers "because you can use conda", there is going to be a proportion of the Python community that will lose out on some great tools as a result. Paul From dmertz at continuum.io Mon May 18 23:58:35 2015 From: dmertz at continuum.io (David Mertz) Date: Mon, 18 May 2015 14:58:35 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: I don't really see any reason conda couldn't support bdist wheels also. But yes, basically the idea is that we'd like users to be able to rely entirely on conda as their packaging (and environment configuration) system if they choose to. It may be impolitic to say so, but I think conda can and should replace pip for a large class of users. That is, it should be possible for users to use pip exactly once (as in the line I show above), and use conda forever thereafter. Since conda does a lot more (programming language independence, environments), perhaps it really does make a lot more sense for conda to be "one package manager to rule them all" much more than trying to make a pip that does so. But y'know, the truth is I'm trying to figure out the best path here. I want to get better interoperability between conda packages and the rest of the Python ecosystem, but there are stakeholders involved both in the distutils community and within Continuum (where I now work). On Mon, May 18, 2015 at 11:21 AM, Paul Moore wrote: > On 18 May 2015 at 18:50, David Mertz wrote: > > % pip install conda > > % conda install scientific_stuff > > % conda install --sdist django_widget # we know to look on PyPI > > But that doesn't give (Windows, mainly) users a solution for things > that need a C compiler, but aren't provided as conda packages. > > My honest view is that unless conda is intending to replace pip and > wheel totally, you cannot assume that people will be happy to use > conda alongside pip (or indeed, use any pair of independent packaging > tools together - people typically want one unified solution). And if > the scientific community stops working towards providing wheels for > people without compilers "because you can use conda", there is going > to be a proportion of the Python community that will lose out on some > great tools as a result. > > Paul > -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 01:25:06 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 18 May 2015 16:25:06 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: A member of the conda dev team could answer this better than I, but I've used enough to _think_ I understand the basics: On Mon, May 18, 2015 at 3:30 AM, Paul Moore wrote: > One way forward in terms of building wheels is to use any build > process you like to do an isolated build (I think it's --root that > distutils uses for this sort of thing) and then use distlib to build a > wheel from the resulting directory structure (or do it by hand, it's > not much more than a bit of directory rearrangement and zipping things > up). > > That process can be automated any way you like - although ideally via > something general, so projects don't have to reinvent the wheel every > time. > sure -- you can put virtually anything in a conda build script. what conda build does is more or less: * setup an isolated environment with some handy environment variables for things like the python interpreter, etc. * run your build script * package up whatever got built. If processes like conda then used wheels as their input for building > packages, the wheels could *also* be published I'm not sure it's any easier to build a wheel, then make a conda package out of it, than to build a conda package, and then make a wheel out of it. Or have your build scrit build a wheel, and then independently build a conda package. In any case, the resulting wheel would depend on an environment like the one set up by conda build -- and that is an environment with all the dependencies installed -- which is where this gets ugly. [remember, making the wheel itself it the easy part] > not least, does the > way conda uses shared libraries make going via wheels impossible (or > at least make the wheels unusable without conda's support for > installing non-Python shared libraries)? Pretty much, yes. conda provides a way to package up and manage arbitrary stuff -- in this case, that would be non-python dependencies -- i.e. shared libs. So you can say that my_python_package depends on this_c_lib, and as long as you, or someone else has made a conda package for this_c_lib, then all is well. But python, setuptools, pip, wheel, etc. don't have a way to handle that shared lib as a dependency -- no standard way where to put it, no way to package it as a wheel, etc. So the way to deal with this with wheels is to statically link everything. But that's not how conda pa cakges are built, so no way to leverage conda here. We need to remember what leveraging conda would buy us: conda doesn't actually make it any easier to build anything -- you need a platform-specific build script to build a conda package. conda does provide a way to manage non-python dependencies -- but that doesn't buy you anything unless you are using conda to manage your system anyway. conda DOES provide a community of people figuring out how to build complex packages, and building them, and putting them up for public dissemination. So the thing that leveraging conda can do is reduce the need for a lot of duplicated effort. And that effort is almost entirely about those third part libs -- after all, a compiled extension that has no dependencies is easy to build and put on PyPi. (OK, there is still a bit of duplicated effort in making the builds themselves on multiple platforms -- but with CI systems, that's not huge) An example: I have a complex package that not depends on all sorts of hard-to-build python packages, but also has its own C++ code that depends on the netcdf4 library. Which in turn, depends on the hdf5 lib, which depends on libcurl, and zlib, and (I think one or two others). Making binary wheels of this requires me to figure out how to build all those deps on at least two platforms (Windows being the nightmare, but OS-X is not trivial, too, if I want it to match the python.org build, and support older OS versions than I am running) Then I could have a nice binary wheel that my users can pip install and away they go. But: 1) They also need the Py_netCDF4 package, which may or may not be easy to find. If not -- they need to go through all that build hell. Then they have a package that is using a bunch of the same shared libs as mine -- and hopefully no version conflicts... 2) my package is under development -- what I really want is for it to be easy for my users to build from source, so they can keep it all up to date from the repo. Now they need to get a development version of all those libs up and running on their machines -- a heavy lift for a lot of people. So now - "use Anaconda" is a pretty good solution -- it provides all the libs I need, and someone else has figured out how to build them on the platforms I care about. But it would be nice if we could find a way for the "standard" python toolchain could support this. NOTE: as someone suggested on this list was to provide (outside of PyPi + pip), a set of static libs all built and configured, so that I could say: "install these libs from some_other_place, then you can build and run my code" -- that may be doable with a community effort and no change in the tooling, but it has not so far. In short: what conda and the conda community provide is python-compatible third party libs. We can't take advantage of that with pip+PyPi unless we find a way to support third party libs with those tools. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 01:43:54 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 18 May 2015 16:43:54 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Mon, May 18, 2015 at 11:21 AM, Paul Moore wrote: > On 18 May 2015 at 18:50, David Mertz wrote: > > % pip install conda > > % conda install scientific_stuff > > % conda install --sdist django_widget # we know to look on PyPI > > But that doesn't give (Windows, mainly) users a solution for things > that need a C compiler, but aren't provided as conda packages. > Conda provides (or can) a C compiler (some versions of gcc). It was buggy last time I checked, but it's doable. > My honest view is that unless conda is intending to replace pip and > wheel totally, you cannot assume that people will be happy to use > conda alongside pip (or indeed, use any pair of independent packaging > tools together - people typically want one unified solution). And if > the scientific community stops working towards providing wheels for > people without compilers "because you can use conda", there is going > to be a proportion of the Python community that will lose out on some > great tools as a result. Exactly -- this idea that there are two (or more) non-overlapping communities is pretty destructive. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 01:41:32 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 18 May 2015 16:41:32 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Mon, May 18, 2015 at 10:50 AM, David Mertz wrote: > This pertains more to the other thread I started, but I'm sort of becoming > convinced--especially by Paul Moore's suggestion there--that the better > approach is to grow conda (the tool) rather than shoehorn conda packages > into pip. > I agree -- in some sense conda is pip+more, you couldn't do it without growing pip (see the other thread...) > So it might make sense to actually allow folks to push conda to budding > web developers, if conda allowed installation (and environment management) > of sdist packages on PyPI. So perhaps it would be good if *this* worked: > > % pip install conda > % conda install scientific_stuff > % conda install --sdist django_widget # we know to look on PyPI > so a point / question here: you can, right now, run pip from inside a conda environment python, and for the most part, it works -- certainly for sdists. I'm actually doing that a lot, and so are others. But it gets messy when you have two systems trying to handle dependencies -- pip may not realize that conda has already installed something, and vice versa. So it's really nicer to have one package manager. But maybe all you really need to do is teach conda to understand pip meta-data, and/or make sure that conda write pip-compatible meta data. Then a user could do: conda install some_package and conda would look it all its normal places for some_package, and if it din't find it, it would try running "pip install" under the hood. The user wouldn't know, or have to know, where the package came from (though conda might want to add that to the meta-data for use come upgrade time, etc.) In short --make it easy for conda users to use pip / pypi packages. Note: there has been various threads about this on the Anaconda list lately. The current "plan" is to have a community binstar channel that mirrors as much of pypi as possible. Until we have an automated way to grab pypi packages for conda -- this isn't bad stop gap. Also note that conda can often (but not always) build a conda package from pypi automagically -- someone could potentially run a service that does that. On Mon, May 18, 2015 at 3:17 AM, Paul Moore wrote: > >> Agreed. My personal use case is as a general programmer (mostly >> sysadmin and automation type of work) with some strong interest in >> business data analysis and a side interest in stats. >> >> For that sort of scenario, some of the scipy stack (specifically >> matplotlib and pandas and their dependencies) is really useful. But >> conda is *not* what I'd use for day to day work, so being able to >> install via pip is important to me. >> > What if "conda install" did work for virtually all pypi packages? (one way or the other) -- would you use and recommend Anaconda (or miniconda) then? > It should be noted that installing >> via pip *is* possible - via some of the relevant projects having >> published wheels, and the rest being available via Christoph Gohlke's >> site either as wheels or as wininsts that I can convert. But that's >> not a seamless process, so it's not something I'd be too happy >> explaining to a colleague should I want to share the workload for that >> type of thing. >> > right -- that could be made better right now -- or soon. Gohlke's packages can't be simply put up on PyPi for licensing reasons (he's using the Intel math libs). But some folks are working really hard on getting a numpy wheel that will work virtually everywhere, and still give good performance for numerics. From there, the core SciPy stack should follow (it's already on PyPi for OS-X). Which is a GREAT move in the right direction, but doesn't get us quite to where PyPi can support the more complex packages. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From gksalil at gmail.com Tue May 19 04:24:02 2015 From: gksalil at gmail.com (salil GK) Date: Tue, 19 May 2015 07:54:02 +0530 Subject: [Distutils] Help required for setup.py Message-ID: Hello I was trying to create my package for distribution. I have a requirement that I need to check if one particular command is available in the system ( this command is not installed through a package - but a bundle is installed to get the command in the system ). I am using Ubuntu 14.04 Thanks in advance Salil -------------- next part -------------- An HTML attachment was scrubbed... URL: From madewokherd at gmail.com Tue May 19 05:24:07 2015 From: madewokherd at gmail.com (Vincent Povirk) Date: Mon, 18 May 2015 22:24:07 -0500 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: > But it gets messy when you have two systems trying to handle dependencies -- > pip may not realize that conda has already installed something, and vice > versa. So it's really nicer to have one package manager. > > But maybe all you really need to do is teach conda to understand pip > meta-data, and/or make sure that conda write pip-compatible meta data. Forgive me, I'm trying to follow as someone who is working with PyPI but hasn't really used conda or pip. Does a conda environment contain its own site-packages directory, and does pip correctly install packages to that directory? If so, I expect supporting PEP 376 would help with this. It doesn't help either package manager install dependencies from outside their repos, it just means that pip will work if the user installs dependencies from conda first. To be able to install dependencies, either conda needs to know enough about PyPI to find a package's dependencies itself (and at that point, I wonder how much value pip adds compared to 'wheel'), or pip needs to know that it can delegate to conda when run in this way. From p.f.moore at gmail.com Tue May 19 11:55:17 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 10:55:17 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 00:25, Chris Barker wrote: > Pretty much, yes. conda provides a way to package up and manage arbitrary > stuff -- in this case, that would be non-python dependencies -- i.e. shared > libs. > > So you can say that my_python_package depends on this_c_lib, and as long as > you, or someone else has made a conda package for this_c_lib, then all is > well. > > But python, setuptools, pip, wheel, etc. don't have a way to handle that > shared lib as a dependency -- no standard way where to put it, no way to > package it as a wheel, etc. > > So the way to deal with this with wheels is to statically link everything. > But that's not how conda pa cakges are built, so no way to leverage conda > here. Thanks for the explanation. So, in effect, conda-as-a-platform defines a (somewhat) incompatible platform for running Python, which can use wheels just as python.org Python can, but which uses conda-as-an-installer as its package manager (much like RPM or apt on Unix). The downside of this is that wheels built for conda (assuming that it's OK to link with shared libs) are not compatible with python.org builds (as those shared libs aren't available) and that difference isn't reflected in the wheel ABI tags (and it's not particularly clearly understood by the community, it seems). So publishing conda-based wheels on PyPI would be a bad idea, because they wouldn't work with python.org python (more precisely, only things that depend on shared libs are affected, but the point remains). > We need to remember what leveraging conda would buy us: > > conda doesn't actually make it any easier to build anything -- you need a > platform-specific build script to build a conda package. > > conda does provide a way to manage non-python dependencies -- but that > doesn't buy you anything unless you are using conda to manage your system > anyway. > > conda DOES provide a community of people figuring out how to build complex > packages, and building them, and putting them up for public dissemination. > > So the thing that leveraging conda can do is reduce the need for a lot of > duplicated effort. And that effort is almost entirely about those third part > libs -- after all, a compiled extension that has no dependencies is easy to > build and put on PyPi. (OK, there is still a bit of duplicated effort in > making the builds themselves on multiple platforms -- but with CI systems, > that's not huge) Agreed - that benefit would be significant (and not just in terms of Python packaging - I'd personally find it useful in a lot of non-Python projects), but it does rely on the information/scripts being accessible in non-conda contexts. For example, for python.org wheels, some way of modifying the build to create static libraries rather than shared ones. I'm not sure to what extent the conda folks would want to deal with handling the issues outside their own environment, though (after all, working within a closed environment is part of what makes the problem tractable for them). The whole area of collecting build processes for common libraries is an area that's always been badly managed on Windows, probably because no-one has ever looked at it in a wider context than their own project, and also because every library has its own "building on Windows" solution (configure, custom MSVC solution files, CMake, etc). And also because C/C++ has no concept anything like PyPI. But while this is hugely interesting to me, it's not clear how well it fits with Python packaging and distutils-sig (except as an unresolved C/C++ issue that we have to contend with). Paul From oscar.j.benjamin at gmail.com Tue May 19 13:03:29 2015 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Tue, 19 May 2015 12:03:29 +0100 Subject: [Distutils] Help required for setup.py In-Reply-To: References: Message-ID: On 19 May 2015 at 03:24, salil GK wrote: > > I was trying to create my package for distribution. I have a requirement > that I need to check if one particular command is available in the system ( > this command is not installed through a package - but a bundle is installed > to get the command in the system ). I am using Ubuntu 14.04 Hi Salil, what is it that you actually want help with? -- Oscar From oscar.j.benjamin at gmail.com Tue May 19 13:27:34 2015 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Tue, 19 May 2015 12:27:34 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 10:55, Paul Moore wrote: >> >> But python, setuptools, pip, wheel, etc. don't have a way to handle that >> shared lib as a dependency -- no standard way where to put it, no way to >> package it as a wheel, etc. >> >> So the way to deal with this with wheels is to statically link everything. >> But that's not how conda pa cakges are built, so no way to leverage conda >> here. > > Thanks for the explanation. So, in effect, conda-as-a-platform defines > a (somewhat) incompatible platform for running Python, which can use > wheels just as python.org Python can, but which uses > conda-as-an-installer as its package manager (much like RPM or apt on > Unix). > > The downside of this is that wheels built for conda (assuming that > it's OK to link with shared libs) are not compatible with python.org > builds (as those shared libs aren't available) and that difference > isn't reflected in the wheel ABI tags (and it's not particularly > clearly understood by the community, it seems). So publishing > conda-based wheels on PyPI would be a bad idea, because they wouldn't > work with python.org python (more precisely, only things that depend > on shared libs are affected, but the point remains). I've been peripherally following this thread so I may be missing the point but it seems to me that Python already has a mature and flexible way of locating and loading shared libs through the module/import system. Surely the best way to manage non-Python shared libs is by exposing them as extension modules which can be packaged up on PyPI. Then you have dependency resolution for pip, you don't need to worry about the OS-specific shared library loading details and ABI information can be stored as metadata in the module. It would even be possible to load multiple versions or ABIs of the same library as differently named Python modules IIUC. As a case in point numpy packages up a load of C code and wraps a BLAS/Lapack library. Many other extension modules are written which can all take advantage of the non-Python shared libraries that embody numpy via its C API. Is there some reason that this is not considered a good solution? -- Oscar From wes.turner at gmail.com Tue May 19 14:17:13 2015 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 May 2015 07:17:13 -0500 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On May 19, 2015 4:55 AM, "Paul Moore" wrote: > > On 19 May 2015 at 00:25, Chris Barker wrote: > > Pretty much, yes. conda provides a way to package up and manage arbitrary > > stuff -- in this case, that would be non-python dependencies -- i.e. shared > > libs. > > > > So you can say that my_python_package depends on this_c_lib, and as long as > > you, or someone else has made a conda package for this_c_lib, then all is > > well. > > > > But python, setuptools, pip, wheel, etc. don't have a way to handle that > > shared lib as a dependency -- no standard way where to put it, no way to > > package it as a wheel, etc. > > > > So the way to deal with this with wheels is to statically link everything. > > But that's not how conda pa cakges are built, so no way to leverage conda > > here. > > Thanks for the explanation. So, in effect, conda-as-a-platform defines > a (somewhat) incompatible platform for running Python, which can use > wheels just as python.org Python can, but which uses > conda-as-an-installer as its package manager (much like RPM or apt on > Unix). A repeatable, reproducible environment (GH:conda/conda-env) that does not require a C compiler to be installed for deployment. > > The downside of this is that wheels built for conda (assuming that > it's OK to link with shared libs) are not compatible with python.org > builds (as those shared libs aren't available) and that difference > isn't reflected in the wheel ABI tags (and it's not particularly > clearly understood by the community, it seems). So publishing > conda-based wheels on PyPI would be a bad idea, because they wouldn't > work with python.org python (more precisely, only things that depend > on shared libs are affected, but the point remains). system-site-packages may very well be for a different version of the python interpreter and stdlib. > > > We need to remember what leveraging conda would buy us: > > > > conda doesn't actually make it any easier to build anything -- you need a > > platform-specific build script to build a conda package. meta.yaml (w/ optional preprocessing # [selectors]) http://conda.pydata.org/docs/build.html calls build.sh or build.bat by default, at build time. > > > > conda does provide a way to manage non-python dependencies -- but that > > doesn't buy you anything unless you are using conda to manage your system > > anyway. NodeJS, R, Perl (GH:conda/conda-recipes) > > > > conda DOES provide a community of people figuring out how to build complex > > packages, and building them, and putting them up for public dissemination. > > > > So the thing that leveraging conda can do is reduce the need for a lot of > > duplicated effort. And that effort is almost entirely about those third part > > libs -- after all, a compiled extension that has no dependencies is easy to > > build and put on PyPi. (OK, there is still a bit of duplicated effort in > > making the builds themselves on multiple platforms -- but with CI systems, > > that's not huge) PPAs would be great > > Agreed - that benefit would be significant (and not just in terms of > Python packaging - I'd personally find it useful in a lot of > non-Python projects), but it does rely on the information/scripts > being accessible in non-conda contexts. For example, for python.org > wheels, some way of modifying the build to create static libraries > rather than shared ones. I'm not sure to what extent the conda folks > would want to deal with handling the issues outside their own > environment, though (after all, working within a closed environment is > part of what makes the problem tractable for them). > > The whole area of collecting build processes for common libraries is > an area that's always been badly managed on Windows, probably because > no-one has ever looked at it in a wider context than their own > project, and also because every library has its own "building on > Windows" solution (configure, custom MSVC solution files, CMake, etc). > And also because C/C++ has no concept anything like PyPI. But while > this is hugely interesting to me, it's not clear how well it fits with > Python packaging and distutils-sig (except as an unresolved C/C++ > issue that we have to contend with). > > Paul > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Tue May 19 14:21:26 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 13:21:26 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 19 May 2015 at 00:41, Chris Barker wrote: >> On Mon, May 18, 2015 at 3:17 AM, Paul Moore wrote: >>> >>> Agreed. My personal use case is as a general programmer (mostly >>> sysadmin and automation type of work) with some strong interest in >>> business data analysis and a side interest in stats. >>> >>> For that sort of scenario, some of the scipy stack (specifically >>> matplotlib and pandas and their dependencies) is really useful. But >>> conda is *not* what I'd use for day to day work, so being able to >>> install via pip is important to me. > > > What if "conda install" did work for virtually all pypi packages? (one way > or the other) -- would you use and recommend Anaconda (or miniconda) then? If conda did everything pip did (and that includes consuming wheels from PyPI, not just sdists, and it includes caching of downloads, autobuilding of wheels etc, etc.) then I'd certainly consider how to switch to conda (*not* Anaconda - I'd use a different package manager, but not a different Python distribution) rather than pip. But "considering switching" would include getting PyPI supporting conda packages, getting ensurepip replaced with ensureconda, etc. A total replacement for pip, in other words. As a pip maintainer I'm obviously biased, but if conda is intending to replace pip as the official packaging solution for Python, then it needs to do so completely. If it doesn't do that, then we (PyPA and the Python core developers) need to be able to credibly say that pip is the official solution, and that means that we need to make sure that pip/wheel provides the best user experience possible. That includes persuading parts of the Python community (e.g. Scientific users) not to abandon the standard solution in favour of a custom one. My fear here is a split in the Python community, with some packages only being available via one ecosystem, and some via another. Most people won't mind, but people with cross-discipline interests will end up disadvantaged in such a situation. Paul From chris.barker at noaa.gov Tue May 19 17:05:02 2015 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Tue, 19 May 2015 08:05:02 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: <7938257622646142585@unknownmsgid> Maybe I wasn't very clear -- I was addressing what conda might provide in the context of using conda packages with pip/pipy. A conda environment provides a great deal more, yes. system-site-packages may very well be for a different version of the python interpreter and stdlib. Isn't that handled by the wheel meta-data? At least in practice -- Anaconda is delivering python.org compatible pythons. meta.yaml (w/ optional preprocessing # [selectors]) http://conda.pydata.org/docs/build.html calls build.sh or build.bat by default, at build time. Exactly -- you need to write those build scripts yourself. Conda does set up the environment for you (lib paths,etc) so it's often as easy as "./configure && make && make install", but if it's not (particularly on windows!) you've got to figure it out. Which brings in the community aspect -- a lot of stuff has been figured out. PPAs would be great Personal package archives? Can't we do that now with pip+wheel? Indeed, isn't that 1/2 of what binary wheels are for? And the only thing they are for on Linux? -Chris -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 17:22:19 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 08:22:19 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 4:27 AM, Oscar Benjamin wrote: > Surely the best way to manage non-Python shared libs is by > exposing them as extension modules which can be packaged up on PyPI. > Then you have dependency resolution for pip, you don't need to worry > about the OS-specific shared library loading details and ABI > information can be stored as metadata in the module. It would even be > possible to load multiple versions or ABIs of the same library as > differently named Python modules IIUC. > yes, that's what I "proposed" earlier in this thread. I put proposed in quotes because it was buried in the discussion, and not the least bit fleshed out, but that was the point. As a case in point numpy packages up a load of C code and wraps a > BLAS/Lapack library. Many other extension modules are written which > can all take advantage of the non-Python shared libraries that embody > numpy via its C API. > Though to be fair -- managing that has been a challenge -- numpy may be built with different versions of BLAS, and there is no way to really know what you might be getting... but I think this is solved with the "curated packages" approach. > Is there some reason that this is not considered a good solution? > Well, I've had some issues with getting the linking worked out right so that modules could use each-other's libraries, at least on Windows -- but that can probably be worked out. The missing piece in terms of the PyPA infrastructure is that that is no way to specify a binary dependency in the meta data: we can specify that this python package depends on some other python package, but not that this particular binary wheel depends on some other binary wheel. For example, the same python package (say something like PIL) might use the system libs when built on Linux, some of the system libs when built for OS-X, and need all the third party libs when built for Windows. So the OS-X wheel has different dependencies than the Windows wheel, which has different dependencies than, say, a conda package of the same lib. Then there are wheels that might be built to use the homebrew libs on OS-X -- it gets messy! But that can be hacked around, so that we could give it a try and see how it works. The other issue is social: this would really only be a benefit if a wide variety of packages shared the same libs -- but each of those packages is maintained by different individuals and communities. So it's had to know if it would get used. I could put up a libpng wheel, for instance, and who knows if the Pillow folks would have any interest in using it? or the matplotlib folks, or, ... And this would be particularly difficult when the solution was hacked together... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 17:36:50 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 08:36:50 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Mon, May 18, 2015 at 8:24 PM, Vincent Povirk wrote: > > But maybe all you really need to do is teach conda to understand pip > > meta-data, and/or make sure that conda write pip-compatible meta data. > > Forgive me, I'm trying to follow as someone who is working with PyPI > but hasn't really used conda or pip. Does a conda environment contain > its own site-packages directory, If python was installed by conda, yes. I get a bit confused here. For one I have only used conda with Anaconda, and Anaconda starts you off with a python environment (for one thing, conda itself is written in Python, so you need that...). So if one were to start from scratch with conda, I'm not entirely sure what you would get. but I _think_ you could run conda with some_random_python, then use it to set up a conda environment with it's own python... But for the current discussion, yes, a conda environment has it's own site-packages, etc, its own complete python install. and does pip correctly install > packages to that directory? yes. it does. > If so, I expect supporting PEP 376 would > help with this. yes, I think that would help. though conda is about more-than-python, so it would still need to manage dependency and meta-data its own way. But I would think it could duplicate the effort python packages. But I can't speak for the conda developers. It doesn't help either package manager install dependencies from > outside their repos, it just means that pip will work if the user > installs dependencies from conda first. and if we can get vice-versa to work, also -- things would be easier. > To be able to install > dependencies, either conda needs to know enough about PyPI to find a > package's dependencies itself (and at that point, I wonder how much > value pip adds compared to 'wheel'), good point -- and conda does know a fair bit about PyPi already -- there is a conda-skeleton pypi command that goes and looks on pypi fo ra package, and builds a conda build script for it automagically -- and it works without modification much of the time, including dependencies. So much of the logic is there. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Tue May 19 17:46:58 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 16:46:58 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 16:22, Chris Barker wrote: > The other issue is social: this would really only be a benefit if a wide > variety of packages shared the same libs -- but each of those packages is > maintained by different individuals and communities. So it's had to know if > it would get used. I could put up a libpng wheel, for instance, and who > knows if the Pillow folks would have any interest in using it? or the > matplotlib folks, or, ... And this would be particularly difficult when the > solution was hacked together... Honestly, I still haven't seen a solid explanation of why (at least on Windows) static linking isn't a viable option. If someone were to create and publish a Python compatible static ".lib" file for the various hard-to-build dependencies, extensions could specify that you link with it in setup.py, and all the person building the wheel has to do is download the needed libraries for the build. If there's a technical reason why dynamic linking at runtime is better than static linking (sufficiently better that it justifies all the effort needed to resolve the issues involved), then I've yet to see a good explanation of it. The only things I've seen are disk space, or maintenance (where this usually means "it's easier to release a new DLL with a security fix than get all the various statically linked packages updated - that's a valid point, but given how hard it is to get a working dynamic linking solution in this environment, I have to wonder whether that argument still holds). All of this applies to Windows only, of course - dynamic linking and system management of shared libraries is very much a per-platform issue, and I don't pretend to know the trade-offs on OSX or Linux. Paul From cournape at gmail.com Tue May 19 18:15:46 2015 From: cournape at gmail.com (David Cournapeau) Date: Wed, 20 May 2015 01:15:46 +0900 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Wed, May 20, 2015 at 12:46 AM, Paul Moore wrote: > On 19 May 2015 at 16:22, Chris Barker wrote: > > The other issue is social: this would really only be a benefit if a wide > > variety of packages shared the same libs -- but each of those packages is > > maintained by different individuals and communities. So it's had to know > if > > it would get used. I could put up a libpng wheel, for instance, and who > > knows if the Pillow folks would have any interest in using it? or the > > matplotlib folks, or, ... And this would be particularly difficult when > the > > solution was hacked together... > > Honestly, I still haven't seen a solid explanation of why (at least on > Windows) static linking isn't a viable option. Because some libraries simply don't work as static libraries, or are too big (MKL comes to mind). Also, we have been historically using static libs for our eggs at Enthought on windows, and it has been a nightmare to support. It just does not scale when you have 100s of packages. But really, once wheels support arbitrary file locations, this becomes fairly easy at the packaging level: the remaining issue is one of ABI / standards, but that's mostly a non technical issue. Gholke has 100s of packages using wheels, and we ourselves at Enthought have close to 500 packages for windows, all packaged as eggs, maybe 30-40 % of which are not python but libs, C/C++ programs, etc... It is more about agreeing about a common way of doing things than a real technical limitation. David If someone were to > create and publish a Python compatible static ".lib" file for the > various hard-to-build dependencies, extensions could specify that you > link with it in setup.py, and all the person building the wheel has to > do is download the needed libraries for the build. > > If there's a technical reason why dynamic linking at runtime is better > than static linking (sufficiently better that it justifies all the > effort needed to resolve the issues involved), then I've yet to see a > good explanation of it. The only things I've seen are disk space, or > maintenance (where this usually means "it's easier to release a new > DLL with a security fix than get all the various statically linked > packages updated - that's a valid point, but given how hard it is to > get a working dynamic linking solution in this environment, I have to > wonder whether that argument still holds). > > All of this applies to Windows only, of course - dynamic linking and > system management of shared libraries is very much a per-platform > issue, and I don't pretend to know the trade-offs on OSX or Linux. > > Paul > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 18:28:18 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 09:28:18 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 5:21 AM, Paul Moore wrote: > If conda did everything pip did (and that includes consuming wheels > from PyPI, not just sdists, and it includes caching of downloads, > autobuilding of wheels etc, etc.) hmm...what about half-way -- conda does everything pip does, but not necessarily the same way -- i.e. you do a "conda install this_package", and it works for every package ( OK -- almost every ;-) ) that pip install works for. But maybe that's not going to cut it -- in a way, we are headed there now, with a contingent of people porting pypi packages to conda. So far it's various subsets of the scientific community, but if we could get a few web developers to join in... then I'd certainly consider how to > switch to conda (*not* Anaconda - I'd use a different package manager, > but not a different Python distribution) rather than pip. > hmm -- that's the interesting technical question -- conda works at a higher level than pip -- it CAN manage python itself -- I'm not sure it is HAS to, but that's how it is usually used, and the idea is to provide a complete environment, which does include python itself. > But "considering switching" would include getting PyPI supporting > conda packages, uhm, why? if there is a community supported repo of packages -- why does it have to be PyPi? > getting ensurepip replaced with ensureconda, etc. A > total replacement for pip, in other words. > > As a pip maintainer I'm obviously biased, but if conda is intending to > replace pip as the official packaging solution for Python, then it > needs to do so completely. If it doesn't do that, then we (PyPA and > the Python core developers) need to be able to credibly say that pip > is the official solution, and that means that we need to make sure > that pip/wheel provides the best user experience possible. That > includes persuading parts of the Python community (e.g. Scientific > users) not to abandon the standard solution in favour of a custom one. > I agree here. Though we do have a problem -- as Nick has pointed out, the "full" scientific development process -- even if python-centered, requires non-python parts, even beyond shared libs -- a fortran compiler is a big one, but also maybe other languages, like R, or Julia, etc. Or LLVM (for numba), or... This is why Continuum build conda -- they wanted to provide a way to manage all that. So is it possible for PyPA to grow the features to mange all the python bits, and then have things like conda use pip inside of Anaconda, maybe? or SOME transition where you can add conda if and only if you need its unique features, and as a add-it-later-to-what-you-have solution, rather than, when you need R or some such, you need to * Toss out your current setup * Install Anaconda (or miniconda) * Switch from virtualenv to conda environments * re-install all your dependencies And for even that to work, we need a way for everything installable by pip to be installable within that conda environment -- which we could probably achieve. My fear here is a split in the Python community, with some packages > only being available via one ecosystem, and some via another. Exactly. While I'm not at all sure that we could get to a "one way to do it" that would meet every community's needs, I do think that we could push pip+pypi+wheel a little further to better support at least the python-centric stuff -- i.e. third party libs, which would get us a lot farther. And again, it's not just the scipy stack -- there is stuff like image manipulation packages, etc, that could be better handled. And the geospatial packages are a mess, too - is that "scientific"? -- I don't know, but it's the "new hotness" in web development. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 18:40:34 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 09:40:34 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 9:15 AM, David Cournapeau wrote: > Honestly, I still haven't seen a solid explanation of why (at least on >> Windows) static linking isn't a viable option. > > well - it does get us pretty far.... > Because some libraries simply don't work as static libraries, or are too > big (MKL comes to mind). Also, we have been historically using static libs > for our eggs at Enthought on windows, and it has been a nightmare to > support. It just does not scale when you have 100s of packages. > there is also the issue of semi-developers -- I want people to be able to easily build my package, that depends on a bunch of libs that I really want to be the same. I suppose I could deliver the static libs themselves, along with the headers, etc, but that does get ugly. > But really, once wheels support arbitrary file locations, this becomes > fairly easy at the packaging level: the remaining issue is one of ABI / > standards, but that's mostly a non technical issue. > yup -- social issues are the big one. > Gholke has 100s of packages using wheels, > doesn't he ship the dlls with the packages, even when not using static libs? so that multiple packages may have the same dll? which is almost the same as a static lib. > and we ourselves at Enthought have close to 500 packages for windows, all > packaged as eggs, maybe 30-40 % of which are not python but libs, C/C++ > programs, etc... It is more about agreeing about a common way of doing > things than a real technical limitation. > good to know -- I suspected as much but haven't tried it yet. If someone were to >> create and publish a Python compatible static ".lib" file for the >> various hard-to-build dependencies, extensions could specify that you >> link with it in setup.py, and all the person building the wheel has to >> do is download the needed libraries for the build. >> > OK, but from a "social" perspective, this is unlikely to happen (it hasn't yet!), without some "official" support on PyPi, with pip, or ??? So even if static is the way to go -- there is an infrastructure need. If there's a technical reason why dynamic linking at runtime is better >> than static linking (sufficiently better that it justifies all the >> effort needed to resolve the issues involved), > > I'm not sure the issues are that much greater -- we could build binary wheels that hold dynamic libs, and put them up on PyPi -- then other package maintainers would need to actually use those -- almost the same thing as getting a variety of package maintainers to use this mythical repository of static libs. (and by the way, gcc makes it remarkably had to force a static build) Another issue I've run into is nested static libs -- you have to change yoru setup.py in special ways for that to work: libnetcdf depends on libhdf5, depends on libcurl and libz and... getting all that statically linked into my extension is tricky. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From leorochael at gmail.com Tue May 19 19:18:14 2015 From: leorochael at gmail.com (Leonardo Rochael Almeida) Date: Tue, 19 May 2015 14:18:14 -0300 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 19 May 2015 at 13:28, Chris Barker wrote: > [...] > So is it possible for PyPA to grow the features to mange all the python > bits, and then have things like conda use pip inside of Anaconda, maybe? or > SOME transition where you can add conda if and only if you need its unique > features, and as a add-it-later-to-what-you-have solution, rather than, > when you need R or some such, you need to > > * Toss out your current setup > * Install Anaconda (or miniconda) > * Switch from virtualenv to conda environments > * re-install all your dependencies > > And for even that to work, we need a way for everything installable by pip > to be installable within that conda environment -- which we could probably > achieve. > What if instead of focusing on pip being able to install more than just python packages, we made sure that a virtualenv was as strict subset of, say, a conda environment? This way, going from virtualenv to, say, conda would not be a toss-out, but an upgrade. With all that was discussed here, ISTM it should be easy enough to make sure that a virtualenv contains *the place* where you install the DLLs/.so needed to make a certain pip package work, in a way that wouldn't pollute other virtualenvs or the whole system (that'd be /lib on nix and \Scritps or \Lib on Windows), even if pip itself is not responsible for installing the libraries in that place. Then, one could run something like pip install conda-enable Which would add a `conda` script to the virtualenv, which then could be used to install python packages and their non Python dependencies in a way that is pip compatible. But what if we don't want to teach people to use anything other than pip? Then perhaps instead of teaching pip to install non-python stuff, we could just add some hooks to pip. For instance, the `conda-enable` package above could install hooks that would be called by pip whenever pip is called to install some packages. The hook would receive the package name and its known metadata, allowing the `conda-enable` package to go and install whatever pre-requisites it knows about for the presented package, before pip tries to install the package itself. The `conda-enable` package could also add configuration to the virtualenv telling pip which alternative indexes to use to install wheels known to be compatible with the non-python dependencies the hook above installs. This would give us a route that doesn't force the PYPA stack to consider how to solve the non-python dependency issue, or even what metadata would be required to solve it. The installed hook would need to keep a mapping of distribution-name to non-python dependency. In time, we could think of extending the metadata that PYPI carries to contemplate non-python dependencies, but we could have a solution that works even without it. I used `conda` in the examples above, but with the hooks in place, anyone could write their own non-python dependency resolution system. For insance, this could be a good way for corporations to provide for the use of standardized software for their python platforms that is not limited to python packages, without forcing a lot of feature creep onto the PYPA stack itself. Cheers, Leo -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Tue May 19 19:58:28 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 18:58:28 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 19 May 2015 at 17:28, Chris Barker wrote: > On Tue, May 19, 2015 at 5:21 AM, Paul Moore wrote: >> >> If conda did everything pip did (and that includes consuming wheels >> from PyPI, not just sdists, and it includes caching of downloads, >> autobuilding of wheels etc, etc.) > > hmm...what about half-way -- conda does everything pip does, but not > necessarily the same way -- i.e. you do a "conda install this_package", and > it works for every package ( OK -- almost every ;-) ) that pip install works > for. Sure. Doesn't have to be the same way, but the user experience has to be the same. > But maybe that's not going to cut it -- in a way, we are headed there now, > with a contingent of people porting pypi packages to conda. So far it's > various subsets of the scientific community, but if we could get a few web > developers to join in... Unless project owners switch to providing conda packages, isn't there always going to be a lag? If a new version of lxml comes out, how long must I wait for "the conda folks" to release a package for it? >> then I'd certainly consider how to >> switch to conda (*not* Anaconda - I'd use a different package manager, >> but not a different Python distribution) rather than pip. > > hmm -- that's the interesting technical question -- conda works at a higher > level than pip -- it CAN manage python itself -- I'm not sure it is HAS to, > but that's how it is usually used, and the idea is to provide a complete > environment, which does include python itself. Yes. But I don't want to use Anaconda Python, Same reason - how long do I wait for the new release of Python to be available in Anaconda? There's currently no Python 3.5 alpha for example... >> But "considering switching" would include getting PyPI supporting >> conda packages, > > uhm, why? if there is a community supported repo of packages -- why does it > have to be PyPi? If conda/binstar is good enough to replace pip/PyPI, there's no reason for pip/PyPI to still exist. So in effect binstar *becomes* PyPI. There's an element of evangelisation going on here - you're (effectively) asking what it'd take to persuade me to use conda in place of pip. I'm playing hard to get, a little, because I see no specific benefits to me in using conda, so I don't see why I should accept any loss at all, in the absence of a benefit to justify it. My biggest worry is that at some point, "if you want numpy/scipy, you should use conda" becomes an explicit benefit of conda, and pip/PyPI users get abandoned by the scientific community. If that happens, I'd rather see the community rally behind conda than see a split. But I hope that's not the way things end up going. >> getting ensurepip replaced with ensureconda, etc. A >> total replacement for pip, in other words. This is the key point. The decision was made to "bless" pip as the official Python package manager. Should we revisit that decision? If not, then how do we ensure that pip (and the surrounding infrastructure) handles the needs of the *whole* Python community? If the authors of scientific extensions for Python abandon pip for conda, then pip isn't supporting that part of the community properly. But conversely, if the scientific community doesn't look to address their issues within the pip/wheel infrastructure, how can we do anything to avoid a rift? (end of doom and gloom section ;-)) >> As a pip maintainer I'm obviously biased, but if conda is intending to >> replace pip as the official packaging solution for Python, then it >> needs to do so completely. If it doesn't do that, then we (PyPA and >> the Python core developers) need to be able to credibly say that pip >> is the official solution, and that means that we need to make sure >> that pip/wheel provides the best user experience possible. That >> includes persuading parts of the Python community (e.g. Scientific >> users) not to abandon the standard solution in favour of a custom one. > > > I agree here. Though we do have a problem -- as Nick has pointed out, the > "full" scientific development process -- even if python-centered, requires > non-python parts, even beyond shared libs -- a fortran compiler is a big > one, but also maybe other languages, like R, or Julia, etc. Or LLVM (for > numba), or... > > This is why Continuum build conda -- they wanted to provide a way to manage > all that. > > So is it possible for PyPA to grow the features to mange all the python > bits, and then have things like conda use pip inside of Anaconda, maybe? I'd like to think so. The goal of pip is to be the baseline Python package manager. We're expecting Linux distributions to build their system packages via wheel, why can't Anaconda? Part of the problem here, to my mind, is that it's *very* hard for the outsider to separate out (Ana)conda-as-a-platform versus conda-as-a-tool, versus conda-as-a-distribution-format. > or > SOME transition where you can add conda if and only if you need its unique > features, and as a add-it-later-to-what-you-have solution, rather than, when > you need R or some such, you need to > > * Toss out your current setup > * Install Anaconda (or miniconda) > * Switch from virtualenv to conda environments > * re-install all your dependencies Yeah, that's hopeless. And worse still is the possibility that a pure Python user might have to do that just to gain access to a particular package. I keep saying this, and I ought to ask - is there *any* likelihood that a package would formally abandon any attempt to provide binary distributions for Windows (or OSX, or whatever) except in conda format? So PyPI users will be told "install from source yourself or switch to conda". If there's no intention for that to ever happen, a lot of the "conda vs pip" discussion is less relevant. At the moment, there are projects like numpy that don't distribute Windows wheels on PyPI, but Christoph Gohlke has most of them available, and in general such projects seem to be aiming to move to wheels. So there aren't any practical cases of "conda-only" packages. > And for even that to work, we need a way for everything installable by pip > to be installable within that conda environment -- which we could probably > achieve. See above - without some serious resource on the conda side, that "everything" is unlikely. >> My fear here is a split in the Python community, with some packages >> only being available via one ecosystem, and some via another. > > > Exactly. While I'm not at all sure that we could get to a "one way to do it" > that would meet every community's needs, I do think that we could push > pip+pypi+wheel a little further to better support at least the > python-centric stuff -- i.e. third party libs, which would get us a lot > farther. Agreed. Understanding the *actual* problem here is important though (see my other post about clarifying why dynamic linking is so important). > And again, it's not just the scipy stack -- there is stuff like image > manipulation packages, etc, that could be better handled. Well, (for example) Pillow provides wheels with no issue, so I'm not sure what you're thinking of here? > And the geospatial packages are a mess, too - is that "scientific"? -- I > don't know, but it's the "new hotness" in web development. I can't really comment on that one, as I've never used them. Does that make me untrendy? :-) Paul From p.f.moore at gmail.com Tue May 19 20:03:57 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 19:03:57 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 17:15, David Cournapeau wrote: > On Wed, May 20, 2015 at 12:46 AM, Paul Moore wrote: >> >> On 19 May 2015 at 16:22, Chris Barker wrote: >> > The other issue is social: this would really only be a benefit if a wide >> > variety of packages shared the same libs -- but each of those packages >> > is >> > maintained by different individuals and communities. So it's had to know >> > if >> > it would get used. I could put up a libpng wheel, for instance, and who >> > knows if the Pillow folks would have any interest in using it? or the >> > matplotlib folks, or, ... And this would be particularly difficult when >> > the >> > solution was hacked together... >> >> Honestly, I still haven't seen a solid explanation of why (at least on >> Windows) static linking isn't a viable option. > > > Because some libraries simply don't work as static libraries, or are too big > (MKL comes to mind). Also, we have been historically using static libs for > our eggs at Enthought on windows, and it has been a nightmare to support. It > just does not scale when you have 100s of packages. > > But really, once wheels support arbitrary file locations, this becomes > fairly easy at the packaging level: the remaining issue is one of ABI / > standards, but that's mostly a non technical issue. > > Gholke has 100s of packages using wheels, and we ourselves at Enthought have > close to 500 packages for windows, all packaged as eggs, maybe 30-40 % of > which are not python but libs, C/C++ programs, etc... It is more about > agreeing about a common way of doing things than a real technical > limitation. Thanks. I'm not going to try to argue against the voice of experience. If you were saying dynamic libs were impossible with wheels (which has been a definite impression I've got elsewhere in this thread) I might try to argue, but if you're saying that the technical issues are solvable, then that's great. We need to establish the details of the "common way of doing things" that needs to be agreed to move forward. Maybe an interoperability PEP would be useful here, with someone simply proposing a solution to give the debate a concrete starting point? Paul From p.f.moore at gmail.com Tue May 19 20:10:15 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 19:10:15 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 17:40, Chris Barker wrote: >> Because some libraries simply don't work as static libraries, or are too >> big (MKL comes to mind). Also, we have been historically using static libs >> for our eggs at Enthought on windows, and it has been a nightmare to >> support. It just does not scale when you have 100s of packages. > > there is also the issue of semi-developers -- I want people to be able to > easily build my package, that depends on a bunch of libs that I really want > to be the same. I suppose I could deliver the static libs themselves, along > with the headers, etc, but that does get ugly. Hmm, that seems to me to be something of a non-goal. If you publish wheels, 99.999% of people should never need to build your software. I'm 100% in favour of repeatable, automated builds - I routinely rant at instructions that say "grab this bit from over here, install that bit (oh wait, it moved since last time I looked, try here)..." But we don't really need a standard infrastructure for them, you write a "setup_build_environment.py" script that ships with your project, run it once, and then run python setup.py bdist_wheel. If your build dependencies are simple, "setup_build_environment.py" could be short (or even non-existent!) or if not it could be many hundreds of lines of code. But it's specific to your project. Paul From chris.barker at noaa.gov Tue May 19 21:26:55 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 12:26:55 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 11:10 AM, Paul Moore wrote: > > to be the same. I suppose I could deliver the static libs themselves, > along > > with the headers, etc, but that does get ugly. > > Hmm, that seems to me to be something of a non-goal. If you publish > wheels, 99.999% of people should never need to build your software. > I'm not so sure -- with a nice mature package, sure. But with something under active development, having a low barrier to entry to running the latest code is really helpful. I suppose I could use a CI system to push a new version of the binary wheel with every update - maybe that is the way to go. I'm 100% in favour of repeatable, automated builds - absolutely > But we > don't really need a standard infrastructure for them, you write a > "setup_build_environment.py" script that ships with your project, run > it once, and then run python setup.py bdist_wheel. If your build > dependencies are simple, "setup_build_environment.py" could be short > (or even non-existent!) or if not it could be many hundreds of lines > of code. But it's specific to your project. > This entire conversation is about when the build dependencies are NOT simple :-). And while it may be project specific, commonly used libs are not project specific, and they are where the time and pain are. So some shared infrastructure would be nice. And maybe all that needs to be is a gitHub project with build scripts. But I had little luck in getting any traction that way. That is, until we had Anaconda, conda and binstar --- an infrastructure that provides a way for folks to collaborate on this kind of ugly package building effort. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 21:34:19 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 12:34:19 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 10:18 AM, Leonardo Rochael Almeida < leorochael at gmail.com> wrote: > What if instead of focusing on pip being able to install more than just > python packages, we made sure that a virtualenv was as strict subset of, > say, a conda environment? This way, going from virtualenv to, say, conda > would not be a toss-out, but an upgrade. > cool idea -- though it's kind of backwards - i.e. conda installs stuff outside of the python environment. So I'm not sure if you could shoehorn this all together in that way. At least with a significant re-engineering of conda, in which case, you've kind of built a new plug-in or add-on to pip that does more. But it's be great to proven wrong here! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Tue May 19 22:19:12 2015 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 May 2015 15:19:12 -0500 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Mon, May 18, 2015 at 12:50 PM, David Mertz wrote: > This pertains more to the other thread I started, but I'm sort of becoming > convinced--especially by Paul Moore's suggestion there--that the better > approach is to grow conda (the tool) rather than shoehorn conda packages > into pip. Getting pip to recognize the archive format of conda would be > easy enough alone, but that really doesn't cover the fact that 'conda ~= > pip+virtualenv', and pip alone simply should not try to grow that latter > aspect itself. Plus pip is not going to be fully language agnostic, for > various reasons, but including the fact that apt-get and yum and homebrew > and ports already exist. > > So it might make sense to actually allow folks to push conda to budding > web developers, if conda allowed installation (and environment management) > of sdist packages on PyPI. So perhaps it would be good if *this* worked: > > % pip install conda > % conda install scientific_stuff > % conda install --sdist django_widget # we know to look on PyPI > > Maybe that flag is mis-named, or could be omitted altogether. But there's > no conceptual reason that conda couldn't build an sdist fetched from PyPI > into a platform specific binary matching the current user machine (and do > all the metadata dependency and environment stuff the conda tool does). > Would this be different than: # miniconda conda install pip conda install scientific_stuff pip install django_widget With gh:conda/conda-env, pip packages are in a pip: section of the environment.yml file For example: conda env export -n root Then, to install pip: packages with pip: conda create -n example -f ./environment.yml > > On Mon, May 18, 2015 at 3:17 AM, Paul Moore wrote: > >> On 17 May 2015 at 23:50, Chris Barker wrote: >> > I guess the key thing here for me is that I don't see pushing conda to >> > budding web developers -- but what if web developers have the need for >> a bit >> > of the scipy stack? or??? >> > >> > We really don't have a good solution for those folks. >> >> Agreed. My personal use case is as a general programmer (mostly >> sysadmin and automation type of work) with some strong interest in >> business data analysis and a side interest in stats. >> >> For that sort of scenario, some of the scipy stack (specifically >> matplotlib and pandas and their dependencies) is really useful. But >> conda is *not* what I'd use for day to day work, so being able to >> install via pip is important to me. It should be noted that installing >> via pip *is* possible - via some of the relevant projects having >> published wheels, and the rest being available via Christoph Gohlke's >> site either as wheels or as wininsts that I can convert. But that's >> not a seamless process, so it's not something I'd be too happy >> explaining to a colleague should I want to share the workload for that >> type of thing. >> >> Paul >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > > > -- > The dead increasingly dominate and strangle both the living and the > not-yet born. Vampiric capital and undead corporate persons abuse > the lives and control the thoughts of homo faber. Ideas, once born, > become abortifacients against new conceptions. > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Tue May 19 22:25:51 2015 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 May 2015 15:25:51 -0500 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 2:34 PM, Chris Barker wrote: > On Tue, May 19, 2015 at 10:18 AM, Leonardo Rochael Almeida < > leorochael at gmail.com> wrote: > >> What if instead of focusing on pip being able to install more than just >> python packages, we made sure that a virtualenv was as strict subset of, >> say, a conda environment? This way, going from virtualenv to, say, conda >> would not be a toss-out, but an upgrade. >> > > cool idea -- though it's kind of backwards - i.e. conda installs stuff > outside of the python environment. So I'm not sure if you could shoehorn > this all together in that way. At least with a significant re-engineering > of conda, in which case, you've kind of built a new plug-in or add-on to > pip that does more. > I've tried to do something like this with my (admittedly opinionated) dotfiles: https://westurner.org/dotfiles/venv # WORKON_HOME ~= CONDA_ENVS_PATH workon workon_conda # wec lscondaenvs virtualenv: deactivate() condaenv: source deactivate ... https://github.com/westurner/dotfiles/blob/develop/etc/bash/08-bashrc.conda.sh#L95 It's still pretty verbose and rough around the edges. -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Tue May 19 22:27:33 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 21:27:33 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 20:26, Chris Barker wrote: > This entire conversation is about when the build dependencies are NOT simple > :-). And while it may be project specific, commonly used libs are not > project specific, and they are where the time and pain are. So some shared > infrastructure would be nice. Fair point. > And maybe all that needs to be is a gitHub project with build scripts. But I > had little luck in getting any traction that way. That is, until we had > Anaconda, conda and binstar --- an infrastructure that provides a way for > folks to collaborate on this kind of ugly package building effort. Yeah, it's all about getting people interested. I wish the github project model would work (it did for msys2) but just wishing doesn't help much. Let me ask a different question, then. If I wanted to get hold of (say) libraries for libyaml, libxml2, and maybe a few others (libxpm?) is the conda work of any use for me? I don't want to build conda packages, or depend on them, I just want to grab Python-compatible libraries that I can link into my extension wheel build, or into my application that embeds Python. Ideally, I'd want static libs, but I understand that conda doesn't go that route (at the moment, at least). Even if it's just for me to better understand how conda works, is there an easy to follow guide I could read to see how the conda build of libyaml works? (I assume there must *be* a conda build of libyaml, as conda includes pyyaml which uses libyaml...) I suspect the answer is "no, conda's not designed to support that use case". Which is fine - but a shame. The "github project with build scripts" approach could potentially allow *more* users of the builds and libraries, And that, in a nutshell is the problem (on Windows, at least) - the community of people developing builds of common Unix tools and libraries is completely fragmented - msys2, conda, mingw, ... Anyway, I feel like we're now going round in circles. It's a hard (social) issue, and I don't feel like I have any answers, really. Paul From chris.barker at noaa.gov Tue May 19 23:11:05 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 14:11:05 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 10:58 AM, Paul Moore wrote: > Sure. Doesn't have to be the same way, but the user experience has to > be the same. absolutely. > But maybe that's not going to cut it -- in a way, we are headed there now, > > with a contingent of people porting pypi packages to conda. So far it's > > various subsets of the scientific community, but if we could get a few > web > > developers to join in... > > Unless project owners switch to providing conda packages, isn't there > always going to be a lag? If a new version of lxml comes out, how long > must I wait for "the conda folks" to release a package for it? > who knows? -- but it is currently a light lift to update a conda package to a new version, once the original is built -- and we've got handy scripts and CI systems that will push an updated version of the binaries as soon as an updated version of the build script is pushed. It's a short step to automate looking for new versions on PyPi and automatically updadating the conda pacakges -- though there would need to be hand-intervention for whenever a update broke the build script... Of course, the ideal is for package maintainers to push conda pacakges themselves -- which is why the more-than-one-system to support is unfortunate. On the other hand, there is one plus side -- if the package maintainer doesn't push to PyPi, it's easier for a third party to take on that role -- see pychecker, or, for that matter numpy and scipy -- on pipy, but not binaries for Windows. But you can get them on binstar (or Anaconda, or...) > hmm -- that's the interesting technical question -- conda works at a > higher > > level than pip -- it CAN manage python itself -- I'm not sure it is HAS > to, > > but that's how it is usually used, and the idea is to provide a complete > > environment, which does include python itself. > > Yes. But I don't want to use Anaconda Python, Same reason - how long > do I wait for the new release of Python to be available in Anaconda? > There's currently no Python 3.5 alpha for example... you can grab and build the latest Python3.5 inside a conda environment just as well. Or are you using python.org builds for alpha versions, too? Oh, and as a conda environment sits at a higher level than python, it's actually easier to set up an environment specifically for a particular version of python. And anyone could put up a conda package of Python3.5 Alpha as well --- once the build script is written, it's pretty easy. But again -- teh more than one way to do it problem. If conda/binstar is good enough to replace pip/PyPI, there's no reason > for pip/PyPI to still exist. So in effect binstar *becomes* PyPI. > yup. > There's an element of evangelisation going on here - you're > (effectively) asking what it'd take to persuade me to use conda in > place of pip. I'm playing hard to get, a little, because I see no > specific benefits to me in using conda, so I don't see why I should > accept any loss at all, in the absence of a benefit to justify it. > I take no position here -- I'm playing around with ideas as to how we can move the community toward a better future -- I'm not trying to advocate any particular solution, but trying to figure out what solution we may want to pursue -- quite specifically which solution I'm going to put my personal energy toward. We may want to look back at a thread on this list where Travis Oliphant talks about why he built conda, etc. (I can't find it now -- maybe someone with better google-fu than me can. It think it was a thread on this list, probably about a year ago) or read his Blog Post: http://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html One of the key points is that when they started building conda -- pip+wheel where not mature, and the core folks behind them didn't want to support what was needed (dynamic libs, etc) -- and still don't. My biggest worry is that at some point, "if you want numpy/scipy, you > should use conda" becomes an explicit benefit of conda, That is EXACTLY what the explicit benefit of conda is. I think we'll get binary wheels for numpy and scipy up on PyPi before too long, but the rest of the more complex stuff is not going to be there. > and pip/PyPI > users get abandoned by the scientific community. They kind of already have -- it's been a long time, and a lot of work by only a couple folks to try to get binary wheels up on PyPi for Windows and OS-X > If that happens, I'd > rather see the community rally behind conda than see a split. But I > hope that's not the way things end up going. we'll see. But look at Travis' post -- pip+wheel simply does not support the full needs of the full scientific user. If we want a "one ring to rule them all", then it'll have to be conda -- or something a lot like it. On the other hand, I think pip+wheel+PyPi (or maybe just the community around it) can be extended a bit to at least support all the truly python focused stuff -- which I think would be pretty worthwhile it itself. This is the key point. The decision was made to "bless" pip as the > official Python package manager. Should we revisit that decision? I'm not sure I want to be the one to bring that up ;-) > If > not, then how do we ensure that pip (and the surrounding > infrastructure) handles the needs of the *whole* Python community? If > the authors of scientific extensions for Python abandon pip for conda, > then pip isn't supporting that part of the community properly. But > conversely, if the scientific community doesn't look to address their > issues within the pip/wheel infrastructure, how can we do anything to > avoid a rift? > well -- I think the problem is that while SOME of the needs of scientific community can be addressed within the pip-wheel infrastructure, they all can't all be addressed there. And (I wish I could find that thread), I'm pretty sure Travis said that before he started conda, he talked to the PyPa folks (before it was called that), and told that he'd be best off going off and building something new -- pip+wheel were just getting started, and were not going to support what he needed -- certainly not anytime soon. > I'd like to think so. The goal of pip is to be the baseline Python > package manager. We're expecting Linux distributions to build their > system packages via wheel, why can't Anaconda? Part of the problem > here, to my mind, is that it's *very* hard for the outsider to > separate out (Ana)conda-as-a-platform versus conda-as-a-tool, versus > conda-as-a-distribution-format. > absolutely. When you say "build their system packages via wheel" -- what does that mean? and why wheel, rather than, say, pip + setuptools? you can put whatever you want in a conda build script -- the current standard practice baseline for python packages is: $PYTHON setup.py install it's that simple. And I don't know if this is what it actually does, but essentially the conda package is all the stuff that that script added to the environment. So if changing that invocation to use pip would get us some better meta data or what have you, then by all means, we should change that standard of practice. (but it seems like an odd thing to have to use the package manager to build the package correctly -- shouldn't' that be distutils or setuptools' job? > * Toss out your current setup > > * Install Anaconda (or miniconda) > > * Switch from virtualenv to conda environments > > * re-install all your dependencies > > Yeah, that's hopeless. And worse still is the possibility that a pure > Python user might have to do that just to gain access to a particular > package. > exactly. I keep saying this, and I ought to ask - is there *any* likelihood > that a package would formally abandon any attempt to provide binary > distributions for Windows (or OSX, or whatever) except in conda > format? absolutely -- probably not the core major packages -- but I think that's already the case with a number of more domain specific packages (including my stuff -- OK, I have maybe three users outside my organization now, but still..) And numpy and scipy don't yet have binary wheels for Windows up -- though that is being worked on. So PyPI users will be told "install from source yourself or > switch to conda". If there's no intention for that to ever happen, Sadly, we are already there for minor packages, at least. Oh wait, not so minor -- the geospatial stack is not well supported on PyPi. I don't think there are pynetcdf or pyhdf binaries, etc... On the other hand, some domain specifc stuff is being support, like scikit-leaern, for instance: http://scikit-learn.org/stable/install.html#install-official-release lot of the "conda vs pip" discussion is less relevant. At the moment, > there are projects like numpy that don't distribute Windows wheels on > PyPI, but Christoph Gohlke has most of them available, yes, but in a form that is not redistributable on PyPi... > and in general > such projects seem to be aiming to move to wheels. So there aren't any > practical cases of "conda-only" packages. I'm not sure about "conda-only", but not pip-installable is all too common. > > And for even that to work, we need a way for everything installable by > pip > > to be installable within that conda environment -- which we could > probably > > achieve. > > See above - without some serious resource on the conda side, that > "everything" is unlikely. conda can provide a full, pretty standard, python environment -- why do you think "everything" is unlikely? > I do think that we could push > > pip+pypi+wheel a little further to better support at least the > > python-centric stuff -- i.e. third party libs, which would get us a lot > > farther. > > Agreed. Understanding the *actual* problem here is important though > (see my other post about clarifying why dynamic linking is so > important). > yes -- I don't know that that's answered yet -- but the third party dependency problem is real -- whether it is addressed by supporting dynamic linking, or by making it easier to find, build, distribute compatible static libs for package maintainers to use is still up in the air. > And again, it's not just the scipy stack -- there is stuff like image > > manipulation packages, etc, that could be better handled. > > Well, (for example) Pillow provides wheels with no issue, so I'm not > sure what you're thinking of here? the PIllow folks have figured it out -- and are doing the work -- but it took years, and we had a lot of pain building binaries for OS-X of PIL during that time. I think dynamic libs would be a good thing for packages like PIL, but maybe static is fine (I presume they are doing static now...) > And the geospatial packages are a mess, too - is that "scientific"? -- I > > don't know, but it's the "new hotness" in web development. > > I can't really comment on that one, as I've never used them. Does that > make me untrendy? :-) > Absolutely! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Tue May 19 23:14:47 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 22:14:47 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 19 May 2015 at 21:19, Wes Turner wrote: > Would this be different than: > > # miniconda > conda install pip > conda install scientific_stuff > pip install django_widget Having tried that in the past, I can say that I *very* rapidly got completely lost as to which packages I'd installed with pip, and which with conda. Uninstalling and/or upgrading with the "wrong" package manager caused all sorts of fun. Paul From chris.barker at noaa.gov Tue May 19 23:29:20 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 14:29:20 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 1:27 PM, Paul Moore wrote: > > And maybe all that needs to be is a gitHub project with build scripts. > But I > > had little luck in getting any traction that way. That is, until we had > > Anaconda, conda and binstar --- an infrastructure that provides a way > for > > folks to collaborate on this kind of ugly package building effort. > > Yeah, it's all about getting people interested. I wish the github > project model would work (it did for msys2) but just wishing doesn't > help much. > well, to be fair, critical mass is key -- and I did not get far in generating that critical mass myself. If a couple dedicated people were to start such a project, and get it far enough that is was really easy for new folks to jump in and grab what they need -- it might keep rolling. > Let me ask a different question, then. If I wanted to get hold of > (say) libraries for libyaml, libxml2, and maybe a few others (libxpm?) > is the conda work of any use for me? I don't want to build conda > packages, or depend on them, I just want to grab Python-compatible > libraries that I can link into my extension wheel build, or into my > application that embeds Python. Ideally, I'd want static libs, but I > understand that conda doesn't go that route (at the moment, at least). > Even if it's just for me to better understand how conda works, is > there an easy to follow guide I could read to see how the conda build > of libyaml works? (I assume there must *be* a conda build of libyaml, > as conda includes pyyaml which uses libyaml...) > Here's the trick -- as far as I know, a conda binary package itself does not include the recipe used to build it. So the question is: has someone published the conda recipe ( conda uses a build dir with a bunch of stuff in it, yaml, shell scripts, what have you ) for that package? if so, then yes, it's easy to go in a look at it and see how it's done. As far as I can tell, Continuum does not publish the build scripts used to build all the stuff in Anaconda. But they do host: https://github.com/conda/conda-recipes though it's not in the least bit complete or particularly maintained. (I don't see libxml in there). There is also a growning community of folks developing and maintaining conda recipes for all sorts of stuff -- much of it on gitHub: https://github.com/ioos/conda-recipes is one example. (those are all getting built and pushed to binstar too). I suspect the answer is "no, conda's not designed to support that use > case". Which is fine - but a shame. it would be nice if a conda package got a copy of its build recipe embedded in it. > The "github project with build > scripts" approach could potentially allow *more* users of the builds > and libraries, And that, in a nutshell is the problem (on Windows, at > least) - the community of people developing builds of common Unix > tools and libraries is completely fragmented - msys2, conda, mingw, > yes, it sure is. Anyway, I feel like we're now going round in circles. It's a hard > (social) issue, and I don't feel like I have any answers, really. > me neither -- come on -- someone give us the answer! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue May 19 23:17:16 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 14:17:16 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 1:19 PM, Wes Turner wrote: > So it might make sense to actually allow folks to push conda to budding >> web developers, if conda allowed installation (and environment management) >> of sdist packages on PyPI. So perhaps it would be good if *this* worked: >> >> % pip install conda >> % conda install scientific_stuff >> % conda install --sdist django_widget # we know to look on PyPI >> >> > Would this be different than: > > # miniconda > conda install pip > conda install scientific_stuff > pip install django_widget > yes -- in the later, you have to START with the conda environment. But yes -- that should be doable. If conda understands pip metadata for dependencies and conda provides pip-understandable metadata, then that could work fine. With gh:conda/conda-env, pip packages are in a pip: section of the > environment.yml file > For example: > > conda env export -n root > > Then, to install pip: packages with pip: > > conda create -n example -f ./environment.yml > good point --- you'd also want a way for the conda to easily re-create the environment, with the pip-installed stuff included. But again, do-able. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Tue May 19 23:50:25 2015 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 May 2015 16:50:25 -0500 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 4:17 PM, Chris Barker wrote: > On Tue, May 19, 2015 at 1:19 PM, Wes Turner wrote: > >> So it might make sense to actually allow folks to push conda to budding >>> web developers, if conda allowed installation (and environment management) >>> of sdist packages on PyPI. So perhaps it would be good if *this* worked: >>> >>> % pip install conda >>> % conda install scientific_stuff >>> % conda install --sdist django_widget # we know to look on PyPI >>> >>> > >> Would this be different than: >> >> # miniconda >> conda install pip >> conda install scientific_stuff >> pip install django_widget >> > > yes -- in the later, you have to START with the conda environment. But yes > -- that should be doable. If conda understands pip metadata for > dependencies and conda provides pip-understandable metadata, then that > could work fine. > > With gh:conda/conda-env, pip packages are in a pip: section of the >> environment.yml file >> For example: >> >> conda env export -n root >> >> Then, to install pip: packages with pip: >> >> conda create -n example -f ./environment.yml >> > > good point --- you'd also want a way for the conda to easily re-create the > environment, with the pip-installed stuff included. But again, do-able. > $ deactivate; conda install conda-env; conda env export -n root | tee environment.yml name: root dependencies: - binstar=0.10.3=py27_0 - clyent=0.3.4=py27_0 - conda=3.12.0=py27_0 - conda-build=1.12.1=py27_0 - conda-env=2.1.4=py27_0 - jinja2=2.7.3=py27_1 - markupsafe=0.23=py27_0 - openssl=1.0.1k=1 - pip=6.1.1=py27_0 - pycosat=0.6.1=py27_0 - python=2.7.9=1 - python-dateutil=2.4.2=py27_0 - pytz=2015.2=py27_0 - pyyaml=3.11=py27_0 - readline=6.2=2 - requests=2.7.0=py27_0 - setuptools=16.0=py27_0 - six=1.9.0=py27_0 - sqlite=3.8.4.1=1 - tk=8.5.18=0 - yaml=0.1.4=1 - zlib=1.2.8=0 - pip: - conda-build (/Users/W/-wrk/-ce27/pypfi/src/conda-build)==1.12.1+46.g2d17c7f - dulwich==0.9.7 - hg-git==0.6.1 > > -Chris > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Wed May 20 00:09:28 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 23:09:28 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 19 May 2015 at 22:29, Chris Barker wrote: > As far as I can tell, Continuum does not publish the build scripts used to > build all the stuff in Anaconda. So, for example the process for building the pyyaml package available via conda is private? (I want to say "proprietary", but there's a lot of implications in that term that I don't intend...) That seems like a rather striking downside to conda that I wasn't aware of. Hopefully, I'm misunderstanding something :-) Paul From p.f.moore at gmail.com Wed May 20 00:04:49 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 19 May 2015 23:04:49 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: Sigh. I really am going to try to stop monopolising this thread - but you keep making good points I feel I "have" to respond to :-) I'll try to keep to essentials. On 19 May 2015 at 22:11, Chris Barker wrote: > On Tue, May 19, 2015 at 10:58 AM, Paul Moore wrote: > you can grab and build the latest Python3.5 inside a conda environment just > as well. Or are you using python.org builds for alpha versions, too? Yep, I basically never build Python myself, except when developing it. > Oh, and as a conda environment sits at a higher level than python, it's > actually easier to set up an environment specifically for a particular > version of python. > > And anyone could put up a conda package of Python3.5 Alpha as well --- once > the build script is written, it's pretty easy. But again -- teh more than > one way to do it problem. Yeah, I'm trying to never build anything for myself, just consume binaries. Having all binaries built by "the conda people" is a bottleneck. Having pip auto-build wheels once and reuse them (coming in the next version, yay!) is good enough. Having projects upload wheels to PyPI is ideal. Building wheels myself from wininsts provided by others or by doing the awkward work once and hosting them in a personal index is acceptable. I've spent too much of my life trying to build other people's C code. I'd like to stop now :-) My needs don't extend to highly specialised *and* hard to build stuff. The specialised stuff I care about is usually not hard to build, and the hard to build stuff is usually relatively mainstream (basic numpy and scipy as opposed to the specialised scientific stuff) > We may want to look back at a thread on this list where Travis Oliphant > talks about why he built conda, etc. (I can't find it now -- maybe someone > with better google-fu than me can. It think it was a thread on this list, > probably about a year ago) > > or read his Blog Post: > > http://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html Thanks for the link - I'll definitely read that. > One of the key points is that when they started building conda -- pip+wheel > where not mature, and the core folks behind them didn't want to support what > was needed (dynamic libs, etc) -- and still don't. Well, I'm not sure "don't want to" is accurate these days. "Think the problem is harder than you're making it sound" may be accurate, as might "have no resource to spare to do the work, but would like others to". >> My biggest worry is that at some point, "if you want numpy/scipy, you >> should use conda" becomes an explicit benefit of conda, > > That is EXACTLY what the explicit benefit of conda is. I think we'll get > binary wheels for numpy and scipy up on PyPi before too long, but the rest > of the more complex stuff is not going to be there. I did specifically mean numpy and scipy. People are using those, and pandas and matplotlib, for a lot of non-scientific things (business data analysis as a replacement for Excel, for example). Forcing such people to use Anaconda seems like a mistake - the scientific community and the business community have very different perspectives. Conceded, numpy/scipy are *currently* hard to get - we're in the middle of a transition from wininst (which numpy/scipy did supply) to wheel. But the *intention* is that wheels will be an acceptable replacement for wininst installers, so giving up before we reach that goal would be a shame. >> This is the key point. The decision was made to "bless" pip as the >> official Python package manager. Should we revisit that decision? > > I'm not sure I want to be the one to bring that up ;-) Well, Nick Coghlan is probably better placed to comment on that. I don't really understand his vision for pip and wheel as something that other tools build on - and whether or not the fact that the scientific community feels the need to build a completely independent infrastructure is a problem in that context. But if it is a problem, maybe that decision *should* be reviewed. > When you say "build their system packages via wheel" -- what does that mean? > and why wheel, rather than, say, pip + setuptools? Again, that's something for Nick to comment on - I don't know how wheel (it's wheel more than pip in this context, I believe) fits into RPM/deb building processes. But I do know that's how he's been talking about things working. I can't see why conda would be any different. > you can put whatever you want in a conda build script -- the current > standard practice baseline for python packages is: > > $PYTHON setup.py install That sounds like conda doesn't separate build from install - is there a conda equivalent of a "binary distribution" like a wheel? There are a lot of reasons why pip/wheel is working hard to move to a separation of build and install steps, and if conda isn't following that, I'd like to know its response to those issues. (I don't know what all the issues are - it'd take some searching of the list archives to find previous discussions. One obvious one is "we don't want end users to have to have a compiler"). > it's that simple. And I don't know if this is what it actually does, but > essentially the conda package is all the stuff that that script added to the > environment. So if changing that invocation to use pip would get us some > better meta data or what have you, then by all means, we should change that > standard of practice. Well, it sounds like using "setup.py bdist_wheel" and then repackaging up the contents of the wheel might be easier than playing "spot what changed" with a setup.py install invocation. But I'm no expert. (If the response to that is "wheel doesn't handle X" then I'd rather see someone offering to help *improve* wheel to handle that situation!) > (but it seems like an odd thing to have to use the package manager to build > the package correctly -- shouldn't' that be distutils or setuptools' job? Wheels can be built using setuptools (bdist_wheel) or just packaging up some files manually. All pip does in this context is orchestrate the running of setup.py bdist_wheel. You lot use "conda" to mean the format, the package manager, and the distribution channel - give me some slack if I occasionally use "pip" when I mean "wheel" or "setuptools" :-) :-) > Sadly, we are already there for minor packages, at least. Oh wait, not so > minor -- the geospatial stack is not well supported on PyPi. I don't think > there are pynetcdf or pyhdf binaries, etc... There is a point where "specialised enough to only matter to Acaconda's target audience" is OK. And Christoph Gohlke's distributions fill a huge gap (losing that because "you should use conda" would be a huge issue, IMO). >> lot of the "conda vs pip" discussion is less relevant. At the moment, >> there are projects like numpy that don't distribute Windows wheels on >> PyPI, but Christoph Gohlke has most of them available, > > yes, but in a form that is not redistributable on PyPi... Well, it's not clear to me how insurmountable that problem is - his page doesn't specifically mention redistribution limitations (I presume it's license issues?) Has anyone discussed the possibility of addressing the issues? If it is licensing (of MKL?) then could the ones without license implications be redistributed? Could the PSF assist with getting a redistribution license for the remainder? I've no idea what the issues are - I'm just asking if there are approaches no-one has considered. >> and in general >> such projects seem to be aiming to move to wheels. So there aren't any >> practical cases of "conda-only" packages. > > I'm not sure about "conda-only", but not pip-installable is all too common. My benchmark is everything that used to distribute wininst installers or binary eggs distributes wheels. I'd *like* to go beyond that, but we haven't got the resources to help projects that never offered binaries. Asking (and assisting) people to move from the old binary standards to the new one should be a reasonable goal, though. (I appreciate this is a less strict criterion than my previous comments about "losing access to the scientific stack" could be taken as implying. My apologies - some of my earlier comments probably included a bit too much rhetoric :-)) >> > And for even that to work, we need a way for everything installable by >> > pip >> > to be installable within that conda environment -- which we could >> > probably >> > achieve. >> >> See above - without some serious resource on the conda side, that >> "everything" is unlikely. > > conda can provide a full, pretty standard, python environment -- why do you > think "everything" is unlikely? Because the pip/wheel ecosystem relies on projects providing their own wheels. Conda relies on the "conda guys" building packages. It's the same reason that a PyPI build farm is still only a theory :-) Or were you assuming that package authors would start providing conda packages? Paul From dmertz at continuum.io Wed May 20 00:23:58 2015 From: dmertz at continuum.io (David Mertz) Date: Tue, 19 May 2015 15:23:58 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: It is certainly not our intention at Continuum to keep build recipes private. I have just come on board at the company, but I'll add it to my TODO list to work on making sure that those are better updated and maintained at https://github.com/conda/conda-recipes. On Tue, May 19, 2015 at 3:09 PM, Paul Moore wrote: > On 19 May 2015 at 22:29, Chris Barker wrote: > > As far as I can tell, Continuum does not publish the build scripts used > to > > build all the stuff in Anaconda. > > So, for example the process for building the pyyaml package available > via conda is private? (I want to say "proprietary", but there's a lot > of implications in that term that I don't intend...) That seems like a > rather striking downside to conda that I wasn't aware of. Hopefully, > I'm misunderstanding something :-) > > Paul > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmertz at continuum.io Wed May 20 00:37:06 2015 From: dmertz at continuum.io (David Mertz) Date: Tue, 19 May 2015 15:37:06 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: I will note that most recipes seem to consist of either 'python setup.py install' or './configure; make; make install'. So there is quite likely actually little significant work that has failed to have been published. But I'm not sure of pyyaml off the top of my head, and how that is built. On Tue, May 19, 2015 at 3:23 PM, David Mertz wrote: > It is certainly not our intention at Continuum to keep build recipes > private. I have just come on board at the company, but I'll add it to my > TODO list to work on making sure that those are better updated and > maintained at https://github.com/conda/conda-recipes. > > On Tue, May 19, 2015 at 3:09 PM, Paul Moore wrote: > >> On 19 May 2015 at 22:29, Chris Barker wrote: >> > As far as I can tell, Continuum does not publish the build scripts used >> to >> > build all the stuff in Anaconda. >> >> So, for example the process for building the pyyaml package available >> via conda is private? (I want to say "proprietary", but there's a lot >> of implications in that term that I don't intend...) That seems like a >> rather striking downside to conda that I wasn't aware of. Hopefully, >> I'm misunderstanding something :-) >> >> Paul >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> > > > > -- > The dead increasingly dominate and strangle both the living and the > not-yet born. Vampiric capital and undead corporate persons abuse > the lives and control the thoughts of homo faber. Ideas, once born, > become abortifacients against new conceptions. > -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 00:32:24 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 15:32:24 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: lost track of where in the thred this was, but here's a conda recipe I found on gitHub: https://github.com/menpo/conda-recipes/tree/master/libxml2 don't know anything about it..... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 01:04:38 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 16:04:38 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 3:04 PM, Paul Moore wrote: > Yeah, I'm trying to never build anything for myself, just consume > binaries. Having all binaries built by "the conda people" is a > bottleneck. it is -- though the group of "conda people" is growing... > Having pip auto-build wheels once and reuse them (coming > in the next version, yay!) is good enough. Having projects upload > wheels to PyPI is ideal. Building wheels myself from wininsts provided > by others or by doing the awkward work once and hosting them in a > personal index is acceptable. > you can build conda packages from wininsts as well, actually. Haven't tried it myself yet. (I'm hoping form bdist_mpkgs on the Mac, too, though those are getting rare) I've spent too much of my life trying to build other people's C code. > I'd like to stop now :-) > no kidding -- and this whole thread is about how to help more an more people stop... > One of the key points is that when they started building conda -- pip+wheel > > where not mature, and the core folks behind them didn't want to support > what > > was needed (dynamic libs, etc) -- and still don't. > > Well, I'm not sure "don't want to" is accurate these days. "Think the > problem is harder than you're making it sound" may be accurate, as > might "have no resource to spare to do the work, but would like others > to". well, yeah, though I'm still not sure how much support there is. And I do think that no one want to extend pip to be able to install Perl, for instance ;-) >> My biggest worry is that at some point, "if you want numpy/scipy, you > >> should use conda" becomes an explicit benefit of conda, > > > > That is EXACTLY what the explicit benefit of conda is. I think we'll get > > binary wheels for numpy and scipy up on PyPi before too long, but the > rest > > of the more complex stuff is not going to be there. > > I did specifically mean numpy and scipy. People are using those, and > pandas and matplotlib, That would be the core "scipy stack", and you can't, as of today, pip install it on Windows (you can on the Mac), and you can get wheel from the Gohlke repo, or wininst from the scipy site. That is going to change, hopefully soon, and to be fair the technical hurdles have to do with building a good LAPACK without licensing issues, and figuring out if we need to support pre-SSE2 hardware, etc ... not a pip or pypi limitation. But the *intention* is that wheels will be an acceptable > replacement for wininst installers, so giving up before we reach that > goal would be a shame. I don't think we will -- some folks are doing some great work on that -- and it looks to be close. > Again, that's something for Nick to comment on - I don't know how > wheel (it's wheel more than pip in this context, I believe) fits into > RPM/deb building processes. But I do know that's how he's been talking > about things working. I can't see why conda would be any different. > yup -- it probably does make sense to do with conda what is done with rpm. Except that conda already has a bunch of python-aware stuff in it... > $PYTHON setup.py install > > That sounds like conda doesn't separate build from install - is there > a conda equivalent of a "binary distribution" like a wheel? yes, a conda pacakge is totally binary. but when you build one, it does this: 1) create a new, empty conda environment 2) install the build dependencies of the package at hand 3) download or unpack the source code 4) build and install the package (into this special, temporary, conda environment) 5) package up all the stuff that got installed into a conda package I don't know how it does it, but it essentially finds all the files that were added by the install stage, and puts those in the package. Remarkably simple, and I think, kind of elegant. I think it installs, rather than simply building, so that conda itself doesn't need to know anything about what kind of package it is -- what it wants the final package to be is an archive of everything that you want installed, where you want it installed. so actually installing it is the easiest way to do that. for instance, a C library build script might be: ./configure make make install There are > a lot of reasons why pip/wheel is working hard to move to a separation > of build and install steps, and if conda isn't following that, I think wheel and conda are quite similar in that regard, actually. > > it's that simple. And I don't know if this is what it actually does, but > > essentially the conda package is all the stuff that that script added to > the > > environment. So if changing that invocation to use pip would get us some > > better meta data or what have you, then by all means, we should change > that > > standard of practice. > > Well, it sounds like using "setup.py bdist_wheel" and then repackaging > up the contents of the wheel might be easier than playing "spot what > changed" with a setup.py install invocation. But I'm no expert. (If > the response to that is "wheel doesn't handle X" then I'd rather see > someone offering to help *improve* wheel to handle that situation!) > for python packages, maybe but the way they do it now is more generic -- why have a bunch of python package specific code you don't need? > > (but it seems like an odd thing to have to use the package manager to > build > > the package correctly -- shouldn't' that be distutils or setuptools' job? > > Wheels can be built using setuptools (bdist_wheel) or just packaging > up some files manually. All pip does in this context is orchestrate > the running of setup.py bdist_wheel. You lot use "conda" to mean the > format, the package manager, and the distribution channel - give me > some slack if I occasionally use "pip" when I mean "wheel" or > "setuptools" :-) :-) > fair enough. but does making a wheel create some meta-data, etc. that doing a raw install not create? i.e. is there something in a wheel that conda maybe should use that it's not getting? There is a point where "specialised enough to only matter to > Acaconda's target audience" is OK. And Christoph Gohlke's > distributions fill a huge gap (losing that because "you should use > conda" would be a huge issue, IMO). > His repo is fantastic, yes, but it's an awful lot of reliance on one really, really, productive and smart guy! As a note -- he is often the first to find and resolve a lot of Windows build issues on a variety of packages. AND I think some of the wheels on pypi are actually built by him, and re-distributed by the package maintainers. >> lot of the "conda vs pip" discussion is less relevant. At the moment, > >> there are projects like numpy that don't distribute Windows wheels on > >> PyPI, but Christoph Gohlke has most of them available, > > > > yes, but in a form that is not redistributable on PyPi... > > Well, it's not clear to me how insurmountable that problem is - his > page doesn't specifically mention redistribution limitations (I > presume it's license issues?) yes. > Has anyone discussed the possibility of > addressing the issues? If it is licensing (of MKL?) yes. > then could the > ones without license implications be redistributed? probably -- and, as above, some are (at least one I know of was, anyway :-) ) > Could the PSF > assist with getting a redistribution license for the remainder? I've > no idea what the issues are - I'm just asking if there are approaches > no-one has considered. > It's been talked about, and has recently -- the trick, as I understand it, is that while the Intel license allows re-distribution, you are supposed to have soem sort of info for the user about the license restrictions. when you "pip install" something, there is no way for the user to get that license info. Or something like that. Also -- the scipy devs in general really prefer a fully open source solution. But one is coming soon, so we're good. > I'm not sure about "conda-only", but not pip-installable is all too > common. > > My benchmark is everything that used to distribute wininst installers > or binary eggs distributes wheels. we're probably getting pretty close there. wxPython is an exception, but Robin is distributing the new and improved beta version as wheels. I think he's got enough on his plate that he doesn't want to wrestle with the old and crufty build system. I don't know what else there is of the major packages. > I'd *like* to go beyond that, but > we haven't got the resources to help projects that never offered > binaries. Asking (and assisting) people to move from the old binary > standards to the new one should be a reasonable goal, though. > yup. which makes me think -- maybe not that hard to do a wininst to wheel converter for wxPython -- that would be nice. We also need it for the Mac, and that would be harder -- he's got some trickery in placing the libs in that one... > >> > And for even that to work, we need a way for everything installable by > >> > pip > >> > to be installable within that conda environment -- which we could > >> > probably > >> > achieve. > >> > >> See above - without some serious resource on the conda side, that > >> "everything" is unlikely. > > > > conda can provide a full, pretty standard, python environment -- why do > you > > think "everything" is unlikely? > > Because the pip/wheel ecosystem relies on projects providing their own > wheels. Conda relies on the "conda guys" building packages. It's the > same reason that a PyPI build farm is still only a theory :-) Or were > you assuming that package authors would start providing conda > packages? no -- THAT is unlikely. I meant that if we can get the automatic build a conda package from a pypi sdist working, then we're good to go. Or maybe build-a-conda-package-from-a-binary-wheel. After all, if a package can be pip-installed, then it should be able to be pip-installed inside a conda environment, which means it should be able to be done automatically. I've done a bunch of these -- it's mostly boilerplate. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 01:11:19 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 May 2015 16:11:19 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 3:09 PM, Paul Moore wrote: > So, for example the process for building the pyyaml package available > via conda is private? well, I haven't been able to find them... I don't know if continuum keeps them "private" on purpose or, just haven't happened to publish them. > That seems like a > rather striking downside to conda that I wasn't aware of. We need to be careful here about what is: "conda" -- a fully open source package management system "Anaconda" -- a python and other stuff distribution produced by Continuum. How Continuum does or doesn't publish the recipes it used to build Anaconda doesn't really have anything to do with conda-the-technology. On Tue, May 19, 2015 at 3:23 PM, David Mertz wrote: > It is certainly not our intention at Continuum to keep build recipes > private. > > I'll add it to my TODO list to work on making sure that those are better > updated and maintained at https://github.com/conda/conda-recipes. > That would be great! I will note that most recipes seem to consist of either 'python setup.py > install' or './configure; make; make install'. > sure -- but those aren't the ones we want ;-) > So there is quite likely actually little significant work that has > failed to have been published. But I'm not sure of pyyaml off the top of > my head, and how that is built. > see if you can find the wxPython one, while you are at it :-) -- though I suspect that was built from the "official" executable, rather than re-built from scratch. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Wed May 20 02:21:06 2015 From: wes.turner at gmail.com (Wes Turner) Date: Tue, 19 May 2015 19:21:06 -0500 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Tue, May 19, 2015 at 6:04 PM, Chris Barker wrote: > On Tue, May 19, 2015 at 3:04 PM, Paul Moore wrote: > > >> Yeah, I'm trying to never build anything for myself, just consume >> binaries. Having all binaries built by "the conda people" is a >> bottleneck. > > > it is -- though the group of "conda people" is growing... > > >> Having pip auto-build wheels once and reuse them (coming >> in the next version, yay!) is good enough. Having projects upload >> wheels to PyPI is ideal. Building wheels myself from wininsts provided >> by others or by doing the awkward work once and hosting them in a >> personal index is acceptable. >> > > you can build conda packages from wininsts as well, actually. Haven't > tried it myself yet. (I'm hoping form bdist_mpkgs on the Mac, too, though > those are getting rare) > > I've spent too much of my life trying to build other people's C code. >> I'd like to stop now :-) >> > > no kidding -- and this whole thread is about how to help more an more > people stop... > > > One of the key points is that when they started building conda -- > pip+wheel > >> > where not mature, and the core folks behind them didn't want to support >> what >> > was needed (dynamic libs, etc) -- and still don't. >> >> Well, I'm not sure "don't want to" is accurate these days. "Think the >> problem is harder than you're making it sound" may be accurate, as >> might "have no resource to spare to do the work, but would like others >> to". > > > well, yeah, though I'm still not sure how much support there is. And I do > think that no one want to extend pip to be able to install Perl, for > instance ;-) > > >> My biggest worry is that at some point, "if you want numpy/scipy, you >> >> should use conda" becomes an explicit benefit of conda, >> > >> > That is EXACTLY what the explicit benefit of conda is. I think we'll get >> > binary wheels for numpy and scipy up on PyPi before too long, but the >> rest >> > of the more complex stuff is not going to be there. >> >> I did specifically mean numpy and scipy. People are using those, and >> pandas and matplotlib, > > > That would be the core "scipy stack", and you can't, as of today, pip > install it on Windows (you can on the Mac), and you can get wheel from the > Gohlke repo, or wininst from the scipy site. > > That is going to change, hopefully soon, and to be fair the technical > hurdles have to do with building a good LAPACK without licensing issues, > and figuring out if we need to support pre-SSE2 hardware, etc ... not a pip > or pypi limitation. > > But the *intention* is that wheels will be an acceptable >> replacement for wininst installers, so giving up before we reach that >> goal would be a shame. > > > I don't think we will -- some folks are doing some great work on that -- > and it looks to be close. > > >> Again, that's something for Nick to comment on - I don't know how >> wheel (it's wheel more than pip in this context, I believe) fits into >> RPM/deb building processes. But I do know that's how he's been talking >> about things working. I can't see why conda would be any different. >> > > yup -- it probably does make sense to do with conda what is done with rpm. > Except that conda already has a bunch of python-aware stuff in it... > > > $PYTHON setup.py install >> >> That sounds like conda doesn't separate build from install - is there >> a conda equivalent of a "binary distribution" like a wheel? > > > yes, a conda pacakge is totally binary. but when you build one, it does > this: > > 1) create a new, empty conda environment > 2) install the build dependencies of the package at hand > 3) download or unpack the source code > 4) build and install the package (into this special, temporary, conda > environment) > 5) package up all the stuff that got installed into a conda package > > I don't know how it does it, but it essentially finds all the files that > were added by the install stage, and puts those in the package. Remarkably > simple, and I think, kind of elegant. > > I think it installs, rather than simply building, so that conda itself > doesn't need to know anything about what kind of package it is -- what it > wants the final package to be is an archive of everything that you want > installed, where you want it installed. so actually installing it is the > easiest way to do that. > > for instance, a C library build script might be: > > ./configure > make > make install > > There are >> a lot of reasons why pip/wheel is working hard to move to a separation >> of build and install steps, and if conda isn't following that, > > > I think wheel and conda are quite similar in that regard, actually. > > > >> > it's that simple. And I don't know if this is what it actually does, but >> > essentially the conda package is all the stuff that that script added >> to the >> > environment. So if changing that invocation to use pip would get us some >> > better meta data or what have you, then by all means, we should change >> that >> > standard of practice. >> >> Well, it sounds like using "setup.py bdist_wheel" and then repackaging >> up the contents of the wheel might be easier than playing "spot what >> changed" with a setup.py install invocation. But I'm no expert. (If >> the response to that is "wheel doesn't handle X" then I'd rather see >> someone offering to help *improve* wheel to handle that situation!) >> > > for python packages, maybe but the way they do it now is more generic -- > why have a bunch of python package specific code you don't need? > > >> > (but it seems like an odd thing to have to use the package manager to >> build >> > the package correctly -- shouldn't' that be distutils or setuptools' >> job? >> >> Wheels can be built using setuptools (bdist_wheel) or just packaging >> up some files manually. All pip does in this context is orchestrate >> the running of setup.py bdist_wheel. You lot use "conda" to mean the >> format, the package manager, and the distribution channel - give me >> some slack if I occasionally use "pip" when I mean "wheel" or >> "setuptools" :-) :-) >> > > fair enough. but does making a wheel create some meta-data, etc. that > doing a raw install not create? i.e. is there something in a wheel that > conda maybe should use that it's not getting? > > There is a point where "specialised enough to only matter to >> Acaconda's target audience" is OK. And Christoph Gohlke's >> distributions fill a huge gap (losing that because "you should use >> conda" would be a huge issue, IMO). >> > > His repo is fantastic, yes, but it's an awful lot of reliance on one > really, really, productive and smart guy! As a note -- he is often the > first to find and resolve a lot of Windows build issues on a variety of > packages. AND I think some of the wheels on pypi are actually built by him, > and re-distributed by the package maintainers. > > >> lot of the "conda vs pip" discussion is less relevant. At the moment, >> >> there are projects like numpy that don't distribute Windows wheels on >> >> PyPI, but Christoph Gohlke has most of them available, >> > >> > yes, but in a form that is not redistributable on PyPi... >> >> Well, it's not clear to me how insurmountable that problem is - his >> page doesn't specifically mention redistribution limitations (I >> presume it's license issues?) > > > yes. > > >> Has anyone discussed the possibility of >> addressing the issues? If it is licensing (of MKL?) > > > yes. > > >> then could the >> ones without license implications be redistributed? > > > probably -- and, as above, some are (at least one I know of was, anyway > :-) ) > > >> Could the PSF >> assist with getting a redistribution license for the remainder? I've >> no idea what the issues are - I'm just asking if there are approaches >> no-one has considered. >> > > It's been talked about, and has recently -- the trick, as I understand it, > is that while the Intel license allows re-distribution, you are supposed to > have soem sort of info for the user about the license restrictions. > > when you "pip install" something, there is no way for the user to get that > license info. Or something like that. > > Also -- the scipy devs in general really prefer a fully open source > solution. But one is coming soon, so we're good. > > > I'm not sure about "conda-only", but not pip-installable is all too >> common. >> >> My benchmark is everything that used to distribute wininst installers >> or binary eggs distributes wheels. > > > we're probably getting pretty close there. wxPython is an exception, but > Robin is distributing the new and improved beta version as wheels. I think > he's got enough on his plate that he doesn't want to wrestle with the old > and crufty build system. I don't know what else there is of the major > packages. > > >> I'd *like* to go beyond that, but >> we haven't got the resources to help projects that never offered >> binaries. Asking (and assisting) people to move from the old binary >> standards to the new one should be a reasonable goal, though. >> > > yup. which makes me think -- maybe not that hard to do a wininst to wheel > converter for wxPython -- that would be nice. We also need it for the Mac, > and that would be harder -- he's got some trickery in placing the libs in > that one... > > >> >> > And for even that to work, we need a way for everything installable >> by >> >> > pip >> >> > to be installable within that conda environment -- which we could >> >> > probably >> >> > achieve. >> >> >> >> See above - without some serious resource on the conda side, that >> >> "everything" is unlikely. >> > >> > conda can provide a full, pretty standard, python environment -- why do >> you >> > think "everything" is unlikely? >> >> Because the pip/wheel ecosystem relies on projects providing their own >> wheels. Conda relies on the "conda guys" building packages. It's the >> same reason that a PyPI build farm is still only a theory :-) Or were >> you assuming that package authors would start providing conda >> packages? > > > no -- THAT is unlikely. I meant that if we can get the automatic build a > conda package from a pypi sdist working, then we're good to go. Or maybe > build-a-conda-package-from-a-binary-wheel. > conda skeleton pypi pyyaml https://github.com/conda/conda-build/issues/397 (adds a --version-compare to compare conda version w/ pypi version) > > After all, if a package can be pip-installed, then it should be able to be > pip-installed inside a conda environment, which means it should be able to > be done automatically. > > I've done a bunch of these -- it's mostly boilerplate. > > -Chris > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed May 20 07:25:38 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 20 May 2015 15:25:38 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 18 May 2015 at 14:32, David Cournapeau wrote: > On Mon, May 18, 2015 at 9:05 AM, Nick Coghlan wrote: >> Indirection via pip injects the usage of setuptools even for plain >> distutils projects, and generates https://www.python.org/dev/peps/pep-0376/ >> compliant metadata by default. > > > Note that some packages will push hard against injecting setuptools, at > least until it does not offer a way to prevent from installing as an egg > directory. Most of the core scientific packages avoid setuptools because of > this. pip changes the default behaviour of setuptools to be more in line with the behaviour of plain distutils (e.g. forcing "--single-version-externally-managed"). This means that "pip install X" consistently gives setuptools levels of capabilities in terms of dependency management, without defaulting to installing things as egg archives or directories (modulo the rough edges around setup-requires). As Donald notes, "pip install" also abstracts away any future invocation of a metadata based pluggable build system, while "./setup.py install" assumes the distutils-based status quo will remain in place forever. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From cournape at gmail.com Wed May 20 07:38:54 2015 From: cournape at gmail.com (David Cournapeau) Date: Wed, 20 May 2015 14:38:54 +0900 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Wed, May 20, 2015 at 2:25 PM, Nick Coghlan wrote: > On 18 May 2015 at 14:32, David Cournapeau wrote: > > On Mon, May 18, 2015 at 9:05 AM, Nick Coghlan > wrote: > >> Indirection via pip injects the usage of setuptools even for plain > >> distutils projects, and generates > https://www.python.org/dev/peps/pep-0376/ > >> compliant metadata by default. > > > > > > Note that some packages will push hard against injecting setuptools, at > > least until it does not offer a way to prevent from installing as an egg > > directory. Most of the core scientific packages avoid setuptools because > of > > this. > > pip changes the default behaviour of setuptools to be more in line > with the behaviour of plain distutils (e.g. forcing > "--single-version-externally-managed"). > > Yes, but that cannot be the only way to install the package. This is why I would like to see a way forward for https://bitbucket.org/pypa/setuptools/issue/371/setuptools-and-state-of-pep-376, to start pushing packages using setuptools in their setup.py in the scientific stack. Without this, it will be hard to move forward politically speaking. Lots of key contributors have a strong aversion to setuptools way of doing things (partly historical reasons, but it is hard to change minds). If every system (pip, "setup.py install", conda, enstaller) were to at least write the {dist/egg}-info directories correctly, it would be already a significant step toward interoperation. David This means that "pip install X" consistently gives setuptools levels > of capabilities in terms of dependency management, without defaulting > to installing things as egg archives or directories (modulo the rough > edges around setup-requires). > > As Donald notes, "pip install" also abstracts away any future > invocation of a metadata based pluggable build system, while > "./setup.py install" assumes the distutils-based status quo will > remain in place forever. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed May 20 08:02:12 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 20 May 2015 16:02:12 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 20 May 2015 at 15:38, David Cournapeau wrote: > > > On Wed, May 20, 2015 at 2:25 PM, Nick Coghlan wrote: >> >> On 18 May 2015 at 14:32, David Cournapeau wrote: >> > On Mon, May 18, 2015 at 9:05 AM, Nick Coghlan >> > wrote: >> >> Indirection via pip injects the usage of setuptools even for plain >> >> distutils projects, and generates >> >> https://www.python.org/dev/peps/pep-0376/ >> >> compliant metadata by default. >> > >> > >> > Note that some packages will push hard against injecting setuptools, at >> > least until it does not offer a way to prevent from installing as an egg >> > directory. Most of the core scientific packages avoid setuptools because >> > of >> > this. >> >> pip changes the default behaviour of setuptools to be more in line >> with the behaviour of plain distutils (e.g. forcing >> "--single-version-externally-managed"). >> > > Yes, but that cannot be the only way to install the package. > > This is why I would like to see a way forward for > https://bitbucket.org/pypa/setuptools/issue/371/setuptools-and-state-of-pep-376, > to start pushing packages using setuptools in their setup.py in the > scientific stack. Without this, it will be hard to move forward politically > speaking. Lots of key contributors have a strong aversion to setuptools way > of doing things (partly historical reasons, but it is hard to change minds). As described in my last comment on that issue, I think the key is to offer a patch that generates PEP 376 metadata when "--single-version-externally-managed" is passed to setuptools. The PEP 376 metadata layout doesn't cover what should happen when using the parallel installation support in setuptools, so it doesn't make sense to try to abide by it in those cases. > If every system (pip, "setup.py install", conda, enstaller) were to at least > write the {dist/egg}-info directories correctly, it would be already a > significant step toward interoperation. Agreed. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Wed May 20 09:57:05 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 20 May 2015 17:57:05 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 19 May 2015 at 09:43, Chris Barker wrote: > On Mon, May 18, 2015 at 11:21 AM, Paul Moore wrote: >> My honest view is that unless conda is intending to replace pip and >> wheel totally, you cannot assume that people will be happy to use >> conda alongside pip (or indeed, use any pair of independent packaging >> tools together - people typically want one unified solution). And if >> the scientific community stops working towards providing wheels for >> people without compilers "because you can use conda", there is going >> to be a proportion of the Python community that will lose out on some >> great tools as a result. > > > Exactly -- this idea that there are two (or more) non-overlapping > communities is pretty destructive. There's a cornucopia of *overlapping* communities. We only rarely hear from system administrators upstream, for example, as they tend to be mainly invested in particular operating system or configuration management communities, leaving upstream mostly to developers and data analysts. For these admins, a package management system is only going to be potentially interesting if it is supported by their operating system or configuration management tool of choice (e.g. http://docs.ansible.com/list_of_packaging_modules.html for Ansible, or some of the options linked from Salt's package management abstraction layer: http://docs.saltstack.com/en/latest/ref/states/all/salt.states.pkg.html) This is why I'm such a big fan of richer upstream metadata with automated conversion to downstream formats as my preferred long term solution - this isn't a "pip vs conda" story, it's "pip vs conda vs yum vs apt vs MSI vs nix vs zypper vs zc.buildout vs enstaller vs PyPM vs ....". (in addition to the modules listed for Ansible and Salt, I discovered yet another one today: https://labix.org/smart) The main differences I see with conda relative to the other downstream package management systems is that it happened to be made by folks that are also heavily involved in development of Python based data analysis tools, and that some of its proponents want it to be the "one package management tool to rule them all". I consider the latter proposal to be as outlandish an idea as believing the world only needs one programming language - just as with programming languages, packaging system design involves making trade-offs between different priorities, so you can't optimise for everything at once. conda's an excellent cross-platform end user focused dependency management system. This is a good thing, but it does mean conda isn't a suitable candidate for use as an input format for other tools that compete with it. As far as the "we could use a better dynamic linking story for Windows and Mac OS X" story goes, now that I understand the general *nix case is considered out of scope for the situations Chris is interested in, I think there's a reasonable case to be made for being able to *bundle* redistributable dynamically linked libraries with a wheel file, and for the build process of *other* wheel files to be able to rely on those bundled external libraries. I originally thought the request was about being able to *describe* the external dependencies in sufficient detail that the general case on *nix could be handled, or that an appropriate Windows or Mac OS X binary could be obtained out of band, rather than by being bundled with the relevant wheel file. Getting a bundling based model to work reliably is still going to be difficult (and definitely more complicated than static linking in cases where data sharing isn't needed), but it's not intractable the way the general case is. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From p.f.moore at gmail.com Wed May 20 10:04:52 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 20 May 2015 09:04:52 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 19 May 2015 at 23:32, Chris Barker wrote: > lost track of where in the thred this was, but here's a conda recipe I found > on gitHub: > > https://github.com/menpo/conda-recipes/tree/master/libxml2 > > don't know anything about it..... OK, I'm still misunderstanding something, I think. As far as I can see, all that does is copy a published binary and repack it. There's no "build" instructions in there. Paul From p.f.moore at gmail.com Wed May 20 10:09:31 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 20 May 2015 09:09:31 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 20 May 2015 at 00:04, Chris Barker wrote: > yup. which makes me think -- maybe not that hard to do a wininst to wheel > converter for wxPython -- that would be nice. We also need it for the Mac, > and that would be harder -- he's got some trickery in placing the libs in > that one... "wheel convert " already does that. I wrote it, and use it a lot. It doesn't handle postinstall scripts (because wheels don't yet) but otherwise should be complete. Paul From dholth at gmail.com Wed May 20 15:30:41 2015 From: dholth at gmail.com (Daniel Holth) Date: Wed, 20 May 2015 09:30:41 -0400 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: The conda package specification is published at http://conda.pydata.org/docs/spec.html The file format is nice and simple. "A conda package is a bzipped tar archive (.tar.bz2) which contains metadata under the info/ directory, and a collection of files which are installed directly into an install prefix. The format is identical across platforms and operating systems. It is important to note that during the install process, all files are basically just extracted into the install prefix, with the exception of the ones in info/." (Compare to the Debian package format's embedded metadata and content archives.) It has concise metadata in info/index.json { "arch": "x86_64", "build": "py27_138_g4f40f08", "build_number": 138, "depends": [ "jinja2", "jsonpointer", "jsonschema", "mistune", "pandoc", "pygments", "python 2.7*", "pyzmq", "terminado", "tornado" ], "license": "MIT License", "name": "ipython-we", "platform": "linux", "subdir": "linux-64", "version": "3.1.0" } The package includes its build recipe in info/recipe This particular package has setuptools metadata in lib/python2.7/site-packages/ipython-3.1.0-py2.7.egg-info On the index, conda packages are organized by placing packages for a platform+architecture in their own (sub)directory, not by putting all that information in the filename. According to the docs it doesn't interpret the platform metadata. When conda installs a package, it gets unpacked into a common directory and then linked into each environment, so that it can be installed to lots of environments without taking up much extra disk space. Packages can have link and unlink scripts to provide custom behavior (perhaps fixing up some paths for files that can't just be linked) when this happens. It occurs to me that the setuptools packaging in general is more like a shared library format .so or .dll, aka libraries searched for along a path, than an OS level package manager. From dmertz at continuum.io Wed May 20 16:53:30 2015 From: dmertz at continuum.io (David Mertz) Date: Wed, 20 May 2015 07:53:30 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 4:11 PM, Chris Barker wrote: > > On Tue, May 19, 2015 at 3:09 PM, Paul Moore wrote: > "conda" -- a fully open source package management system > "Anaconda" -- a python and other stuff distribution produced by Continuum. > > How Continuum does or doesn't publish the recipes it used to build Anaconda doesn't really have anything to do with conda-the-technology. True. Also though, in answer to the a question here, I asked a Continuum colleague on the conda team. It seems that Anaconda was built using a proprietary system before conda-build and conda-recipes was opened, so not all recipes have made it over to the Free side of the fence yet. But y'know, gh:conda-build *is* a public repository, anyone could add more. >> >> I will note that most recipes seem to consist of either 'python setup.py install' or './configure; make; make install'. > > > sure -- but those aren't the ones we want ;-) Understood. > > see if you can find the wxPython one, while you are at it :-) > -- though I suspect that was built from the "official" executable, rather than re-built from scratch. In the case of pyyaml, this is actually what's "behind the wall" > > #!/bin/bash > patch -p0 < --- setup.cfg~ 2011-05-29 22:31:18.000000000 -0500 > +++ setup.cfg 2012-07-10 20:33:50.000000000 -0500 > @@ -4,10 +4,10 @@ > [build_ext] > # List of directories to search for 'yaml.h' (separated by ':'). > -#include_dirs=/usr/local/include:../../include > +include_dirs=$PREFIX/include > # List of directories to search for 'libyaml.a' (separated by ':'). > -#library_dirs=/usr/local/lib:../../lib > +library_dirs=$PREFIX/lib > # An alternative compiler to build the extention. > #compiler=mingw32 > EOF > $PYTHON setup.py install It's not *only* the 'setup.py install', but it's not *that* much mystery either. wxPython I can't seem to find, not sure what I'm missing. -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Wed May 20 17:33:39 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 20 May 2015 16:33:39 +0100 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 20 May 2015 at 15:53, David Mertz wrote: > It's not *only* the 'setup.py install', but it's not *that* much mystery > either. wxPython I can't seem to find, not sure what I'm missing. Yeah, I had been under the impression that there was a lot of knowledge on how to build the dependencies (things like libyaml or whatever) in there. Looks like that's not the case... Paul From chris.barker at noaa.gov Wed May 20 17:54:35 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 20 May 2015 08:54:35 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Wed, May 20, 2015 at 1:04 AM, Paul Moore wrote: > > https://github.com/menpo/conda-recipes/tree/master/libxml2 > > > > don't know anything about it..... > > OK, I'm still misunderstanding something, I think. As far as I can > see, all that does is copy a published binary and repack it. There's > no "build" instructions in there. > indeed -- that is one way to buld a conda pacakge, as you well know! maybe no one has done a "proper" build from scratch recipe for that one -- or maybe continuum has, and we'll find out about it from David.... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From gksalil at gmail.com Wed May 20 01:12:37 2015 From: gksalil at gmail.com (salil GK) Date: Wed, 20 May 2015 04:42:37 +0530 Subject: [Distutils] Help required for setup.py In-Reply-To: References: Message-ID: Hello I will provide more details about what I need to achieve I need to create a package for a tool that I create. Actually the tool that I created is a wrapper over ovftool which is provided by VMWare. ovftool install binary is provided as a bundle hence there is no package installed in the system ( `dpkg -l` will not list ovftool package ). ovftool will be installed in /usr/bin/ location. While creating the package I need to check if ovftool is available in the system and the version is 4.1.0. If it is not compatible, I need to fail the package installation with proper message. So how do I write setup.py for achieving the same. Thanks Salil On 19 May 2015 at 07:54, salil GK wrote: > Hello > > I was trying to create my package for distribution. I have a > requirement that I need to check if one particular command is available in > the system ( this command is not installed through a package - but a bundle > is installed to get the command in the system ). I am using Ubuntu 14.04 > > Thanks in advance > Salil > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 18:34:54 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 20 May 2015 09:34:54 -0700 Subject: [Distutils] Help required for setup.py In-Reply-To: References: Message-ID: On Tue, May 19, 2015 at 4:12 PM, salil GK wrote: > I will provide more details about what I need to achieve > > I need to create a package for a tool that I create. Actually the tool > that I created is a wrapper over ovftool which is provided by VMWare. > ovftool install binary is provided as a bundle hence there is no package > installed in the system ( `dpkg -l` will not list ovftool package ). > ovftool will be installed in /usr/bin/ location. > > While creating the package I need to check if ovftool is available in > the system and the version is 4.1.0. If it is not compatible, I need to > fail the package installation with proper message. So how do I write > setup.py for achieving the same. > you can put arbitrary python code in setup.py. so before you call setup() in the file, put something like: import subprocess try: version = subprocess.check_output(['/usr/bin/ovftool','--version']) except subprocess.CalledProcessError: print "ovftool is not properly installed" raise if not is_this_the_right_version(version): raise ValueError("ovftool is not the right version") of course, you'd probably want better error messages, etc, but hopefully you get the idea. -CHB > Thanks > Salil > > On 19 May 2015 at 07:54, salil GK wrote: > >> Hello >> >> I was trying to create my package for distribution. I have a >> requirement that I need to check if one particular command is available in >> the system ( this command is not installed through a package - but a bundle >> is installed to get the command in the system ). I am using Ubuntu 14.04 >> >> Thanks in advance >> Salil >> > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 18:25:05 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 20 May 2015 09:25:05 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Wed, May 20, 2015 at 6:30 AM, Daniel Holth wrote: > The package includes its build recipe in info/recipe > very cool -- I hadn't seen that -- I'll go take a look at some packages and see what I can find. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 19:13:13 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 20 May 2015 10:13:13 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: > > The package includes its build recipe in info/recipe >> > > very cool -- I hadn't seen that -- I'll go take a look at some packages > and see what I can find. > Darn -- the recipe is not there in most (all?) of the packages that came from Anaconda -- probably due to the legacy issues David referred to. And since a conda package is "just" a tar archive, you can presumably build them in other ways than a conda build recipe. By the way -- libxml is one example of one without a recipe... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed May 20 19:37:48 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 20 May 2015 10:37:48 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Wed, May 20, 2015 at 12:57 AM, Nick Coghlan wrote: > This is why I'm such a big fan of richer upstream metadata with > automated conversion to downstream formats as my preferred long term > solution - this isn't a "pip vs conda" story, it's "pip vs conda vs > yum vs apt vs MSI vs nix vs zypper vs zc.buildout vs enstaller vs PyPM > vs ....". hopefully not "versus", but "working with" ;-) -- but very good point. If python can do things to make it easier for all these broader systems, that's a "good thing" > The main differences I see with conda relative to the other downstream > package management systems is that it happened to be made by folks > that are also heavily involved in development of Python based data > analysis tools, Which is to say Python itself. > and that some of its proponents want it to be the "one > package management tool to rule them all". I don't know about that -- though another key point is that it is cross platform (platform independent) -- it may be the only one that does that part well. > I consider the latter > proposal to be as outlandish an idea as believing the world only needs > one programming language - just as with programming languages, > packaging system design involves making trade-offs between different > priorities, so you can't optimise for everything at once. conda's an > excellent cross-platform end user focused dependency management > system. This is a good thing, but it does mean conda isn't a suitable > candidate for use as an input format for other tools that compete with > it. > Hmm -- that's true. But it is, as you said " cross-platform end user focused dependency management system" that handles python well, in addition to other things, including libs python may depend on. As such, it _could_ play the role that pip+wheel (secondarily pypi) play in the python ecosystem. You'd still need something like distutils and/or setuptools to actually handle the building, etc. And IF we wanted the "official" package manager for python to fully support dynamic libs, etc, as well as non-python associated software, then it would make sense to use conda, rather than keep growing pip_wheel until it duplicated conda's functionality. But I don't get the impression that that is an end-goal for PyPa, and I'm not sure it should be. As far as the "we could use a better dynamic linking story for Windows > and Mac OS X" story goes, now that I understand the general *nix case > is considered out of scope for the situations Chris is interested in, > exactly, -- just like it linux is out of scope for compiled wheels I think there's a reasonable case to be made for being able to > *bundle* redistributable dynamically linked libraries with a wheel > file, and for the build process of *other* wheel files to be able to > rely on those bundled external libraries. yup -- that's what I have in mind. > I originally thought the > request was about being able to *describe* the external dependencies > in sufficient detail that the general case on *nix could be handled, > or that an appropriate Windows or Mac OS X binary could be obtained > out of band, rather than by being bundled with the relevant wheel > file. > Sure would be nice, but no, -- I have no fantasies about that. Getting a bundling based model to work reliably is still going to be > difficult (and definitely more complicated than static linking in > cases where data sharing isn't needed), but it's not intractable the > way the general case is. > Glad you agree -- so the rabbit hole may not be that deep? There isn't much that should change in pip+wheel+metadata to enable this. So the way to proceed, if someone wants to do it, could be to simply hack together some binary wheels of a common dependency or two, build wheels for a package or two that depend on those, and see how it works. I dont know if/when I'll find the roundtoits to do that -- but I have some more detailed ideas if anyone wants to talk about it. Then it becomes a social issue -- package maintainers would have to actually use these new sharedlib wheels to build against. But that isn't really that different than the current case of deciding whether to include a copy of a dependent python package in your distribution -- and one we made it easy for users to get dependencies, folks have been happy to shift that burden elsewhere. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Wed May 20 21:05:27 2015 From: wes.turner at gmail.com (Wes Turner) Date: Wed, 20 May 2015 14:05:27 -0500 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Wed, May 20, 2015 at 12:13 PM, Chris Barker wrote: > The package includes its build recipe in info/recipe >>> >> >> very cool -- I hadn't seen that -- I'll go take a look at some packages >> and see what I can find. >> > > Darn -- the recipe is not there in most (all?) of the packages that came > from Anaconda -- probably due to the legacy issues David referred to. > The other day, I upgraded the version of conda-recipes/arrow to v0.5.4, and added ofxparse. I should probably create some sort of recurring cron task to show how far behind stable the version number in the meta.yaml is. (see: conda skeleton --version-compare issue/PR (GH:conda/conda-build)) > > And since a conda package is "just" a tar archive, you can presumably > build them in other ways than a conda build recipe. > > By the way -- libxml is one example of one without a recipe... > Hours of compilation time. * https://www.google.com/#q=inurl:libxml2+conda+meta.yaml (3 results) * https://pypi.python.org/pypi?%3Aaction=search&term=buildout+libxml * https://pypi.python.org/pypi/z3c.recipe.staticlxml/0.10 > > -Chris > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu May 21 00:46:32 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 May 2015 08:46:32 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 21 May 2015 at 03:37, Chris Barker wrote: > As such, it _could_ play the role that pip+wheel (secondarily pypi) play in > the python ecosystem. In practice, it can't, as conda is entirely inappropriate as an input format for yum/apt/enstaller/zc.buildout/pypm/MSI/etc. In many ways, the barriers that keep conda from being a viable competitor to pip from an upstream perspective are akin to those that felled the distutils2 project, while the compatible-with-the-existing-ecosystem d2to1 has seen far more success. Rather than being strictly technical, the reasons for this are mostly political (and partially user experience related) so it's not worth the futile effort of attempting to change them. When folks try anyway, it mainly serves to alienate people using (or working on) other integration platforms rather than achieving anything productive (hence my comment about the "one package manager to rule them all" attitude of some conda proponents, although I'll grant they haven't yet gone as far as the NixOS folks by creating an entirely conda based Linux distro). The core requirement for the upstream tooling is to be able to bridge the gap from publishers of software components implemented in Python to integrators of software applications and development environments (regardless of whether those integrators are themselves end users, redistributors or both). That way, Python developers can focus on learning one publication toolchain (anchored by pip & PyPI), while users of integrated platforms can use the appropriate tools for their platform. conda doesn't bridge that gap for Python in the general case, as it is itself an integrator tool managed independently of the PSF and designed to consume components from *multiple* language ecosystems and make them available to end users in a common format. Someone designing a *new* language ecosystem today could quite reasonably decide not to invent their own distribution infrastructure, and instead adopt conda as their *upstream* tooling, and have it be the publication toolchain that new contributors to that ecosystem are taught, and that downstream integrators are expected to interoperate with, but that's not the case for Python - Python's far too far down the distutils->setuptools->pip path to be readily amenable to alternatives (especially alternatives that are currently still fairly tightly coupled to the offerings of one particular commercial redistributor). Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu May 21 02:20:57 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 May 2015 10:20:57 +1000 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On 21 May 2015 at 08:46, Nick Coghlan wrote: > On 21 May 2015 at 03:37, Chris Barker wrote: >> As such, it _could_ play the role that pip+wheel (secondarily pypi) play in >> the python ecosystem. > > In practice, it can't, as conda is entirely inappropriate as an input > format for yum/apt/enstaller/zc.buildout/pypm/MSI/etc. In many ways, > the barriers that keep conda from being a viable competitor to pip > from an upstream perspective are akin to those that felled the > distutils2 project, while the compatible-with-the-existing-ecosystem > d2to1 has seen far more success. I think I've finally figured out a short way of describing these "packaging ideas that simply won't work": if an ecosystem-wide packaging proposal doesn't work for entirely unmaintained PyPI packages, it's likely a bad proposal. This was not only the fatal flaw in the previous distribute/distutils2 approach, it's the reason we introduced so much additional complexity into PEP 440 in order to preserve compatibility with the vast majority of existing package versions on PyPI (over 98% of existing version numbers were still accepted), it's one of the key benefits of separating the PyPI-to-end-user TUF PEP from the dev-to-end-user one, and it's the reason why the "Impact assessment" section is one of the most important parts of the proposal in PEP 470 to migrate away from offering the current link spidering functionality (https://www.python.org/dev/peps/pep-0470/#id13). Coping with this problem is also why injecting setuptools when running vanilla distutils projects is one of the secrets of pip's success: by upgrading setuptools, and by tweaking the way pip invokes setup.py with it injected, we can change the way packages are built and installed *without* needing to change the packages themselves. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu May 21 02:25:06 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 May 2015 10:25:06 +1000 Subject: [Distutils] PyPI and Uploading Documentation In-Reply-To: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> References: <3FF6FB8B-5DAE-4743-8F1A-6CEC324AD17A@stufft.io> Message-ID: On 15 May 2015 at 23:48, Donald Stufft wrote: > I think that it's time to retire this aspect of PyPI which has never been well > supported and instead focus on just the things that are core to PyPI. I don't > have a fully concrete proposal for doing this, but I wanted to reach out here > and figure out if anyone had any ideas. Re-reading the current draft of PEP 470, it's worth noting we currently mention pythonhosted.org in as a means of hosting a separate index page for a project that would like to host packages externally: https://www.python.org/dev/peps/pep-0470/#deprecation-and-removal-of-link-spidering That's not a reason to avoid deprecating the documentation hosting functionality in favour of ReadTheDocs and static HTML hosting services, just something to take into account. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu May 21 02:43:29 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 May 2015 10:43:29 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 21 May 2015 at 05:05, Wes Turner wrote: > > > On Wed, May 20, 2015 at 12:13 PM, Chris Barker > wrote: >>>> >>>> The package includes its build recipe in info/recipe >>> >>> >>> very cool -- I hadn't seen that -- I'll go take a look at some packages >>> and see what I can find. >> >> >> Darn -- the recipe is not there in most (all?) of the packages that came >> from Anaconda -- probably due to the legacy issues David referred to. > > The other day, I upgraded the version of conda-recipes/arrow to v0.5.4, and > added ofxparse. > > I should probably create some sort of recurring cron task to show how far > behind stable the version number in the meta.yaml is. (see: conda skeleton > --version-compare issue/PR (GH:conda/conda-build)) https://release-monitoring.org/ is a public service for doing that (more info on supported upstream backends at https://release-monitoring.org/about, more info on the federated messaging protocol used to publish alerts at http://www.fedmsg.com/en/latest/) Anitya (the project powering release-monitoring.org) was built as the "monitoring" part of Fedora's upstream release notification pipeline: https://fedoraproject.org/wiki/Upstream_release_monitoring One of my hopes for the metadata extension system in PEP 426 is that we'll be able to define extensions like "fedora.repackage", "debian.repackage" or "conda.repackage" which include whatever additional info is needed to automate creation of a policy compliant downstream package in a format that's a purely additive complement to the upstream metadata, rather than being somewhat duplicative as is the case today with things like spec files, deb control files, and conda recipes. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From wes.turner at gmail.com Thu May 21 02:52:54 2015 From: wes.turner at gmail.com (Wes Turner) Date: Wed, 20 May 2015 19:52:54 -0500 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On May 20, 2015 7:43 PM, "Nick Coghlan" wrote: > > On 21 May 2015 at 05:05, Wes Turner wrote: > > > > > > On Wed, May 20, 2015 at 12:13 PM, Chris Barker > > wrote: > >>>> > >>>> The package includes its build recipe in info/recipe > >>> > >>> > >>> very cool -- I hadn't seen that -- I'll go take a look at some packages > >>> and see what I can find. > >> > >> > >> Darn -- the recipe is not there in most (all?) of the packages that came > >> from Anaconda -- probably due to the legacy issues David referred to. > > > > The other day, I upgraded the version of conda-recipes/arrow to v0.5.4, and > > added ofxparse. > > > > I should probably create some sort of recurring cron task to show how far > > behind stable the version number in the meta.yaml is. (see: conda skeleton > > --version-compare issue/PR (GH:conda/conda-build)) > > https://release-monitoring.org/ is a public service for doing that > (more info on supported upstream backends at > https://release-monitoring.org/about, more info on the federated > messaging protocol used to publish alerts at > http://www.fedmsg.com/en/latest/) > > Anitya (the project powering release-monitoring.org) was built as the > "monitoring" part of Fedora's upstream release notification pipeline: > https://fedoraproject.org/wiki/Upstream_release_monitoring Thanks! > > One of my hopes for the metadata extension system in PEP 426 is that > we'll be able to define extensions like "fedora.repackage", > "debian.repackage" or "conda.repackage" which include whatever > additional info is needed to automate creation of a policy compliant > downstream package in a format that's a purely additive complement to > the upstream metadata, rather than being somewhat duplicative as is > the case today with things like spec files, deb control files, and > conda recipes. http://conda.pydata.org/docs/bdist_conda.html bdist_conda? > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu May 21 02:57:18 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 May 2015 10:57:18 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 20 May 2015 at 23:30, Daniel Holth wrote: > It occurs to me that the setuptools packaging in general is more like > a shared library format .so or .dll, aka libraries searched for along > a path, than an OS level package manager. Yep, that was what PJE was after for Chandler, so that's what he built. It was just useful enough for other folks that it was adopted well beyond that original use case. The key benefit it offered at the time was that pkg_resources could use sys.path + the assumption of directory or zip archive based installation to implement searching for metadata, which avoided the need to come up with an alternative cross-platform approach to metadata storage and retrieval. A related potentially interesting project I've never had time to pursue is an idea for a virtualenv friendly installation layout that doesn't quite lead to the same kind of version proliferation as setuptools did (or as something like NixOS does), while remaining compatible with sys.path based metadata discovery. The essential concept would be to assume semantic versioning for shared installations, and install the shared packages into directories named as "package/MAJOR_VERSION/". In each virtualenv that opts in to using the shared package rather than its own bundled copy, you'd then install a *.pth file that added "package/MAJOR_VERSION" to sys.path in that environment. This would be similar in principle to the way Nix user profiles work (http://nixos.org/releases/nix/nix-0.12/manual/#sec-profiles), but adapted to be compatible with Python's existing *.pth file mechanism. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu May 21 03:13:28 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 May 2015 11:13:28 +1000 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On 21 May 2015 at 10:52, Wes Turner wrote: > On May 20, 2015 7:43 PM, "Nick Coghlan" wrote: >> One of my hopes for the metadata extension system in PEP 426 is that >> we'll be able to define extensions like "fedora.repackage", >> "debian.repackage" or "conda.repackage" which include whatever >> additional info is needed to automate creation of a policy compliant >> downstream package in a format that's a purely additive complement to >> the upstream metadata, rather than being somewhat duplicative as is >> the case today with things like spec files, deb control files, and >> conda recipes. > > http://conda.pydata.org/docs/bdist_conda.html bdist_conda? conda has the benefit of *not* renaming Python packages in convoluted ways that interfere with automated identification of dependencies :) Both conda and Linux distros run into the "it's difficult/impossible to describe external binary dependencies in a cross-platform way" problem, though. While https://www.biicode.com/ is interesting in the context of CMake based projects, that still excludes a lot of software. (RYPPL is another I'd heard of, but it's GitHub repo hasn't seen much activity since 2013, and ryppl.org appears to be entirely dead) Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From chris.barker at noaa.gov Thu May 21 17:33:07 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 21 May 2015 08:33:07 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Wed, May 20, 2015 at 5:20 PM, Nick Coghlan wrote: > Coping with this problem is also why injecting setuptools when running > vanilla distutils projects is one of the secrets of pip's success: Ahh! THAT is the role pip plays in building. It's the way that you get setuptools features in a plain distutils-based package. So conda _could_ play the same trick, and inject setuptools into packages that don't use it already, but why bother -- pip does that for us. OK -- I'm going to try to find some time to play with this -- I do think it will solve some of the issues I've had, and if it works well, maybe we can move it toward a new standard of practice for conda-python-packages. Thanks -- clarity at last! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Thu May 21 17:35:05 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 21 May 2015 08:35:05 -0700 Subject: [Distutils] Making pip and PyPI work with conda packages In-Reply-To: References: Message-ID: On Wed, May 20, 2015 at 5:43 PM, Nick Coghlan wrote: > One of my hopes for the metadata extension system in PEP 426 is that > we'll be able to define extensions like "fedora.repackage", > "debian.repackage" or "conda.repackage" which include whatever > additional info is needed to automate creation of a policy compliant > downstream package in a format that's a purely additive complement to > the upstream metadata, rather than being somewhat duplicative as is > the case today with things like spec files, deb control files, and > conda recipes. > That would be great, yes! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Thu May 21 17:28:44 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 21 May 2015 08:28:44 -0700 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: On Wed, May 20, 2015 at 3:46 PM, Nick Coghlan wrote: > On 21 May 2015 at 03:37, Chris Barker wrote: > > As such, it _could_ play the role that pip+wheel (secondarily pypi) play > in > > the python ecosystem. > > In practice, it can't, as conda is entirely inappropriate as an input > format for yum/apt/enstaller/zc.buildout/pypm/MSI/etc. well, I'm making a strong distinction between a build system and a dependency management / install system. conda is not any kind of replacement for distutils / setuptools. (kind of how rpm doesn't replace configure and make. at all.) I'm still confused as to why pip plays as big a role in building as it seems to, but I guess I trust that there are reasons. Maybe it's only wheel that conda duplicates. But this is all irrelevent, because: Rather than being strictly technical, the reasons for this are mostly > political (and partially user experience related) Exactly. setuptools+pip+wheel is, and should be, the "official" python distribution system. When folks try anyway, > it mainly serves to alienate people using (or working on) other > integration platforms rather than achieving anything productive (hence > my comment about the "one package manager to rule them all" attitude > of some conda proponents, well, sorry if I've contributed to that -- but I guess for my part, there is a core frustration -- I have only so much time (not much), and I want to support multiple communities, -- and I simply can't do that without doing twice as much work. Duplication of effort may be inevitable, but it is still frustrating. That way, Python developers can focus on > learning one publication toolchain (anchored by pip & PyPI), while > users of integrated platforms can use the appropriate tools for their > platform. > That's all very nice, and it works great for packages that don't have any external dependencies, but if I'm trying to publish my python package, pip+wheel simply doesn't support what I need -- I can't use only one publication toolchain. And indeed, even if it did (with my vaporware better support for shared libs), that would be incompatible with conda, which does, in fact, support everything I need. conda doesn't bridge that gap for Python in the general case, as it is > itself an integrator tool managed independently of the PSF well that is a political/social issue. > and > designed to consume components from *multiple* language ecosystems and > make them available to end users in a common format. > not sure why that precludes it being used for python -- Python somehow HWAS to use a system that is only designed for Python? why? > Python's far too far down > the distutils->setuptools->pip path to be readily amenable to > alternatives agreed. this is the key point - it's gotten a bit blended in with teh technical issues, but that really is the key point. -- I'll shut up now :-) (at least about this) -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Thu May 21 17:37:39 2015 From: donald at stufft.io (Donald Stufft) Date: Thu, 21 May 2015 11:37:39 -0400 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> Message-ID: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> > On May 21, 2015, at 11:33 AM, Chris Barker wrote: > > On Wed, May 20, 2015 at 5:20 PM, Nick Coghlan > wrote: > Coping with this problem is also why injecting setuptools when running > vanilla distutils projects is one of the secrets of pip's success: > > Ahh! THAT is the role pip plays in building. It's the way that you get setuptools features in a plain distutils-based package. So conda _could_ play the same trick, and inject setuptools into packages that don't use it already, but why bother -- pip does that for us. > > OK -- I'm going to try to find some time to play with this -- I do think it will solve some of the issues I've had, and if it works well, maybe we can move it toward a new standard of practice for conda-python-packages. > > Thanks -- clarity at last! > Also, one of the goals a few of us has in the PyPA is that we move to a future where the build systems are pluggable. So one package could be building using setuptools, another building using some SciPy specific build tool, another using a whole other one. They will all ideally have some sort of generic interface that they need to work with, but using pip means you get the details of abstracting out to the different build tools handled for you, for ?free?. At least, in theory that?s how it?ll work :) --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From p.f.moore at gmail.com Thu May 21 19:33:12 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 21 May 2015 18:33:12 +0100 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> Message-ID: On 21 May 2015 at 16:37, Donald Stufft wrote: > Also, one of the goals a few of us has in the PyPA is that we move to a > future where the build systems are pluggable. So one package could be > building using setuptools, another building using some SciPy specific build > tool, another using a whole other one. They will all ideally have some sort > of generic interface that they need to work with, but using pip means you > get the details of abstracting out to the different build tools handled for > you, for ?free?. At least, in theory that?s how it?ll work :) Note that this is a key to why wheel is important to this discussion. The "build interface" in pip is "pip wheel foo", which will (in the "pluggable build" future) run whatever build tool the project specifies, and produce as output a wheel. That wheel is then the input for any packaging systems that want to build their own formats. So, we have: 1. sdist: The source format for packages. 2. distutils/setuptools/bdist_wheel: The only major "build tool" currently in existence. Our goal is to seamlessly allow others to fit here. 3. pip wheel: The build tool interface designed to convert sdist->wheel using the appropriate build tool. 4. wheel: The built format for Python packages. Acts as a common target for build tools and as a common source for distribution package builders. Also directly installable via pip. 5. pip install . The canonical Python installer, taking wheels as input. Pip can also combine this whole sequence, and install direct from sdist (via wheel in the next version, currently by direct install from sdist), but that;s not the important point for this discussion. In terms of tool interoperability, therefore, there are a couple of places things can hook in. 1. Anything that can take an sdist and build a wheel can be treated as a "build tool". You could run the appropriate build tool for your package manually, but it's a goal for pip to provide a unified interface to that process. 2. Any installer can use wheels as the source for building its install packages. This frees package build processes from needing to deal with compiling Python extensions, packaging up Python sources, etc. We (the PyPA) haven't really done a particularly good job of articulating this design, not least because a lot of it is still ideas in progress, rather than concrete plans. And as a result, it's hard for tools like conda to clearly understand how they could fit into this stack. And of course, the backward compatibility pressures on any change in Python packaging causes things to go pretty slowly, meaning that projects like conda have an additional pressure to come up with a solution *right now* rather than waiting for a standard solution that frankly is currently vapourware. Ideally, the scientific community's experiences with building complex Python packages can help us to improve the wheel spec to ensure that it can better act as that universal binary format (for repackaging or direct installation). But that does require ongoing effort to make sure we understand where the wheel format falls short, and how we can fix those issues. Doing this without getting sucked into trying to solve problems that the wheel format is *not* intended to cover (packaging and distribution of non-Python code) is hard - particularly where we need to express dependencies on such things. Paul. From jjhelmus at gmail.com Thu May 21 20:12:47 2015 From: jjhelmus at gmail.com (Jonathan Helmus) Date: Thu, 21 May 2015 13:12:47 -0500 Subject: [Distutils] Dynamic linking between Python modules In-Reply-To: References: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> Message-ID: <555E201F.3090706@gmail.com> On 05/21/2015 12:33 PM, Paul Moore wrote: > 1. Anything that can take an sdist and build a wheel can be treated as > a "build tool". You could run the appropriate build tool for your > package manually, but it's a goal for pip to provide a unified > interface to that process. Paul, Thanks for the wonderfully clear email, it helped solidify a number of questions I had on pip, wheels and build tools. From the section above it sounds like a tool that is capable of converting between conda packages and wheel files (and possibility other binary Python formats) could bridge the two systems and help root out where the wheel format is lacking. Combined with conda's build system such a converter could serve as a pip pluggable "build tool" to take an sdist and produce a wheel using a workflow more familiar to the segment of the scientific python community who use conda. Obviously there are a number of technical challenges that would need to be overcome (shared libraries, Python entry points, etc) and I'm guessing some political issues. None-the-less, I'd be willing to try to prototype such a tool if other think it would be of use. Cheers, - Jonathan Helmus From chris.barker at noaa.gov Thu May 21 21:26:16 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 21 May 2015 12:26:16 -0700 Subject: [Distutils] Dynamic linking between Python modules In-Reply-To: <555E201F.3090706@gmail.com> References: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> <555E201F.3090706@gmail.com> Message-ID: On Thu, May 21, 2015 at 11:12 AM, Jonathan Helmus wrote: > it sounds like a tool that is capable of converting between conda > packages and wheel files converting from a wheel to a conda package should be very doable (and may already be done). But the other way around is not -- conda packages can hold a superset of what wheels can hold -- i.e. stuff outside of python. Now that I think about it, it may seem kludgy, but conda packages can be built from wheel right now, with no special tools. A conda recipe that simply installs a wheel in it's build script would do just that. I'm still a bit confused about the role of wheel here. Why build a wheel, jsut so you can go install it, rather than simply install the package directly? I'm not sure it matters in this context, but I don't get the point. Though now that I think about it, that's exactly what conda does -- if you want to install a package from a conda build recipe -- you build a conda package, and then install that. That may be required to support some stuff conda supports -- for instance, conda install can re-write paths to shared libs, for instance. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu May 21 22:12:18 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 21 May 2015 21:12:18 +0100 Subject: [Distutils] Dynamic linking between Python modules In-Reply-To: References: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> <555E201F.3090706@gmail.com> Message-ID: On 21 May 2015 at 20:26, Chris Barker wrote: > Now that I think about it, it may seem kludgy, but conda packages can be > built from wheel right now, with no special tools. A conda recipe that > simply installs a wheel in it's build script would do just that. That sounds about right, from what I've seen of conda builds. You could probably do better (for example, by just repacking the wheel rather than going through the whole wheel install process) but you don't have to. Some possible problem areas - when you install a wheel, it will install executable wrappers for the entry points (like pip.exe) which are tied to the install location. You'd need to deal with that. But presumably conda already has to deal with that because setuptools does precisely the same. > I'm still a bit confused about the role of wheel here. Why build a wheel, > just so you can go install it, rather than simply install the package > directly? Basically, because you can't "simply install". You may not have a compiler, or you may not have the required libraries, etc etc. Don't forget, you can build a wheel once, then install it anywhere that has a compatible Python installation. I'm surprised this isn't obvious to you, as isn't that precisely what conda does as well? Also, installing from a wheel is a *lot* faster than installing from sdist - people who deploy lots of packages to multiple servers or virtualenvs greatly appreciate the extra speed of a wheel install. That's why pure-python wheels are worth having, even though you could install from source. Also, so you don't need any install-time only requirements (e.g. setuptools) on the target production system. Generally, pretty much all of the reasons people don't compile all their software on their production machines :-) Paul From chris.barker at noaa.gov Fri May 22 00:49:14 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 21 May 2015 15:49:14 -0700 Subject: [Distutils] Dynamic linking between Python modules In-Reply-To: References: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> <555E201F.3090706@gmail.com> Message-ID: On Thu, May 21, 2015 at 1:12 PM, Paul Moore wrote: > > built from wheel right now, with no special tools. A conda recipe that > > simply installs a wheel in it's build script would do just that. > > That sounds about right, from what I've seen of conda builds. You > could probably do better (for example, by just repacking the wheel > rather than going through the whole wheel install process) but then conda would need to understand wheel -- now it doesn't have to. ony pip or whatever has to understand wheel. > Some possible problem areas - when you install a wheel, it will > install executable wrappers for the entry points (like pip.exe) which > are tied to the install location. You'd need to deal with that. But > presumably conda already has to deal with that because setuptools does > precisely the same. > indeed conda build (or is it install -- not sure!) does do path re-writing, etc. > > I'm still a bit confused about the role of wheel here. Why build a wheel, > > just so you can go install it, rather than simply install the package > > directly? > > Basically, because you can't "simply install". You may not have a > compiler, or you may not have the required libraries, etc etc. > I'm thinking of the context of building a conda package -- or rpm, etc. -- not the general, "you are using pip as your package manager" case. And in that case, you want to build it to match your environemnt -- which may not be what a wheel onPyPi matches, for instance. So you have to build the package -- not sure what that wheel step buys you. But it doesn't cost much either, so why not? > Generally, pretty much all of the reasons people don't compile all > their software on their production machines :-) > right, but you're not running conda build on a production machine, either. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From madewokherd at gmail.com Fri May 22 01:40:25 2015 From: madewokherd at gmail.com (Vincent Povirk) Date: Thu, 21 May 2015 18:40:25 -0500 Subject: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more) In-Reply-To: References: <87AE23BF-FEA1-4A03-83AC-34BD4A241DA9@stufft.io> <8538319ezj.fsf_-_@benfinney.id.au> <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> Message-ID: On Thu, May 21, 2015 at 12:33 PM, Paul Moore wrote: > Doing this without getting sucked into trying to > solve problems that the wheel format is *not* intended to cover > (packaging and distribution of non-Python code) is hard - particularly > where we need to express dependencies on such things. It's not there yet (especially on non-Windows platforms), but the plan is for OneGet to provide a consistent way to install things from different package managers/formats. From donald at stufft.io Fri May 22 06:21:19 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 22 May 2015 00:21:19 -0400 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 Message-ID: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Hey, I'm happy to say that I've just cut the releases of pip 7.0 and virtualenv 13.0 and I have uploaded them to PyPI. For the full list of changes go visit the respective changelogs, however the biggest change here is that in pip 7.0 when pip finds and downloads a sdist, instead of installing that sdist directly it will instead build a wheel of that and cache it locally. From then on out it will use that cached wheel to install instead of downloading and building the sdist each time. This can have a profound impact upon installation speed. For instance, taking a look at the popular lxml library: # Without a locally cached wheel $ time pip install lxml ... pip install lxml 36.00s user 1.40s system 98% cpu 38.117 total # The next time, with a primed cache. $ time pip install lxml ... pip install lxml 0.61s user 0.10s system 94% cpu 0.750 total Some important notes about this new feature: * If the wheel project is not installed, then this feature will be disabled, however get-pip.py and virtualenv both will now install wheel by default. * If attempting to actually *build* the wheel fails for any reason, it will fall back to the older method of simply installing the sdist directly. * If a project cannot be installed correctly from a wheel, but it can successfully build a wheel, you can disable using wheels for that project by adding the flag --no-binary project1,project2,project3 to tell pip not to use binaries for those projects. You can use the :all: psuedo identifier to disable all wheels. I'm pretty excited about this release, caching built wheels is going to result in a tremendous speedup for a lot of common cases. As with any big change there is a pretty good change that this will cause breakages for some percentage of projects as well as have bugs within the system itself. As always, if you find a bug please feel free to open an issue up on the pip issue tracker at https://github.com/pypa/pip/issues. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From wes.turner at gmail.com Fri May 22 07:34:08 2015 From: wes.turner at gmail.com (Wes Turner) Date: Fri, 22 May 2015 00:34:08 -0500 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: Thanks! Someone (or I) could also update these: * https://github.com/conda/conda-recipes/blob/master/pip/meta.yaml * https://github.com/conda/conda-recipes/blob/master/virtualenv/meta.yaml * https://github.com/conda/conda-recipes/blob/master/virtualenvwrapper/meta.yaml On May 21, 2015 11:21 PM, "Donald Stufft" wrote: > Hey, > > I'm happy to say that I've just cut the releases of pip 7.0 and virtualenv > 13.0 > and I have uploaded them to PyPI. For the full list of changes go visit the > respective changelogs, however the biggest change here is that in pip 7.0 > when > pip finds and downloads a sdist, instead of installing that sdist directly > it will instead build a wheel of that and cache it locally. From then on > out > it will use that cached wheel to install instead of downloading and > building > the sdist each time. This can have a profound impact upon installation > speed. > > > For instance, taking a look at the popular lxml library: > > # Without a locally cached wheel > $ time pip install lxml > ... > pip install lxml 36.00s user 1.40s system 98% cpu 38.117 total > > # The next time, with a primed cache. > $ time pip install lxml > ... > pip install lxml 0.61s user 0.10s system 94% cpu 0.750 total > > > Some important notes about this new feature: > > * If the wheel project is not installed, then this feature will be > disabled, > however get-pip.py and virtualenv both will now install wheel by default. > > * If attempting to actually *build* the wheel fails for any reason, it will > fall back to the older method of simply installing the sdist directly. > > * If a project cannot be installed correctly from a wheel, but it can > successfully build a wheel, you can disable using wheels for that project > by adding the flag --no-binary project1,project2,project3 to tell pip > not to > use binaries for those projects. You can use the :all: psuedo identifier > to > disable all wheels. > > > I'm pretty excited about this release, caching built wheels is going to > result > in a tremendous speedup for a lot of common cases. As with any big change > there > is a pretty good change that this will cause breakages for some percentage > of > projects as well as have bugs within the system itself. As always, if you > find > a bug please feel free to open an issue up on the pip issue tracker at > https://github.com/pypa/pip/issues. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Fri May 22 09:07:09 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 22 May 2015 08:07:09 +0100 Subject: [Distutils] Dynamic linking between Python modules In-Reply-To: References: <2BD32EE5-31ED-4E81-ACB6-3A468AA328D0@stufft.io> <555E201F.3090706@gmail.com> Message-ID: On 21 May 2015 at 23:49, Chris Barker wrote: > On Thu, May 21, 2015 at 1:12 PM, Paul Moore wrote: >> Some possible problem areas - when you install a wheel, it will >> install executable wrappers for the entry points (like pip.exe) which >> are tied to the install location. You'd need to deal with that. But >> presumably conda already has to deal with that because setuptools does >> precisely the same. > > indeed conda build (or is it install -- not sure!) does do path re-writing, > etc. Note that script wrappers are executables, not scripts, (some are exes with scripts alongside, but not all - distlib and pip embed the script in the exe). So simply editing scripts in .py files isn't enough. >> > I'm still a bit confused about the role of wheel here. Why build a >> > wheel, >> > just so you can go install it, rather than simply install the package >> > directly? >> >> Basically, because you can't "simply install". You may not have a >> compiler, or you may not have the required libraries, etc etc. > > I'm thinking of the context of building a conda package -- or rpm, etc. -- > not the general, "you are using pip as your package manager" case. And in > that case, you want to build it to match your environemnt -- which may not > be what a wheel onPyPi matches, for instance. > > So you have to build the package -- not sure what that wheel step buys you. > But it doesn't cost much either, so why not? It avoids the need to embed the knowledge of how to build the package in conda. Remember that "setup.py install" is not necessarily the only way a package might be built once we get to the pluggable build state. >> Generally, pretty much all of the reasons people don't compile all >> their software on their production machines :-) > > right, but you're not running conda build on a production machine, either. I was answering your question "why install from wheels rather than just installing directly". Now that I understand that you're saying "... in the context of building conda packages" the answer is slightly different: - To prepare for a "pluggable build" situation. - To integrate better with the rest of the Python packaging world. - To let people without compilers, but with access to binary wheels, build their own conda packages should the official ones not exist. (possibly others, that's just off the top of my head). Paul From p.f.moore at gmail.com Fri May 22 09:10:22 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 22 May 2015 08:10:22 +0100 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: On 22 May 2015 at 05:21, Donald Stufft wrote: > I'm happy to say that I've just cut the releases of pip 7.0 and virtualenv 13.0 > and I have uploaded them to PyPI. For the full list of changes go visit the > respective changelogs, however the biggest change here is that in pip 7.0 when > pip finds and downloads a sdist, instead of installing that sdist directly > it will instead build a wheel of that and cache it locally. From then on out > it will use that cached wheel to install instead of downloading and building > the sdist each time. This can have a profound impact upon installation speed. Nice! I've been looking forward to this for ages. Thanks to all who worked on the change - in particular Robert Collins who did the bulk of the implementation. Paul From wes.turner at gmail.com Fri May 22 10:13:12 2015 From: wes.turner at gmail.com (Wes Turner) Date: Fri, 22 May 2015 03:13:12 -0500 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: On Fri, May 22, 2015 at 12:34 AM, Wes Turner wrote: > Thanks! > > Someone (or I) could also update these: > * https://github.com/conda/conda-recipes/blob/master/pip/meta.yaml > * https://github.com/conda/conda-recipes/blob/master/virtualenv/meta.yaml > > * > https://github.com/conda/conda-recipes/blob/master/virtualenvwrapper/meta.yaml > Here's this start at updating conda-recipes with the latest PyPA + [virtualenvwrapper, peep] packages https://github.com/conda/conda-recipes/compare/master...westurner:pip_7.0.0 * pip 7.0.0 * virtualenv 13.0 * wheel 0.24.0 * virtualenvwrapper 4.5.1 * peep 2.4.1 Blocking: * BUG: pip 7.0.0 errors with """OSError: [Errno 17] File exists: '${CONDAENV}/_build/lib/python2.7/site-packages/pip/_vendor'""" Thanks again! > On May 21, 2015 11:21 PM, "Donald Stufft" wrote: > >> Hey, >> >> I'm happy to say that I've just cut the releases of pip 7.0 and >> virtualenv 13.0 >> and I have uploaded them to PyPI. For the full list of changes go visit >> the >> respective changelogs, however the biggest change here is that in pip 7.0 >> when >> pip finds and downloads a sdist, instead of installing that sdist directly >> it will instead build a wheel of that and cache it locally. From then on >> out >> it will use that cached wheel to install instead of downloading and >> building >> the sdist each time. This can have a profound impact upon installation >> speed. >> >> >> For instance, taking a look at the popular lxml library: >> >> # Without a locally cached wheel >> $ time pip install lxml >> ... >> pip install lxml 36.00s user 1.40s system 98% cpu 38.117 total >> >> # The next time, with a primed cache. >> $ time pip install lxml >> ... >> pip install lxml 0.61s user 0.10s system 94% cpu 0.750 total >> >> >> Some important notes about this new feature: >> >> * If the wheel project is not installed, then this feature will be >> disabled, >> however get-pip.py and virtualenv both will now install wheel by >> default. >> >> * If attempting to actually *build* the wheel fails for any reason, it >> will >> fall back to the older method of simply installing the sdist directly. >> >> * If a project cannot be installed correctly from a wheel, but it can >> successfully build a wheel, you can disable using wheels for that >> project >> by adding the flag --no-binary project1,project2,project3 to tell pip >> not to >> use binaries for those projects. You can use the :all: psuedo >> identifier to >> disable all wheels. >> >> >> I'm pretty excited about this release, caching built wheels is going to >> result >> in a tremendous speedup for a lot of common cases. As with any big change >> there >> is a pretty good change that this will cause breakages for some >> percentage of >> projects as well as have bugs within the system itself. As always, if you >> find >> a bug please feel free to open an issue up on the pip issue tracker at >> https://github.com/pypa/pip/issues. >> >> --- >> Donald Stufft >> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA >> >> >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> https://mail.python.org/mailman/listinfo/distutils-sig >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri May 22 11:22:34 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 22 May 2015 19:22:34 +1000 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: On 22 May 2015 14:21, "Donald Stufft" wrote: > > Hey, > > I'm happy to say that I've just cut the releases of pip 7.0 and virtualenv 13.0 > and I have uploaded them to PyPI. For the full list of changes go visit the > respective changelogs, however the biggest change here is that in pip 7.0 when > pip finds and downloads a sdist, instead of installing that sdist directly > it will instead build a wheel of that and cache it locally. From then on out > it will use that cached wheel to install instead of downloading and building > the sdist each time. This can have a profound impact upon installation speed. Huzzah! Great work folks :) Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.grisel at ensta.org Fri May 22 13:10:52 2015 From: olivier.grisel at ensta.org (Olivier Grisel) Date: Fri, 22 May 2015 12:10:52 +0100 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: Congrats on the release, wheel caching for sdist is a great new feature! -- Olivier From tseaver at palladion.com Fri May 22 13:59:46 2015 From: tseaver at palladion.com (Tres Seaver) Date: Fri, 22 May 2015 07:59:46 -0400 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 05/22/2015 12:21 AM, Donald Stufft wrote: > I'm happy to say that I've just cut the releases of pip 7.0 and > virtualenv 13.0 and I have uploaded them to PyPI. For the full list of > changes go visit the respective changelogs, however the biggest change > here is that in pip 7.0 when pip finds and downloads a sdist, instead > of installing that sdist directly it will instead build a wheel of > that and cache it locally. From then on out it will use that cached > wheel to install instead of downloading and building the sdist each > time. This can have a profound impact upon installation speed. Nice! I was hoping that this release would make ;tox -e jython` finally work: it's closer, but fails now due to the fact that the '.tox/jython/bin/pip' script isn't set to executable. I don't know whether that is an issue with the pip bundled in Jython 2.7.0, or some other weirdness. Tres. - -- =================================================================== Tres Seaver +1 540-429-0999 tseaver at palladion.com Palladion Software "Excellence by Design" http://palladion.com -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iQIcBAEBAgAGBQJVXxoyAAoJEPKpaDSJE9HYhjUP/RADiv/qM2xRg8UednYfxLub 1AUQk+GRYR4B1zbcNu/f8tcXeKePde7QaQ3si1EE123CBQxk85xqMH/2oYFqwCsw Dxb7QuQdHhhuImxpM4SIVJmb0Tm8YqYIkDNH0NAuRVWkgaWQDfQgHVYlNQWxog/e 8hEnEWs3HjF+9+7GcTL7ny+l6CAmlh32zJaPQi0j+hzYbS54iShYzcWzlkSltFmU N4LLIlel9SWyX0WclpeuP4hhygpPlkqAoMPmh73m1/74jk0My8qXC2UY5KwU/Iar ugK3KOTAohpeTvbgK9wdIzOuqYoX6k0IgAHFejcbXWYoDNFUvS7HAqCoyCHA7JJJ 9SZsWRf8rCfaGEzWrAR3PVOMGmpLO4DuQBZZ3sypIRC2stTOtXzQc5gg0CjvHZjH DVmjXF0/WBcngAnUdkDjIwZTxvQuR0NJEV7H6ODMNwO+UaBGTCojih1JDVG+GYW6 2uXUSAWBlX50zBph6U2GbouAwUbtzTlb8LWFdj/GIQ2UvNwoF4f84EougriXFAnl ODkPxjHt0l9N8WIGfq2v32aHl+VsNgRMptZMV38fGxCSOGM6Eo/eUQgY2Qfcx8g7 wOUHcQRyTAs8AT0Dy++eOZNrdndvQFiYKUMIUzJJaeHuEbGaeKruiOva7/+65GNr N90Y/+cw/feYl/eIOeP4 =Me4+ -----END PGP SIGNATURE----- From randy at thesyrings.us Tue May 26 21:02:58 2015 From: randy at thesyrings.us (Randy Syring) Date: Tue, 26 May 2015 15:02:58 -0400 Subject: [Distutils] Released: pip 7.0 and virtualenv 13.0 In-Reply-To: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> References: <0A947614-178E-4385-8C3D-36AD165CF687@stufft.io> Message-ID: <5564C362.7000105@thesyrings.us> On 05/22/2015 12:21 AM, Donald Stufft wrote: > From then on out > it will use that cached wheel to install instead of downloading and building > the sdist each time. Soooo great! Thanks to the pip team for their diligence! *Randy Syring* Husband | Father | Redeemed Sinner /"For what does it profit a man to gain the whole world and forfeit his soul?" (Mark 8:36 ESV)/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jrm at exa.com Wed May 27 22:18:40 2015 From: jrm at exa.com (jrm) Date: Wed, 27 May 2015 16:18:40 -0400 Subject: [Distutils] Embedding python (and added goodies) in a fundamentally relocatable product distribution Message-ID: <556626A0.3000906@exa.com> I work for a company that ships software products containing ready-to-run python The products contain both a command line python and embedded python used to script complex applications. Our distributions are *fundamentally relocatable*. What I mean by this, is that our install is nothing but an archive extraction. Location information is not modified or patched in anywhere in the tree of files representing our distribution. Our products can be extracted on an NFS or SMB share - mounted differently on various client systems - and run fine everywhere. To distinguish our approach from static/install-time relocation - I sometimes call it a just in time install. We would like to improve the capability of our installed python, to include some of the typical scientific and engineering extensions (numpy, scipy, and so on). Growing weary (and wary) of doing all of our own builds for various third party software - I've begun looking at alternatives that collect a compatible set of python and extensions together. The most capable and easy to get collection of python - with scientific support - appears to be Conda. It's extremely good at pulling together a collection of extensions that play nicely together - and has an appropriately open license. Unfortunately, a Conda distribution must be statically located before use. Other distribution methods I've seen are similarly limited. The Conda distribution is also a bit of a space hog - using hard links that would expand to duplicate copies of files if contained within a distribution such as ours. I can imagine ways to patch up the Conda distribution with symbolic links and smarter start scripts - but I was wondering if any fundamentally better stuff was out there. Any ideas? -jrm James Mason Exa Corp Burlington, MA -------------- next part -------------- An HTML attachment was scrubbed... URL: From guy at rzn.co.il Wed May 27 23:35:36 2015 From: guy at rzn.co.il (Guy Rozendorn) Date: Wed, 27 May 2015 14:35:36 -0700 (PDT) Subject: [Distutils] Embedding python (and added goodies) in a fundamentally relocatable product distribution In-Reply-To: <556626A0.3000906@exa.com> References: <556626A0.3000906@exa.com> Message-ID: <1432762536060.58ce0ae1@Nodemailer> We are doing similar things at Infinidat.You should check ?infi.recipe.application_packager at github.com/Infinidat for building and shipping standalone Python projects, either in platform-specific packages (rpm, MSI, etc) or as standalone executables. ? Sent from Mailbox On Thu, May 28, 2015 at 12:10 AM, jrm wrote: > I work for a company that ships software products containing > ready-to-run python > The products contain both a command line python and embedded python used > to script complex applications. > Our distributions are *fundamentally relocatable*. What I mean by > this, is that our install is nothing but an archive extraction. > Location information is not modified or patched in anywhere in the tree > of files representing our distribution. Our products can be extracted > on an NFS or SMB share - mounted differently on various client systems - > and run fine everywhere. To distinguish our approach from > static/install-time relocation - I sometimes call it a just in time install. > We would like to improve the capability of our installed python, to > include some of the typical scientific and engineering extensions > (numpy, scipy, and so on). > Growing weary (and wary) of doing all of our own builds for various > third party software - I've begun looking at alternatives that collect a > compatible set of python and extensions together. > The most capable and easy to get collection of python - with scientific > support - appears to be Conda. It's extremely good at pulling together > a collection of extensions that play nicely together - and has an > appropriately open license. > Unfortunately, a Conda distribution must be statically located before > use. Other distribution methods I've seen are similarly limited. The > Conda distribution is also a bit of a space hog - using hard links that > would expand to duplicate copies of files if contained within a > distribution such as ours. > I can imagine ways to patch up the Conda distribution with symbolic > links and smarter start scripts - but I was wondering if any > fundamentally better stuff was out there. > Any ideas? > -jrm > James Mason > Exa Corp > Burlington, MA -------------- next part -------------- An HTML attachment was scrubbed... URL: From domen at dev.si Thu May 28 14:02:20 2015 From: domen at dev.si (=?UTF-8?B?RG9tZW4gS2/FvmFy?=) Date: Thu, 28 May 2015 14:02:20 +0200 Subject: [Distutils] Embedding python (and added goodies) in a fundamentally relocatable product distribution Message-ID: You could achieve that using Nix package manager[0] (I won't go into a long story about Conda vs Nix). Once you build python interpreter including all dependencies using buildPythonPackage.buildEnv[1], you'll get a closure (runtime dependency graph[2]) that can be copied to any system with the same architecture that it was built on. The binary doesn't require anything to be installed separately (even not Nix). We have a number of Python packages[6], but there is an on-going effort (lot's of work to be done) to package the whole PyPI (or at least have the capability to do so) with system dependencies. Currently a plain Python interpreter with all runtime dependencies is about 250MB, but this could be reduced down to 50MB using multiple outputs[3] - meaning each package is split into /lib and everything else and only /lib is used for runtime dependency graph. There is a Pull Request open[4] to address the closure size, but it's not yet ready to be merged (I expect that to happen in 2015). If you have more questions, you're welcome to join nix-dev[5] mailing list or come to #nixos channel on Freenode IRC network. Domen [0] http://nixos.org/nix [1] http://nixos.org/nixpkgs/manual/#python-build-env [2] http://static.domenkozar.com/slides/pycon-us-2015/#/5 [3] http://static.domenkozar.com/slides/pycon-us-2015/#/24 [4] https://github.com/NixOS/nixpkgs/pull/7701 [5] http://lists.science.uu.nl/mailman/listinfo/nix-dev [6] http://nixos.org/nixos/packages.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From phodges at id.iit.edu Thu May 28 23:49:40 2015 From: phodges at id.iit.edu (Peter Hodges) Date: Thu, 28 May 2015 16:49:40 -0500 Subject: [Distutils] setuptools Message-ID: Newbie to Python Goal try to use NLTK Mac OSX 10 I downloaded and installed python 3.5 Next I went to the NLTK site. It has instructions about easy_install and setup tools. I went to the Linux and Mac OS line: curl https://bootstrap.pypa.io/ez_setup.py -o - | python I entered this in a terminal window: curl https://bootstrap.pypa.io/ez_setup.py -o - | python % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 11432 100 11432 0 0 49709 0 --:--:-- --:--:-- --:--:-- 49921 Downloading https://pypi.python.org/packages/source/s/setuptools/setuptools-16.0.zip Extracting in /var/folders/fy/c6gf5sxx7px6p0tyj9ks3dw80000gn/T/tmpI86dYV Now working in /var/folders/fy/c6gf5sxx7px6p0tyj9ks3dw80000gn/T/tmpI86dYV/setuptools-16.0 Installing Setuptools running install Checking .pth file support in /Library/Python/2.7/site-packages/ error: can't create or remove files in install directory The following error occurred while trying to add or remove files in the installation directory: [Errno 13] Permission denied: '/Library/Python/2.7/site-packages/test-easy-install-702.pth' The installation directory you specified (via --install-dir, --prefix, or the distutils default setting) was: /Library/Python/2.7/site-packages/ Perhaps your account does not have write access to this directory? If the installation directory is a system-owned directory, you may need to sign in as the administrator or "root" account. If you do not have administrative access to this machine, you may wish to choose a different installation directory, preferably one that is listed in your PYTHONPATH environment variable. For information on other options, you may wish to consult the documentation at: https://pythonhosted.org/setuptools/easy_install.html Please make the appropriate changes for your system and try again. Something went wrong during the installation. See the error message above. I looked at the PATH variable: dhcp127-219:~ phodges$ echo $PATH /Library/Frameworks/Python.framework/Versions/3.5/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin dhcp127-219:~ phodges$ I would appreciate any suggestions on how to proceed. Thanks, Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Fri May 29 12:44:33 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 29 May 2015 11:44:33 +0100 Subject: [Distutils] setuptools In-Reply-To: References: Message-ID: On 28 May 2015 at 22:49, Peter Hodges wrote: > I would appreciate any suggestions on how to proceed. If you downloaded Python 3.5, you should have pip available to you (I use Windows, so I'm not 100% sure how the OSX installers handle it, but it should be there). With pip, you should be able to simply run "pip install nltk" and be up and running. Paul From jeads442 at gmail.com Fri May 29 15:04:26 2015 From: jeads442 at gmail.com (Jason Eads) Date: Fri, 29 May 2015 06:04:26 -0700 Subject: [Distutils] load module issue and fix Message-ID: Hello, Terje and I tweaked this package to correct this error: IOError: [Errno 2] No such file or directory: 'README.md' Added MANIFEST.in and migrated README.md and LICENSE from github project to fix issue with error with package install, with solution mentioned here: https://github.com/wagnerrp/pytmdb3/issues/59 We tried to submit this to PyPI but were denied permission to change 'load'. - Jason R. Eads -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Fri May 29 16:23:16 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Fri, 29 May 2015 17:23:16 +0300 Subject: [Distutils] [Python-Dev] Single-file Python executables (including case of self-sufficient package manager) In-Reply-To: References: <20150529083601.GK932@ando.pearwood.info> Message-ID: <20150529172316.516fce6c@x230> Hello, On Fri, 29 May 2015 08:35:44 -0400 Donald Stufft wrote: [] > Another example is one that I personally worked on recently, where > the company I worked for wanted to distribute a CLI to our customers > which would "just work" that they could use to interact with the [] > particular piece I was feeling like I should be suggesting to my > manager that we throw away the Python code and just write it in Go. Please consider next time thinking about MicroPython for this usecase, as that's exactly why there're people who think that MicroPython is interesting for "desktop" systems, not just bare-metal microcontrollers. There're too few such people so far, unfortunately, so progress is slow. > An example of a product that does this is Chef, they install their > own Ruby and everything but libc into /opt/chef to completely isolate > themselves from the host system. I?m told this made things *much* > easier for them as they don't really have to worry at all about > what's available on the host system, Chef pretty much just works. [] > As folks may or may not know, I'm heavily involved in pip which is > probably one of the most widely used CLIs written in Python. A single > file executable won't help pip, however through my experience there I It's interesting you bring up this case of pip (and Chef), as I had similar concerns/issues when developing a self-hosted package manager for MicroPython. MicroPython doesn't come out of the box with standard library - beyond few builtin modules ("core" library), every other module/package needs to be installed individually (micropython-* modules on PyPI). That makes a package manager a critical component, and means that package manager itself cannot rely on standard library - presence, absence, of specific version (or contents of standard modules). My initial idea was to write single-file script, but we're not ready for self-sufficiency yet anyway (no SSL support, have to rely on wget), and it's a bit of chore anyway. So instead, I made a semi-automated "library subset package" (e.g. os renamed to upip_os), and all such modules comes package together with the main script: https://github.com/micropython/micropython-lib/tree/master/upip , https://pypi.python.org/pypi/micropython-upip . Then we have a script to bootstrap upip: https://github.com/micropython/micropython/blob/master/tools/bootstrap_upip.sh , after which any package can be installed using upip proper. -- Best regards, Paul mailto:pmiscml at gmail.com From jp at jamezpolley.com Sat May 30 00:33:15 2015 From: jp at jamezpolley.com (James Polley) Date: Fri, 29 May 2015 22:33:15 +0000 Subject: [Distutils] setuptools In-Reply-To: References: Message-ID: After a quick spot of research, I'm assuming you got to this point by reading http://www.nltk.org/install.html and following the link in step 1 to https://pypi.python.org/pypi/setuptools If that's the case, the answer to your questions is in the document you read. Under the "Unix and Mac OS" heading it tells you to "follow the wget instructions but replace wget with curl" If you follow the wget instructions, they note that you may get this error, and tell you how to proceed: Note that you will may need to invoke the command with superuser privileges to install to the system Python: > wget https://bootstrap.pypa.io/ez_setup.py -O - | sudo python But don't do that. Those instructions are intended to work anywhere, from a very old system that has a very very old version of python right through to a system that already has everything installed. You should be able to just go back to http://www.nltk.org/install.html and start from step 3: Install Numpy (optional): run sudo pip install -U numpy On Fri, May 29, 2015 at 8:35 PM Peter Hodges wrote: > Newbie to Python > Goal try to use NLTK > > Mac OSX 10 > I downloaded and installed python 3.5 > > Next I went to the NLTK site. > It has instructions about easy_install and setup tools. > > I went to the Linux and Mac OS line: > > curl https://bootstrap.pypa.io/ez_setup.py -o - | python > > > I entered this in a terminal window: > > > curl https://bootstrap.pypa.io/ez_setup.py -o - | python > > % Total % Received % Xferd Average Speed Time Time Time > Current > > Dload Upload Total Spent Left > Speed > > 100 11432 100 11432 0 0 49709 0 --:--:-- --:--:-- --:--:-- > 49921 > > Downloading > https://pypi.python.org/packages/source/s/setuptools/setuptools-16.0.zip > > Extracting in /var/folders/fy/c6gf5sxx7px6p0tyj9ks3dw80000gn/T/tmpI86dYV > > Now working in > /var/folders/fy/c6gf5sxx7px6p0tyj9ks3dw80000gn/T/tmpI86dYV/setuptools-16.0 > > Installing Setuptools > > running install > > Checking .pth file support in /Library/Python/2.7/site-packages/ > > error: can't create or remove files in install directory > > > The following error occurred while trying to add or remove files in the > > installation directory: > > > [Errno 13] Permission denied: > '/Library/Python/2.7/site-packages/test-easy-install-702.pth' > > The installation directory you specified (via --install-dir, --prefix, or > > the distutils default setting) was: > > > /Library/Python/2.7/site-packages/ > > > Perhaps your account does not have write access to this directory? If the > > installation directory is a system-owned directory, you may need to sign in > > as the administrator or "root" account. If you do not have administrative > > access to this machine, you may wish to choose a different installation > > directory, preferably one that is listed in your PYTHONPATH environment > > variable. > > > For information on other options, you may wish to consult the > > documentation at: > > > https://pythonhosted.org/setuptools/easy_install.html > > > Please make the appropriate changes for your system and try again. > > > Something went wrong during the installation. > > See the error message above. > > > I looked at the PATH variable: > > > dhcp127-219:~ phodges$ echo $PATH > > > /Library/Frameworks/Python.framework/Versions/3.5/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin > > dhcp127-219:~ phodges$ > > I would appreciate any suggestions on how to proceed. > Thanks, > Peter > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig > -------------- next part -------------- An HTML attachment was scrubbed... URL: From weegreenblobbie at yahoo.com Sat May 30 01:27:18 2015 From: weegreenblobbie at yahoo.com (Nick Hilton) Date: Fri, 29 May 2015 23:27:18 +0000 (UTC) Subject: [Distutils] Cross Platform PIP install Message-ID: <1453004677.1466147.1432942038365.JavaMail.yahoo@mail.yahoo.com> Hello list! This is my first message to the list so I apologize in advance if this was covered somewhere else but I'm having trouble figuring it out. I have a C++ Python extension module that I configure and build.? My goal is to have the source package in the PyPI to simplify how users install my package.? Ultimately I wish it to be as simple as: $ pip install nsound --user How does one achieve this? The way I currently build my Python extension is by using my build system to generate setup.py, then the usual python setup.py install --user. Are .whl files the way to go?? How do I create nsound .whl for multiple platforms?? Can I have multiple flavors of packages in PyPI?? I'm considering having multiple Python extensions, some dynamically linked with the Jack Audio Connection Kit (JACK), and others not.? Is this possible? Any advice or links to pages would be much appreciated! :) Nick -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrea at andreabedini.com Sat May 30 04:38:23 2015 From: andrea at andreabedini.com (Andrea Bedini) Date: Sat, 30 May 2015 12:38:23 +1000 Subject: [Distutils] setuptools In-Reply-To: References: Message-ID: Hi Peter, my take it as follows. On OS X Install python 3.5 using brew.sh (brew install python3 --devel) then you are good to go. It will come with pip and it will install packages in a folder you have write access to, so pip3 install ntlk will work. PS: a simple explanation of your error is that, apparently, `python` is your system python (not the python you installed) and you don?t have write access to /Library/Python/2.7/site-packages/. You could solve this issues but I recommend the brew way, it?s the setup I use all the time. More details on brew+python are here https://github.com/Homebrew/homebrew/blob/master/share/doc/homebrew/Homebrew-and-Python.md Andrea > On 29 May 2015, at 7:49 am, Peter Hodges wrote: > > Newbie to Python > Goal try to use NLTK > > Mac OSX 10 > I downloaded and installed python 3.5 > > Next I went to the NLTK site. > It has instructions about easy_install and setup tools. > > I went to the Linux and Mac OS line: > > curl https://bootstrap.pypa.io/ez_setup.py -o - | python > > I entered this in a terminal window: > > curl https://bootstrap.pypa.io/ez_setup.py -o - | python > % Total % Received % Xferd Average Speed Time Time Time Current > Dload Upload Total Spent Left Speed > 100 11432 100 11432 0 0 49709 0 --:--:-- --:--:-- --:--:-- 49921 > Downloading https://pypi.python.org/packages/source/s/setuptools/setuptools-16.0.zip > Extracting in /var/folders/fy/c6gf5sxx7px6p0tyj9ks3dw80000gn/T/tmpI86dYV > Now working in /var/folders/fy/c6gf5sxx7px6p0tyj9ks3dw80000gn/T/tmpI86dYV/setuptools-16.0 > Installing Setuptools > running install > Checking .pth file support in /Library/Python/2.7/site-packages/ > error: can't create or remove files in install directory > > The following error occurred while trying to add or remove files in the > installation directory: > > [Errno 13] Permission denied: '/Library/Python/2.7/site-packages/test-easy-install-702.pth' > > The installation directory you specified (via --install-dir, --prefix, or > the distutils default setting) was: > > /Library/Python/2.7/site-packages/ > > Perhaps your account does not have write access to this directory? If the > installation directory is a system-owned directory, you may need to sign in > as the administrator or "root" account. If you do not have administrative > access to this machine, you may wish to choose a different installation > directory, preferably one that is listed in your PYTHONPATH environment > variable. > > For information on other options, you may wish to consult the > documentation at: > > https://pythonhosted.org/setuptools/easy_install.html > > Please make the appropriate changes for your system and try again. > > Something went wrong during the installation. > See the error message above. > > I looked at the PATH variable: > > dhcp127-219:~ phodges$ echo $PATH > /Library/Frameworks/Python.framework/Versions/3.5/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin > dhcp127-219:~ phodges$ > > I would appreciate any suggestions on how to proceed. > Thanks, > Peter > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > https://mail.python.org/mailman/listinfo/distutils-sig -- Andrea Bedini @andreabedini, http://www.andreabedini.com use https://keybase.io/andreabedini to send me private messages -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Sat May 30 05:43:58 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 29 May 2015 20:43:58 -0700 Subject: [Distutils] [Python-Dev] Single-file Python executables (including case of self-sufficient package manager) In-Reply-To: <20150529172316.516fce6c@x230> References: <20150529083601.GK932@ando.pearwood.info> <20150529172316.516fce6c@x230> Message-ID: On Fri, May 29, 2015 at 7:23 AM, Paul Sokolovsky wrote: > > An example of a product that does this is Chef, they install their > > own Ruby and everything but libc into /opt/chef to completely isolate > > themselves from the host system. this sounds a bit like what conda does -- install miniconda, and a conda environment set up with a yaml file,, and away you go. not small, but quite self contained, and give you exactly what you want. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat May 30 12:31:46 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 30 May 2015 20:31:46 +1000 Subject: [Distutils] setuptools In-Reply-To: References: Message-ID: On 30 May 2015 at 08:33, James Polley wrote: > After a quick spot of research, I'm assuming you got to this point by > reading http://www.nltk.org/install.html I filed https://github.com/nltk/nltk/issues/994 to suggest to the NLTK folks that they may want to update that page, if anyone would like to follow through with an actual pull request. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ajfriend at gmail.com Sun May 31 23:07:01 2015 From: ajfriend at gmail.com (AJ Friend) Date: Sun, 31 May 2015 14:07:01 -0700 Subject: [Distutils] pip can't find header file for extension module, but `python setup.py install` works fine Message-ID: Hi, I'm trying to write a new `setup.py` file for an extension module to wrap a C library (https://github.com/cvxgrp/scs). The current `setup.py` file imports numpy. I'm trying to delay that import statement until setuptools has a chance to install numpy if it's not already installed. I'm trying to do that with this bit of code: from setuptools.command.build_ext import build_ext as _build_ext class build_ext(_build_ext): def finalize_options(self): _build_ext.finalize_options(self) # Prevent numpy from thinking it is still in its setup process: __builtins__.__NUMPY_SETUP__ = False import numpy self.include_dirs += ext['include_dirs'] + [numpy.get_include()] Running `python setup.py install` seems to work fine on my OSX machine, but when I run `pip install .` in the directory with `setup.py`, I get a clang error that it can't find one of the header files. Any idea why that would be happening? Could it have anything to do with the relative path I'm giving for the include directories? Also, I had trouble finding good documentation on subclassing build_ext. Does anyone know if setting self.include_dirs overwrites or appends to the include_dirs attribute of an Extension object defined later in setup.py? For the curious, my current attempt at setup.py is athttps://github.com/ajfriend/scs/blob/setup2/python/setup.py. The original can be found in the same directory. More generally, since I'm new to python packaging, I'm not sure how well or correctly I've written my `setup.py` file. Any feedback on doing things correctly would be appreciated. Thanks, AJ