[Distutils] Making pip and PyPI work with conda packages

Oscar Benjamin oscar.j.benjamin at gmail.com
Tue May 19 13:27:34 CEST 2015


On 19 May 2015 at 10:55, Paul Moore <p.f.moore at gmail.com> wrote:
>>
>> But python, setuptools, pip, wheel, etc. don't have a way to handle that
>> shared lib as a dependency -- no standard way where to put it, no way to
>> package it as a wheel, etc.
>>
>> So the way to deal with this with wheels is to statically link everything.
>> But that's not how conda pa cakges are built, so no way to leverage conda
>> here.
>
> Thanks for the explanation. So, in effect, conda-as-a-platform defines
> a (somewhat) incompatible platform for running Python, which can use
> wheels just as python.org Python can, but which uses
> conda-as-an-installer as its package manager (much like RPM or apt on
> Unix).
>
> The downside of this is that wheels built for conda (assuming that
> it's OK to link with shared libs) are not compatible with python.org
> builds (as those shared libs aren't available) and that difference
> isn't reflected in the wheel ABI tags (and it's not particularly
> clearly understood by the community, it seems). So publishing
> conda-based wheels on PyPI would be a bad idea, because they wouldn't
> work with python.org python (more precisely, only things that depend
> on shared libs are affected, but the point remains).

I've been peripherally following this thread so I may be missing the
point but it seems to me that Python already has a mature and flexible
way of locating and loading shared libs through the module/import
system. Surely the best way to manage non-Python shared libs is by
exposing them as extension modules which can be packaged up on PyPI.
Then you have dependency resolution for pip, you don't need to worry
about the OS-specific shared library loading details and ABI
information can be stored as metadata in the module. It would even be
possible to load multiple versions or ABIs of the same library as
differently named Python modules IIUC.

As a case in point numpy packages up a load of C code and wraps a
BLAS/Lapack library. Many other extension modules are written which
can all take advantage of the non-Python shared libraries that embody
numpy via its C API.

Is there some reason that this is not considered a good solution?


--
Oscar


More information about the Distutils-SIG mailing list