From vitaly.krugl.numenta at gmail.com Wed Sep 7 17:00:05 2016 From: vitaly.krugl.numenta at gmail.com (vitaly numenta) Date: Wed, 7 Sep 2016 14:00:05 -0700 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation Message-ID: I would like to structure my project such that when I build a wheel for deployment to PyPi, the wheel uses the pinned-down versions of dependencies, but during development/experimental builds uses the abstract/loosely-versioned dependencies. I want developers to be able to use the abstract dependencies for experimenting with my package against other versions of the dependencies, but installs of my wheel from PyPi need to reference the concrete (pinned-down) dependencies with which my build system actually tested my wheel. Users that install my wheel from PyPi will not (and should not need to) have access to my package's requirements.txt. https://packaging.python.org/requirements/ talks about using `extras_require` (arg to `setup()` in setup.py) for the abstract or loosely-versioned dependencies, while placing the pinned-down or concrete dependencies in requirements.txt. However, when building a wheel, I don't see a way to build one that incorporates the pinned-down dependencies from requirements.txt instead of the loosely-versioned requirements from `extras_require`. For example: in setup.py: ``` setup( extras_require = { ":platform_system=='Linux' or platform_system=='Darwin'": ["pycapnp"] }, install_requires=[ "psutil>=1.0.1,<2" ], ... ) ``` and in requirements.txt: ``` pycapnp==0.5.8 ; platform_system=='Linux' or platform_system=='Darwin' psutil==1.0.1 ``` However, when I build my wheel via `python setup.py bdist_wheel`, the resulting wheel will reference only the abstract loosely-versioned dependencies from setup.py. `pip wheel -e .` appears to do the same. Thus, when users install my wheel from PyPi, the wrong versions of dependencies would be installed along with it instead of the pinned-down ones that I have in requirements.txt. Can anyone suggest a clean way to solve this? Man thanks, Vitaly -------------- next part -------------- An HTML attachment was scrubbed... URL: From pombredanne at nexb.com Thu Sep 8 04:23:57 2016 From: pombredanne at nexb.com (Philippe Ombredanne) Date: Thu, 8 Sep 2016 10:23:57 +0200 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation In-Reply-To: References: Message-ID: On Wed, Sep 7, 2016 at 11:00 PM, vitaly numenta wrote: > I would like to structure my project such that when I build a wheel for > deployment to PyPi, the wheel uses the pinned-down versions of dependencies, > but during development/experimental builds uses the > abstract/loosely-versioned dependencies. > > I want developers to be able to use the abstract dependencies for > experimenting with my package against other versions of the dependencies, > but installs of my wheel from PyPi need to reference the concrete > (pinned-down) dependencies with which my build system actually tested my > wheel. > > Users that install my wheel from PyPi will not (and should not need to) have > access to my package's requirements.txt. Vitaly: you cannot treat setup.py as a both a flexible set of version ranges and a pinned set at the same time. For a detailed explanation please read Donald's post here: https://caremad.io/2013/07/setup-vs-requirement/ > However, when building a wheel, I don't see a way to build one that incorporates the pinned-down dependencies from requirements.txt instead of the loosely-versioned requirements from `extras_require`. The requirements are not used by setup.py and THEY SHOULD NOT, IMHO. Use requirements for pinned versions, not setup.py. That said, here is what I would do if I had your requirements to fulfill, and still keeping up with the spirit and intent of what a setup.py (abstract) and requirements.txt (concrete) use should be: 1. for my base library (that I will call library A), create a setup.py with install_requires with either no version ranges or a minimum version or a permissive version range and push that to Pypi 2. for that same library A, create a pinned requirements.txt with pip freeze for development usage 3. create a new library B that will have no code except for a setup.py . That setup.py will have install_requires for all the pinned deps with exact version of library A and the pinned version of library A too. With this config: - you users can install library B and get exactly specified versions for Library A and its deps. - your developers can install library A instead and have flexibility if which dep versions they want to use and explore, and can also use exactly pinned down requirements if needed. I amot using this kinda of approach but here are some pointers: For an (old) example of a setup.py-only project for library B you can check this simple https://github.com/pombredanne/anyreadline/blob/master/setup.py (just as an example it is outdated, and predates extras_require in particular) For an example of a setup.py with some extras_requires you can check. There I use lax enough version ranges and in some case very specific ones to work around some bugs in upstream lxml: https://github.com/nexB/scancode-toolkit/blob/develop/setup.py When built as an application, I eventually freeze requirements to get exact versions but I rarely if ever freeze versions in setup.py. -- Cordially Philippe Ombredanne From pombredanne at nexb.com Thu Sep 8 04:35:32 2016 From: pombredanne at nexb.com (Philippe Ombredanne) Date: Thu, 8 Sep 2016 10:35:32 +0200 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation In-Reply-To: References: Message-ID: On Thu, Sep 8, 2016 at 10:23 AM, Philippe Ombredanne wrote: > On Wed, Sep 7, 2016 at 11:00 PM, vitaly numenta > wrote: >> I would like to structure my project such that when I build a wheel for >> deployment to PyPi, the wheel uses the pinned-down versions of dependencies, >> but during development/experimental builds uses the >> abstract/loosely-versioned dependencies. >> >> I want developers to be able to use the abstract dependencies for >> experimenting with my package against other versions of the dependencies, >> but installs of my wheel from PyPi need to reference the concrete >> (pinned-down) dependencies with which my build system actually tested my >> wheel. >> >> Users that install my wheel from PyPi will not (and should not need to) have >> access to my package's requirements.txt. > > Vitaly: > you cannot treat setup.py as a both a flexible set of version ranges > and a pinned set at the same time. > For a detailed explanation please read Donald's post here: > https://caremad.io/2013/07/setup-vs-requirement/ > >> However, when building a wheel, I don't see a way to build one that incorporates the pinned-down dependencies from requirements.txt instead of the loosely-versioned requirements from `extras_require`. > > The requirements are not used by setup.py and THEY SHOULD NOT, IMHO. > Use requirements for pinned versions, not setup.py. In particular if based on your email this is some code you are talking about I would never do this: https://github.com/numenta/nupic/blob/d2aad354226bba6b554c706e74e18b3ed4415a0f/setup.py#L84 e.g. I would NEVER source install_requires from a requirements.txt because this would rob my users and flexibility from any flexibility in versions. I find quite unfortunate that there are so many setup.py that do this and so many snippets and SO posts that recommend this. **sigh** This is the source of many problems, like the one you are facing. -- Cordially Philippe Ombredanne From vkruglikov at numenta.com Thu Sep 8 17:21:23 2016 From: vkruglikov at numenta.com (Vitaly Kruglikov) Date: Thu, 8 Sep 2016 21:21:23 +0000 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation In-Reply-To: References: Message-ID: On 9/8/16, 1:35 AM, "Philippe Ombredanne" wrote: >On Thu, Sep 8, 2016 at 10:23 AM, Philippe Ombredanne > wrote: >> On Wed, Sep 7, 2016 at 11:00 PM, vitaly numenta >> wrote: >>> I would like to structure my project such that when I build a wheel for >>> deployment to PyPi, the wheel uses the pinned-down versions of >>>dependencies, >>> but during development/experimental builds uses the >>> abstract/loosely-versioned dependencies. >>> >>> I want developers to be able to use the abstract dependencies for >>> experimenting with my package against other versions of the >>>dependencies, >>> but installs of my wheel from PyPi need to reference the concrete >>> (pinned-down) dependencies with which my build system actually tested >>>my >>> wheel. >>> >>> Users that install my wheel from PyPi will not (and should not need >>>to) have >>> access to my package's requirements.txt. >> >> Vitaly: >> you cannot treat setup.py as a both a flexible set of version ranges >> and a pinned set at the same time. >> For a detailed explanation please read Donald's post here: >> https://caremad.io/2013/07/setup-vs-requirement/ >> >>> However, when building a wheel, I don't see a way to build one that >>>incorporates the pinned-down dependencies from requirements.txt instead >>>of the loosely-versioned requirements from `extras_require`. >> >> The requirements are not used by setup.py and THEY SHOULD NOT, IMHO. >> Use requirements for pinned versions, not setup.py. > >In particular if based on your email this is some code you are talking >about I would never do this: >https://github.com/numenta/nupic/blob/d2aad354226bba6b554c706e74e18b3ed441 >5a0f/setup.py#L84 >e.g. I would NEVER source install_requires from a requirements.txt >because this would rob my users and flexibility from any flexibility >in versions. > >I find quite unfortunate that there are so many setup.py that do this >and so many snippets and SO posts that recommend this. **sigh** This >is the source of many problems, like the one you are facing. > >-- >Cordially >Philippe Ombredanne Dear Philippe, thank you for your detailed follow-up. I have received the same negative feedback concerning the population of install_requires with pinned-down versions (via requirements.txt or inline) from a number of developers, including several wheel members (and yes, good guess with the nupic link :) ). It's a problem that we run into occasionally with our users as well as internally, so I agree that something needs to be done to address it. My colleagues and I are struggling to reconcile this problem: if our CI system only tests our library with a specific set of pinned-down dependencies, how can we deploy a wheel to PyPi that references a looser set of dependencies? What guarantee would there be that it will work correctly with the looser set of dependencies, since we can?t possibly test with **all** of their combinations? The suggestion from your previous post for solving my problem by deploying library A with loose dependencies to PyPy as well as another ?codeless? library B with only setup.py containing the pinned-down dependencies would likely work. However, the additional complexity of the solution (more moving parts) is telling me that the problem I am trying to solve may be going against the grain of what python packaging and deployment intended. If it?s common practice for wheels deployed to PyPi to incorporate loosely-versioned dependencies, then how do developers mitigate the risk that their wheel might fail with some combination of those dependencies? Thank you, Vitaly From matthew.brett at gmail.com Thu Sep 8 18:00:59 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 8 Sep 2016 15:00:59 -0700 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation In-Reply-To: References: Message-ID: Hi, On Thu, Sep 8, 2016 at 2:21 PM, Vitaly Kruglikov wrote: > On 9/8/16, 1:35 AM, "Philippe Ombredanne" wrote: > >>On Thu, Sep 8, 2016 at 10:23 AM, Philippe Ombredanne >> wrote: >>> On Wed, Sep 7, 2016 at 11:00 PM, vitaly numenta >>> wrote: >>>> I would like to structure my project such that when I build a wheel for >>>> deployment to PyPi, the wheel uses the pinned-down versions of >>>>dependencies, >>>> but during development/experimental builds uses the >>>> abstract/loosely-versioned dependencies. >>>> >>>> I want developers to be able to use the abstract dependencies for >>>> experimenting with my package against other versions of the >>>>dependencies, >>>> but installs of my wheel from PyPi need to reference the concrete >>>> (pinned-down) dependencies with which my build system actually tested >>>>my >>>> wheel. >>>> >>>> Users that install my wheel from PyPi will not (and should not need >>>>to) have >>>> access to my package's requirements.txt. >>> >>> Vitaly: >>> you cannot treat setup.py as a both a flexible set of version ranges >>> and a pinned set at the same time. >>> For a detailed explanation please read Donald's post here: >>> https://caremad.io/2013/07/setup-vs-requirement/ >>> >>>> However, when building a wheel, I don't see a way to build one that >>>>incorporates the pinned-down dependencies from requirements.txt instead >>>>of the loosely-versioned requirements from `extras_require`. >>> >>> The requirements are not used by setup.py and THEY SHOULD NOT, IMHO. >>> Use requirements for pinned versions, not setup.py. >> >>In particular if based on your email this is some code you are talking >>about I would never do this: >>https://github.com/numenta/nupic/blob/d2aad354226bba6b554c706e74e18b3ed441 >>5a0f/setup.py#L84 >>e.g. I would NEVER source install_requires from a requirements.txt >>because this would rob my users and flexibility from any flexibility >>in versions. >> >>I find quite unfortunate that there are so many setup.py that do this >>and so many snippets and SO posts that recommend this. **sigh** This >>is the source of many problems, like the one you are facing. >> >>-- >>Cordially >>Philippe Ombredanne > > Dear Philippe, thank you for your detailed follow-up. I have received the > same negative feedback concerning the population of install_requires with > pinned-down versions (via requirements.txt or inline) from a number of > developers, including several wheel members (and yes, good guess with the > nupic link :) ). It's a problem that we run into occasionally with our > users as well as internally, so I agree that something needs to be done to > address it. > > My colleagues and I are struggling to reconcile this problem: if our CI > system only tests our library with a specific set of pinned-down > dependencies, how can we deploy a wheel to PyPi that references a looser > set of dependencies? What guarantee would there be that it will work > correctly with the looser set of dependencies, since we can?t possibly > test with **all** of their combinations? > > The suggestion from your previous post for solving my problem by deploying > library A with loose dependencies to PyPy as well as another ?codeless? > library B with only setup.py containing the pinned-down dependencies would > likely work. However, the additional complexity of the solution (more > moving parts) is telling me that the problem I am trying to solve may be > going against the grain of what python packaging and deployment intended. > > If it?s common practice for wheels deployed to PyPi to incorporate > loosely-versioned dependencies, then how do developers mitigate the risk > that their wheel might fail with some combination of those dependencies? I believe the most common route is to accept that risk, and mitigate by adding travis-ci matrix entries with different combinations of dependencies to test the common ones. For example: https://github.com/nipy/nibabel/blob/master/.travis.yml Best, Matthew From pombredanne at nexb.com Fri Sep 9 05:07:32 2016 From: pombredanne at nexb.com (Philippe Ombredanne) Date: Fri, 9 Sep 2016 11:07:32 +0200 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation In-Reply-To: References: Message-ID: On Thu, Sep 8, 2016 at 11:21 PM, Vitaly Kruglikov wrote: > My colleagues and I are struggling to reconcile this problem: if our CI > system only tests our library with a specific set of pinned-down > dependencies, how can we deploy a wheel to PyPi that references a looser > set of dependencies? What guarantee would there be that it will work > correctly with the looser set of dependencies, since we can?t possibly > test with **all** of their combinations? > > The suggestion from your previous post for solving my problem by deploying > library A with loose dependencies to PyPy as well as another ?codeless? > library B with only setup.py containing the pinned-down dependencies would > likely work. However, the additional complexity of the solution (more > moving parts) is telling me that the problem I am trying to solve may be > going against the grain of what python packaging and deployment intended. You can do this very simply IMHO by having a regular setup.py with flexible deps; and a requirements.txt with pinned deps and a setup_pinned.py that includes no code and sources your pinned requirements.txt. And you publish both wheels to Pypi. quite simple IMHO > If it?s common practice for wheels deployed to PyPi to incorporate > loosely-versioned dependencies, then how do developers mitigate the risk > that their wheel might fail with some combination of those dependencies? This is not a matter of common practice, this is the intended usage and the sane way to deal with two contexts: - as a library or framework developer, I shall give flexibility to my users with their deps - as an application developer, I want to have something that works with a set of pinned deps that I have tested In your case, you want to fill in both usage in one place... hence your challenge: this cannot be sanely handled with a single description of deps: you need both flexible and pinned. I have similar requirements for one of my projects which is both a framework/library and an application. I solve this this way which works for me: - my setup.py has flexible version ranges, mostly with a lower version limit to support use as a lib - when used as an app, the distribution contains the whole recursive set (vendored) of pinned dependent wheels and I use a small configuration script that pip installs only from these pinned deps. The benefits to me are: 1. when used from Pypi as a lib in some other project, my package has enough flexibility to avoid major conflicts with other package requirements 2. when used as an app, I have total control of which wheel and which file is installed exactly 3. when used as an app, the app is fully self-contained and does not depend on Pypi or network availability This is surely not the only to do this: the suggested approach with two wheels (one flexible , one pinned) is another way. And there are surely other ways to handle this. But the added complexity is likely always needed if you want to support both a flexible and a pinned set of deps. -- Cordially Philippe Ombredanne From pombredanne at nexb.com Fri Sep 9 06:16:58 2016 From: pombredanne at nexb.com (Philippe Ombredanne) Date: Fri, 9 Sep 2016 12:16:58 +0200 Subject: [Wheel-builders] pinned-down dependencies for building wheel versus loose dependencies for experimentation In-Reply-To: References: Message-ID: On Thu, Sep 8, 2016 at 11:21 PM, Vitaly Kruglikov wrote: > The suggestion from your previous post for solving my problem by deploying > library A with loose dependencies to PyPy as well as another ?codeless? > library B with only setup.py containing the pinned-down dependencies would > likely work. However, the additional complexity of the solution (more > moving parts) is telling me that the problem I am trying to solve may be > going against the grain of what python packaging and deployment intended. See an example of this approach, based on an IRC chat excerpt from #pypa (10:45:14 AM) [Tritium]: if you pip install toga with no dependencies, nothing will be installed. The toga package is a meta-package - its just a setup.py file (10:45:40 AM) [Tritium]: its like a distro metapackage; existing only to depend on other packages The example can be found here: https://github.com/pybee/toga/blob/master/setup.py -- Cordially Philippe Ombredanne From amitsaha.in at gmail.com Fri Sep 16 00:42:20 2016 From: amitsaha.in at gmail.com (Amit Saha) Date: Fri, 16 Sep 2016 14:42:20 +1000 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded Message-ID: Hi all, I am trying to build manylinux1 wheels for https://pypi.python.org/pypi/MySQL-python-embedded/1.2.5 Here is my script which I am running inside the build container: #!/bin/bash # Script modified from https://github.com/pypa/python-manylinux-demo set -e -x yum install -y make zlib-devel openssl-devel libaio libaio-devel wget http://dev.mysql.com/get/Downloads/MySQL-5.1/mysql-5.1.51.tar.gz/from/http://mysql.he.net/ tar -zxvf mysql-5.1.51.tar.gz cd /mysql-5.1.51 CFLAGS=-fPIC CXXFLAGS=-fPIC ./configure make install cd libmysqld make install cd / # Compile wheels for PYBIN in /opt/python/cp27*/bin; do LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip install MySQL-Python==1.2.5 --no-index -f /mysql-python-wheels LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip wheel /workspace/ -w wheelhouse/ done # Bundle external shared libraries into the wheels #ls wheelhouse/* for whl in wheelhouse/*linux*.whl; do LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql auditwheel repair $whl -w /workspace/wheelhouse/ done # Install packages and test for PYBIN in /opt/python/cp27*/bin/; do ${PYBIN}/pip install --no-index mysql-python-embedded -f /workspace/wheelhouse ${PYBIN}/python -c "import MySQLdb_embedded; MySQLdb_embedded.start_server()" || true done The wheels build fine (example): Building wheels for collected packages: MySQL-python-embedded, MySQL-python Running setup.py bdist_wheel for MySQL-python-embedded ... done Stored in directory: /wheelhouse Running setup.py bdist_wheel for MySQL-python ... done Stored in directory: /wheelhouse Successfully built MySQL-python-embedded MySQL-python + for whl in 'wheelhouse/*linux*.whl' + LD_LIBRARY_PATH=/opt/rh/devtoolset-2/root/usr/lib64:/opt/rh/devtoolset-2/root/usr/lib:/usr/local/lib64:/usr/local/lib:/usr/local/lib/mysql + auditwheel repair wheelhouse/MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl -w /workspace/wheelhouse/ Repairing MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl Grafting: /lib64/libz.so.1.2.3 -> .libs_mysql/libz-a147dcb0.so.1.2.3 Grafting: /usr/local/lib/mysql/libmysqlclient_r.so.16.0.0 -> .libs_mysql/libmysqlclient_r-0bea0d7c.so.16.0.0 Setting RPATH: _mysql.so to "$ORIGIN/.libs_mysql" Previous filename tags: linux_x86_64 New filename tags: manylinux1_x86_64 Previous WHEEL info tags: cp27-cp27m-linux_x86_64 New WHEEL info tags: cp27-cp27m-manylinux1_x86_64 Fixed-up wheel written to /workspace/wheelhouse/MySQL_python-1.2.5-cp27-cp27m-manylinux1_x86_64.whl Now when I import it inside the same build container: /opt/python/cp27-cp27mu/bin//python -c 'import MySQLdb_embedded; MySQLdb_embedded.start_server()' Traceback (most recent call last): File "", line 1, in File "/opt/python/cp27-cp27mu/lib/python2.7/site-packages/MySQLdb_embedded/__init__.py", line 12, in import _mysql_embedded ImportError: /opt/python/cp27-cp27mu/lib/python2.7/site-packages/_mysql_embedded.so: undefined symbol: __cxa_pure_virtual The reason I download and compile mysql 5.151 from source is mysql python embedded needs to statically link to the libmysqld.a library which on CentOS5 can only be done as far as I have found out by hand compiling above. This is also the reason I use CFLAGS=-fPIC CXXFLAGS=-fPIC when running configure. Not sure what I am doing wrong or what I should be looking at next. Any suggestions will be greatly appreciated. Thank you. Best Wishes, Amit. -- http://echorand.me From vkruglikov at numenta.com Fri Sep 16 02:36:56 2016 From: vkruglikov at numenta.com (Vitaly Kruglikov) Date: Fri, 16 Sep 2016 06:36:56 +0000 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: On 9/15/16, 9:42 PM, "Wheel-builders on behalf of Amit Saha" wrote: >Hi all, > >I am trying to build manylinux1 wheels for >https://pypi.python.org/pypi/MySQL-python-embedded/1.2.5 > >Here is my script which I am running inside the build container: > >#!/bin/bash ># Script modified from https://github.com/pypa/python-manylinux-demo >set -e -x > >yum install -y make zlib-devel openssl-devel libaio libaio-devel > >wget >http://dev.mysql.com/get/Downloads/MySQL-5.1/mysql-5.1.51.tar.gz/from/http >://mysql.he.net/ >tar -zxvf mysql-5.1.51.tar.gz >cd /mysql-5.1.51 >CFLAGS=-fPIC CXXFLAGS=-fPIC ./configure >make install >cd libmysqld >make install >cd / > ># Compile wheels >for PYBIN in /opt/python/cp27*/bin; do > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip >install MySQL-Python==1.2.5 --no-index -f /mysql-python-wheels > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip >wheel /workspace/ -w wheelhouse/ >done > ># Bundle external shared libraries into the wheels >#ls wheelhouse/* >for whl in wheelhouse/*linux*.whl; do > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql auditwheel >repair $whl -w /workspace/wheelhouse/ >done > ># Install packages and test >for PYBIN in /opt/python/cp27*/bin/; do > ${PYBIN}/pip install --no-index mysql-python-embedded -f >/workspace/wheelhouse > ${PYBIN}/python -c "import MySQLdb_embedded; >MySQLdb_embedded.start_server()" || true >done > > > >The wheels build fine (example): > >Building wheels for collected packages: MySQL-python-embedded, >MySQL-python > Running setup.py bdist_wheel for MySQL-python-embedded ... done > Stored in directory: /wheelhouse > Running setup.py bdist_wheel for MySQL-python ... done > Stored in directory: /wheelhouse >Successfully built MySQL-python-embedded MySQL-python >+ for whl in 'wheelhouse/*linux*.whl' >+ >LD_LIBRARY_PATH=/opt/rh/devtoolset-2/root/usr/lib64:/opt/rh/devtoolset-2/r >oot/usr/lib:/usr/local/lib64:/usr/local/lib:/usr/local/lib/mysql >+ auditwheel repair >wheelhouse/MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl -w >/workspace/wheelhouse/ >Repairing MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl >Grafting: /lib64/libz.so.1.2.3 -> .libs_mysql/libz-a147dcb0.so.1.2.3 >Grafting: /usr/local/lib/mysql/libmysqlclient_r.so.16.0.0 -> >.libs_mysql/libmysqlclient_r-0bea0d7c.so.16.0.0 >Setting RPATH: _mysql.so to "$ORIGIN/.libs_mysql" >Previous filename tags: linux_x86_64 >New filename tags: manylinux1_x86_64 >Previous WHEEL info tags: cp27-cp27m-linux_x86_64 >New WHEEL info tags: cp27-cp27m-manylinux1_x86_64 > >Fixed-up wheel written to >/workspace/wheelhouse/MySQL_python-1.2.5-cp27-cp27m-manylinux1_x86_64.whl > > >Now when I import it inside the same build container: > > /opt/python/cp27-cp27mu/bin//python -c 'import MySQLdb_embedded; >MySQLdb_embedded.start_server()' >Traceback (most recent call last): > File "", line 1, in > File >"/opt/python/cp27-cp27mu/lib/python2.7/site-packages/MySQLdb_embedded/__in >it__.py", >line 12, in > import _mysql_embedded >ImportError: >/opt/python/cp27-cp27mu/lib/python2.7/site-packages/_mysql_embedded.so: >undefined symbol: __cxa_pure_virtual > > >The reason I download and compile mysql 5.151 from source is mysql >python embedded needs to statically link to the libmysqld.a library >which on CentOS5 can only be done as far as I have found out by hand >compiling above. This is also the reason I use CFLAGS=-fPIC >CXXFLAGS=-fPIC when running configure. > >Not sure what I am doing wrong or what I should be looking at next. >Any suggestions will be greatly appreciated. > >Thank you. > >Best Wishes, >Amit. > > >-- >http://echorand.me >_______________________________________________ >Wheel-builders mailing list >Wheel-builders at python.org >https://mail.python.org/mailman/listinfo/wheel-builders The explanation in http://stackoverflow.com/questions/920500/what-is-the-purpose-of-cxa-pure-v irtual would suggest that you might not be linking with some necessary default libraries that came with your development environment. While you can build a shared library on a posix system that doesn?t have all the referenced symbols defined, when such a library is loaded into an application, the runtime loader/linker will signal a failure if not all symbols were able to be resolved at this time. It seems like in your case, the default implementation of `__cxa_pure_virtual` was not linked into your extension .so. Not related to your problem, but critically important concepts to be aware of when building python extensions in general and manylinux wheels specifically are symbol visibility, symbol preemption, and ABI compatibility (the latter if you?re compiling from C++ code). If you don?t control symbol visibility, you may cause unpredictable behavior either by having your symbols preempted by a previously-loaded extension or preempting matching symbols in another subsequently-loaded extension. In a python extension .so, all symbols must be hidden, except the extension?s init function. This may be accomplished with a combination of compile-time visibility flags (e.g., https://github.com/numenta/nupic.core/blob/a4a5d3fd62a11a7d9b9094122d49a9be 6889efd0/CommonCompilerConfig.cmake#L258-L259) and an export map (e.g., https://github.com/numenta/nupic.core/blob/0777644933fb8dda5c7c015730cf3717 bea8f724/src/CMakeLists.txt#L786-L815). Here is a summary of the measures that I took to solve the symbol and ABI compatibility issues that I encountered: https://discourse.numenta.org/t/segmentation-fault-while-running-basic-swar m/877/24?u=vkruglikov. You may also need to link with certain standard libraries statically - see comments in https://github.com/numenta/nupic.core/blob/a4a5d3fd62a11a7d9b9094122d49a9be 6889efd0/CommonCompilerConfig.cmake#L185-L197. Good luck! Vitaly From amitsaha.in at gmail.com Fri Sep 16 03:57:48 2016 From: amitsaha.in at gmail.com (Amit Saha) Date: Fri, 16 Sep 2016 17:57:48 +1000 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: On 16 Sep 2016 4:36 pm, "Vitaly Kruglikov" wrote: > > On 9/15/16, 9:42 PM, "Wheel-builders on behalf of Amit Saha" > amitsaha.in at gmail.com> wrote: > > >Hi all, > > > >I am trying to build manylinux1 wheels for > >https://pypi.python.org/pypi/MySQL-python-embedded/1.2.5 > > > >Here is my script which I am running inside the build container: > > > >#!/bin/bash > ># Script modified from https://github.com/pypa/python-manylinux-demo > >set -e -x > > > >yum install -y make zlib-devel openssl-devel libaio libaio-devel > > > >wget > > http://dev.mysql.com/get/Downloads/MySQL-5.1/mysql-5.1.51.tar.gz/from/http > >://mysql.he.net/ > >tar -zxvf mysql-5.1.51.tar.gz > >cd /mysql-5.1.51 > >CFLAGS=-fPIC CXXFLAGS=-fPIC ./configure > >make install > >cd libmysqld > >make install > >cd / > > > ># Compile wheels > >for PYBIN in /opt/python/cp27*/bin; do > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip > >install MySQL-Python==1.2.5 --no-index -f /mysql-python-wheels > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip > >wheel /workspace/ -w wheelhouse/ > >done > > > ># Bundle external shared libraries into the wheels > >#ls wheelhouse/* > >for whl in wheelhouse/*linux*.whl; do > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql auditwheel > >repair $whl -w /workspace/wheelhouse/ > >done > > > ># Install packages and test > >for PYBIN in /opt/python/cp27*/bin/; do > > ${PYBIN}/pip install --no-index mysql-python-embedded -f > >/workspace/wheelhouse > > ${PYBIN}/python -c "import MySQLdb_embedded; > >MySQLdb_embedded.start_server()" || true > >done > > > > > > > >The wheels build fine (example): > > > >Building wheels for collected packages: MySQL-python-embedded, > >MySQL-python > > Running setup.py bdist_wheel for MySQL-python-embedded ... done > > Stored in directory: /wheelhouse > > Running setup.py bdist_wheel for MySQL-python ... done > > Stored in directory: /wheelhouse > >Successfully built MySQL-python-embedded MySQL-python > >+ for whl in 'wheelhouse/*linux*.whl' > >+ > >LD_LIBRARY_PATH=/opt/rh/devtoolset-2/root/usr/lib64:/opt/rh/devtoolset-2/r > >oot/usr/lib:/usr/local/lib64:/usr/local/lib:/usr/local/lib/mysql > >+ auditwheel repair > >wheelhouse/MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl -w > >/workspace/wheelhouse/ > >Repairing MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl > >Grafting: /lib64/libz.so.1.2.3 -> .libs_mysql/libz-a147dcb0.so.1.2.3 > >Grafting: /usr/local/lib/mysql/libmysqlclient_r.so.16.0.0 -> > >.libs_mysql/libmysqlclient_r-0bea0d7c.so.16.0.0 > >Setting RPATH: _mysql.so to "$ORIGIN/.libs_mysql" > >Previous filename tags: linux_x86_64 > >New filename tags: manylinux1_x86_64 > >Previous WHEEL info tags: cp27-cp27m-linux_x86_64 > >New WHEEL info tags: cp27-cp27m-manylinux1_x86_64 > > > >Fixed-up wheel written to > >/workspace/wheelhouse/MySQL_python-1.2.5-cp27-cp27m-manylinux1_x86_64.whl > > > > > >Now when I import it inside the same build container: > > > > /opt/python/cp27-cp27mu/bin//python -c 'import MySQLdb_embedded; > >MySQLdb_embedded.start_server()' > >Traceback (most recent call last): > > File "", line 1, in > > File > >"/opt/python/cp27-cp27mu/lib/python2.7/site-packages/MySQLdb_embedded/__in > >it__.py", > >line 12, in > > import _mysql_embedded > >ImportError: > >/opt/python/cp27-cp27mu/lib/python2.7/site-packages/_mysql_embedded.so: > >undefined symbol: __cxa_pure_virtual > > > > > >The reason I download and compile mysql 5.151 from source is mysql > >python embedded needs to statically link to the libmysqld.a library > >which on CentOS5 can only be done as far as I have found out by hand > >compiling above. This is also the reason I use CFLAGS=-fPIC > >CXXFLAGS=-fPIC when running configure. > > > >Not sure what I am doing wrong or what I should be looking at next. > >Any suggestions will be greatly appreciated. > > > >Thank you. > > > >Best Wishes, > >Amit. > > > > > >-- > >http://echorand.me > >_______________________________________________ > >Wheel-builders mailing list > >Wheel-builders at python.org > >https://mail.python.org/mailman/listinfo/wheel-builders > > > The explanation in > http://stackoverflow.com/questions/920500/what-is-the-purpose-of-cxa-pure-v > irtual would suggest that you might not be linking with some necessary > default libraries that came with your development environment. While you > can build a shared library on a posix system that doesn?t have all the > referenced symbols defined, when such a library is loaded into an > application, the runtime loader/linker will signal a failure if not all > symbols were able to be resolved at this time. It seems like in your case, > the default implementation of `__cxa_pure_virtual` was not linked into > your extension .so. > > Not related to your problem, but critically important concepts to be aware > of when building python extensions in general and manylinux wheels > specifically are symbol visibility, symbol preemption, and ABI > compatibility (the latter if you?re compiling from C++ code). If you don?t > control symbol visibility, you may cause unpredictable behavior either by > having your symbols preempted by a previously-loaded extension or > preempting matching symbols in another subsequently-loaded extension. In a > python extension .so, all symbols must be hidden, except the extension?s > init function. This may be accomplished with a combination of compile-time > visibility flags (e.g., > https://github.com/numenta/nupic.core/blob/a4a5d3fd62a11a7d9b9094122d49a9be > 6889efd0/CommonCompilerConfig.cmake#L258-L259) and an export map (e.g., > https://github.com/numenta/nupic.core/blob/0777644933fb8dda5c7c015730cf3717 > bea8f724/src/CMakeLists.txt#L786-L815). Here is a summary of the measures > that I took to solve the symbol and ABI compatibility issues that I > encountered: > https://discourse.numenta.org/t/segmentation-fault-while-running-basic-swar > m/877/24?u=vkruglikov. You may also need to link with certain standard > libraries statically - see comments in > https://github.com/numenta/nupic.core/blob/a4a5d3fd62a11a7d9b9094122d49a9be > 6889efd0/CommonCompilerConfig.cmake#L185-L197. > > Good luck! Thank you! This is going to be a long battle i think. > > Vitaly > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amitsaha.in at gmail.com Tue Sep 20 03:14:10 2016 From: amitsaha.in at gmail.com (Amit Saha) Date: Tue, 20 Sep 2016 17:14:10 +1000 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: On Fri, Sep 16, 2016 at 2:42 PM, Amit Saha wrote: > Hi all, > > I am trying to build manylinux1 wheels for > https://pypi.python.org/pypi/MySQL-python-embedded/1.2.5 > > Here is my script which I am running inside the build container: > > #!/bin/bash > # Script modified from https://github.com/pypa/python-manylinux-demo > set -e -x > > yum install -y make zlib-devel openssl-devel libaio libaio-devel > > wget http://dev.mysql.com/get/Downloads/MySQL-5.1/mysql-5.1.51.tar.gz/from/http://mysql.he.net/ > tar -zxvf mysql-5.1.51.tar.gz > cd /mysql-5.1.51 > CFLAGS=-fPIC CXXFLAGS=-fPIC ./configure > make install > cd libmysqld > make install > cd / > > # Compile wheels > for PYBIN in /opt/python/cp27*/bin; do > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip > install MySQL-Python==1.2.5 --no-index -f /mysql-python-wheels > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql ${PYBIN}/pip > wheel /workspace/ -w wheelhouse/ > done > > # Bundle external shared libraries into the wheels > #ls wheelhouse/* > for whl in wheelhouse/*linux*.whl; do > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/mysql auditwheel > repair $whl -w /workspace/wheelhouse/ > done > > # Install packages and test > for PYBIN in /opt/python/cp27*/bin/; do > ${PYBIN}/pip install --no-index mysql-python-embedded -f > /workspace/wheelhouse > ${PYBIN}/python -c "import MySQLdb_embedded; > MySQLdb_embedded.start_server()" || true > done > > > > The wheels build fine (example): > > Building wheels for collected packages: MySQL-python-embedded, MySQL-python > Running setup.py bdist_wheel for MySQL-python-embedded ... done > Stored in directory: /wheelhouse > Running setup.py bdist_wheel for MySQL-python ... done > Stored in directory: /wheelhouse > Successfully built MySQL-python-embedded MySQL-python > + for whl in 'wheelhouse/*linux*.whl' > + LD_LIBRARY_PATH=/opt/rh/devtoolset-2/root/usr/lib64:/opt/rh/devtoolset-2/root/usr/lib:/usr/local/lib64:/usr/local/lib:/usr/local/lib/mysql > + auditwheel repair > wheelhouse/MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl -w > /workspace/wheelhouse/ > Repairing MySQL_python-1.2.5-cp27-cp27m-linux_x86_64.whl > Grafting: /lib64/libz.so.1.2.3 -> .libs_mysql/libz-a147dcb0.so.1.2.3 > Grafting: /usr/local/lib/mysql/libmysqlclient_r.so.16.0.0 -> > .libs_mysql/libmysqlclient_r-0bea0d7c.so.16.0.0 > Setting RPATH: _mysql.so to "$ORIGIN/.libs_mysql" > Previous filename tags: linux_x86_64 > New filename tags: manylinux1_x86_64 > Previous WHEEL info tags: cp27-cp27m-linux_x86_64 > New WHEEL info tags: cp27-cp27m-manylinux1_x86_64 > > Fixed-up wheel written to > /workspace/wheelhouse/MySQL_python-1.2.5-cp27-cp27m-manylinux1_x86_64.whl > > > Now when I import it inside the same build container: > > /opt/python/cp27-cp27mu/bin//python -c 'import MySQLdb_embedded; > MySQLdb_embedded.start_server()' > Traceback (most recent call last): > File "", line 1, in > File "/opt/python/cp27-cp27mu/lib/python2.7/site-packages/MySQLdb_embedded/__init__.py", > line 12, in > import _mysql_embedded > ImportError: /opt/python/cp27-cp27mu/lib/python2.7/site-packages/_mysql_embedded.so: > undefined symbol: __cxa_pure_virtual > > > The reason I download and compile mysql 5.151 from source is mysql > python embedded needs to statically link to the libmysqld.a library > which on CentOS5 can only be done as far as I have found out by hand > compiling above. This is also the reason I use CFLAGS=-fPIC > CXXFLAGS=-fPIC when running configure. > > Not sure what I am doing wrong or what I should be looking at next. > Any suggestions will be greatly appreciated. I looked into it again, and I can see that (via readelf), the created _mysql_embedded.so links to the following libraries: 0x0000000000000001 (NEEDED) Shared library: [libz-a147dcb0.so.1.2.3] 0x0000000000000001 (NEEDED) Shared library: [libm.so.6] 0x0000000000000001 (NEEDED) Shared library: [librt.so.1] 0x0000000000000001 (NEEDED) Shared library: [libcrypt.so.1] 0x0000000000000001 (NEEDED) Shared library: [libdl.so.2] 0x0000000000000001 (NEEDED) Shared library: [libaio-f5693f09.so.1.0.1] 0x0000000000000001 (NEEDED) Shared library: [libc.so.6] The libz-a1xx and libaio-xx libraries are not in the list here at https://www.python.org/dev/peps/pep-0513/ Does this mean, I am creating incompatible wheels? > > Thank you. > > Best Wishes, > Amit. > > > -- > http://echorand.me -- http://echorand.me From olivier.grisel at ensta.org Tue Sep 20 03:35:18 2016 From: olivier.grisel at ensta.org (Olivier Grisel) Date: Tue, 20 Sep 2016 09:35:18 +0200 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: > Does this mean, I am creating incompatible wheels? libaio-f5693f09.so.1.0.1 and libz-a147dcb0.so.1.2.3 seem to have been grafted into the .libs_mysql folder of a python package embedded the wheel package. The RPATH of the libraries that need them should have been edited by auditwheel to point to that folder. Can you check whether this is actually the case? -- Olivier From olivier.grisel at ensta.org Tue Sep 20 03:37:31 2016 From: olivier.grisel at ensta.org (Olivier Grisel) Date: Tue, 20 Sep 2016 09:37:31 +0200 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: > The reason I download and compile mysql 5.151 from source is mysql python embedded needs to statically link to the libmysqld.a library. Maybe it's unrelated to your issue but do you know why isn't it possible to let auditwheel embedd libmysql.so into your wheel instead? -- Olivier From amitsaha.in at gmail.com Tue Sep 20 03:37:31 2016 From: amitsaha.in at gmail.com (Amit Saha) Date: Tue, 20 Sep 2016 17:37:31 +1000 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: On Tue, Sep 20, 2016 at 5:35 PM, Olivier Grisel wrote: >> Does this mean, I am creating incompatible wheels? > > libaio-f5693f09.so.1.0.1 and libz-a147dcb0.so.1.2.3 seem to have been > grafted into the .libs_mysql folder of a python package embedded the > wheel package. The RPATH of the libraries that need them should have > been edited by auditwheel to point to that folder. Can you check > whether this is actually the case? Thanks. I see this: (RPATH) Library rpath: [$ORIGIN/.libs_mysql_embedded] So i think that's fine then? > > -- > Olivier -- http://echorand.me From amitsaha.in at gmail.com Tue Sep 20 04:38:56 2016 From: amitsaha.in at gmail.com (Amit Saha) Date: Tue, 20 Sep 2016 18:38:56 +1000 Subject: [Wheel-builders] Help with building/using MySQL-python-embedded In-Reply-To: References: Message-ID: On 20 Sep 2016 5:37 pm, "Olivier Grisel" wrote: > > > The reason I download and compile mysql 5.151 from source is mysql python embedded needs to statically link to the libmysqld.a library. > > Maybe it's unrelated to your issue but do you know why isn't it > possible to let auditwheel embedd libmysql.so into your wheel instead? May be I will try that. Thanks! > > -- > Olivier -------------- next part -------------- An HTML attachment was scrubbed... URL: