From benoit at marmelune.net Wed Jun 6 10:52:15 2012 From: benoit at marmelune.net (=?ISO-8859-1?Q?Beno=EEt_Bryon?=) Date: Wed, 06 Jun 2012 10:52:15 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> Message-ID: <4FCF1A3F.1010604@marmelune.net> Hi, Updated https://bitbucket.org/benoitbryon/cpython/src/doc-package-names/Doc/packaging/packagenames.rst Le 31/05/2012 18:00, PJ Eby a ?crit : > You've definitely put in a lot of work on this, and most of the actual > guidelines in your PEP are quite good. I think there's a core part of > this that can and should be a good Informational-track PEP. If it actually becomes a draft PEP, I guess we should: * move the document in PEPs' repository * add notes in Python packaging documentation to reference the PEP. > However, I do have a few comments regarding overall organization, and > some regarding some of the specific recommendations and tone. > > In order of appearance in the PEP, they are: > > 1. I would strongly suggest striking the aside on eggs entirely -- > there's no point to bringing it up just to dismiss it, and there never > was such a thing as an "egg name" - eggs are just another kind of > distribution, so they don't need a special term. Removed the text about eggs. > 2. A project name isn't the name of a directory, it's the name of the > thing you release distributions of. E.g., my "Importing" project on > PyPI (http://pypi.python.org/pypi/Importing), which has the following > names: > > a) Project name = Importing > b) Release = 1.10 > c) Distribution = Importing-1.10.zip > d) Module = peak.util.imports > > Following this conceptual breakdown will make the naming > recommendations easier to follow. Changed "Terminology section". However, I am still confusing "project name" and "distribution name". From your comment, I understand that name argument of packaging.core.setup() is the name of the project, which is used to get names of the distributions like "%(project_name)s-%(version)s.tar.gz". But PEP 345 says it is the name of the distributions : http://www.python.org/dev/peps/pep-0345/#name Also, I propose to shorten or move the "terminology" section in another document (another pep or in packaging documentation): * it would simplify the document * we'd better share the terminology between several documents. > 3. The rationale is unnecessary and could (and *should*) be cut > entirely. Use PEP 8 as a model - it doesn't waste time explaining why > coding guidelines are a good idea. The truth is, your PEP would be > massively improved simply by deleting this entire section and not even > trying to replace it with anything. Changed the "Rationale" section: * kept "relationship with other PEPs" and "other sources of inspiration". * moved "opportunity" section from "Rationale" to "Transition plan" section. I kept it because opportunity should be a valuable reason to adopt the conventions. And unlike other statements of "Rationale", opportunity cannot be explained within the conventions. Would you remove it too? * removed other text in the "Rationale" section. > 4. The proposal section is also unnecessary, and likely to produce > resistance in any event. The "proposal" part of "Rationale" section has been removed. > There is no reason why every package should use your approach,and some > developers will be opposed to your conventions. "Rationale" section used to point out that current (missing) naming rules lead to problems, such as duplicate packages/modules or inconsistent project/package names. Shouldn't we consider them as issues? But, whatever the value of those arguments, I understand some developers will be opposed to proposed conventions. That's why I tried to make the document tell: * at last, in the namespace you own, you do what you want. * at least (and at first), you should state on the conventions you are currently using. * then, understand the issues that names can introduce. * optionally follow the proposed conventions, which may lead to renaming project or packages/modules. Since the document is a bit directive, the ideas above appear in reversed order: first you should apply the convention, but at last you do what you want to. Is it clear in the document? Let's consider the "django-pipeline" VS "pipeline" VS "python-pipeline" example: * there is no obligation at all to rename these projects or packages. * but one should admit there is an issue with names. Here the issue is a duplicate package name. See https://github.com/cyberdelia/django-pipeline/issues/101 * first of all, authors should be aware of the issue. * then, if authors want to, they can apply conventions, which could be a rename of the distributed package only. * if django-pipeline changes its package or project name, it breaks the "most used" scheme (de facto standard) of the Django community. So it would be another homemade naming scheme... * unless the Django community takes the naming issues into account, states about packaging, and optionally promotes migrations. Notice that Django community could state that "there is no valuable issue, i.e. wontfix". But this would mean that this choice is documented in an issue, discussion or in the documentation... so it would fit the guidelines as an explicit community-specific convention. > (I disagree with some of them myself, for that matter, as I will > discuss below.) Some of the material from this section might make a > good later section on "How to apply these guidelines to your project". The "proposal" section has been removed as part of the "Rationale" one. Were you talking about the "Transition plan" instead? I moved the "transition plan" section at the end of the document, i.e. conventions appear before. In fact, I changed the "transition plan" into a set of recipes for existing projects. > 5. The actual guidelines are pretty good, as I said. In particular, I > like pretty much every thing you say about package names. I disagree > with some of your assertions regarding project names, however. It's > true that a project name doesn't need to spell out what a project does > (as with Celery, Nose, etc.), but it does in fact hurt discovery and > use when people use package-based project names. It's a continuing > source of frustration, for example, that buildout recipe projects have > names like z3c.recipe.scripts and other names that I can't ever > remember, and am forced to re-google or dig up previous buildout files > to find the magic names to include in my buildouts. To me, the > important thing about a project name is that it be *memorable*, and > that argues against using namespaced package names as project names > for contributions to a larger project, such as Plone portlets, > Buildout recipes, etc. > > This isn't a huge point of contention for me, I just want to point out > that the good naming choices for projects are more complicated than > "just use the package name and let people search for it". I think > there are some other points you made that are a little too > cut-and-dried in the same way, but this is the only really big one. I updated the conventions: * added an overview of conventions, * introduced the "memorable" value. I agree that choosing a "good" name is not easy. I suppose that the best we can do is to provide guidelines such as "ownership", "memorable" or "meaningful"... so that users can avoid some common pitfalls. Then, it's a matter of priorities. I would say that: 1. first, we should avoid duplicates (project and packages). That's why "top-level namespace for ownership" and "use a single name" seem so important to me. 2. then, it's better if the project name is memorable and meaningful and easy to discover... but it's a less important issue than duplicates, mainly because other metadata such as description, keywords and classifiers can help. > 6. Drop the language about how things are supported or not supported. > This just isn't true: Python the language supports you nesting things > 6 layers deep if you feel like it, and AFAIK there are no plans to > change that, nor should there be. Stick to talking here about what is > or isn't a good idea, not what's supported or not supported. That's > just FUD. Removed "support" vocabulary. Changed "conventions" to "guidelines". Used only lowercase "may", "should" and "can". > 7. PyPI should be the only place people have to check for a > registered distribution name; authors of projects that are hosted > elsewhere can and should use "setup.py register" to claim the name, or > log on and do it. Added "Register names to PyPI" guideline. Changed the "check for name availability" recipe: PyPI is the only place. > Anyway, as I said, I think you've got some really good stuff here, > but it's got a lot of other stuff hiding it. Cut away the extras (and > the parts implying these guidelines are mandatory) and you've got a > good Informational-track PEP. Does it sounds better now? Regards, Benoit -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexis at notmyidea.org Wed Jun 6 11:07:36 2012 From: alexis at notmyidea.org (=?ISO-8859-1?Q?Alexis_M=E9taireau?=) Date: Wed, 06 Jun 2012 11:07:36 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FCF1A3F.1010604@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> Message-ID: <4FCF1DD8.3030500@notmyidea.org> > Also, I propose to shorten or move the "terminology" section in > another document (another pep or in packaging documentation): > > * it would simplify the document > * we'd better share the terminology between several documents. +1 >> There is no reason why every package should use your approach,and >> some developers will be opposed to your conventions. > > "Rationale" section used to point out that current (missing) naming > rules lead to problems, such as duplicate packages/modules or > inconsistent project/package names. Shouldn't we consider them as issues? > > But, whatever the value of those arguments, I understand some > developers will be opposed to proposed conventions. That's conventions, not rules. PEP8 for instance, says that the rules are made to be changed in case there really is a need to, for your project. Having conventions is not about making everyone agree, but about having everyone using the same methods to avoid confusion IMO. Maybe should we state this clearly. Alexis From robhealey1 at gmail.com Sun Jun 10 08:39:18 2012 From: robhealey1 at gmail.com (Rob Healey) Date: Sat, 9 Jun 2012 23:39:18 -0700 Subject: [Distutils] bug#14651 Message-ID: Greetings: I know that there are many many bugs out there right now, I must sincerely and grateful request that bug#14651's patch would please be accepted and applied before python-3.3.0 freeze happens! The only reasons that I must ask at this point is because without it, it completely stops the work that, me and others, are doing on our project??? -- Sincerely yours, Rob G. Healey -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.jerdonek at gmail.com Tue Jun 12 21:24:33 2012 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Tue, 12 Jun 2012 12:24:33 -0700 Subject: [Distutils] building an application made from many internal packages Message-ID: Hi, is there a recommended way of building and deploying a private Python application from many loosely coupled internal packages or modules as opposed to a single large package? Should this be done with a private PyPI server with each module/package having its own setup.py, or is zc.buildout (or some other tool) the more appropriate solution here? Thanks, --Chris From keshav.kini at gmail.com Tue Jun 12 22:40:23 2012 From: keshav.kini at gmail.com (Keshav Kini) Date: Tue, 12 Jun 2012 13:40:23 -0700 Subject: [Distutils] Packaging dependencies Message-ID: <86pq94z324.fsf@zhenghe.ntu.edu.sg> Hi, We're using distribute to package our software, but we'd like to do something different with the "sdist" command. We would like to recursively find all packages necessary to build our package from a base Python install, download these packages, and store them all in a directory inside our source distribution tarball; then, when the tarball is unpacked and setup.py is run, we want "install" to use those cached packages. (This is to support installation on machines that can't or shouldn't access the network.) One solution I found in the distribute docs was to use `easy_install -zmaxd dirname package` on our package to generate and store dependency eggs, and then `easy_install -H none -f dirname` when installing the package. Unfortunately this actually builds packages which have C extensions and stuff like that in them, whereas we would like to only ship the source and build the dependencies when our package is built (to support multiple platforms). Also I guess we'd like `setup.py install` to just work in the directory without the user having to care about the fact that these source packages were shipped inside the tarball they just extracted. So I'm wondering if I can actually do this by importing something from distribute itself and writing some nontrivial code in setup.py. Can someone point me to what would be the best way to do this? I'm guessing I'll need to start by subclassing Environment to create a fake "default environment" which looks like how a bare Python 2.7.3 (say) install would. Then I'd need to somehow modify "sdist" somewhere to make it download the dependency tarballs that would be necessary if you were installing our packages into that Environment, and also need to modify "install" to make it look in the correct directory. Is that about right? Thanks and sorry for the simple question. -Keshav From erik.m.bray at gmail.com Tue Jun 12 22:58:27 2012 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 12 Jun 2012 16:58:27 -0400 Subject: [Distutils] Packaging dependencies In-Reply-To: <86pq94z324.fsf@zhenghe.ntu.edu.sg> References: <86pq94z324.fsf@zhenghe.ntu.edu.sg> Message-ID: On Tue, Jun 12, 2012 at 4:40 PM, Keshav Kini wrote: > Hi, > > We're using distribute to package our software, but we'd like to do > something different with the "sdist" command. We would like to > recursively find all packages necessary to build our package from a base > Python install, download these packages, and store them all in a > directory inside our source distribution tarball; then, when the tarball > is unpacked and setup.py is run, we want "install" to use those cached > packages. (This is to support installation on machines that can't or > shouldn't access the network.) > > One solution I found in the distribute docs was to use `easy_install > -zmaxd dirname package` on our package to generate and store dependency > eggs, and then `easy_install -H none -f dirname` when installing the > package. Unfortunately this actually builds packages which have C > extensions and stuff like that in them, whereas we would like to only > ship the source and build the dependencies when our package is built (to > support multiple platforms). Also I guess we'd like `setup.py install` > to just work in the directory without the user having to care about the > fact that these source packages were shipped inside the tarball they > just extracted. > > So I'm wondering if I can actually do this by importing something from > distribute itself and writing some nontrivial code in setup.py. Can > someone point me to what would be the best way to do this? > > I'm guessing I'll need to start by subclassing Environment to create a > fake "default environment" which looks like how a bare Python 2.7.3 > (say) install would. Then I'd need to somehow modify "sdist" somewhere > to make it download the dependency tarballs that would be necessary if > you were installing our packages into that Environment, and also need to > modify "install" to make it look in the correct directory. Is that about > right? > > Thanks and sorry for the simple question. I've found that the find_links option works fine for this sort of thing. For example, you can put source tarballs of your dependencies in a directory under your source tree like `dependencies/`, and then in setup.cfg add: [easy_install] find_links = dependencies Something like this has worked for me in the past, I think. As for automatically downloading the source, adding a custom subclass of sdist seems like the way to go. The setuptools package contains the required machinery to download dependencies from PyPI (or another package index) with or without installing them. Not sure I fully understand what the issue is regarding building C extensions. Erik From setuptools at bugs.python.org Tue Jun 12 23:08:13 2012 From: setuptools at bugs.python.org (JimC) Date: Tue, 12 Jun 2012 21:08:13 +0000 Subject: [Distutils] [issue138] setuptools.extension incompatible w/ Cython (patch included) Message-ID: <1339535293.28.0.225201206294.issue138@psf.upfronthosting.co.za> New submission from JimC : We discovered a minor problem with setuptools.extension.py that prevents being able to incrementally build extenion modules when working with Cython (meaning, edits to source do NOT result in a call to the cython compiler). A patch is attached to this ticket. The problem is the extension.Extension() class has logic that alters the filename the sources[] list if Pyrex is not installed. The code has a simple boolean test if Pyrex is installed, and if not, it remaps the extension from ".pyx" -> ".c". I believe the simple solution is to perform the same logic for Cython (to set a boolean if Cython is available, and to NOT remap the source extension if either Pyrex OR Cython is installed). Example to demonstrate the problem: setup('project', ..., ext_modules = Extension('Prj', ['Prj/pr1.pyx']), ..., cmdclass = {'build_ext': Cython.Distutils.build_ext}, ..., With a setup.py like this, the setuptools.extension.Extension class will re-write the sources[] list BEFORE the Cython.Distutils.build_ext module can get a shot at it. The result is that subsequent edits of 'Prj/pr1.pyx' will cause build_ext to skip the cython compile step. The only work around to force a recompile is to delete the contents of the 'build/' folder. See the suggested patch for our request for modification. Thanks ---------- files: extension.patch messages: 658 nosy: jim at carroll.com priority: bug status: unread title: setuptools.extension incompatible w/ Cython (patch included) Added file: http://bugs.python.org/setuptools/file80/extension.patch _______________________________________________ Setuptools tracker _______________________________________________ -------------- next part -------------- --- \ORIGINAL\setuptools\extension.py Wed Sep 20 17:05:02 2006 +++ \REVISED\setuptools\extension.py Tue Jun 12 16:35:08 2012 @@ -1,15 +1,17 @@ from distutils.core import Extension as _Extension from dist import _get_unpatched _Extension = _get_unpatched(_Extension) +have_pyrex = True try: from Pyrex.Distutils.build_ext import build_ext except ImportError: - have_pyrex = False -else: - have_pyrex = True + try: + from Cython.Distutils.build_ext import build_ext + except ImportError, exc: + have_pyrex = False class Extension(_Extension): """Extension that uses '.c' files in place of '.pyx' files""" From dholth at gmail.com Wed Jun 13 05:14:32 2012 From: dholth at gmail.com (Daniel Holth) Date: Tue, 12 Jun 2012 23:14:32 -0400 Subject: [Distutils] RFC: package multiple platforms into a single binary package Message-ID: I've been trying to work out a filename convention and specification for packaging multiple platforms worth of Python into a single archive. The feature would reduce the number of mostly-redundant archives required for a package that is mostly Python but has a small extension module, or express that a package contains fat binaries a-la OS X. Now that Python includes the __pycache__ directory and ABI tags for extension modules, it will often be possible to copy installations of the same package compiled for different Python ABIs on top of each other without clobbering any files. For example, a directory could include .pyc files for Python 3.2 and 3.3, or extension modules for win32 and Linux at the same time. When copied into the same location, the only files clobbered by the second architecture should be the identical .py, .dist-info/METADATA and so forth, while the second platform's shared libraries will have a unique name. The proposed filename is {distribution}-{version}-{python tag}-{platform tag}.{archive extension} distribution: Distribution name, e.g. ?django?, ?pyramid? version: PEP 386-compliant version, e.g. 1.0 Python implementation tag Platform tag or 'noarch' For example, package-1.0-py27-noarch.rar is compatible with Python 2.7 (any Python 2.7 implementation) on any CPU architecture. The Python implementation is abbreviated to save space. Each implementation has a two-letter code: py: Generic Python cp: CPython ip: IronPython pp: PyPy jy: Jython concatenated with the ABI tag: SOABI.split(?-?)[-1] ?33m?, or, for ?pure Python? packages, py_version_nodot ?27?. The platform tag is distutils.util.get_platform() with all periods and hyphens replaced with underscore, or the string ?noarch?. The {python tag} field and {platform tag} field are actually .-delimited sets of tags. The filename indicates compatibility with the cartesian product of the two sets. For example, you would be able to combine a win32 and linux_x86_64 archive into package-1.0-py33-linux_x86_64.win32.tar.bz2 The part I'm unsure about is that 'noarch' and 'py27' don't quite fit with 'cp33dmu' (which says something about both the compiled code and the version of Python required by the .py files). Thanks, Daniel Holth From dholth at gmail.com Wed Jun 13 05:33:59 2012 From: dholth at gmail.com (Daniel Holth) Date: Tue, 12 Jun 2012 23:33:59 -0400 Subject: [Distutils] plugin mechanism for packaging itself? Message-ID: I would like to be able to add a command to packaging for all packages, not just my own. This is a mechanism that would stay out of the way of other packages. 1. Distribute a submodule of the namespace package 'packaging_hooks'. 'packaging_hooks.yourpackage' where 'yourpackage' is the pypi name of the package providing the hook 2. Include in the submodule a dictionary COMMANDS={'command_name':'class name' or class} and __all__ = ['COMMANDS', ...] Packaging calls pkgutil.iter_modules(prefix='packaging_hooks.') to import all the submodules of packaging_hooks, looking for COMMANDS= and resolving the dotted name of the command class if the value is not already the command class. This would only be a plugin mechanism for packaging itself, not a general-purpose replacement for entry_points. From marius at pov.lt Wed Jun 13 10:08:38 2012 From: marius at pov.lt (Marius Gedminas) Date: Wed, 13 Jun 2012 11:08:38 +0300 Subject: [Distutils] building an application made from many internal packages In-Reply-To: References: Message-ID: <20120613080837.GA1627@fridge.pov.lt> On Tue, Jun 12, 2012 at 12:24:33PM -0700, Chris Jerdonek wrote: > Hi, is there a recommended way of building and deploying a private > Python application from many loosely coupled internal packages or > modules as opposed to a single large package? I suspect there's more than one. ;) > Should this be done > with a private PyPI server with each module/package having its own > setup.py, or is zc.buildout (or some other tool) the more appropriate > solution here? I generally don't bother setting up private PyPI servers -- dropping the .tar.gz files of private package sdists in a directory and exporting it with Apache (with password protection if you wish) works just fine. There's generally a single version control checkout that contains the buildout configuration (with find-links = https://private-package-page-url) and the main app. I also use buildout-versions to ensure all the dependencies have version pins to avoid unpleasant surprises on deployments. Sometimes unpleasant surprises still happen, when an upstream package sdist disappears off the net and I have to go hunt it down from someone's buildout cache and drop it into the same private package page. Marius Gedminas -- What goes up, must come down. Ask any system administrator. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 190 bytes Desc: Digital signature URL: From pje at telecommunity.com Wed Jun 13 17:06:57 2012 From: pje at telecommunity.com (PJ Eby) Date: Wed, 13 Jun 2012 11:06:57 -0400 Subject: [Distutils] plugin mechanism for packaging itself? In-Reply-To: References: Message-ID: On Tue, Jun 12, 2012 at 11:33 PM, Daniel Holth wrote: > I would like to be able to add a command to packaging for all > packages, not just my own. This is a mechanism that would stay out of > the way of other packages. > > 1. Distribute a submodule of the namespace package 'packaging_hooks'. > 'packaging_hooks.yourpackage' where 'yourpackage' is the pypi name of > the package providing the hook > 2. Include in the submodule a dictionary > COMMANDS={'command_name':'class name' or class} and __all__ = > ['COMMANDS', ...] > > Packaging calls pkgutil.iter_modules(prefix='packaging_hooks.') to > import all the submodules of packaging_hooks, looking for COMMANDS= > and resolving the dotted name of the command class if the value is not > already the command class. > > This would only be a plugin mechanism for packaging itself, not a > general-purpose replacement for entry_points. > Namespace packages are not meant to be a plugin mechanism, and having the stdlib violate this principle would be setting a horrible precedent. Please don't. -------------- next part -------------- An HTML attachment was scrubbed... URL: From dholth at gmail.com Wed Jun 13 17:58:11 2012 From: dholth at gmail.com (Daniel Holth) Date: Wed, 13 Jun 2012 11:58:11 -0400 Subject: [Distutils] plugin mechanism for packaging itself? In-Reply-To: References: Message-ID: Horribly convenient? Entry points already import every __init__.py at the root. This would in most cases be looking in a single directory, instead of iterating over every distribution most of which aren't providing any kind of plugin at all. Why not, apart from the occasional name clashes and only supporting one version of each plugin provider? -------------- next part -------------- An HTML attachment was scrubbed... URL: From pje at telecommunity.com Wed Jun 13 18:15:28 2012 From: pje at telecommunity.com (PJ Eby) Date: Wed, 13 Jun 2012 12:15:28 -0400 Subject: [Distutils] plugin mechanism for packaging itself? In-Reply-To: References: Message-ID: On Wed, Jun 13, 2012 at 11:58 AM, Daniel Holth wrote: > Horribly convenient? Entry points already import every __init__.py at the > root. > What? > This would in most cases be looking in a single directory, instead of > iterating over every distribution most of which aren't providing any kind > of plugin at all. Why not, apart from the occasional name clashes and only > supporting one version of each plugin provider? > It turns the entire concept of a namespace package on its head: the point of the namespace is to reduce conflicts and clarify ownership, not the other way around. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf at systemexit.de Thu Jun 14 22:01:44 2012 From: ralf at systemexit.de (Ralf Schmitt) Date: Thu, 14 Jun 2012 22:01:44 +0200 Subject: [Distutils] [ANNOUNCE] pypiserver 0.6.0 - minimal private pypi server Message-ID: <87haudslw7.fsf@myhost.lan> Hi, I've just uploaded pypiserver 0.6.0 to the python package index. pypiserver is a minimal PyPI compatible server. It can be used to serve a set of packages and eggs to easy_install or pip. pypiserver is easy to install (i.e. just easy_install pypiserver). It doesn't have any external dependencies. http://pypi.python.org/pypi/pypiserver/ should contain enough information to easily get you started running your own PyPI server in a few minutes. The code is available on github: https://github.com/schmir/pypiserver Changes in version 0.6.0 ------------------------- - make pypiserver work with pip on windows - add support for password protected uploads - make pypiserver work with non-root paths - make pypiserver 'paste compatible' - allow to serve multiple package directories using paste -- Cheers, Ralf From benoit at marmelune.net Tue Jun 19 23:49:45 2012 From: benoit at marmelune.net (=?ISO-8859-1?Q?Beno=EEt_Bryon?=) Date: Tue, 19 Jun 2012 23:49:45 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FCF1A3F.1010604@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> Message-ID: <4FE0F3F9.9080706@marmelune.net> Hi, Le 06/06/2012 10:52, Beno?t Bryon a ?crit : > Le 31/05/2012 18:00, PJ Eby a ?crit : >> You've definitely put in a lot of work on this, and most of the >> actual guidelines in your PEP are quite good. I think there's a core >> part of this that can and should be a good Informational-track PEP. > > If it actually becomes a draft PEP, I guess we should: > > * move the document in PEPs' repository > * add notes in Python packaging documentation to reference the PEP. Updated https://bitbucket.org/benoitbryon/cpython/src/doc-package-names/Doc/packaging/packagenames.rst Prepared migration to a PEP. I am to submit the document for review at peps at python.org. Feedback is welcome! Le 06/06/2012 10:52, Beno?t Bryon a ?crit : > Also, I propose to shorten or move the "terminology" section in > another document (another pep or in packaging documentation): > > * it would simplify the document > * we'd better share the terminology between several documents. Moved terminology section to packaging/introduction.rst. See https://bitbucket.org/benoitbryon/cpython/changeset/5eca478e6804#chg-Doc/packaging/introduction.rst Benoit From dholth at gmail.com Sat Jun 23 08:20:45 2012 From: dholth at gmail.com (Daniel Holth) Date: Sat, 23 Jun 2012 02:20:45 -0400 Subject: [Distutils] patch distutils to look into .dist-info directories Message-ID: https://bitbucket.org/dholth/distribute/changeset/0c805a76aa61 Or replacing entry_points for distutils2-style distributions with itself. Is there a name for distutils2-style distributions? (things with a .dist-info directory)? Daniel Holth From pje at telecommunity.com Sat Jun 23 16:45:53 2012 From: pje at telecommunity.com (PJ Eby) Date: Sat, 23 Jun 2012 10:45:53 -0400 Subject: [Distutils] patch distutils to look into .dist-info directories In-Reply-To: References: Message-ID: On Sat, Jun 23, 2012 at 2:20 AM, Daniel Holth wrote: > https://bitbucket.org/dholth/distribute/changeset/0c805a76aa61 > > Just a suggestion: you might want to delay importing email.parser until it's actually needed; i.e. import it inside the @property where you're actually using it. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jim at zope.com Mon Jun 25 01:11:35 2012 From: jim at zope.com (Jim Fulton) Date: Sun, 24 Jun 2012 19:11:35 -0400 Subject: [Distutils] distribute broken Python 3.3.3 Message-ID: https://bitbucket.org/tarek/distribute/issue/289/distribute-broken-with-python-33 I'm gonna try to work around it in buildout 2 by monkey-patching distribute. Jim -- Jim Fulton http://www.linkedin.com/in/jimfulton Jerky is better than bacon! http://zo.pe/Kqm From nad at acm.org Mon Jun 25 04:28:56 2012 From: nad at acm.org (Ned Deily) Date: Sun, 24 Jun 2012 19:28:56 -0700 Subject: [Distutils] distribute broken Python 3.3.3 References: Message-ID: In article , Jim Fulton wrote: > https://bitbucket.org/tarek/distribute/issue/289/distribute-broken-with-python > -33 > > I'm gonna try to work around it in buildout 2 by monkey-patching > distribute. I think this is a duplicate of https://bitbucket.org/tarek/distribute/issue/283/bdist_egg-issues-with-py thon-330ax which has been open for a while. It really needs to be fixed soon; Python 3.3.0 beta 1 will be released in a few days. -- Ned Deily, nad at acm.org From tarek at ziade.org Mon Jun 25 08:53:03 2012 From: tarek at ziade.org (=?ISO-8859-1?Q?Tarek_Ziad=E9?=) Date: Mon, 25 Jun 2012 08:53:03 +0200 Subject: [Distutils] distribute broken Python 3.3.3 In-Reply-To: References: Message-ID: <4FE80ACF.6030604@ziade.org> On 6/25/12 1:11 AM, Jim Fulton wrote: > https://bitbucket.org/tarek/distribute/issue/289/distribute-broken-with-python-33 > > I'm gonna try to work around it in buildout 2 by monkey-patching > distribute. > > Jim > If you write the fix, maybe you can do it against Distribute ? we can release it right after. Cheers Tarek From setuptools at bugs.python.org Mon Jun 25 09:53:53 2012 From: setuptools at bugs.python.org (mbogosian) Date: Mon, 25 Jun 2012 07:53:53 +0000 Subject: [Distutils] [issue139] pkg_resources.require can't deal with multiple installed versions if one is present in easy-install.pth Message-ID: <1340610833.79.0.645871548134.issue139@psf.upfronthosting.co.za> New submission from mbogosian : Perhaps I have misunderstood the intended behavior, but I thought one should be able to do this: - - - - - - - - %< - - - - - - - - % easy_install BeautifulSoup==3.0.8 ... % easy_install -U BeautifulSoup [installs 3.1.0.1] ... % python -c 'import pkg_resources ; print pkg_resources.require("BeautifulSoup==3.0.8")' Traceback (most recent call last): File "", line 1, in File ".../site-packages/distribute-0.6.10-py2.5.egg/pkg_resources.py", line 648, in require needed = self.resolve(parse_requirements(requirements)) File ".../site-packages/distribute-0.6.10-py2.5.egg/pkg_resources.py", line 550, in resolve raise VersionConflict(dist,req) # XXX put more info here pkg_resources.VersionConflict: (BeautifulSoup 3.1.0.1 (.../site-packages/BeautifulSoup-3.1.0.1-py2.5.egg), Requirement.parse('BeautifulSoup==3.0.8')) - - - - - - - - >% - - - - - - - - If I use the '-m' tag (to keep it out of easy_install.pth), I have a different problem: - - - - - - - - %< - - - - - - - - % easy_install -m BeautifulSoup==3.0.8 ... % easy_install -U -m BeautifulSoup [installs 3.1.0.1] ... % python -c 'import pkg_resources ; print pkg_resources.require("BeautifulSoup==3.0.8")' [BeautifulSoup 3.0.8 (.../site-packages/BeautifulSoup-3.0.8-py2.5.egg)] % python -c 'import BeautifulSoup' Traceback (most recent call last): File "", line 1, in ImportError: No module named BeautifulSoup - - - - - - - - >% - - - - - - - - Can I not have my cake and eat it too? Meaning, can I not have multiple installed versions of a package with a default (i.e., an entry in easy-install.pth), but still be able to explicitly specify a previously installed version? ---------- messages: 659 nosy: mbogosian priority: bug status: unread title: pkg_resources.require can't deal with multiple installed versions if one is present in easy-install.pth _______________________________________________ Setuptools tracker _______________________________________________ From jim at zope.com Mon Jun 25 12:57:54 2012 From: jim at zope.com (Jim Fulton) Date: Mon, 25 Jun 2012 06:57:54 -0400 Subject: [Distutils] distribute broken Python 3.3.3 In-Reply-To: <4FE80ACF.6030604@ziade.org> References: <4FE80ACF.6030604@ziade.org> Message-ID: On Mon, Jun 25, 2012 at 2:53 AM, Tarek Ziad? wrote: > On 6/25/12 1:11 AM, Jim Fulton wrote: >> >> >> https://bitbucket.org/tarek/distribute/issue/289/distribute-broken-with-python-33 >> >> I'm gonna try to work around it in buildout 2 by monkey-patching >> distribute. >> >> Jim >> > If you write the fix, maybe you can do it against Distribute ? we can > release it right after. I'm not interested in fixing the scanning of pyc files, because: 1. buildout2 always unzips, because almost no one wants zipped eggs, and 2. Because the zip_safe setup flag is ignored by distribute. https://bitbucket.org/tarek/distribute/issue/279/zip_safe-ignored-by-easy_install-command 3. Life is short. :) I would be willing to fix distribute to not scan if the the --always-unzip, -Z option is passed to it. My monkey-patch simply replaces setuptools.command.bdist_egg.can_scan with lambda:False. My recommendation is is to just stop scanning, and don't zip installed eggs unless the zip_safe flag is present and true (and the --always-unzip option isn't used). I'd be willing to help with this, but I probably couldn't get to it for a couple of weeks at the earliest. This is, of course, backward-incompatible, although I doubt anyone would really care. Jim -- Jim Fulton http://www.linkedin.com/in/jimfulton Jerky is better than bacon! http://zo.pe/Kqm From reinout at vanrees.org Mon Jun 25 13:44:50 2012 From: reinout at vanrees.org (Reinout van Rees) Date: Mon, 25 Jun 2012 13:44:50 +0200 Subject: [Distutils] unneeded warning about missing readme in case of rst Message-ID: Hi, I often get warnings like: warning: sdist: standard file not found: should have one of README, README.txt The reason for the missing file is that it is called README.rst now because bitbucket and github pick up that extension. Is there something that can be done about this warning? (See also a zest.releaser ticket about this: https://github.com/zestsoftware/zest.releaser/issues/14 ) Reinout -- Reinout van Rees http://reinout.vanrees.org/ reinout at vanrees.org http://www.nelen-schuurmans.nl/ "If you're not sure what to do, make something. -- Paul Graham" From hanno at hannosch.eu Mon Jun 25 13:49:25 2012 From: hanno at hannosch.eu (Hanno Schlichting) Date: Mon, 25 Jun 2012 13:49:25 +0200 Subject: [Distutils] unneeded warning about missing readme in case of rst In-Reply-To: References: Message-ID: On Mon, Jun 25, 2012 at 1:44 PM, Reinout van Rees wrote: > warning: sdist: standard file not found: should have one of README, > README.txt > > The reason for the missing file is that it is called README.rst now because > bitbucket and github pick up that extension. Is there something that can be > done about this warning? I think this got changed in distribute for version 0.6.27. From the changelog: Distribute now recognizes README.rst as a standard, default readme file. Not sure if this needs changes on the PyPi server end. Hanno From reinout at vanrees.org Mon Jun 25 14:00:10 2012 From: reinout at vanrees.org (Reinout van Rees) Date: Mon, 25 Jun 2012 14:00:10 +0200 Subject: [Distutils] unneeded warning about missing readme in case of rst In-Reply-To: References: Message-ID: On 25-06-12 13:49, Hanno Schlichting wrote: > On Mon, Jun 25, 2012 at 1:44 PM, Reinout van Rees wrote: >> warning: sdist: standard file not found: should have one of README, >> README.txt >> >> The reason for the missing file is that it is called README.rst now because >> bitbucket and github pick up that extension. Is there something that can be >> done about this warning? > > I think this got changed in distribute for version 0.6.27. From the changelog: > > Distribute now recognizes README.rst as a standard, default readme file. Thanks! Yep, that's it. I somehow had an older distribute in my 'tools' virtualenv. Reinout -- Reinout van Rees http://reinout.vanrees.org/ reinout at vanrees.org http://www.nelen-schuurmans.nl/ "If you're not sure what to do, make something. -- Paul Graham" From pje at telecommunity.com Mon Jun 25 18:34:20 2012 From: pje at telecommunity.com (PJ Eby) Date: Mon, 25 Jun 2012 12:34:20 -0400 Subject: [Distutils] [issue139] pkg_resources.require can't deal with multiple installed versions if one is present in easy-install.pth In-Reply-To: <1340610833.79.0.645871548134.issue139@psf.upfronthosting.co.za> References: <1340610833.79.0.645871548134.issue139@psf.upfronthosting.co.za> Message-ID: On Mon, Jun 25, 2012 at 3:53 AM, mbogosian wrote: > > Perhaps I have misunderstood the intended behavior, but I thought one > should be able to do this: > > % easy_install BeautifulSoup==3.0.8 > ... > % easy_install -U BeautifulSoup > [installs 3.1.0.1] > ... > % python -c 'import pkg_resources ; print > pkg_resources.require("BeautifulSoup==3.0.8")' > Traceback (most recent call last): > pkg_resources.VersionConflict: (BeautifulSoup 3.1.0.1 > (.../site-packages/BeautifulSoup-3.1.0.1-py2.5.egg), > Requirement.parse('BeautifulSoup==3.0.8')) > Do this instead: python -c '__requires__=['BeautifulSoup==3.0.8']; import pkg_resources; import beautifulsoup' That is, if the main script declares a __requires__ variable before importing pkg_resources, pkg_resources will allow you to override what would normally produce a VersionConflict due to a default version being present on the command line. This is an undocumented but supported workaround for the issue, as it's used internally by the scripts easy_install generates. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tarek at ziade.org Mon Jun 25 18:53:27 2012 From: tarek at ziade.org (=?ISO-8859-1?Q?Tarek_Ziad=E9?=) Date: Mon, 25 Jun 2012 18:53:27 +0200 Subject: [Distutils] distribute broken Python 3.3.3 In-Reply-To: References: <4FE80ACF.6030604@ziade.org> Message-ID: <4FE89787.6050300@ziade.org> On 6/25/12 12:57 PM, Jim Fulton wrote: > On Mon, Jun 25, 2012 at 2:53 AM, Tarek Ziad? wrote: >> On 6/25/12 1:11 AM, Jim Fulton wrote: >>> >>> https://bitbucket.org/tarek/distribute/issue/289/distribute-broken-with-python-33 >>> >>> I'm gonna try to work around it in buildout 2 by monkey-patching >>> distribute. >>> >>> Jim >>> >> If you write the fix, maybe you can do it against Distribute ? we can >> release it right after. > I'm not interested in fixing the scanning of pyc files, because: > > 1. buildout2 always unzips, because almost no one wants zipped eggs, and > > 2. Because the zip_safe setup flag is ignored by distribute. > > https://bitbucket.org/tarek/distribute/issue/279/zip_safe-ignored-by-easy_install-command > > 3. Life is short. :) > > I would be willing to fix distribute to not scan if the the > --always-unzip, -Z option is passed to it. > > My monkey-patch simply replaces setuptools.command.bdist_egg.can_scan > with lambda:False. > > My recommendation is is to just stop scanning, and don't zip installed > eggs unless > the zip_safe flag is present and true (and the --always-unzip option > isn't used). I'd be > willing to help with this, but I probably couldn't get to it for a > couple of weeks at the earliest. > This is, of course, backward-incompatible, although I doubt anyone > would really care. That sounds like a good plan to me. > > Jim > > From dholth at gmail.com Mon Jun 25 20:13:36 2012 From: dholth at gmail.com (Daniel Holth) Date: Mon, 25 Jun 2012 14:13:36 -0400 Subject: [Distutils] complete 'extras' proposal Message-ID: The prose isn't PEP-compliant yet, but here is my whole proposal for adding extras to .dist-info-style / Metadata 1.2 distributions. An 'extra' is an optional feature that can augment a distribution's requirements. An extra name must be a valid Python identifier and is referenced using subscript notation, e.g. distribution[extra]. As a shortcut, multiple extras can be referenced as a comma-separated list distribution[extra1,extra2,...] which is expanded to "distribution distribution[extra1] distribution[extra2]". (It is not legal to require an extra without also requiring the providing distribution). The name 'extra' is added to the list of variables in PKG-INFO environment markers. During marker execution 'extra' is either 'None' or the name of the extra that is being installed. (The markers must be evaluated once for each extra that is being installed.) Extras are declared with the new Provides-Extra: extra field, repeated for each extra that a distribution provides. A distribution need not reference Provides-Extra: in any Requires-Dist: fields, a case that will most often occur to preserve compatibility with other distributions that require the named extra when a once-optional dependency becomes standard. The extra names 'test' and 'doc' are reserved for requirements that are needed for running automated tests or building documentation, respectively. Example lines from PKG-INFO: Requires-Dist: nose; extra == 'test' Provides-Extra: test From dholth at gmail.com Mon Jun 25 20:35:11 2012 From: dholth at gmail.com (Daniel Holth) Date: Mon, 25 Jun 2012 14:35:11 -0400 Subject: [Distutils] patch distutils to look into .dist-info directories In-Reply-To: References: Message-ID: OK I'll do that. Of course it also means the code won't work on Python < when e-mail.parser was introduced. What is the floor on Python versions for distutils? Would it work to require Python 2.6 just for the .dist-info bits? From hanno at hannosch.eu Mon Jun 25 21:52:31 2012 From: hanno at hannosch.eu (Hanno Schlichting) Date: Mon, 25 Jun 2012 21:52:31 +0200 Subject: [Distutils] patch distutils to look into .dist-info directories In-Reply-To: References: Message-ID: On Mon, Jun 25, 2012 at 8:35 PM, Daniel Holth wrote: > OK I'll do that. Of course it also means the code won't work on Python > < when e-mail.parser was introduced. > > What is the floor on Python versions for distutils? distribute and setuptools try to support Python >= 2.3 - though the earlier versions have likely seen less testing as of late. Not sure if you really meant distutils, as that is part of Python itself. > Would it work to > require Python 2.6 just for the .dist-info bits? That's not ideal in any way, but I'd say practicality beats purity here. So if it's much simpler to write this code for Python 2.6, go for it. Hanno From dholth at gmail.com Mon Jun 25 22:12:47 2012 From: dholth at gmail.com (Daniel Holth) Date: Mon, 25 Jun 2012 16:12:47 -0400 Subject: [Distutils] patch distutils to look into .dist-info directories In-Reply-To: References: Message-ID: > Not sure if you really meant distutils, as that is part of Python itself. Of course I meant setuptools / distribute From bkabrda at redhat.com Tue Jun 26 08:56:13 2012 From: bkabrda at redhat.com (Bohuslav Kabrda) Date: Tue, 26 Jun 2012 02:56:13 -0400 (EDT) Subject: [Distutils] Self Introduction and Getting Hands On In-Reply-To: Message-ID: <3b2aaf04-5f6d-4cc3-86b4-64234b50610c@zmail15.collab.prod.int.phx2.redhat.com> Hi all, I recently came upon Tarek's blogpost [1] about converting Python packages to rpm specfiles. I have pretty good knowledge of this, as I am a Fedora package maintainer and Python is one of my main responsibilities. I have always wanted to become a Python developer and becoming part of distutils-sig and trying to give a helping hand with RPM related stuff is probably a good entry point for me :) To introduce myself, I work at Red Hat as a package maintainer for dynamic languages, I am author of the new Fedora-used pyp2rpm [2], [3], which tries to do exactly what Py2RPM is supposed to be. (Well, it's not perfect, but it can do 90 % of the work, usually. So far, it doesn't work with distutils2, but I'll probably be working on that when Python 3.3 gets released.) I also have some minor not-yet-released projects in Python, I work with Django and I'm just a Python enthusiast. In Fedora, I am a maintainer/comaintainer of both Python and Python3, Django and another 15+ (and growing) set of packages. Thanks for reading this through :) I'm looking forward to working with - and learning from - all of you. -- Regards, Bohuslav "Slavek" Kabrda. [1] http://ziade.org/2011/03/25/bdist_rpm-is-dead-long-life-to-py2rpm/ [2] http://pypi.python.org/pypi/pyp2rpm [3] https://bitbucket.org/bkabrda/pyp2rpm From tarek at ziade.org Tue Jun 26 10:16:26 2012 From: tarek at ziade.org (=?ISO-8859-1?Q?Tarek_Ziad=E9?=) Date: Tue, 26 Jun 2012 10:16:26 +0200 Subject: [Distutils] Self Introduction and Getting Hands On In-Reply-To: <3b2aaf04-5f6d-4cc3-86b4-64234b50610c@zmail15.collab.prod.int.phx2.redhat.com> References: <3b2aaf04-5f6d-4cc3-86b4-64234b50610c@zmail15.collab.prod.int.phx2.redhat.com> Message-ID: <4FE96FDA.8030206@ziade.org> On 6/26/12 8:56 AM, Bohuslav Kabrda wrote: > Hi all, > I recently came upon Tarek's blogpost [1] about converting Python packages to rpm specfiles. I have pretty good knowledge of this, as I am a Fedora package maintainer and Python is one of my main responsibilities. I have always wanted to become a Python developer and becoming part of distutils-sig and trying to give a helping hand with RPM related stuff is probably a good entry point for me :) > > To introduce myself, I work at Red Hat as a package maintainer for dynamic languages, I am author of the new Fedora-used pyp2rpm [2], [3], which tries to do exactly what Py2RPM is supposed to be. (Well, it's not perfect, but it can do 90 % of the work, usually. So far, it doesn't work with distutils2, but I'll probably be working on that when Python 3.3 gets released.) I also have some minor not-yet-released projects in Python, I work with Django and I'm just a Python enthusiast. In Fedora, I am a maintainer/comaintainer of both Python and Python3, Django and another 15+ (and growing) set of packages. > > Thanks for reading this through :) I'm looking forward to working with - and learning from - all of you. > Welcome ! As I stated in the blog post, I think a separate project for all RPM -related stuff would make a lot of sense FWIW I have started a "pypi2rpm" project that creates RPM out of PyPI projects, we use at Mozilla to deploy our apps, http://pypi.python.org/pypi/pypi2rpm It's just a glue script on the top of an isolated bdist_rpm command and distutils2.version (for sorting versions) but you can also pass your spec file for a given project and have Fedora/RHEL specific options (like adding python26- prefixes etc) It sounds like your project is on the other side, e.g. generating spec files out of Python projects, so I'd be really interested to see how it may replace bdist_rpm in pypi2rpm Cheers Tarek From bkabrda at redhat.com Tue Jun 26 13:23:32 2012 From: bkabrda at redhat.com (Bohuslav Kabrda) Date: Tue, 26 Jun 2012 07:23:32 -0400 (EDT) Subject: [Distutils] Self Introduction and Getting Hands On In-Reply-To: <4FE96FDA.8030206@ziade.org> Message-ID: <6fa805f3-dd91-42c4-8b1d-92bb721b6352@zmail15.collab.prod.int.phx2.redhat.com> ----- Original Message ----- > On 6/26/12 8:56 AM, Bohuslav Kabrda wrote: > > Hi all, > > I recently came upon Tarek's blogpost [1] about converting Python > > packages to rpm specfiles. I have pretty good knowledge of this, > > as I am a Fedora package maintainer and Python is one of my main > > responsibilities. I have always wanted to become a Python > > developer and becoming part of distutils-sig and trying to give a > > helping hand with RPM related stuff is probably a good entry point > > for me :) > > > > To introduce myself, I work at Red Hat as a package maintainer for > > dynamic languages, I am author of the new Fedora-used pyp2rpm [2], > > [3], which tries to do exactly what Py2RPM is supposed to be. > > (Well, it's not perfect, but it can do 90 % of the work, usually. > > So far, it doesn't work with distutils2, but I'll probably be > > working on that when Python 3.3 gets released.) I also have some > > minor not-yet-released projects in Python, I work with Django and > > I'm just a Python enthusiast. In Fedora, I am a > > maintainer/comaintainer of both Python and Python3, Django and > > another 15+ (and growing) set of packages. > > > > Thanks for reading this through :) I'm looking forward to working > > with - and learning from - all of you. > > > Welcome ! > > As I stated in the blog post, I think a separate project for all RPM > -related stuff would make a lot of sense > I agree. RPM (and distro-packaging in general) things have different lifecycle, so it may be nice to be able to reflect that. > FWIW I have started a "pypi2rpm" project that creates RPM out of PyPI > projects, we use at Mozilla to deploy our apps, > http://pypi.python.org/pypi/pypi2rpm > It's just a glue script on the top of an isolated bdist_rpm command > and > distutils2.version (for sorting versions) but you can also pass your > spec file for a given > project and have Fedora/RHEL specific options (like adding python26- > prefixes etc) > > It sounds like your project is on the other side, e.g. generating > spec > files out of Python projects, so I'd be really interested to see how > it > may replace bdist_rpm in pypi2rpm > Exactly, my project is aimed at creating just the specfiles. In Fedora (and other distros as well), there is a review process that each package must go through and having a specfile written according to the current guidelines is a necessary part of that. "pypi2rpm" solves a different use case than distro-packagers need. I have a pretty good idea about how distro-packagers would like things to work and combining it with the "pypi2rpm", I see it this way: 1) Starting point: we have a Python package. 2) We want to: 2a) directly convert it to binary RPM => we use "pypi2rpm" - good for Mozilla to deploy apps, not useful for distro-packagers - Producing binary RPM probably could be left in distutils, but looking at the bigger picture, RPM has different lifecycle than python standard library, so it might be good to pull this functionality out. 2b) produce a specfile/whatever that is called in other distributions => use Fedora's "pyp2rpm"/is there something similar for other distributions? - This should definitely be present in a library outside of Python - we may want to provide templates/distro-specific stuff here and be able to react to new releases of different distros (when they change their way of packaging). So the best approach that currently comes to my mind is having a functionality in distutils, that can provide the package metadata (license, runtime/build requirements) and separate tool(s) that would be able to use these metadata to a) convert directly to binary packages (rpm, deb, etc.) or b) produce specfiles/whatever for the packagers' reviews Some questions that pop immediately are - Should a) and b) be one library or two? They handle packages in completely different way. - Should every distribution provide its very own library to handle a) and b) stuff or should these be a general libraries with distro-specific modules+templates? ... How does this sound? This is just a general idea that came to my mind some time ago - feel free to dump it, if you don't like it :) Have a nice day, Slavek. > Cheers > Tarek > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > http://mail.python.org/mailman/listinfo/distutils-sig > -- Regards, Bohuslav "Slavek" Kabrda. From tarek at ziade.org Tue Jun 26 13:44:48 2012 From: tarek at ziade.org (=?UTF-8?B?VGFyZWsgWmlhZMOp?=) Date: Tue, 26 Jun 2012 13:44:48 +0200 Subject: [Distutils] Self Introduction and Getting Hands On In-Reply-To: <6fa805f3-dd91-42c4-8b1d-92bb721b6352@zmail15.collab.prod.int.phx2.redhat.com> References: <6fa805f3-dd91-42c4-8b1d-92bb721b6352@zmail15.collab.prod.int.phx2.redhat.com> Message-ID: <4FE9A0B0.5040505@ziade.org> On 6/26/12 1:23 PM, Bohuslav Kabrda wrote: > [...] > Exactly, my project is aimed at creating just the specfiles. In Fedora (and other distros as well), there is a review process that each package must go through and having a specfile written according to the current guidelines is a necessary part of that. "pypi2rpm" solves a different use case than distro-packagers need. > > I have a pretty good idea about how distro-packagers would like things to work and combining it with the "pypi2rpm", I see it this way: > 1) Starting point: we have a Python package. > 2) We want to: > 2a) directly convert it to binary RPM => we use "pypi2rpm" - good for Mozilla to deploy apps, not useful for distro-packagers > - Producing binary RPM probably could be left in distutils, but looking at the bigger picture, RPM has different lifecycle than python standard library, so it might be good to pull this functionality out. Yeah -- I had no intent to keep bdist_rpm in the stdlib. We removed it from distiutils2 > 2b) produce a specfile/whatever that is called in other distributions => use Fedora's "pyp2rpm"/is there something similar for other distributions? You mean for RPM that are not Fedora/RHEL ? I have no idea. > - This should definitely be present in a library outside of Python - we may want to provide templates/distro-specific stuff here and be able to react to new releases of different distros (when they change their way of packaging). > > So the best approach that currently comes to my mind is having a functionality in distutils, that can provide the package metadata (license, runtime/build requirements) you can already have this, with the current Distutils code. > and separate tool(s) that would be able to use these metadata to > a) convert directly to binary packages (rpm, deb, etc.) > or > b) produce specfiles/whatever for the packagers' reviews > > Some questions that pop immediately are > - Should a) and b) be one library or two? They handle packages in completely different way.] In a different project I think. For debian there's 'stdeb' - http://pypi.python.org/pypi/stdeb > - Should every distribution provide its very own library to handle a) and b) stuff or should these be a general libraries with distro-specific modules+templates? > ... As a user I want to pick up *any* project, and build a deb or a rpm, even it the author did not provide the deb/rpm integration, even if I eventually need to tweak the process. > > How does this sound? This is just a general idea that came to my mind some time ago - feel free to dump it, if you don't like it :) So what are your plans ? > Have a nice day, > Slavek. > >> Cheers >> Tarek >> _______________________________________________ >> Distutils-SIG maillist - Distutils-SIG at python.org >> http://mail.python.org/mailman/listinfo/distutils-sig >> From bkabrda at redhat.com Tue Jun 26 14:02:24 2012 From: bkabrda at redhat.com (Bohuslav Kabrda) Date: Tue, 26 Jun 2012 08:02:24 -0400 (EDT) Subject: [Distutils] Self Introduction and Getting Hands On In-Reply-To: <4FE9A0B0.5040505@ziade.org> Message-ID: <1ba18549-4b60-4d0c-8af1-19fc8bdec3b8@zmail15.collab.prod.int.phx2.redhat.com> ----- Original Message ----- > On 6/26/12 1:23 PM, Bohuslav Kabrda wrote: > > [...] > > Exactly, my project is aimed at creating just the specfiles. In > > Fedora (and other distros as well), there is a review process that > > each package must go through and having a specfile written > > according to the current guidelines is a necessary part of that. > > "pypi2rpm" solves a different use case than distro-packagers need. > > > > I have a pretty good idea about how distro-packagers would like > > things to work and combining it with the "pypi2rpm", I see it this > > way: > > 1) Starting point: we have a Python package. > > 2) We want to: > > 2a) directly convert it to binary RPM => we use "pypi2rpm" - good > > for Mozilla to deploy apps, not useful for distro-packagers > > - Producing binary RPM probably could be left in distutils, but > > looking at the bigger picture, RPM has different lifecycle than > > python standard library, so it might be good to pull this > > functionality out. > Yeah -- I had no intent to keep bdist_rpm in the stdlib. We removed > it > from distiutils2 > Ah, I see. I didn't have that much time to get familiar with distutils2 yet - I only know them from the documentation :) > > 2b) produce a specfile/whatever that is called in other > > distributions => use Fedora's "pyp2rpm"/is there something similar > > for other distributions? > You mean for RPM that are not Fedora/RHEL ? I have no idea. > > - This should definitely be present in a library outside of > > Python - we may want to provide templates/distro-specific stuff > > here and be able to react to new releases of different distros > > (when they change their way of packaging). > > > > So the best approach that currently comes to my mind is having a > > functionality in distutils, that can provide the package metadata > > (license, runtime/build requirements) > you can already have this, with the current Distutils code. > True. What I meant by this is, that providing the metadata is the thing that distutils should do. In my opinion, they shouldn't do anything more (no distro-specific stuff). > > and separate tool(s) that would be able to use these metadata to > > a) convert directly to binary packages (rpm, deb, etc.) > > or > > b) produce specfiles/whatever for the packagers' reviews > > > > Some questions that pop immediately are > > - Should a) and b) be one library or two? They handle packages in > > completely different way.] > In a different project I think. For debian there's 'stdeb' - > http://pypi.python.org/pypi/stdeb > > > > - Should every distribution provide its very own library to handle > > a) and b) stuff or should these be a general libraries with > > distro-specific modules+templates? > > ... > > As a user I want to pick up *any* project, and build a deb or a rpm, > even it the author did not provide the deb/rpm integration, even if I > eventually need to tweak the process. > What I meant here is - should there be one big tool producing the specfiles/debian stuff/... or one project for RPM specfiles, one project for debian stuff... > > > > How does this sound? This is just a general idea that came to my > > mind some time ago - feel free to dump it, if you don't like it :) > So what are your plans ? > Since I'm new here, I first want to watch and listen and contribute to the discussions. I would certainly like to discuss the distro-specific stuff in more detail and participate on a solution. That means first create a clear vision with more distutils people here and then actually take my part implementing it. It would certainly be nice to create something flexible that could do both the specfiles and binary packages for more distributions - for me, that's the goal here (and of course, Python rocks). > > Have a nice day, > > Slavek. > > > >> Cheers > >> Tarek > >> _______________________________________________ > >> Distutils-SIG maillist - Distutils-SIG at python.org > >> http://mail.python.org/mailman/listinfo/distutils-sig > >> > > > -- Regards, Bohuslav "Slavek" Kabrda. From dholth at gmail.com Tue Jun 26 15:51:15 2012 From: dholth at gmail.com (Daniel Holth) Date: Tue, 26 Jun 2012 09:51:15 -0400 Subject: [Distutils] patch distutils to look into .dist-info directories In-Reply-To: References: Message-ID: My markers implementation uses 'ast' from Python 2.6, so it would be a pain to take it further back than Python 2.5. I think it's probably a safe bet that Python 2.3 users do not want any .dist-info directories in their sys.path. (https://bitbucket.org/dholth/wheel/src/b400adaad0fe/wheel/markers.py) From benoit at marmelune.net Wed Jun 27 10:36:40 2012 From: benoit at marmelune.net (=?ISO-8859-1?Q?Beno=EEt_Bryon?=) Date: Wed, 27 Jun 2012 10:36:40 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FE0F3F9.9080706@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> Message-ID: <4FEAC618.7090004@marmelune.net> Hello, The proposal has been posted to peps at python.org : http://hg.python.org/peps/file/52767ab7e140/pep-0423.txt I am about to post it to python-dev at python.org for discussion and review. In order to complete the move, I plan to replace the document by references to PEP 423 in packaging section of cpython documentation. See also http://bugs.python.org/issue14899 Benoit Le 19/06/2012 23:49, Beno?t Bryon a ?crit : > Hi, > > Le 06/06/2012 10:52, Beno?t Bryon a ?crit : >> Le 31/05/2012 18:00, PJ Eby a ?crit : >>> You've definitely put in a lot of work on this, and most of the >>> actual guidelines in your PEP are quite good. I think there's a >>> core part of this that can and should be a good Informational-track >>> PEP. >> >> If it actually becomes a draft PEP, I guess we should: >> >> * move the document in PEPs' repository >> * add notes in Python packaging documentation to reference the PEP. > > Updated > https://bitbucket.org/benoitbryon/cpython/src/doc-package-names/Doc/packaging/packagenames.rst > Prepared migration to a PEP. > I am to submit the document for review at peps at python.org. Feedback is > welcome! > > > Le 06/06/2012 10:52, Beno?t Bryon a ?crit : >> Also, I propose to shorten or move the "terminology" section in >> another document (another pep or in packaging documentation): >> >> * it would simplify the document >> * we'd better share the terminology between several documents. > > Moved terminology section to packaging/introduction.rst. > See > https://bitbucket.org/benoitbryon/cpython/changeset/5eca478e6804#chg-Doc/packaging/introduction.rst > > Benoit > _______________________________________________ > Distutils-SIG maillist - Distutils-SIG at python.org > http://mail.python.org/mailman/listinfo/distutils-sig From ozarow at gmail.com Thu Jun 28 14:15:11 2012 From: ozarow at gmail.com (Piotr Ozarowski) Date: Thu, 28 Jun 2012 14:15:11 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FEAC618.7090004@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> Message-ID: <20120628121511.GR5600@piotro.eu> [Beno?t Bryon, 2012-06-27] > http://hg.python.org/peps/file/52767ab7e140/pep-0423.txt I think PEP 386 (about versions) should be mentioned in "Relationship with other PEPs" section. About "Use a single name" section: sometimes one needs to add "python-" or "py" prefix to project name (f.e. if the name is already used in other programming languages and you cannot register it in your favourite forge hosting service)... but please don't add the prefix also in namespace. All these PyQts, pyudevs, pygames and pyopencls sound stupid to me (isn't it obvious that when I do `import foo` in Python, "foo" is a *Python* library?) About ".contrib.": flask uses flask.ext.foo (namespace) and Flask-Foo (project name) schema for Flask extensions - maybe it's worth mentioning in this PEP as well. -- Piotr O?arowski Debian GNU/Linux Developer www.ozarowski.pl www.griffith.cc www.debian.org GPG Fingerprint: 1D2F A898 58DA AF62 1786 2DF7 AEF6 F1A2 A745 7645 From dholth at gmail.com Fri Jun 29 10:29:40 2012 From: dholth at gmail.com (Daniel Holth) Date: Fri, 29 Jun 2012 04:29:40 -0400 Subject: [Distutils] patch distutils to look into .dist-info directories In-Reply-To: References: Message-ID: Got it. https://bitbucket.org/dholth/distribute now includes the lazy email.Parser() import and lazily imports 'markerlib' to implement environment markers. BW compat should be easy as long as you don't use .dist-info directories in your Python 2.3. It implements the Provides-Extra: proposal to compute dependencies for extras, important if you are converting existing packages to the 1.2 version of the metadata. Hopefully this is enough to convert a bunch of existing packages over to .dist-info and Metadata 1.2 to see how it goes, without having to affect setuptools users that much. From benoit at marmelune.net Fri Jun 29 09:42:35 2012 From: benoit at marmelune.net (=?UTF-8?B?QmVub8OudCBCcnlvbg==?=) Date: Fri, 29 Jun 2012 09:42:35 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <20120628121511.GR5600@piotro.eu> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> Message-ID: <4FED5C6B.4070200@marmelune.net> Le 28/06/2012 14:15, Piotr Ozarowski a ?crit : > [Beno?t Bryon, 2012-06-27] >> http://hg.python.org/peps/file/52767ab7e140/pep-0423.txt > I think PEP 386 (about versions) should be mentioned in "Relationship > with other PEPs" section. Why? Right now, I can't see the relationship between projects/packages/modules names and versions. > About "Use a single name" section: sometimes one needs to add "python-" > or "py" prefix to project name (f.e. if the name is already used in > other programming languages and you cannot register it in your favourite > forge hosting service)... but please don't add the prefix also in namespace. > All these PyQts, pyudevs, pygames and pyopencls sound stupid to me (isn't > it obvious that when I do `import foo` in Python, "foo" is a *Python* > library?) Do you mean: * "pip install pyqt" => yes, * "import pyqt" => no, * "import qt" => yes? Currently, if the project name is "pyqt", then the PEP proposal recommends "pip install pyqt" and "import pyqt", to make the name consistent. If one is not a good choice, change both. It is convention over configuration. IMHO, a "py" prefix for package name would not be a problem in this case. You are not importing Qt itself, but Python bindings for Qt. I mean, yes, it is obvious that when I do `import foo` in Python, then "foo" is a *Python* library. But would the reverse assertion be that if I see "import qt" in Python code, then it is obvious that Qt itself is a Python library? The name could have been "qtbindings", would have it been a problem then? The "py" prefix has the advantage to be clear: that's *Python* bindings for Qt. And it's a really short prefix. > About ".contrib.": flask uses flask.ext.foo (namespace) and Flask-Foo > (project name) schema for Flask extensions - maybe it's worth mentioning > in this PEP as well. Maybe it will be used in discussions or in examples, but I suppose the PEP is not the place to list all project-specific conventions. Currently the draft PEP tells: "search for specific conventions in main project (Flask) documentation". Benoit From ozarow at gmail.com Fri Jun 29 12:50:55 2012 From: ozarow at gmail.com (Piotr Ozarowski) Date: Fri, 29 Jun 2012 12:50:55 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FED5C6B.4070200@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> Message-ID: <20120629105055.GS5600@piotro.eu> [Beno?t Bryon, 2012-06-29] > Le 28/06/2012 14:15, Piotr Ozarowski a ?crit : > >[Beno?t Bryon, 2012-06-27] > >>http://hg.python.org/peps/file/52767ab7e140/pep-0423.txt > >I think PEP 386 (about versions) should be mentioned in "Relationship > >with other PEPs" section. > > Why? Right now, I can't see the relationship between > projects/packages/modules names and versions. The PEP name is "Naming conventions and recipes related to packaging" and version numbers are related to packaging (and naming) for sure. How about PEP 396 (Module Version Numbers)? > >About "Use a single name" section: sometimes one needs to add "python-" > >or "py" prefix to project name (f.e. if the name is already used in > >other programming languages and you cannot register it in your favourite > >forge hosting service)... but please don't add the prefix also in namespace. > >All these PyQts, pyudevs, pygames and pyopencls sound stupid to me (isn't > >it obvious that when I do `import foo` in Python, "foo" is a *Python* > >library?) > > Do you mean: > > * "pip install pyqt" => yes, > * "import pyqt" => no, > * "import qt" => yes? right > Currently, if the project name is "pyqt", then the PEP > proposal recommends "pip install pyqt" and "import pyqt", > to make the name consistent. If one is not a good choice, > change both. > It is convention over configuration. that's my point, PyQt guys cannot use Qt as project name, but "qt" it a better name for this namespace, IMO > IMHO, a "py" prefix for package name would not be a problem > in this case. You are not importing Qt itself, but Python > bindings for Qt. it's not "a problem", I just think PEP should not recommend such prefixes > I mean, yes, it is obvious that when I do `import foo` in > Python, then "foo" is a *Python* library. But would the > reverse assertion be that if I see "import qt" in Python > code, then it is obvious that Qt itself is a Python library? There's a "readline" package in Python's stdlib, does it imply readline is something Python related or specific? > The name could have been "qtbindings", would have it been a > problem then? The "py" prefix has the advantage to be clear: > that's *Python* bindings for Qt. And it's a really short > prefix. qtbindings, qtlib or other similar prefixes/suffixes should not be recommended, IMHO > >About ".contrib.": flask uses flask.ext.foo (namespace) and Flask-Foo > >(project name) schema for Flask extensions - maybe it's worth mentioning > >in this PEP as well. > Maybe it will be used in discussions or in examples, but > I suppose the PEP is not the place to list all project-specific > conventions. Currently the draft PEP tells: "search for > specific conventions in main project (Flask) documentation". I just thought "foo.ext.bar" is a nice convention worth mentioning (next to, not instead of "contrib" which is more general than "ext") -- Piotr O?arowski Debian GNU/Linux Developer www.ozarowski.pl www.griffith.cc www.debian.org GPG Fingerprint: 1D2F A898 58DA AF62 1786 2DF7 AEF6 F1A2 A745 7645 From benoit at marmelune.net Fri Jun 29 15:05:01 2012 From: benoit at marmelune.net (=?UTF-8?B?QmVub8OudCBCcnlvbg==?=) Date: Fri, 29 Jun 2012 15:05:01 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <20120629105055.GS5600@piotro.eu> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> Message-ID: <4FEDA7FD.2090202@marmelune.net> Le 29/06/2012 12:50, Piotr Ozarowski a ?crit : > [Beno?t Bryon, 2012-06-29] >> Le 28/06/2012 14:15, Piotr Ozarowski a ?crit : >>> [Beno?t Bryon, 2012-06-27] >>>> http://hg.python.org/peps/file/52767ab7e140/pep-0423.txt >>> I think PEP 386 (about versions) should be mentioned in "Relationship >>> with other PEPs" section. >> Why? Right now, I can't see the relationship between >> projects/packages/modules names and versions. > The PEP name is "Naming conventions and recipes related to packaging" > and version numbers are related to packaging (and naming) for sure. Ok. Maybe the title of the PEP is not clear. It is related to naming conventions only, and only in packaging (i.e. projects, and distributed packages or modules, not variables, function or class names). Would "naming conventions related to packaging" be better? > How about PEP 396 (Module Version Numbers)? How are version numbers related to naming? >>> About "Use a single name" section: sometimes one needs to add "python-" >>> or "py" prefix to project name (f.e. if the name is already used in >>> other programming languages and you cannot register it in your favourite >>> forge hosting service)... but please don't add the prefix also in namespace. >>> All these PyQts, pyudevs, pygames and pyopencls sound stupid to me (isn't >>> it obvious that when I do `import foo` in Python, "foo" is a *Python* >>> library?) >> Do you mean: >> >> * "pip install pyqt" => yes, >> * "import pyqt" => no, >> * "import qt" => yes? > right > >> Currently, if the project name is "pyqt", then the PEP >> proposal recommends "pip install pyqt" and "import pyqt", >> to make the name consistent. If one is not a good choice, >> change both. >> It is convention over configuration. > that's my point, PyQt guys cannot use Qt as project name, but > "qt" it a better name for this namespace, IMO > >> IMHO, a "py" prefix for package name would not be a problem >> in this case. You are not importing Qt itself, but Python >> bindings for Qt. > it's not "a problem", I just think PEP should not recommend such > prefixes By now, I feel that the "py" prefix use case is not special enough to reconsider or break the "use a single name" rule. I would accept to "import pyqt" instead of "import qt". But that's my own viewpoint. I understand you don't subscribe to it. May someone else enter the discussion and give his opinion about this point? >> I mean, yes, it is obvious that when I do `import foo` in >> Python, then "foo" is a *Python* library. But would the >> reverse assertion be that if I see "import qt" in Python >> code, then it is obvious that Qt itself is a Python library? > There's a "readline" package in Python's stdlib, does it imply readline > is something Python related or specific? You are right. My bad :) >> The name could have been "qtbindings", would have it been a >> problem then? The "py" prefix has the advantage to be clear: >> that's *Python* bindings for Qt. And it's a really short >> prefix. > qtbindings, qtlib or other similar prefixes/suffixes should not be > recommended, IMHO > >>> About ".contrib.": flask uses flask.ext.foo (namespace) and Flask-Foo >>> (project name) schema for Flask extensions - maybe it's worth mentioning >>> in this PEP as well. >> Maybe it will be used in discussions or in examples, but >> I suppose the PEP is not the place to list all project-specific >> conventions. Currently the draft PEP tells: "search for >> specific conventions in main project (Flask) documentation". > I just thought "foo.ext.bar" is a nice convention worth mentioning > (next to, not instead of "contrib" which is more general than "ext") I agree that "foo.ext.bar" seems nice... but it uses 3 namespace levels. It breaks the "avoid deep nesting" rule, which recommends not more than two levels. So the PEP can't recommend this pattern, or we will also have to reconsider the "avoid deep nesting" rule. Notice that, from current feedback about the PEP, I understand that the "avoid deep nesting" rule sounds more valuable than the "standard pattern for contributions". About naming pattern for contribs, the draft PEP proposes: * to use "foocontrib.bar" by default * if there is a need to separate "extensions" from "contributions" (supposing the difference is clearly documented), the project can tell his users to: * publish contributions in foocontrib.* * publish extensions in fooext.* That said, in that particular case, I would publish everything related to foo (except foo itself) in the same namespace (i.e. only foocontrib.*) and use metadata to categorize projects. As an example, with something like keywords="foo extension". So that: * there is only one namespace package * everything that is "related to foo" is under foocontrib.* * if the "contribution" VS "extension" terminology change, there is no need to move projects from one namespace to another. * a project can be both an extension and a contribution. * it's possible to filter the list of packages at PyPI to get the list of extensions: name starts with "foocontrib." and has keyword "extension". Benoit From dholth at gmail.com Fri Jun 29 16:34:14 2012 From: dholth at gmail.com (Daniel Holth) Date: Fri, 29 Jun 2012 10:34:14 -0400 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FEDA7FD.2090202@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> <4FEDA7FD.2090202@marmelune.net> Message-ID: <90DFD55F-D1F1-4BC9-84E9-92D384F28B84@gmail.com> May package names contain hyphen (-)? From ozarow at gmail.com Fri Jun 29 16:51:22 2012 From: ozarow at gmail.com (Piotr Ozarowski) Date: Fri, 29 Jun 2012 16:51:22 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FEDA7FD.2090202@marmelune.net> References: <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> <4FEDA7FD.2090202@marmelune.net> Message-ID: <20120629145121.GT5600@piotro.eu> [Beno?t Bryon, 2012-06-29] > Le 29/06/2012 12:50, Piotr Ozarowski a ?crit : > >[Beno?t Bryon, 2012-06-29] > >>Le 28/06/2012 14:15, Piotr Ozarowski a ?crit : > >>>[Beno?t Bryon, 2012-06-27] > >>>>http://hg.python.org/peps/file/52767ab7e140/pep-0423.txt > >>>I think PEP 386 (about versions) should be mentioned in "Relationship > >>>with other PEPs" section. > >>Why? Right now, I can't see the relationship between > >>projects/packages/modules names and versions. > >The PEP name is "Naming conventions and recipes related to packaging" > >and version numbers are related to packaging (and naming) for sure. > Ok. Maybe the title of the PEP is not clear. > It is related to naming conventions only, and only in > packaging (i.e. projects, and distributed packages or > modules, not variables, function or class names). > > Would "naming conventions related to packaging" be better? > > > >How about PEP 396 (Module Version Numbers)? > How are version numbers related to naming? OK, I admit I shamelessly wanted to promote these two PEPs in as many places as possible ;-) "The great thing about standards is that there are so many to choose from." and there are way too many Python versioning conventions (and places where these versions are stored) > >I just thought "foo.ext.bar" is a nice convention worth mentioning > >(next to, not instead of "contrib" which is more general than "ext") > > I agree that "foo.ext.bar" seems nice... but it uses 3 > namespace levels. It breaks the "avoid deep nesting" nevermind then, it was just a suggestion. I feel strongly against the py/python/lib/bindings/etc. prefix/suffix issue, though (but will not bother you about it anymore) -- Piotr O?arowski Debian GNU/Linux Developer www.ozarowski.pl www.griffith.cc www.debian.org GPG Fingerprint: 1D2F A898 58DA AF62 1786 2DF7 AEF6 F1A2 A745 7645 From jim at zope.com Fri Jun 29 16:59:02 2012 From: jim at zope.com (Jim Fulton) Date: Fri, 29 Jun 2012 10:59:02 -0400 Subject: [Distutils] zc.buildoutsftp Message-ID: If you don't use zc.buildoutsftp on Windows, please ignore :) Does anyone use zc.buildoutsftp on windows? There is some windows support that I added several years ago, presumably because someone sent me a patch. I don't use zc.buildoutsftp on Windows myself. When I wrote, zc.buildoutsftp, it had no automated tests, because I couldn't figure out a sane way to write automated tests for it. Recently, I've written mock-based tests, but these tests are unix-ish-specific. I know they'll fail on Windows. I can't say I really care. :) I'd rather have half-way decent tests that fail on windows than no tests anywhere. I've endeavored to write the code in a platform-independent manner, but without tests, it's easy to screw up. If anyone is using windows, and is willing to help at least make sure the new revision works on windows or is willing to help factor the tests to run on Windows, then I'm happy to work with them. Jim -- Jim Fulton http://www.linkedin.com/in/jimfulton Jerky is better than bacon! http://zo.pe/Kqm From p.f.moore at gmail.com Fri Jun 29 17:59:52 2012 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 29 Jun 2012 16:59:52 +0100 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <4FEDA7FD.2090202@marmelune.net> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> <4FEDA7FD.2090202@marmelune.net> Message-ID: On 29 June 2012 14:05, Beno?t Bryon wrote: > I agree that "foo.ext.bar" seems nice... but it uses 3 > namespace levels. It breaks the "avoid deep nesting" > rule, which recommends not more than two levels. > So the PEP can't recommend this pattern, or we will also have > to reconsider the "avoid deep nesting" rule. ??? I thought the PEP was about distribution naming (i.e., things published on PyPI. If the package "foo" is published under that name on PyPI, I see no reason it can't (good taste permitting) use multi-level names like foo.bar.baz internally. That's a completely different question than whether a distribution named foo.bar.baz should be available for download from PyPI (presumably alongside foo.bar.bip and foo.bop.boop...) Maybe I'm misunderstanding the conversation here. We need to be *extremely* precise about the differences between distributions (things published to somewhere like PyPI, which can be installed independently, and have a version number, etc) and packages (which are things you import in Python). I've been just as vague as everyone else in this thread, so I'm not trying to blame anyone. But things are definitely getting muddled. Paul. From p.f.moore at gmail.com Fri Jun 29 18:01:31 2012 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 29 Jun 2012 17:01:31 +0100 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: <90DFD55F-D1F1-4BC9-84E9-92D384F28B84@gmail.com> References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> <4FEDA7FD.2090202@marmelune.net> <90DFD55F-D1F1-4BC9-84E9-92D384F28B84@gmail.com> Message-ID: On 29 June 2012 15:34, Daniel Holth wrote: > May package names contain hyphen (-)? Package or distribution? Package, no because it has to be a valid Python identifier. Distribution, I'm not sure - there are funny rules to allow for things like parsing dist-info directory names, or egg filenames, unambiguously. One of the packaging PEPs would probably answer this one. Paul. From benoit at marmelune.net Fri Jun 29 18:29:32 2012 From: benoit at marmelune.net (=?ISO-8859-1?Q?Beno=EEt_Bryon?=) Date: Fri, 29 Jun 2012 18:29:32 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> <4FEDA7FD.2090202@marmelune.net> Message-ID: <4FEDD7EC.1070807@marmelune.net> Le 29/06/2012 17:59, Paul Moore a ?crit : > On 29 June 2012 14:05, Beno?t Bryon wrote: >> I agree that "foo.ext.bar" seems nice... but it uses 3 >> namespace levels. It breaks the "avoid deep nesting" >> rule, which recommends not more than two levels. >> So the PEP can't recommend this pattern, or we will also have >> to reconsider the "avoid deep nesting" rule. > ??? I thought the PEP was about distribution naming (i.e., things > published on PyPI. It is about naming, limited to packaging, but not limited to things published on PyPI. > If the package "foo" is published under that name > on PyPI, I see no reason it can't (good taste permitting) use > multi-level names like foo.bar.baz internally. That's a completely > different question than whether a distribution named foo.bar.baz > should be available for download from PyPI (presumably alongside > foo.bar.bip and foo.bop.boop...) I understood the foo.ext.bar example as: * foo is a namespace package * foo.ext is a namespace package * foo project invites users to publish contributions as foo.ext.* If foo is a project without namespace packages, then, yes, the PEP proposal doesn't cover its internal code organization. > Maybe I'm misunderstanding the conversation here. We need to be > *extremely* precise about the differences between distributions > (things published to somewhere like PyPI, which can be installed > independently, and have a version number, etc) and packages (which are > things you import in Python). I've been just as vague as everyone else > in this thread, so I'm not trying to blame anyone. But things are > definitely getting muddled. The terminology reference used in the PEP proposal is https://bitbucket.org/benoitbryon/cpython/src/doc-package-names/Doc/packaging/introduction.rst (which is http://docs.python.org/dev/packaging/introduction.html + some changes related to the PEP proposal). The PEP proposal deals with : * the "name" argument passed to packaging.core.setup() or setuptools.setup(), i.e. project name * names of packages or modules being distributed, as a consequence of the "use a single name" rule Benoit From benoit at marmelune.net Fri Jun 29 18:36:53 2012 From: benoit at marmelune.net (=?ISO-8859-1?Q?Beno=EEt_Bryon?=) Date: Fri, 29 Jun 2012 18:36:53 +0200 Subject: [Distutils] conventions or best practice to choose package names? In-Reply-To: References: <4FB02B36.8040504@marmelune.net> <4FC69AE4.10904@marmelune.net> <4FCF1A3F.1010604@marmelune.net> <4FE0F3F9.9080706@marmelune.net> <4FEAC618.7090004@marmelune.net> <20120628121511.GR5600@piotro.eu> <4FED5C6B.4070200@marmelune.net> <20120629105055.GS5600@piotro.eu> <4FEDA7FD.2090202@marmelune.net> <90DFD55F-D1F1-4BC9-84E9-92D384F28B84@gmail.com> Message-ID: <4FEDD9A5.3030700@marmelune.net> Le 29/06/2012 18:01, Paul Moore a ?crit : > On 29 June 2012 15:34, Daniel Holth wrote: >> May package names contain hyphen (-)? > Package or distribution? Package, no because it has to be a valid > Python identifier. Distribution, I'm not sure - there are funny rules > to allow for things like parsing dist-info directory names, or egg > filenames, unambiguously. One of the packaging PEPs would probably > answer this one. > > Paul. In package/module names, as Paul said, you can't use hyphens. About project names, currently, the draft PEP recommends using the same value for distributed module/package name and for project name. The consequence is: PEP 8 applies to project names too, which means "no hyphens" in project names (and many other rules). The exception is the dot symbol, required to create namespace packages. Benoit From chris at simplistix.co.uk Sat Jun 30 14:54:27 2012 From: chris at simplistix.co.uk (Chris Withers) Date: Sat, 30 Jun 2012 13:54:27 +0100 Subject: [Distutils] working with different requirements depending on python version Message-ID: <4FEEF703.2020001@simplistix.co.uk> Hi All, While trying to debug another problem, I re-ran some Jenkins jobs for two of my packages, which promptly failed under Python 2.5: http://jenkins.simplistix.co.uk/job/testfixtures-buildout/ http://jenkins.simplistix.co.uk/job/checker-buildout/ Now, the issue here is that zope.interface 4.0 and above no longer supports Python 2.5: http://jenkins.simplistix.co.uk/job/checker-buildout/PYTHON=2.5,label=linux/15/console http://pypi.python.org/pypi/zope.interface/4.0.0 That's fine, but I'd like the above projects to continue 2.5 support for the time being. The simple solution would be to add a versions section to the buildout here: https://github.com/Simplistix/checker/blob/master/buildout.cfg ...and pin the versions in the tox.ini here: https://github.com/Simplistix/checker/blob/master/tox.ini ...however, that doesn't give the correct result. The version requirement is "use the latest version of all packages that work with the Python version under test" and the changes I've suggested would mean that on Python 2.6 and 2.7, the latest version of zope.interface wouldn't be used. What makes this even more "interesting" is that zope.interface isn't a direct requirement of either of these projects, it's a dependency of a dependency in both cases. How have other people solved this? Chris -- Simplistix - Content Management, Batch Processing & Python Consulting - http://www.simplistix.co.uk From chris.jerdonek at gmail.com Sat Jun 30 16:49:59 2012 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Sat, 30 Jun 2012 07:49:59 -0700 Subject: [Distutils] working with different requirements depending on python version In-Reply-To: <4FEEF703.2020001@simplistix.co.uk> References: <4FEEF703.2020001@simplistix.co.uk> Message-ID: On Sat, Jun 30, 2012 at 5:54 AM, Chris Withers wrote: > > The version requirement is "use the latest version of all packages that work with the Python version under test" and the changes I've suggested would mean that on Python 2.6 and 2.7, the latest version of zope.interface wouldn't be used. > > What makes this even more "interesting" is that zope.interface isn't a direct requirement of either of these projects, it's a dependency of a dependency in both cases. > > How have other people solved this? This might not be applicable because I haven't used Jenkins or buildout (but I do use tox). In cases where I've had different requirements for different Python versions, I've done the determination in setup.py. There I determined the 'install_requires' argument to setup() differently depending on the Python version. --Chris From jim at zope.com Sat Jun 30 23:13:31 2012 From: jim at zope.com (Jim Fulton) Date: Sat, 30 Jun 2012 17:13:31 -0400 Subject: [Distutils] distribute broken Python 3.3.3 In-Reply-To: <4FE89787.6050300@ziade.org> References: <4FE80ACF.6030604@ziade.org> <4FE89787.6050300@ziade.org> Message-ID: On Mon, Jun 25, 2012 at 12:53 PM, Tarek Ziad? wrote: > On 6/25/12 12:57 PM, Jim Fulton wrote: >> >> On Mon, Jun 25, 2012 at 2:53 AM, Tarek Ziad? wrote: >>> >>> On 6/25/12 1:11 AM, Jim Fulton wrote: >>>> >>>> >>>> >>>> https://bitbucket.org/tarek/distribute/issue/289/distribute-broken-with-python-33 >>>> >>>> I'm gonna try to work around it in buildout 2 by monkey-patching >>>> distribute. >>>> >>>> Jim >>>> >>> If you write the fix, maybe you can do it against Distribute ? we can >>> release it right after. >> >> I'm not interested in fixing the scanning of pyc files, because: >> >> 1. buildout2 always unzips, because almost no one wants zipped eggs, and >> >> 2. Because the zip_safe setup flag is ignored by distribute. >> >> >> ?https://bitbucket.org/tarek/distribute/issue/279/zip_safe-ignored-by-easy_install-command >> >> 3. Life is short. :) >> >> I would be willing to fix distribute to not scan if the the >> --always-unzip, -Z option is passed to it. >> >> My monkey-patch simply replaces setuptools.command.bdist_egg.can_scan >> with lambda:False. >> >> My recommendation is is to just stop scanning, and don't zip installed >> eggs unless >> the zip_safe flag is present and true (and the --always-unzip option >> isn't used). ?I'd be >> willing to help with this, but I probably couldn't get to it for a >> couple of weeks at the earliest. >> This is, of course, backward-incompatible, although I doubt anyone >> would really care. > > > That sounds like a good plan to me. OK. So, getting started, I cloned the project and ran the tests. On python 2.4-2.7 and 3.2, the test test_bad_url_double_scheme fails for me. The reason it fails is because it "scapes" an exception looking for a string and the scraping fails. Does distribute raise this exception? If so, couldn't it in include some actual data in the exception that could be tested? I'm gonna ignore this failure. :/ Jim -- Jim Fulton http://www.linkedin.com/in/jimfulton Jerky is better than bacon! http://zo.pe/Kqm