From gward@cnri.reston.va.us Tue Jul 6 21:13:33 1999 From: gward@cnri.reston.va.us (Greg Ward) Date: Tue, 6 Jul 1999 16:13:33 -0400 Subject: [Distutils] Anyone home? In-Reply-To: <3777738E.DA38BB40@interet.com>; from James C. Ahlstrom on Mon, Jun 28, 1999 at 09:07:26AM -0400 References: <3777738E.DA38BB40@interet.com> Message-ID: <19990706161332.A17531@cnri.reston.va.us> On 28 June 1999, James C. Ahlstrom said: > Is this sig active? Is anyone out there? It's not exactly piping hot, but it's certainly not dead. A bit sleepy, perhaps. I regard topics such as freeze and the "how to install extensions in packages under a multi-architecture installation" debate as important, worth considering, and very much on topic in the Distutils SIG. But the main purpose of the SIG is the design and implementation of the Distutils themselves, which (due to my being endlessly distracted by other things) are still not mature enough that other people can really sit down and hack on them. The good news is, I finally dug myself out of the hole of other distractions and got back to coding the Distutils proper on the weekend. Should have something to show for it in a few days. Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913 From gward@cnri.reston.va.us Sat Jul 10 03:31:33 1999 From: gward@cnri.reston.va.us (gward@cnri.reston.va.us) Date: Fri, 9 Jul 1999 22:31:33 -0400 Subject: [Distutils] Beginnings of a C/C++ compiler interface Message-ID: <19990709223132.A6291@cnri.reston.va.us> Hi all -- as promised earlier in the week, I have completed the beginning steps along the road to a 'build_ext' command that will work under Unix. The legions of eager developers who watch every movement on the distutils-checkins list will know that I've just added two modules, ccompiler and unixccompiler, which provide the classes CCompiler and UnixCCompiler. The basic idea is this: CCompiler defines the interface to a generic C/C++ compiler, and UnixCCompiler implements an interface to the traditional Unix "cc -Dmacro -Iincludedir -Umacro -c foo.c -o foo.o" compiler invocation (and -l/-L linker invocation). So far all it does is generate and print out command lines, but that's enough to convince me that it works on my Linux/gcc system, i.e. it generates the command lines I intended it to generate, and gets the right preprocessor/compiler/linker flags from Python's Makefile (so that the basic compile/link steps are essentially the same as would be done by a Makefile.pre.in-generated Makefile). Please, take a look at the code. To encourage this, you'll find the bulk of ccompiler.py below: it's mostly comments and docstrings, since after all it mostly exists to define an interface. It's crucial that this interface be capable of what we need to build Python extensions on Unix, Windows, and Mac OS, and I'm relying on you folks to tell me what needs to be added to support Windows and Mac OS. (Well, if I missed something for Unix, be sure to tell me about that too. That's less likely, though, and I'll have a clue what you're talking about when politely inform me of my errors.) In particular: is this interface sufficient to handle Windows .def files? Is it enough for the weird case of using Oracle's libraries that Greg Stein mentioned (or other libraries with hairy interdependencies)? What Mac C compiler is supported, and is there a way to drive it programmatically? (Ie. is this even *possible* on the Mac?) Oh, if you're looking for some example code: see test/test_cc.py. Gives whatever CCompiler class applies on your platform a run for its money. (Currently only works when os.name == 'posix', since so far only UnixCCompiler is implemented.) Anyways, here's that hunk of ccompiler.py: class CCompiler: """Abstract base class to define the interface that must be implemented by real compiler abstraction classes. Might have some use as a place for shared code, but it's not yet clear what code can be shared between compiler abstraction models for different platforms. The basic idea behind a compiler abstraction class is that each instance can be used for all the compile/link steps in building a single project. Thus, attributes common to all of those compile and link steps -- include directories, macros to define, libraries to link against, etc. -- are attributes of the compiler instance. To allow for variability in how individual files are treated, most (all?) of those attributes may be varied on a per-compilation or per-link basis.""" # XXX things not handled by this compiler abstraction model: # * client can't provide additional options for a compiler, # e.g. warning, optimization, debugging flags. Perhaps this # should be the domain of concrete compiler abstraction classes # (UnixCCompiler, MSVCCompiler, etc.) -- or perhaps the base # class should have methods for the common ones. # * can't put output files (object files, libraries, whatever) # into a separate directory from their inputs. Should this be # handled by an 'output_dir' attribute of the whole object, or a # parameter to the compile/link_* methods, or both? # * can't completely override the include or library searchg # path, ie. no "cc -I -Idir1 -Idir2" or "cc -L -Ldir1 -Ldir2". # I'm not sure how widely supported this is even by POSIX # compilers, much less on other platforms. And I'm even less # sure how useful it is; probably for cross-compiling, but I # have no intention of supporting that. # * can't do really freaky things with the library list/library # dirs, e.g. "-Ldir1 -lfoo -Ldir2 -lfoo" to link against # different versions of libfoo.a in different locations. I # think this is useless without the ability to null out the # library search path anyways. # * don't deal with verbose and dry-run flags -- probably a # CCompiler object should just drag them around the way the # Distribution object does (either that or we have to drag # around a Distribution object, which is what Command objects # do... but might be kind of annoying) [...] # -- Bookkeeping methods ------------------------------------------- def define_macro (self, name, value=None): """Define a preprocessor macro for all compilations driven by this compiler object. The optional parameter 'value' should be a string; if it is not supplied, then the macro will be defined without an explicit value and the exact outcome depends on the compiler used (XXX true? does ANSI say anything about this?)""" def undefine_macro (self, name): """Undefine a preprocessor macro for all compilations driven by this compiler object. If the same macro is defined by 'define_macro()' and undefined by 'undefine_macro()' the last call takes precedence (including multiple redefinitions or undefinitions). If the macro is redefined/undefined on a per-compilation basis (ie. in the call to 'compile()'), then that takes precedence.""" def add_include_dir (self, dir): """Add 'dir' to the list of directories that will be searched for header files. The compiler is instructed to search directories in the order in which they are supplied by successive calls to 'add_include_dir()'.""" def set_include_dirs (self, dirs): """Set the list of directories that will be searched to 'dirs' (a list of strings). Overrides any preceding calls to 'add_include_dir()'; subsequence calls to 'add_include_dir()' add to the list passed to 'set_include_dirs()'. This does not affect any list of standard include directories that the compiler may search by default.""" def add_library (self, libname): """Add 'libname' to the list of libraries that will be included in all links driven by this compiler object. Note that 'libname' should *not* be the name of a file containing a library, but the name of the library itself: the actual filename will be inferred by the linker, the compiler, or the compiler abstraction class (depending on the platform). The linker will be instructed to link against libraries in the order they were supplied to 'add_library()' and/or 'set_libraries()'. It is perfectly valid to duplicate library names; the linker will be instructed to link against libraries as many times as they are mentioned.""" def set_libraries (self, libnames): """Set the list of libraries to be included in all links driven by this compiler object to 'libnames' (a list of strings). This does not affect any standard system libraries that the linker may include by default.""" def add_library_dir (self, dir): """Add 'dir' to the list of directories that will be searched for libraries specified to 'add_library()' and 'set_libraries()'. The linker will be instructed to search for libraries in the order they are supplied to 'add_library_dir()' and/or 'set_library_dirs()'.""" def set_library_dirs (self, dirs): """Set the list of library search directories to 'dirs' (a list of strings). This does not affect any standard library search path that the linker may search by default.""" def add_link_object (self, object): """Add 'object' to the list of object files (or analogues, such as explictly named library files or the output of "resource compilers") to be included in every link driven by this compiler object.""" def set_link_objects (self, objects): """Set the list of object files (or analogues) to be included in every link to 'objects'. This does not affect any standard object files that the linker may include by default (such as system libraries).""" # -- Worker methods ------------------------------------------------ # (must be implemented by subclasses) def compile (self, sources, macros=None, includes=None): """Compile one or more C/C++ source files. 'sources' must be a list of strings, each one the name of a C/C++ source file. Return a list of the object filenames generated (one for each source filename in 'sources'). 'macros', if given, must be a list of macro definitions. A macro definition is either a (name, value) 2-tuple or a (name,) 1-tuple. The former defines a macro; if the value is None, the macro is defined without an explicit value. The 1-tuple case undefines a macro. Later definitions/redefinitions/ undefinitions take precedence. 'includes', if given, must be a list of strings, the directories to add to the default include file search path for this compilation only.""" pass # XXX this is kind of useless without 'link_binary()' or # 'link_executable()' or something -- or maybe 'link_static_lib()' # should not exist at all, and we just have 'link_binary()'? def link_static_lib (self, objects, output_libname, libraries=None, library_dirs=None): """Link a bunch of stuff together to create a static library file. The "bunch of stuff" consists of the list of object files supplied as 'objects', the extra object files supplied to 'add_link_object()' and/or 'set_link_objects()', the libraries supplied to 'add_library()' and/or 'set_libraries()', and the libraries supplied as 'libraries' (if any). 'output_libname' should be a library name, not a filename; the filename will be inferred from the library name. 'library_dirs', if supplied, should be a list of additional directories to search on top of the system default and those supplied to 'add_library_dir()' and/or 'set_library_dirs()'.""" pass # XXX what's better/more consistent/more universally understood # terminology: "shared library" or "dynamic library"? def link_shared_lib (self, objects, output_libname, libraries=None, library_dirs=None): """Link a bunch of stuff together to create a shared library file. Has the same effect as 'link_static_lib()' except that the filename inferred from 'output_libname' will most likely be different, and the type of file generated will almost certainly be different.""" pass def link_shared_object (self, objects, output_filename, libraries=None, library_dirs=None): """Link a bunch of stuff together to create a shared object file. Much like 'link_shared_lib()', except the output filename is explicitly supplied as 'output_filename'.""" pass # class CCompiler Hope you enjoyed that as much as I did. ;-) Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913 From mal@lemburg.com Sat Jul 10 10:09:53 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Sat, 10 Jul 1999 11:09:53 +0200 Subject: [Distutils] Beginnings of a C/C++ compiler interface References: <19990709223132.A6291@cnri.reston.va.us> Message-ID: <37870DE1.796987C5@lemburg.com> [ccompiler.py] Looks nice. Somethings that might be of use for the build process writer: · a way to test compiler features a la configure, e.g. a simple way to pass a small C program and evaluate the compiler error message (error or no error should suffice) · a way to test the availability of (shared) libs in much the same way, e.g. try to link an empty object file against a set of given libs to see if the linker finds the libs or not · a way to access the compiler/linker name and version Apart from this, I would also need a standard way to figure out the platform. os.uname() does help a bit, but it usually does not include e.g. the Linux distribution name or libc version. Is there a way to add APIs for these to the distutils ? [I need the distribution name to be able to preset paths to libs.] If these things are already included in the distutils please ignore this message: I'm only following this list every now and then... you're doing a great job, BTW :-) Cheers, -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 174 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@CNRI.Reston.VA.US Sat Jul 10 14:42:50 1999 From: guido@CNRI.Reston.VA.US (Guido van Rossum) Date: Sat, 10 Jul 1999 09:42:50 -0400 Subject: [Distutils] Re: Distutils-SIG digest, Vol 1 #79 - 1 msg In-Reply-To: Your message of "Sat, 10 Jul 1999 01:05:15 EDT." <199907100505.BAA27449@python.org> References: <199907100505.BAA27449@python.org> Message-ID: <199907101342.JAA04194@eric.cnri.reston.va.us> Greg, Only a few comments: 1) There's no command line interface on the Mac, but the compiler they use (Metroworks) can be controlled through an external "event" interface (Apple's response to COM, except Apple had it long before COM even existed :-). So there's no reason why it couldn't be done on the Mac. 2) I'm not sure if you're trying to add a method for each of the typical cc options (I recognized -I, -D, -U, etc.). Looks like you're missing -R, which is like -L but works at runtime, and is needed if you're using shared libraries that live in non-standard places. 3) I stringly prefer shared lib over dynamic lib (on Windows everyone calls them "DLL" anyway). 4) Do you really not want to support cross-compiling? The Windows/ce folks will regret that ;-) But I don't think you need to support -I (without a directory) for that; typically the compiler has a different name and all is well. I bet -I without args is intended for people who want to experiment with a hacked set of system includes; they can add it to the compiler command name if they want to. 5) When creating a shared lib from libraries only, you need to specify an entry point. Is this useful to support? 6) I haven't been following distutils very closely, so forgive me if the above makes no sense :-) --Guido van Rossum (home page: http://www.python.org/~guido/) From gmcm@hypernet.com Sat Jul 10 17:28:01 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Sat, 10 Jul 1999 11:28:01 -0500 Subject: [Distutils] Re: Add multiple frozen modules Message-ID: <1280502155-85895598@hypernet.com> (Doing my bi-monthly perusal of Distutils activity, I find...) [Mark] > .... Specifically, Greg Stien and Gordon McMillan (and plenty of > people before them :-) seem to have what is considered "state of the > art" in where this is heading. [Jim Ahlstrom] I don't know of techniques which aren't Windows specific. Greg??Gordon?? Mark is referring to Greg's imputil.py (http://www.lyra.org/greg/small/) and some of the places I've taken it. My Win32-specific installer (ftp://ftp.python.org/pub/python/contrib/System/Installer_r_01.exe) makes use of this, but not all of it is Windows specific. (I originally packaged it so the cross-platform stuff was available separately, but there was no apparent interest from non-Windows users.) Greg's imputil.py basically makes it possible to create a chain of importers, with the standard mechanism pushed to the end. Writing an importer is easy. You set up the chain is site.py. I created a way of building archives of .pyc's (or .pyo's, though I've never worked with them). These are compressed with zlib. The standard library fits in about 500K and (subjectively) it is no slower and perhaps faster than the regular method. The mechanism handles modules and packages, and the building of an archive can be done in all kinds of ways (including using modulefinder from freeze). I also created another kind of archive that can package up arbitrary stuff, and is fairly easily unpacked from C code. All of the above is cross-platform. On Windows, this means you can have a complete Python installation (independent of any other Python installation) in a single directory: myPython/ python.exe (and/or pythonw.exe) python15.dll (completely vanilla) py_lib.pyz exceptions.py (from the std distr) site.py (hacked to load all the .pyz's) [any other .pyd's or .dll's you want] [more .pyz's if you want] [more .py's if you want] (The Window's installation / standalone stuff goes further than this). These .pyz's do work on Linux, but the steps Python goes through to determine where it lives is simpler on Windows. I'm not sure what it would take to get a single directory Python installation (independent of any other installed Pythons) working on *nix. - Gordon From ovidiu@cup.hp.com Mon Jul 12 18:40:26 1999 From: ovidiu@cup.hp.com (Ovidiu Predescu) Date: Mon, 12 Jul 1999 10:40:26 -0700 Subject: [Distutils] Beginnings of a C/C++ compiler interface In-Reply-To: Your message of "Sat, 10 Jul 1999 11:09:53 +0200." <37870DE1.796987C5@lemburg.com> Message-ID: <199907121740.KAA28242@hpcll563.cup.hp.com> On Sat, 10 Jul 1999 11:09:53 +0200, "M.-A. Lemburg" wrote: > [ccompiler.py] > > Looks nice. Somethings that might be of use for the build process > writer: > > · a way to test compiler features a la configure, e.g. a simple > way to pass a small C program and evaluate the compiler error > message (error or no error should suffice) > > · a way to test the availability of (shared) libs in much > the same way, e.g. try to link an empty object file against a set > of given libs to see if the linker finds the libs or not > > · a way to access the compiler/linker name and version This sounds a lot like GNU autoconf so why we are not using it? I agree that some things are still missing from autoconf, things like how to build shared libraries and dynamically loadable object files, but these could be defined for each system in a configuration file, perhaps written in Python. Look for example how libtool works, it's very simple to write a new makefile. Or take a look at the GNUstep's makefile package, is a combination of autoconf and makefiles, although it requires GNUmake (see http://www.gnustep.org). The advantage of using autoconf is that we use something that already works very well with C code, so building C extensions is very simple. Plus developes are used to how autoconf/configure works. Not to mention we don't have to rewrite what configure does in Python, there are lots of ugly details we have to figure out in order for this to work really well. In my view, I see autoconf plus some configuration files that define the flags used for building shared libraries (Python files, makefiles, shell scripts, doesn't matter) as being the right tools for building C extensions. For installing Python files, a simple Python tool that checks for package dependencies and installs the files in the appropriate places should be enough. Am I missing something? Best regards, -- Ovidiu Predescu http://www.geocities.com/SiliconValley/Monitor/7464/ From Fred L. Drake, Jr." References: <37870DE1.796987C5@lemburg.com> <199907121740.KAA28242@hpcll563.cup.hp.com> Message-ID: <14218.10660.374152.34831@weyr.cnri.reston.va.us> Ovidiu Predescu writes: > This sounds a lot like GNU autoconf so why we are not using it? Yes, it does. > Am I missing something? Macintosh, Windows, and other operating systems. We cannot assume that GNUish tools are installed on those systems. For a number of people, there's a licensing issue with GNUish tools as well. -Fred -- Fred L. Drake, Jr. Corporation for National Research Initiatives From ovidiu@cup.hp.com Mon Jul 12 19:14:31 1999 From: ovidiu@cup.hp.com (Ovidiu Predescu) Date: Mon, 12 Jul 1999 11:14:31 -0700 Subject: [Distutils] Beginnings of a C/C++ compiler interface In-Reply-To: Your message of "Mon, 12 Jul 1999 13:45:08 EDT." <14218.10660.374152.34831@weyr.cnri.reston.va.us> Message-ID: <199907121814.LAA28524@hpcll563.cup.hp.com> On Mon, 12 Jul 1999 13:45:08 -0400 (EDT), "Fred L. Drake" wrote: > > Am I missing something? > > Macintosh, Windows, and other operating systems. We cannot assume > that GNUish tools are installed on those systems. The way I saw things happening with other free software packages is to provide already configured files for Windows and Mac OS. You simple don't run the configure tools at all but come up with some assumptions on those platforms. > For a number of people, there's a licensing issue with GNUish tools > as well. The output generated by autoconf, aka the configure script, is not covered by GPL, so there's no licensing issue here. Only the autoconf package itself is covered by GPL, but not its result.The autoconf manual also specifies this very clear: What are the restrictions on distributing `configure' scripts that Autoconf generates? How does that affect my programs that use them? There are no restrictions on how the configuration scripts that Autoconf produces may be distributed or used. In Autoconf version 1, they were covered by the GNU General Public License. We still encourage software authors to distribute their work under terms like those of the GPL, but doing so is not required to use Autoconf. Of the other files that might be used with `configure', `config.h.in' is under whatever copyright you use for your `configure.in', since it is derived from that file and from the public domain file `acconfig.h'. `config.sub' and `config.guess' have an exception to the GPL when they are used with an Autoconf-generated `configure' script, which permits you to distribute them under the same terms as the rest of your package. `install-sh' is from the X Consortium and is not copyrighted. Using autoconf does not present any problem since we are to distribute the generated configure script and not the autoconf package itself. And all the extension packages could use a configure script even if the licensing of the package is not under GPL or LGPL. Greetings, -- Ovidiu Predescu http://www.geocities.com/SiliconValley/Monitor/7464/ From Fred L. Drake, Jr." References: <14218.10660.374152.34831@weyr.cnri.reston.va.us> <199907121814.LAA28524@hpcll563.cup.hp.com> Message-ID: <14218.13928.272010.165288@weyr.cnri.reston.va.us> Ovidiu Predescu writes: > The way I saw things happening with other free software packages is to provide > already configured files for Windows and Mac OS. You simple don't run the > configure tools at all but come up with some assumptions on those platforms. For a finished, end-user package, this is true. Based on the developer's day discussion at IPC7, distutils will also be used to help the developers of those packages. Many authors are not able to provide binary distributions for all platforms. The ideal would be for me to put together a source distribution as a distutils-based package; people with access to various platforms can pull that down, build any platform-dependent components, and then create an installable package for others to use. Hopefully this can be reduced to one or two commands. There are also people who will want to build from sources regardless of platform; having a Python-only system makes this a lot easier; only the compiler issues become platform-dependent, and that can be isolated within the distutils package. > The output generated by autoconf, aka the configure script, is not covered by > GPL, so there's no licensing issue here. Only the autoconf package itself is > covered by GPL, but not its result.The autoconf manual also specifies this very I agree. This issue is not entirely a matter of legal interpretation, unfortunately; some organizations will (reportedly; I don't know of any documented cases) fire people for installing un-approved software, and the GPL or GNU label can make corporate software managers very leery. Whether or not it should is *not* the issue. Having the system entirely in Python also helps when creating packages on non-Unix systems; autoconf may not do the right thing on a Macintosh! -Fred -- Fred L. Drake, Jr. Corporation for National Research Initiatives From arcege@shore.net Mon Jul 12 19:56:21 1999 From: arcege@shore.net (Michael P. Reilly) Date: Mon, 12 Jul 1999 14:56:21 -0400 (EDT) Subject: [Distutils] Beginnings of a C/C++ compiler interface In-Reply-To: <199907121740.KAA28242@hpcll563.cup.hp.com> from Ovidiu Predescu at "Jul 12, 99 10:40:26 am" Message-ID: <199907121856.OAA05492@northshore.shore.net> [Charset iso-8859-1 unsupported, filtering to ASCII...] > On Sat, 10 Jul 1999 11:09:53 +0200, "M.-A. Lemburg" wrote: > > > [ccompiler.py] > > > > Looks nice. Somethings that might be of use for the build process > > writer: > > > > _ a way to test compiler features a la configure, e.g. a simple > > way to pass a small C program and evaluate the compiler error > > message (error or no error should suffice) > > > > _ a way to test the availability of (shared) libs in much > > the same way, e.g. try to link an empty object file against a set > > of given libs to see if the linker finds the libs or not > > > > _ a way to access the compiler/linker name and version > > This sounds a lot like GNU autoconf so why we are not using it? > > I agree that some things are still missing from autoconf, things like how to > build shared libraries and dynamically loadable object files, but these could > be defined for each system in a configuration file, perhaps written in Python. > Look for example how libtool works, it's very simple to write a new makefile. > Or take a look at the GNUstep's makefile package, is a combination of autoconf > and makefiles, although it requires GNUmake (see http://www.gnustep.org). > > The advantage of using autoconf is that we use something that already works > very well with C code, so building C extensions is very simple. Plus developes > are used to how autoconf/configure works. Not to mention we don't have to > rewrite what configure does in Python, there are lots of ugly details we have > to figure out in order for this to work really well. > > In my view, I see autoconf plus some configuration files that define the flags > used for building shared libraries (Python files, makefiles, shell scripts, > doesn't matter) as being the right tools for building C extensions. For > installing Python files, a simple Python tool that checks for package > dependencies and installs the files in the appropriate places should be enough. > > Am I missing something? > Please read the charter and requirements doc for the Distutils-SIG. You will find that the tools being discussed and developed are for the developers, not necessarily for the installers/end-users. Autoconf is a tool for the developer, but does not test anything.. it generates a ./configure script, for UNIX-based systems only. Autoconf requires m4 to be compiled and understood. M4 is very nice, but it is a language that not everyone can understand easily. Asking everyone to learn autoconf and m4 is as bad as the Perl crew requiring that Perl/C extensions be written in XS. So the point of the Distutils SIG was to develop some standards and some (Python or C based) tools to aid the aspiring module developers out there in creating easily packagable distributions. Autoconf doesn't cut the mustard (tho I like it :), especially for non-UNIX developing. -Arcege -- ------------------------------------------------------------------------ | Michael P. Reilly, Release Engineer | Email: arcege@shore.net | | Salem, Mass. USA 01970 | | ------------------------------------------------------------------------ From ovidiu@cup.hp.com Mon Jul 12 21:34:48 1999 From: ovidiu@cup.hp.com (Ovidiu Predescu) Date: Mon, 12 Jul 1999 13:34:48 -0700 Subject: [Distutils] Beginnings of a C/C++ compiler interface In-Reply-To: Your message of "Mon, 12 Jul 1999 14:39:36 EDT." <14218.13928.272010.165288@weyr.cnri.reston.va.us> Message-ID: <199907122034.NAA29785@hpcll563.cup.hp.com> On Mon, 12 Jul 1999 14:39:36 -0400 (EDT), "Fred L. Drake" wrote: > > Ovidiu Predescu writes: > > The way I saw things happening with other free software packages is to provide > > already configured files for Windows and Mac OS. You simple don't run the > > configure tools at all but come up with some assumptions on those platforms. > > For a finished, end-user package, this is true. Based on the > developer's day discussion at IPC7, distutils will also be used to > help the developers of those packages. Many authors are not able to > provide binary distributions for all platforms. The ideal would be > for me to put together a source distribution as a distutils-based > package; people with access to various platforms can pull that down, > build any platform-dependent components, and then create an > installable package for others to use. Hopefully this can be reduced > to one or two commands. There are also people who will want to build > from sources regardless of platform; having a Python-only system makes > this a lot easier; only the compiler issues become platform-dependent, > and that can be isolated within the distutils package. OK, I think I can understand that. What I'm saying is that there are two different things: - a tool to help the compilation of C extensions - a tool to install a binary package on a Python distribution The distutils package could help both developers and packagers accomplish the second thing. The autoconf package however helps both crowds deal with platform dependent issues. There are too many details we need to take care of that has already been taken care of in autoconf. There is a lot of work we need to put in figuring out all the things that autoconf solves, header files, libraries, behaviors of various function libraries and system calls. All these may be important for a C extension that does heavy use them. In my mind a combination of these two tools is the best. Maybe a Python configure tool would have its merits, however I would take a pragmatic approach. I think we first need to focus on a packaging tool and then on a distribution site for Python packages (sort of CPAN) and then worry about the rest. > > The output generated by autoconf, aka the configure script, is not covered by > > GPL, so there's no licensing issue here. Only the autoconf package itself is > > covered by GPL, but not its result.The autoconf manual also specifies this very > > I agree. This issue is not entirely a matter of legal > interpretation, unfortunately; some organizations will (reportedly; I > don't know of any documented cases) fire people for installing > un-approved software, and the GPL or GNU label can make corporate > software managers very leery. Whether or not it should is *not* the > issue. I wouldn't worry about such people. I think these days there are less people that think like this. In the future, with big companies like HP, SGI, Sun and others actively supporting free software there will be even less people thinking like that. > Having the system entirely in Python also helps when creating > packages on non-Unix systems; autoconf may not do the right thing on a > Macintosh! As I pointed out above, we are talking about two different things. The packaging is one thing while identifying system specific issues is a different one. I don't argue about building a Python packaging tool, I think this is a very good thing, just about the merits of creating a new tool for determining system specific things. Greetings, -- Ovidiu Predescu http://www.geocities.com/SiliconValley/Monitor/7464/ From jim@interet.com Mon Jul 12 21:53:05 1999 From: jim@interet.com (James C. Ahlstrom) Date: Mon, 12 Jul 1999 16:53:05 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280502155-85895598@hypernet.com> Message-ID: <378A55B1.6B78F41A@interet.com> Gordon McMillan wrote: > On Windows, this means you can have a complete Python installation > (independent of any other Python installation) in a single directory: > > myPython/ > python.exe (and/or pythonw.exe) > python15.dll (completely vanilla) > py_lib.pyz > exceptions.py (from the std distr) > site.py (hacked to load all the .pyz's) > [any other .pyd's or .dll's you want] > [more .pyz's if you want] > [more .py's if you want] But the normal PYTHON/Registry stuff is used to find site.py, no? Is there any guarantee that the correct site.py and exceptions.py will be found? The myPython/ directory may not be on the path. Jim Ahlstrom From ovidiu@cup.hp.com Mon Jul 12 22:00:47 1999 From: ovidiu@cup.hp.com (Ovidiu Predescu) Date: Mon, 12 Jul 1999 14:00:47 -0700 Subject: [Distutils] Beginnings of a C/C++ compiler interface In-Reply-To: Your message of "Mon, 12 Jul 1999 14:56:21 EDT." <199907121856.OAA05492@northshore.shore.net> Message-ID: <199907122100.OAA29970@hpcll563.cup.hp.com> On Mon, 12 Jul 1999 14:56:21 -0400 (EDT), "Michael P. Reilly" wrote: > Please read the charter and requirements doc for the Distutils-SIG. > You will find that the tools being discussed and developed are for the > developers, not necessarily for the installers/end-users. > > Autoconf is a tool for the developer, but does not test anything.. it > generates a ./configure script, for UNIX-based systems only. > > Autoconf requires m4 to be compiled and understood. M4 is very nice, > but it is a language that not everyone can understand easily. Asking > everyone to learn autoconf and m4 is as bad as the Perl crew requiring > that Perl/C extensions be written in XS. > > So the point of the Distutils SIG was to develop some standards and some > (Python or C based) tools to aid the aspiring module developers out there > in creating easily packagable distributions. Autoconf doesn't cut the > mustard (tho I like it :), especially for non-UNIX developing. Yes, I read the requirements doc for this SIG when I joined it, that's exactly why I joined this SIG. I need a tool that helps me package the Python code that I'm writing. I used autoconf to write the configuration process for all the free or proprietary software that I wrote. I see no problem in learning how to write a configure.in script. m4 and autoconf are small little tools that are very easy to learn compared to the amount of software an experienced developer has to learn. As I already wrote in a previous post, I don't argue about the utility of a Python packager tool but about how useful would be to work on a configuration tool right now. Some of the extensions that I wrote use configure to check for header files and library versions. These things are very well done by autoconf, and my point is that we probably don't need another tool to do this. What we need though is standard way to integrate configure in the process of building a package. In a normal C package that uses configure, this is done by running configure and then make. The question is how configure could be integrated with the distutils setup.py script? How are the configure options passed to it and what is the order of execution, first configure and then setup.py? Just wondering if this is reasonable... Greetings, -- Ovidiu Predescu http://www.geocities.com/SiliconValley/Monitor/7464/ From gstein@lyra.org Mon Jul 12 22:24:03 1999 From: gstein@lyra.org (Greg Stein) Date: Mon, 12 Jul 1999 14:24:03 -0700 Subject: [Distutils] Re: Add multiple frozen modules References: <1280502155-85895598@hypernet.com> <378A55B1.6B78F41A@interet.com> Message-ID: <378A5CF3.7010FC6@lyra.org> James C. Ahlstrom wrote: > > Gordon McMillan wrote: > > On Windows, this means you can have a complete Python installation > > (independent of any other Python installation) in a single directory: > > > > myPython/ > > python.exe (and/or pythonw.exe) > > python15.dll (completely vanilla) > > py_lib.pyz > > exceptions.py (from the std distr) > > site.py (hacked to load all the .pyz's) > > [any other .pyd's or .dll's you want] > > [more .pyz's if you want] > > [more .py's if you want] > > But the normal PYTHON/Registry stuff is used to find > site.py, no? Is there any guarantee that the correct > site.py and exceptions.py will be found? The myPython/ > directory may not be on the path. In my "small" distribution, I was able to take advantage of the fact that python.exe looks in the current directory for site.py if there is nothing in the registry. The site.py then proceeds to bootstrap the import system. Gordon has an excellent writeup of the process on his website and/or in the README in his distribution (forget which). My small distribution, links to some discussion about it, and a reference to Gordon's distro is available at: http://www.lyra.org/greg/small/ Cheers, -g -- Greg Stein, http://www.lyra.org/ From gmcm@hypernet.com Mon Jul 12 23:29:09 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Mon, 12 Jul 1999 17:29:09 -0500 Subject: [Distutils] Re: Add multiple frozen modules In-Reply-To: <378A55B1.6B78F41A@interet.com> Message-ID: <1280307761-2059840@hypernet.com> James C. Ahlstrom asks: > Gordon McMillan wrote: > > On Windows, this means you can have a complete Python installation > > (independent of any other Python installation) in a single directory: > > > > myPython/ > > python.exe (and/or pythonw.exe) > > python15.dll (completely vanilla) > > py_lib.pyz > > exceptions.py (from the std distr) > > site.py (hacked to load all the .pyz's) > > [any other .pyd's or .dll's you want] > > [more .pyz's if you want] > > [more .py's if you want] > > But the normal PYTHON/Registry stuff is used to find > site.py, no? Is there any guarantee that the correct > site.py and exceptions.py will be found? The myPython/ > directory may not be on the path. Not if you deliberately SET PYTHONPATH=. which cuts the registry out entirely. For my installer / standalone, I deliberately muck with the environment before loading python and executing the "main" script. (I forgot imputil.py, archive.py and zlib.pyd in the above list). - Gordon From tismer@appliedbiometrics.com Tue Jul 13 10:33:39 1999 From: tismer@appliedbiometrics.com (Christian Tismer) Date: Tue, 13 Jul 1999 11:33:39 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <1280307761-2059840@hypernet.com> Message-ID: <378B07F3.D0B166D6@appliedbiometrics.com> Gordon McMillan wrote: > > James C. Ahlstrom asks: > > > Gordon McMillan wrote: > > > On Windows, this means you can have a complete Python installation > > > (independent of any other Python installation) in a single directory: ... > > But the normal PYTHON/Registry stuff is used to find > > site.py, no? Is there any guarantee that the correct > > site.py and exceptions.py will be found? The myPython/ > > directory may not be on the path. > > Not if you deliberately > SET PYTHONPATH=. > which cuts the registry out entirely. This is what I use for standalone apps. We just drop the whole tree into the application. site.py lives in the Python directory, as a special version. This suffices, since then no other path than "." is needed to boot. > For my installer / standalone, I deliberately muck with the > environment before loading python and executing the "main" script. And that's the only annoyance: You need to take care of environment space, which is unfortunately not big enough very often. I think a special python.exe for this purpose would be handy, which just ignores all registry, sets the path to the executable's directory, and done. ciao - chris (who really hates to muck with anyone's registry) -- Christian Tismer :^) Applied Biometrics GmbH : Have a break! Take a ride on Python's Kaiserin-Augusta-Allee 101 : *Starship* http://starship.python.net 10553 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF we're tired of banana software - shipped green, ripens at home From jim@interet.com Tue Jul 13 15:50:57 1999 From: jim@interet.com (James C. Ahlstrom) Date: Tue, 13 Jul 1999 10:50:57 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280502155-85895598@hypernet.com> <378A55B1.6B78F41A@interet.com> <378A5CF3.7010FC6@lyra.org> Message-ID: <378B5251.D4F84197@interet.com> Greg Stein wrote: > > In my "small" distribution, I was able to take advantage of the fact > that python.exe looks in the current directory for site.py if there is > nothing in the registry. The site.py then proceeds to bootstrap the > import system. Gordon: > Not if you deliberately > SET PYTHONPATH=. > which cuts the registry out entirely. > For my installer / standalone, I deliberately muck with the > environment before loading python and executing the "main" script. Christian: > This is what I use for standalone apps. We just drop the whole > tree into the application. site.py lives in the Python directory, > as a special version. This suffices, since then no other path > than "." is needed to boot. > And that's the only annoyance: You need to take care of > environment space, which is unfortunately not big enough > very often. I think a special python.exe for this purpose > would be handy, which just ignores all registry, sets > the path to the executable's directory, and done. A Windows user normally sets the "Start in" directory in the shortcut to something other than the directory of the binary. In other words, the current directory "." is unreliable. Users can right-click their icon and change it. I don't like changing other people's registry either. In fact our bank customers hate it when we do. I don't trust the environment PYTHONPATH either. So we can't be sure the site.py bootstrap is bullet-proof. Please stand by while I try to clarify my thoughts... Jim Ahlstrom From tismer@appliedbiometrics.com Tue Jul 13 16:08:50 1999 From: tismer@appliedbiometrics.com (Christian Tismer) Date: Tue, 13 Jul 1999 17:08:50 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <1280502155-85895598@hypernet.com> <378A55B1.6B78F41A@interet.com> <378A5CF3.7010FC6@lyra.org> <378B5251.D4F84197@interet.com> Message-ID: <378B5682.E7A88E79@appliedbiometrics.com> "James C. Ahlstrom" wrote: ... > A Windows user normally sets the "Start in" directory in > the shortcut to something other than the directory of the > binary. In other words, the current directory "." is unreliable. > Users can right-click their icon and change it. > > I don't like changing other people's registry either. In > fact our bank customers hate it when we do. I don't trust > the environment PYTHONPATH either. So we can't be sure > the site.py bootstrap is bullet-proof. Please stand by > while I try to clarify my thoughts... I'd say we write a startup program which sits in the same directory as the python15.dll (which is the Python directory for my customers! I don't want to interfere with anything.) This extracts it's executable's program path, chdirs to there, sets the environment internally to "." (which holds now), and the site.py trick always should work. We (PNS) go even further (but at the moment without the extra .exe which we will do soon). Site.py does some fiddling with the Python path, tries to find ./lib and so on, and then looks for ../app.py which has to be there. From there, the bootstrap loads the application. By this, we can replace the whole Python installation by moving a single directory tree. It does not contain any application dependant stuff and just knows that the app.py sits on top. -chris -- Christian Tismer :^) Applied Biometrics GmbH : Have a break! Take a ride on Python's Kaiserin-Augusta-Allee 101 : *Starship* http://starship.python.net 10553 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF we're tired of banana software - shipped green, ripens at home From jim@interet.com Tue Jul 13 16:13:43 1999 From: jim@interet.com (James C. Ahlstrom) Date: Tue, 13 Jul 1999 11:13:43 -0400 Subject: [Distutils] Re: Add multiple frozen modules Message-ID: <378B57A7.37C7D454@interet.com> I think I finally understand the work Greg, Gordon and the distutils sig folks have done with building the Python library into the Python binary. But I am still having problems. I still think we may need to be able to have multiple frozen module arrays. When Python starts up (Py_Initialize()) it: Performs an import of exceptions.py. Calls initsigs(). The implication is that someday something_else.py may be imported here. Calls initmain(). Performs an import of site.py. Greg's (wonderful) imputil.py must be "turned on" in site.py. So the conclusion is that the default Python import logic can not be replaced until after it has been used to find exceptions.py and site.py. This seems to be unfortunate. It would be nice to replace it for all modules. It would be nice if all this were identical on Unix and Windows, and that no C compiler were required. And that Python versions could be changed by replacing python15.dll/.so. And that having a frozen "__main__" still worked... Here are a few ideas: 1) Use freeze to include exceptions.py, site.py and imputil.py in the binary. Frozen modules are always (???) found before identical modules on PYTHONPATH/Registry. There is no chicken-and-egg problem because site.py has already been hacked to turn on imputil.py and to provide for custom imports. This works now. But there can be only one frozen module array, thus my suggestion to allow multiple arrays. OTOH, maybe this is OK since imports are now customized. But we are taking away the freeze feature including frozen "__main__". 2) Declare that imputil.py is part of the distribution, and hack Py_Initialize() to turn it on before anything is imported. I am not sure how to solve the chicken-and-egg problem. Perhaps sys.executable could be used as the initial search path in Py_Initialize(), and the regular path be used iff sys.executable fails. At least on Windows 95 and later, sys.executable is the highly reliable path to the Python interpreter binary. I am not sure how reliable it is on Unix. The idea is that site.py (etc.) is always in the same directory as the binary. For this, we need to guarantee that *.py in the directory of sys.executable is always found first. This is not currently the case. 3) Somehow have the user create special built-in modules which actually contain the *.pyc code, and hack Py_Initialize() to load and initialize them first. 4) Use pickle to create the magic file "python.py0" which contains site.py etc. in a standard format, I guess a dictionary. Hack Py_Initialize() to load its modules if it is located in the directory of sys.executable. So the start-up Python code lives in "python.py0" with its binary. Maybe we could automagically load python.py1, 2, ... too thus providing a hook for the user to add frozen code without a compiler. sys.executable must be bullet-proof. Does the executable mean python.exe or python15.dll?? Both?? Another problem is that tools/Freeze/*.py seems to require one of the frozen modules to be named "__main__" and thus be executed on start up. This means we can't put all this into python15.dll. It seems that (on Windows) exceptions.py and imputils.py should go into python15.dll, and not into python.exe. Site.py might go into either python15.dll or python.exe. FWIW, I am currently using three frozen module arrays. Site.py, exceptions.py and the Python library goes in python15.dll. My WPY modules go in python.exe, and the application main modules go in another DLL. This enables replacing python15.dll to update Python. But I am not that happy with this scheme. Jim Ahlstrom From gmcm@hypernet.com Tue Jul 13 21:12:54 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Tue, 13 Jul 1999 15:12:54 -0500 Subject: [Distutils] Re: Add multiple frozen modules In-Reply-To: <378B5251.D4F84197@interet.com> Message-ID: <1280229508-6766468@hypernet.com> Jim Ahlstrom wrote: > Gordon: > > Not if you deliberately > > SET PYTHONPATH=. > > which cuts the registry out entirely. > > For my installer / standalone, I deliberately muck with the > > environment before loading python and executing the "main" script. > A Windows user normally sets the "Start in" directory in > the shortcut to something other than the directory of the > binary. In other words, the current directory "." is unreliable. > Users can right-click their icon and change it. OK - os.path.dirname(sys.argv[0]). In my case it doesn't matter. I have just concatenated my archive (of python15.dll, any .pyzs, exceptions.py, site.py etc.) onto the end of the executable. When run, it opens itself as a file and unpacks the archive into the current directory. Then it does a _putenv("PYTHONPATH=.") and dynamically loads python15.dll. If you're using Run.exe, it even cleans up whatever it unpacked at end of run. More advanced would be to make the archive a legit resource section of the exe, but I can't see that as worth the effort (esp. since I don't expect my users to have compilers, and I've already gone to the work of finding the import sections in dll headers so I can find binary dependencies...) - Gordon From jim@interet.com Wed Jul 14 14:21:46 1999 From: jim@interet.com (James C. Ahlstrom) Date: Wed, 14 Jul 1999 09:21:46 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280502155-85895598@hypernet.com> <378A55B1.6B78F41A@interet.com> <378A5CF3.7010FC6@lyra.org> <378B5251.D4F84197@interet.com> <378B5682.E7A88E79@appliedbiometrics.com> Message-ID: <378C8EEA.47BF4D9C@interet.com> Christian Tismer wrote: > > I'd say we write a startup program which sits in the same > directory as the python15.dll (which is the Python > directory for my customers! I don't want to interfere with > anything.) This extracts it's executable's program path, > chdirs to there, sets the environment internally to "." > (which holds now), and the site.py trick always should work. We would need to remember the original getcwd() and chdir back so the user's "Start in" setting still works. Also, I think it would be better to use the directory of python15.dll, since all of Python is in it, and that is where imports are done. It is not really a feature of the app. Jim Ahlstrom From jim@interet.com Wed Jul 14 14:52:17 1999 From: jim@interet.com (James C. Ahlstrom) Date: Wed, 14 Jul 1999 09:52:17 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> Message-ID: <378C9611.F94AB321@interet.com> Gordon McMillan wrote: > OK - os.path.dirname(sys.argv[0]). Oh, I see. Your main program is given as a complete path in the icon target. So "python D:/my/path/myprog.py". I start my main as the frozen module named "__main__", so sys.argv[0] is useless. > In my case it doesn't matter. I have just concatenated my archive (of > python15.dll, any .pyzs, exceptions.py, site.py etc.) onto the end of > the executable. When run, it opens itself as a file and unpacks the > archive into the current directory. Then it does a > _putenv("PYTHONPATH=.") and dynamically loads python15.dll. If > you're using Run.exe, it even cleans up whatever it unpacked at end > of run. Cool. I didn't realize you could append to an .exe and still have it run. How do you know where your appended data starts? > More advanced would be to make the archive a legit resource section > of the exe, but I can't see that as worth the effort (esp. since I > don't expect my users to have compilers, and I've already gone to the > work of finding the import sections in dll headers so I can find > binary dependencies...) I think this may be a good idea. We could use a Windows user-defined resource named with a magic name such as "PythonPyc", and all resource modules could be loaded in Py_Initialize(). The .rc resource file names the .pyc file so there is no need to convert the bytes to C and compile. The .rc file syntax is the line: string PythonPyc C:/lib/string.pyc You use FindResource(), SizeofResource(), LoadResource() and LockResource() to access these resources. It is possible to replace resources in an exe or dll using BeginUpdateResource(), UpdateResource() and EndUpdateResource(). If we wrote a Python interface into a .pyd, then users don't even need a compiler nor resource compiler. Note that there is Mac code in the distribution which accesses Mac resources. The only reason I have not written all this is it is not usable on Unix. Jim Ahlstrom From tismer@appliedbiometrics.com Wed Jul 14 15:04:35 1999 From: tismer@appliedbiometrics.com (Christian Tismer) Date: Wed, 14 Jul 1999 16:04:35 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <1280502155-85895598@hypernet.com> <378A55B1.6B78F41A@interet.com> <378A5CF3.7010FC6@lyra.org> <378B5251.D4F84197@interet.com> <378B5682.E7A88E79@appliedbiometrics.com> <378C8EEA.47BF4D9C@interet.com> Message-ID: <378C98F3.93D9636B@appliedbiometrics.com> "James C. Ahlstrom" wrote: > > Christian Tismer wrote: > > > > I'd say we write a startup program which sits in the same > > directory as the python15.dll (which is the Python > > directory for my customers! I don't want to interfere with > > anything.) This extracts it's executable's program path, > > chdirs to there, sets the environment internally to "." > > (which holds now), and the site.py trick always should work. > > We would need to remember the original getcwd() and chdir > back so the user's "Start in" setting still works. Well, that's easy. Whatever directory the user "was in", the path of the executable (i.e. what's in sys.executable) will be allways enough to figure out where you are now. This might not be a full path but a relative one, and it might (will) be the "short" version under Win9X, but anyway it was enough for the exec launcher to launch the .exe. > Also, I think it would be better to use the directory of > python15.dll, since all of Python is in it, and that is > where imports are done. It is not really a feature of the > app. In my case, python15.dll and python.exe *are* in the same directory. This is not the app, app.py is a file on top of this python directory, which is like plug-in (drop-in) and always the same. *int* the same python drop-in-dir I also have the simplified site.py which just takes care that the path settings are completed, that application scripts are found (by reaching out, off the directory, into app.py) and imports are prepended to ./lib stuff. Works fine, regardless where ther is another Python installation, even if another one is running. ciao - chris -- Christian Tismer :^) Applied Biometrics GmbH : Have a break! Take a ride on Python's Kaiserin-Augusta-Allee 101 : *Starship* http://starship.python.net 10553 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF we're tired of banana software - shipped green, ripens at home From tismer@appliedbiometrics.com Wed Jul 14 15:10:57 1999 From: tismer@appliedbiometrics.com (Christian Tismer) Date: Wed, 14 Jul 1999 16:10:57 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> Message-ID: <378C9A71.AF4BBD53@appliedbiometrics.com> "James C. Ahlstrom" wrote: > > Gordon McMillan wrote: > > > OK - os.path.dirname(sys.argv[0]). > > Oh, I see. Your main program is given as a complete path > in the icon target. So "python D:/my/path/myprog.py". > I start my main as the frozen module named "__main__", so > sys.argv[0] is useless. Freezing is not so handy as Gordon's trick. Which came partially out of my trick, which came from Fredrik's squeeze trick, or so. [the trick] > Cool. I didn't realize you could append to an .exe and still > have it run. How do you know where your appended data starts? Is it still so? look at the end for the magic cookie, use its info to find the start of a cookie, and there's the data? > > More advanced would be to make the archive a legit resource section > > of the exe, but I can't see that as worth the effort (esp. since I > > don't expect my users to have compilers, and I've already gone to the > > work of finding the import sections in dll headers so I can find > > binary dependencies...) > > I think this may be a good idea. We could use a Windows user-defined > resource named with a magic name such as "PythonPyc", and all resource > modules could be loaded in Py_Initialize(). The .rc resource file > names the .pyc file so there is no need to convert the bytes to C > and compile. The .rc file syntax is the line: > string PythonPyc C:/lib/string.pyc > You use FindResource(), SizeofResource(), LoadResource() and > LockResource() to access these resources. Yes, this is still an open issue. Greg proposed a similar way. I never came to it, since the drawback of the simple approach is that it works, and it isn't so obvious how. By a resource, I open the app to Joe Hacker and invite him to play with resources. So why should I, if everything can be done in a copy /b style? :-) > It is possible to replace resources in an exe or dll using > BeginUpdateResource(), UpdateResource() and EndUpdateResource(). If > we wrote a Python interface into a .pyd, then users don't even need > a compiler nor resource compiler. Note that there is Mac code in the > distribution which accesses Mac resources. The only reason I have > not written all this is it is not usable on Unix. Yes, and to complete it: The append trick *does* work on Unix. And a resource like management of things could be written with Python, defining our own resources. Why care about an OS? Ahem :-) ciao - chris -- Christian Tismer :^) Applied Biometrics GmbH : Have a break! Take a ride on Python's Kaiserin-Augusta-Allee 101 : *Starship* http://starship.python.net 10553 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF we're tired of banana software - shipped green, ripens at home From mal@lemburg.com Thu Jul 15 11:52:38 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 15 Jul 1999 12:52:38 +0200 Subject: [Distutils] Beginnings of a C/C++ compiler interface References: <199907122100.OAA29970@hpcll563.cup.hp.com> Message-ID: <378DBD76.7CD80B43@lemburg.com> Ovidiu Predescu wrote: > > As I already wrote in a previous post, I don't argue about the utility of a > Python packager tool but about how useful would be to work on a configuration > tool right now. > > Some of the extensions that I wrote use configure to check for header files and > library versions. These things are very well done by autoconf, and my point is > that we probably don't need another tool to do this. What we need though is > standard way to integrate configure in the process of building a package. In a > normal C package that uses configure, this is done by running configure and > then make. The question is how configure could be integrated with the distutils > setup.py script? How are the configure options passed to it and what is the > order of execution, first configure and then setup.py? Just wondering if this > is reasonable... Well, you are certainly right in saying that autoconf already handles what I proposed in an earlier mail. OTOH, the distutils package is intended to be cross-platform and the autoconf output does not run on non-Unix platforms. Given that the compiler interface provides an abstract interface to the compiler (+ the linker) and its installation, it only seems natural to use the interface to "try out some things" much like the generated configure scripts do. I don't want to reinvent a wheel here: most config setups are very primitive and don't need the full-blown autoconf mechanisms. Some more methods on the compiler interface would enable this for those who want to use the feature, something like .testcompile(program_string[,options]) .testcompileandlink(program_string[,options,libs]) which return 1/0 to state "success" / "failure" and implement proper cleanup of the intermediate files. Very simple, really no need to argue here, IMHO... Cheers, -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 169 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Thu Jul 15 12:18:17 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 15 Jul 1999 13:18:17 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> Message-ID: <378DC379.65AEB215@lemburg.com> Sorry to jump into this discussion this late... Jim kindly made me take a closer look at the thread. As far as I have skimmed the discussion, you are mostly talking about freeze'ing on Windows platforms. Well, then I'm the wrong guy to say anything, because my knowledge of that platform doesn't go too deep. All I can say is that Florent Heyworth has been able to create a cgipython binary (see the mxCGIPython project on my Python Pages) for NT which includes the Python lib as frozen modules in the EXE. Here are some instructions which I copied from a mail by Florent (hope this is ok, Florent): """ > BTW: What did you have to do to compile the thingie on WinNT ? > I read that freeze.py works on WinNT too, but have never tried it. Well , it works but I needed to make a few changes as the whole build process on windows is geared to producing dlls - what i did was this (I had to do this by hand - maybe I'll try to find the time to automate this): - - removed the dllmain routine in the python15 classes - - undefined the USE_DL_IMPORT macro in config.h - - ran freeze.py from the command line as defined im your makefile - - added the *.c files created into the python15 project - - replaced the standard frozen.c with the newly created frozen.c in CGIPython - the "main" method here calls Py_FrozenMain - - added frozen_dllmain.c from the "PC" directory because that's where PyWinFreeze_ExeInit and PyWinFreeze_ExeTerm are defined - - changed the static struct _frozen _PyImport_FrozenModules[] definition to extern struct _frozen _PyImport_FrozenModules[] - - changed the preprocessor definitions to: NDEBUG,WIN32,_CONSOLE,_MBCS - - changed the link options to /subsystem:console And that was it. """ More information on installing such a binary is on the mxCGIPython web-page. Hope this helps a bit... Florent is the one to ask for details ;-) -- Something else which might also be related: The problem with this binary is that it fails to load shared libs (DLLs on WinNT). This fails for both Unix and Windows platforms. I haven't dug any deeper into this, but its either a linker problem (Python's symbols are not found) or some problem with the interaction of the import mechanism and frozen modules (I had some reports about the loader not finding the initXXX() functions of the shared extension modules). Does anybody have a clue on this one ? Maybe some tips on how to set the linker options ? Cheers, -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 169 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From jim@interet.com Thu Jul 15 14:50:48 1999 From: jim@interet.com (James C. Ahlstrom) Date: Thu, 15 Jul 1999 09:50:48 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> <378DC379.65AEB215@lemburg.com> Message-ID: <378DE738.997BF41F@interet.com> "M.-A. Lemburg" wrote: > > As far as I have skimmed the discussion, you are mostly talking > about freeze'ing on Windows platforms. Actually, we are trying to get something which works identically on Windows and Unix if possible. > The problem with this binary is that it fails to load shared > libs (DLLs on WinNT). This fails for both Unix and Windows I would guess the problem on Windows is that the shared libs have been compiled to need Python, and so the only way they can get Python is in a DLL. If the main.exe and a shared lib both need Python (the usual case) then Python MUST be in a shared lib too. Making an exe will not work. On Windows, the solution is to put the frozen Python libs into python15.dll. Then everything will work. Jim Ahlstrom From mal@lemburg.com Thu Jul 15 17:12:39 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 15 Jul 1999 18:12:39 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> <378DC379.65AEB215@lemburg.com> <378DE738.997BF41F@interet.com> Message-ID: <378E0877.7CC5BF41@lemburg.com> James C. Ahlstrom wrote: > > "M.-A. Lemburg" wrote: > > > > As far as I have skimmed the discussion, you are mostly talking > > about freeze'ing on Windows platforms. > > Actually, we are trying to get something which works identically > on Windows and Unix if possible. Ehh... I thought this was only a problem on Windows where the default is to build a Python DLL instead of an EXE. On Unix, freezing multiple modules into one binary is not much of a problem (except the shared lib thing below): this is how all the cgipython binaries for the mxCGIPython project where created. > > The problem with this binary is that it fails to load shared > > libs (DLLs on WinNT). This fails for both Unix and Windows > > I would guess the problem on Windows is that the shared libs > have been compiled to need Python, and so the only way they > can get Python is in a DLL. If the main.exe and a shared lib > both need Python (the usual case) then Python MUST be in a > shared lib too. Making an exe will not work. On Windows, the > solution is to put the frozen Python libs into python15.dll. > Then everything will work. Hmm, but that wouldn't be all that elegant :-) since the code is included twice. Even worse, I think this will mess up the memory allocation, if I remember old mails on c.l.p regarding malloc and DLLs on Windows correctly. Cheers, -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 169 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From jim@interet.com Thu Jul 15 19:41:50 1999 From: jim@interet.com (James C. Ahlstrom) Date: Thu, 15 Jul 1999 14:41:50 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> <378DC379.65AEB215@lemburg.com> <378DE738.997BF41F@interet.com> <378E0877.7CC5BF41@lemburg.com> Message-ID: <378E2B6E.A4B18A01@interet.com> "M.-A. Lemburg" wrote: > > Ehh... I thought this was only a problem on Windows where the > default is to build a Python DLL instead of an EXE. On Unix, > freezing multiple modules into one binary is not much of a problem > (except the shared lib thing below): this is how all the cgipython > binaries for the mxCGIPython project where created. I think you are right that this topic is mostly of interest to Windows people. On Unix you usually use freeze to include everything in the binary as you say. On Windows the interpreter is in python15.dll, a shared library. The advantage is that you can add other DLL's without recompiling anything. So "built-in" modules become dynamic. There is no reason Unix could not do this too if shared modules are available. For those of us distributing applications, the executable programs could be 1) main1.exe, main2.exe, etc. with a frozen "__main__" module included; 2) or a single executable "python.exe" but with a specified program on the command line "python.exe C:/path/myprog.py" as specified in a Windows icon (shortcut). This exposes the myprog source. What I am trying to do is put the Python library into python15.dll but still allow other *.pyc files to be in main1.exe. Thus the subject line, add multiple frozen modules. Putting the library into python15.dll guarantees version match, and means you can upgrade Python versions just by replacing python15.dll. You upgrade your application by replacing main1.exe. Simple. > > > The problem with this binary is that it fails to load shared > > > libs (DLLs on WinNT). This fails for both Unix and Windows > > > > I would guess the problem on Windows is that the shared libs > > have been compiled to need Python, and so the only way they > > can get Python is in a DLL. If the main.exe and a shared lib > > both need Python (the usual case) then Python MUST be in a > > shared lib too. Making an exe will not work. On Windows, the > > solution is to put the frozen Python libs into python15.dll. > > Then everything will work. > > Hmm, but that wouldn't be all that elegant :-) since the code > is included twice. Even worse, I think this will mess up > the memory allocation, if I remember old mails on c.l.p regarding > malloc and DLLs on Windows correctly. I do not understand. Nothing is included twice. The only Python interpreter is in python15.dll, and all clients such as main1.exe, cool_stuff.pyd, etc. use it. Jim Ahlstrom From jim@interet.com Thu Jul 15 21:17:48 1999 From: jim@interet.com (James C. Ahlstrom) Date: Thu, 15 Jul 1999 16:17:48 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> Message-ID: <378E41EC.FB27A8FC@interet.com> Christian Tismer wrote: > Yes, and to complete it: The append trick *does* work on Unix. > And a resource like management of things could be written > with Python, defining our own resources. Why care about an OS? [Jim ponders the append trick...] OK, a really simple idea is to do this: cat python.exe string.pyc site.pyc > myprog.exe That is, just append a bunch of *.pyc to the executable. Anybody can do that. Then make the import mechanism be able to find them. So, look for the *.pyc magic numbers. To recover the module names, use the base file name of the code object attribute co.co_filename. Same with Python15.dll, append *.pyc to it too. Jim Ahlstrom From mal@lemburg.com Thu Jul 15 21:14:08 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 15 Jul 1999 22:14:08 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> <378DC379.65AEB215@lemburg.com> <378DE738.997BF41F@interet.com> <378E0877.7CC5BF41@lemburg.com> <378E2B6E.A4B18A01@interet.com> Message-ID: <378E4110.256D8544@lemburg.com> James C. Ahlstrom wrote: > > "M.-A. Lemburg" wrote: > > > > Ehh... I thought this was only a problem on Windows where the > > default is to build a Python DLL instead of an EXE. On Unix, > > freezing multiple modules into one binary is not much of a problem > > (except the shared lib thing below): this is how all the cgipython > > binaries for the mxCGIPython project where created. > > I think you are right that this topic is mostly of interest to > Windows people. On Unix you usually use freeze to include everything > in the binary as you say. > > On Windows the interpreter is in python15.dll, a shared library. > The advantage is that you can add other DLL's without > recompiling anything. So "built-in" modules become dynamic. There > is no reason Unix could not do this too if shared modules are available. > > For those of us distributing applications, the executable programs could > be > 1) main1.exe, main2.exe, etc. with a frozen "__main__" module > included; > 2) or a single executable "python.exe" but with a specified program > on the command line "python.exe C:/path/myprog.py" as specified > in a Windows icon (shortcut). This exposes the myprog source. > > What I am trying to do is put the Python library into python15.dll > but still allow other *.pyc files to be in main1.exe. Thus the > subject line, add multiple frozen modules. Putting the library into > python15.dll guarantees version match, and means you can upgrade Python > versions just by replacing python15.dll. You upgrade your application > by replacing main1.exe. Simple. Just a note here: the new DLL should *not* be named python15.dll, since this is likely to cause incompatible setups -- why not call it pylib15.dll or something similar. This DLL would then only include the frozen modules and live happily alongside of the standard python15.dll. Only the EXE would know about the frozen modules in pylib15.dll (with the extra magic to include them as frozen modules in sys). The same could be done to your Python files. As a result you'd have these files: main.exe -- special main for frozen app which links to the DLLs myapp.dll -- frozen application modules python15.dll -- the standard python15.dll pylib15.dll -- the standard lib in form of frozen modules This setup should work on all platforms providing shared libs since the hard part of linking in the different DLLs is handled by the linker (*). main.exe would only need to know the names of all the frozen modules. (*) Using the lib in this form is also likely to reduce startup times once the DLL is loaded. > > > > The problem with this binary is that it fails to load shared > > > > libs (DLLs on WinNT). This fails for both Unix and Windows > > > > > > I would guess the problem on Windows is that the shared libs > > > have been compiled to need Python, and so the only way they > > > can get Python is in a DLL. If the main.exe and a shared lib > > > both need Python (the usual case) then Python MUST be in a > > > shared lib too. Making an exe will not work. On Windows, the > > > solution is to put the frozen Python libs into python15.dll. > > > Then everything will work. > > > > Hmm, but that wouldn't be all that elegant :-) since the code > > is included twice. Even worse, I think this will mess up > > the memory allocation, if I remember old mails on c.l.p regarding > > malloc and DLLs on Windows correctly. > > I do not understand. Nothing is included twice. The only Python > interpreter is in python15.dll, and all clients such as main1.exe, > cool_stuff.pyd, etc. use it. Sorry, I misread you reply and was still thinking of how the cgipython EXE works (it includes the lib as frozen modules). Adding the python15.dll to cgipython would probably lead to the malloc problems. If the lib were included in python15.dll that would be different, I suppose. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 169 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From gstein@lyra.org Thu Jul 15 22:37:58 1999 From: gstein@lyra.org (Greg Stein) Date: Thu, 15 Jul 1999 14:37:58 -0700 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> <378E41EC.FB27A8FC@interet.com> Message-ID: <378E54B6.BDE4450@lyra.org> James C. Ahlstrom wrote: > Christian Tismer wrote: > > Yes, and to complete it: The append trick *does* work on Unix. > > And a resource like management of things could be written > > with Python, defining our own resources. Why care about an OS? > > [Jim ponders the append trick...] > > OK, a really simple idea is to do this: > cat python.exe string.pyc site.pyc > myprog.exe > That is, just append a bunch of *.pyc to the executable. > Anybody can do that. Then make the import mechanism be > able to find them. So, look for the *.pyc magic numbers. > To recover the module names, use the base file name of the > code object attribute co.co_filename. > > Same with Python15.dll, append *.pyc to it too. You can make it a lot easier on yourself if you remember the byte offset of each of those .pyc files. Then, you append a marshalled dictionary mapping the names to the offsets (and remember the byte offset of this marshalled dict). Finally, you append the byte offset of the dict. Recovery is then very easy: seek to (-4) from the end and read the offset. Seek there and unmarshall the dict. Seek to whichever module you want and unmarshal that. I used this technique in constructing my Python library of modules in the small distribution. The offset was at the beginning cuz it was a self-defined format so I just seeked back to the start to write out the file's magic value + offset. But the technique works for appending, too. Personally, I think it is a hack to simply append stuff to an executable (Unix or Windows). I would much rather see a proper COFF or ELF section inserted in there which contains the modules. Cheers, -g -- Greg Stein, http://www.lyra.org/ From s341625@student.uq.edu.au Fri Jul 16 04:03:04 1999 From: s341625@student.uq.edu.au (Anthony Pfrunder) Date: Fri, 16 Jul 1999 13:03:04 +1000 (GMT+1000) Subject: [Distutils] Zmake build system Message-ID: Hi, As part of the Zope project I've constructed a python build system which uses python itself and works cross-platform. Currently, only Visual C is implemented but in the next few weeks plug-ins for cygwin and mingw32 should come online. I would love to merge this in with dist-util so I can go back to playing with ZClasses ;) DESCRIPTION The system consists of: Make.py Python script which reads makefiles and 'runs' can import Setup files (for building extensions) and generic makefiles. Will soon be able to read ./configure output to get the variables Makefile.zmake The platform generic makefile, recursively loads Setup files and then builds a python console, lib and pyd's (choose your targets) msvc\msvc.zmake Platform stuff for visual c. Makefile hooks into generic one by adding stuff to dep lines and defing new rules. Also includes custom rc, ico files (ie all the stuff that is in the PC build dir) A simple rule (building the parser module): parser: os.path.join(PythonHome, 'Modules', 'parsermodule.obj') parser.pyd Note: you can recursively import Setup files and normal unix makefiles so you don't need to change anything (eventually...) This will compile parsermodule.c (dumping temp stuff including def files into directory pointed to by TMP) and link it to generate a pyd. The pyd will be placed in OUTDIRE for you. See the python-lib, python-lib-sources rules for a more complex example. (note, the PC specific stuff is added when msvc.zmake is loaded) Beware that it is pre-alpha and is release only to provide a tool to build Zope Extensions in VC. You can grab a snapshot at: http://student.uq.edu.au/~s341625/ZwinG2.zip Currently, it can build python-lin, console and extensions under Visual C Cheers, Anthony Pfrunder From jim@interet.com Fri Jul 16 14:49:41 1999 From: jim@interet.com (James C. Ahlstrom) Date: Fri, 16 Jul 1999 09:49:41 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> <378DC379.65AEB215@lemburg.com> <378DE738.997BF41F@interet.com> <378E0877.7CC5BF41@lemburg.com> <378E2B6E.A4B18A01@interet.com> <378E4110.256D8544@lemburg.com> Message-ID: <378F3875.4737DD29@interet.com> "M.-A. Lemburg" wrote: > Just a note here: the new DLL should *not* be named python15.dll, > since this is likely to cause incompatible setups -- why not call > it pylib15.dll or something similar. This DLL would then only > include the frozen modules and live happily alongside of the > standard python15.dll. Only the EXE would know about the frozen > modules in pylib15.dll (with the extra magic to include them > as frozen modules in sys). I don't have a problem with including the frozen libs as a seperate DLL. But I don't think the main.exe needs to be involved at all on Windows. Remember on Windows all of Python including import.c is in python15.dll. We may be having Unix/Windows mind set problems. I am saying that the frozen libs (however they are packaged) are controlled by the module containing Python and import.c. On Windows that is python15.dll, on Unix it is often the main. By the way, if you can test a Windows CGI setup, I would be happy to send you a hacked python15.dll which might solve your module load problems. Jim Ahlstrom From jim@interet.com Fri Jul 16 14:55:51 1999 From: jim@interet.com (James C. Ahlstrom) Date: Fri, 16 Jul 1999 09:55:51 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> <378E41EC.FB27A8FC@interet.com> <378E54B6.BDE4450@lyra.org> Message-ID: <378F39E7.CB440912@interet.com> Greg Stein wrote: > You can make it a lot easier on yourself if you remember the byte offset > of each of those .pyc files. Then, you append a marshalled dictionary Yes, I see your point. I am looking into this now. But it does require a special Python set up program. I was trying to make it easier on the user. If a Lib module turns up missing you just cat it onto python15.dll. > Personally, I think it is a hack to simply append stuff to an executable > (Unix or Windows). I would much rather see a proper COFF or ELF section > inserted in there which contains the modules. Yes, I agree. But presumably every different system has its own executable format. I do not know how standard they are even just on Unix. I am currently writing some experimental code and will report back. Jim Ahlstrom From mal@lemburg.com Fri Jul 16 15:31:29 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 16 Jul 1999 16:31:29 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <378B57A7.37C7D454@interet.com> <378DC379.65AEB215@lemburg.com> <378DE738.997BF41F@interet.com> <378E0877.7CC5BF41@lemburg.com> <378E2B6E.A4B18A01@interet.com> <378E4110.256D8544@lemburg.com> <378F3875.4737DD29@interet.com> Message-ID: <378F4241.3E5B9D13@lemburg.com> James C. Ahlstrom wrote: > > "M.-A. Lemburg" wrote: > > Just a note here: the new DLL should *not* be named python15.dll, > > since this is likely to cause incompatible setups -- why not call > > it pylib15.dll or something similar. This DLL would then only > > include the frozen modules and live happily alongside of the > > standard python15.dll. Only the EXE would know about the frozen > > modules in pylib15.dll (with the extra magic to include them > > as frozen modules in sys). > > I don't have a problem with including the frozen libs as a > seperate DLL. But I don't think the main.exe needs to be > involved at all on Windows. Remember on Windows all of Python > including import.c is in python15.dll. AFAIK (and that only refers to Unix), freeze.py builds a custom main program which includes the frozen mods as static data. I'm not sure whether it also includes a custom import.c -- I think the hooks are there for the modified main.c to use... have to look into this a little closer sometime. > We may be having Unix/Windows mind set problems. I am saying > that the frozen libs (however they are packaged) are controlled > by the module containing Python and import.c. On Windows that > is python15.dll, on Unix it is often the main. > > By the way, if you can test a Windows CGI setup, I would be happy > to send you a hacked python15.dll which might solve your module > load problems. Thanks for the offer, but I'm running Unix for CGI things. There are some people out on c.l.p which would like to use cgipython + DLLs though, so you might want to publish your modified DLL somewhere for them to download. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 168 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From jim@interet.com Mon Jul 19 16:22:41 1999 From: jim@interet.com (James C. Ahlstrom) Date: Mon, 19 Jul 1999 11:22:41 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> <378E41EC.FB27A8FC@interet.com> Message-ID: <379342C1.541C0F98@interet.com> I got a little time to think about this over the weekend and propose this design. It is a way to package *.pyc files into a single-file portable archive for distribution with commercial, CGI (like Marc is doing) or otherwise simplified distributions. It is dumb-stupid-simple, my personal favorite and a requirement for commercial software. This is not concerned with an "installer", which is a separate problem. Few of these ideas are mine. 1. There is a Python library file format (PYL) which can hold *.pyc files. The format is: a.pyc, b.pyc, marshalled dict, offset of dict, magic number. The a.pyc, b.pyc,... are the bytes of the *.pyc files including the eight byte header. The dict has keys "name", values are the seek offset of the start of the *.pyc files. The "offset of dict" is the seek offset for finding the dict. The magic number identifies the file as a PYL file. The "names" can be normal names such as "os", or dotted names such as "foo.bar". Although initially devoted to *.pyc files, we note that it is possible to include other resources with non-module names such as "more*data". A PYL file is portable to any computer, Unix, Windows, etc. Compression is not included, and should be done at the "installer" level if desired. 2. The PYL file always has the same directory and file name as the binary module containing import.c. but with a .pyl ending: Python Binary PYL File Name /usr/local/bin/python /usr/local/bin/python.pyl /my/so/dir/python.so /my/so/dir/python.pyl C:/python/python15.dll C:/python/python15.pyl These Python binary names are available as sys.executable (the main) and sys.dllfullpath (the shared library or DLL). 3) Since the PYL file can be efficiently read backwards, it can, if desired, be appended the the Python binary itself: cat python15.pyl >> python15.dll 4) The PYL file is created with the Python program makepyl.py and no C compiler is necessary. 5) There is a new optional built-in module "importer" which may be included by editing Setup. It is imported in Py_Initialize() after "sys" is set up, but before any other imports such as "exceptions". It is not an error if it is absent. If present, it replaces __import__ just like imputils.py does. The replacement importer searches for PYL files as a .pyl file, in the current sys.executable, and in the current sys.dllfullpath (name of DLL). Note that importer can use multiple PYL files. Importer is able to import the modules exceptions, site and sitecustomize, and probably most other modules. Importer has methods to print out the names of PYL modules available. You could still override importer using sitecustomize and imputils if desired, in which case it may be convenient to use importer's methods. 6) Alternative to (5): Modules exceptions, site, sitecustomize and imputils are frozen (using frozen modules) into the interpreter binary, and sitecustomize boots imputils. Thereafter the Python logic in sitecustomize and imputils implements the logic of (5). Sitecustomize has methods to print out the names of PYL modules available. 7) The Python main program is enhanced to start "__main__" as the main program if it can be imported, in which case all command line arguments are sent to __main__. This enables you to create a Python program "myprog", and start it with the command "myprog -a arg1 ..." just like a normal program. 8) The Python main can start any module "foo" (which may be in a PYL file) as the main program in response to a new command line argument. This enables you to ship multiple main programs in the PYL file. 9) The current frozenmain.c is eliminated and the enhanced main is used instead. This (I hope) results in a net decrease in code. Where this is going: It enables simplified distribution of Python programs. Greg has demonstrated that import.c can be replaced with a Python module, and that in general at least some of Python should be written in itself. This heads that way. Please send in your "hell no" comments now. Jim Ahlstrom From mal@lemburg.com Mon Jul 19 16:58:05 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 19 Jul 1999 17:58:05 +0200 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> <378E41EC.FB27A8FC@interet.com> <379342C1.541C0F98@interet.com> Message-ID: <37934B0D.48A40B4F@lemburg.com> This is a multi-part message in MIME format. --------------3D859D18FB33CC48FB315480 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit [Jim talking about a Python Lib format] You may find the attached DictFile.py useful. It does pretty much what you (and even includes a PYCFile subclass). Be warned that it will not work out of the box though, since it references some other modules I normally use. You can safely comment them out though. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 165 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ --------------3D859D18FB33CC48FB315480 Content-Type: application/x-gzip; name="DictFile.py.gz" Content-Transfer-Encoding: base64 Content-Disposition: inline; filename="DictFile.py.gz" H4sICABKkzcCA0RpY3RGaWxlLnB5AMU7a3PbOJKfpV+BjCtHasIwdnb2KqWM506xZcd1fmQl eR7lc6koCpI4pkgdSdlWpua/bz8AEJCoZLJXdedKbJFoNLob/QZ08OLNuizepHkcpW8mSfZm takWedZuf/fdd+I0iauzJJXi9WsRiTiKF3IqojiWZSniNILfVQ4DUwBL8iwqNiLP4Kl8EP7P sijhnTgM/9ZptwX8+HFHXEVF/LqXTQspLuVysi7mwl9GSVrl3WWU/mfK78I4X3bet9utVkv0 0lQMkvmiKsVAlrJ4lFNG90kWy6SkNYCIdSkDEeerTSCW+TSZwd8omyItVZFM1pUU1SIpRZnP qqcIVsfBpCoJ0zSP10uZVREyIWZ5AaMbsVoXq7xkyKcEZLKuxExKAcNFvonSaiMA4UIWcrIh NPMiyio5DcSqyB+TKUiqWkQV/AIck/xREnkFsiKyvEpieL1ayagQSSYi4BKGE8kU4Zo0eZJX C/60O5lggIRVLYhtvISsXK9WeVEl2XyLU+CEBvKsRCoLmc8CmBSn6ykCkxyTmGDLgFARJZt8 LZbRgwx5H0Yf+6J3O/p4MxBXvcHJ69716aAvLvtXH24H5+L0Ynhy2bu4Gore5aX4pTcY9K5H F/2h+OVi9FEM+ue9wakY3ShMF0MxvDkbAVg/EBfXJ5e3pxfX5zT14urT5UX/1EZxcyau+oOT j/DY+3BxeTH6TcDihOrsYnTdHw4Ribi+Ef2f+9cjMfyIiCx6PwCdF70Pl31xBk+969/E8FP/ 5KJ3ydxeXJ9eDPonIwGDJzfXw/4/bgENDIvT3lXvHCngWfrxl4+90fAGFhsAZ8PbyxEQz+QM bq7E5c2QaL4dAnOnvVEPp38a3ACtQOgvH/tA2QAJ7sG/k9HFzTVCw8KjATwySdf988uL8/71 SR8n39CM0c0AgG+HalIgeoOLIYrt5naEGG4IKeC57hMAISLxoyiAGqKjPwAZXPUI85m7FeIF eYN2O1mivsDmF+UiSoNbMEf0EIF8juWK1ST+lMQPqQxQXceFnAfVZiXLAK0wmwdFXuWFRjPK 87QM0L1c5vFDe1bkS3GZz4Ua/h7eSPlZimOGDPmx3R6PH9m3jMcw5oGD8fDlVE7Wc3p11G4f CFgbbF8WmloRgaUQbQWZeJxPwZYnv8sYvMBUztTgGN/7+KvTJUEVsloXmVhn1njgK6ThdL1c lQwedAiNA+gvdzHpqWkeTUsF0NbiCnmyj1LzLUThbJ3FY17HJsRZrePwHTnsIkLD7oHFL62E v2AQCD1o4hmHAw0UjsdZtJTjMTB84HBsUOF40D4A7w0xY3XMmx/Cx6SK0uSz3FpnLsEhVYXP qgJgBknnlTeCjx6s1CShMsRR/B/YlDoEkVA+yhS8ZJs2KM2z+dt4UfhJ0G4hFQIejuE/UGXv E4GIfxOHz2dnHfGKnv3kp5/edayX/00o9NDRv3caJrz9wbxlCmDgLVLhP2sK8mJ6DP+3KIA3 /vPd4T0i8/nh6L7z44/vrBdv8QWua978Dd/AmsD3wYHoF0VelO02R2sdz+mtX5ttOKwgnETF lAaUvq5ghrJB35moUH+ISklot7H72jGE+oNmDHOKnp0ugK1DzDcRieJLPpuBfYP+ljRKE/Fn KeNFlCXlEuN9CY5EYgQX06iK6rTDir2ShtqtOMrERAqYO01hYLLRJthpt16jcSxxUUbQpQ1Z yGgK6oIfk0ouD8MwxL/XvNk5RPmsKl/XbND7HwBzJUuhnJPzLsmmFEghqqYym0NMz2fNiF4T 4XqIRAWUU3wGUtM0f0IkMFZArsDUejOQuZ7hdSEenPD7eVRMormEV+oTSBQcaOvXX38V1zlm JgUwKspoJsWLFy/0DvFWIVLwpdd5JiENO6DnQORgl+RPcCZQYkDRWg34Qf0KtuppkcQL3g/k BjKWSb7OpmaqYbaePoomsLoloy5tRNeHjCxAJjqsoSjqskKXTzTCLqtXMEf8h7KlaJpn6QaA DhEmUbJE9swYg6JHBgWpybiANUv9muITTuOcUaayagAmDX6CXMpAmHmaF04W1Y4QVchxDq6R 9gfZ1qOQGZIC0RSlWCrmCU/gvKvb4Qj51pqGfoWgr3q/jiG5+gCJCcrnEH8Afhk9h446uGsg nVAKPFh86QANgCV4iVi2lQBmYjxOsqQaj/1SprNA7/kxzgy0bI8PA9CXWGby6fgQ/UALncAJ DFecw+bkgiAB3ggoP6ZGUZSHoAIDUYfsK4eQgocESGENgJcSvAIYBcgNH3Pl0ozfQJVu7bik 0CEeXEEyq7UWlARSaRIB25hAIFrW10CGQ8MfK6WRDkE7kvmKUD58iX1D3Htbf6F2guQdrZLJ RJmU60kp/2eNJmAh4qoNtpngXm+voziOUygcoMJSQCATYyNYPckqEJjyoLjVjAh9Uj1DsyWU 14UpyAOUP8VTkdCWQ2lVyMckX5eAFV9W6FMSy9PjjunPJHhLLbU6hvqD2RDeQ6RJE6H2rio2 XYNP/xwIUmoty51xZ90Qf/lgRLAGmMQA3SbtgOOoYWwGFOJu1FriFRMPZxl8ADFDyLCU8sF/ /S54y0ihtgZ6ltE8iaEKC2VYRxLSzRB3wodc4sUxE6eGgUnisogSqFYxGaIgHXhQyeXLFUQd 9KWmkud459GS5P6rbV+riYU4jLsHDMF/k7MYOjDleGcxkjMjlje3M11/5gjBgtIf77wqj717 DVU7yBpAB7R73lfOX8TFDTGMcjDah0WADVOL5WmxQUiWlpPWBB6lEuAPX5ao3XFeFOtV1YVH T7yEVE80/tRbDag7bVfHRqBepLhVspTYQZC4kradiURDgsSA+irg2lXPIWRv5GrxgbgGxnZ0 1dVT8AhgamNS147luaxovvXW2ok//rTGrADVrvdNhXn9UY2YwIkY6E0dmZFP7UPUoBWS65EE w4mq4RTHaT73h78NR/2r8Wn/w+154BktfvnMSQT6lu9elt95AVoJu9tO4JDbqePVOjM+WWek Oiu9zZqCD8Vu7SLCtiP4M2yf1NmGmQV5WWCyT9xpdJhyGpIuUAvNQYOKxk5XdXRkHYR5N6eh M6HRSyrH5yiD6/O4prAn2BvV3dUpJNs/6riOmEgdExd+p9lHh+tMqZ8ehkzIBWkeaX5LatY4 4uRTzih7rt33Sqd5Hw9EL00ino2al1LvgFVEQ/TBhWKa+2Yq+YMogfOl5DhLZW1Zq5fMqOan kA+FctBW1R0kLQSKTuiY69b6hYn+Ay76IoWXseEHVIl58ghBkuv3kODJYqjixsJde5Qar7Ih 9nM/R+laOzmyPLXGI76HUgvSVdUllFMIDbr+nPxeM0cSUMyhZtepXJ5BJKpqwgsJ8b00NR2T Dl6VqzEKyJDUaG6En+D8kkKQkmCno3lUlNBUQwqXZ25qtStGhqIChBIP+LAT7O01jHPx6pVs dddeg5nGAeUtoOwkiPfUxU2oDCgk1LDADVi1TEv5hN1V7Me2uJBd5uBcnhIAn2ChUHFj3Xib 3X46eo+kNDlejJtmFgPOJpgMUjgnJDTF5ZJtGuTINozRwfLd5qXL+n6m68Qx0/xQMq/JPe1d n/cH2JV8obYS7G1K/lpbYasOMY51YhQnzhB3TfaD3JR+R6l1PVV/erXrGO4Qyf3d0X17O6Uw S+3M2ZKOI7DaFtA1kvrNi/zp+MhYOrUDj53mYAD6eQz/A0wLj/FXoDtQx/qDkfApzGioAEhn VYKNK2JQ7mq1BuinRa4KBBQZxHixSmQM5bp2QHGeAovUf9BCoG4LTI02wkfPATa4EXI2S+IE 5BCIybrSswEuo65/hrkjFmoQsz0IeTkfQOQzEF3nPT9gwDNZhqCzDXBCeLzAjQZY0Ik1hqOj LtcS9IyUIj+lbVS2Omj8dJazAqlSz6c28fdEMBYhMSJoCF6aORBvui4Xe9YhrSaLQ9RJocwM Wx8KgTG4BraQje+BiO9hWrlOK0RdUaVPuNO8hFwhUXUruUOLLOVOY/KvVPnPoiQtBZ1P6axX 5HG8Lko3NcEGT7nI1+kUHcMSDJxkX4inN+QQrcVlFavCmn+r3pDsihLqB3VqtpTVIp8ah4US 5UYaqaoEQtdxBbGFUaAg52k+wX7GZjnJgeSaGDpnTD7zZuEpEgSegmSLQQkW0Gk7Zo9PqHhc 4hMBmCvR2iHbS7tOtIHmBQVL4C4PQ86g3WwThGVyJJSAjpCcGH1LKkUxVYVTk5LqnF6npVy2 e22LyJMojdcpNj6wRnGbcrAsasyOf3Ndo+7RjKFW0oNOLmS6AKjKAcV3142SipMf1R0PxBUu onIMHtbH4Y4uKal+KaUyC91jfdJ7CYE+jWLWBGQI8b83U7X+gGYDOKmc6VdoTLi/2k4A0szl ZiPpXrQEgValEzj2hgDNjuX51ZZqmJ9YFFZ3rNvm/huo4zpeaMCu9pksJ7s0aa5N8LDae1l2 G3wtVJAJ99ioTGlhgqQqFQWBtbF27PjRrr0wf9jRisP2X6uY2ooo1bQBwyB5H79MAo3t2CHO pg0AawItDdX1qD+QHdMQxmikG8L6Z+aUnNogcZO6djMAHDMUvX5np8BubNxoxNRmccoE7LW8 8jqNMw6sFhTqo2ksN+NXPQ2rmsDuxjbYdgOieeGPy2VQ9+xApecgEnRRTeB06OLUh2CkOEn1 IkFVM9UGCFQGwIlu1+4+bYnliVtQs5Am8KiVRNelcsfsLaO2U/Yv+JtVju2DWVjJNNWRlVN5 PYefeIRRIzx9+iaXhU0GVSQQILXOVhAnK+FpFfQw5fKnncBjtQYQ5nuK0MY7ACY8TWBYGGAm 8PcroV66wnB5tvsmglpXXRRS0HTsovoqKEF9YMyTO5ZE/Hh7wSSb5WC7Ovq/ss6O9BmPsHZU wd11f7h38JoDT+Qq7nTsvpDp8fEYtfQ0EYVk41ZdQ+HkPTNjtDtNpmbD9Drtrd7YAArp5QSy Eu3Xd1oQuwm7doeaRkxX1iVUQ+S7K+3EVUWiTwiqCEqSMDSziC/QASh4qcujOdQ+rLXfx7zy wmq58rRN7dXnr1saq8S6gDKxGjdZEJvcH38asmdJUVZ8y4mSLHUKSGnHRnKbrdnVgscbU2yk EtGy45Dn1ADo9xwAfIGGB4Y4Nn2+2jZV3sC9V7ZgPKRLsHrhkzpjy06ZZdm0lXRZq9SZiEtc jZ2a8a0ieqLc4lg4TPp8Stj6S+5B+QeFimbZbsLapICTtVbL3Tj76RUndHrTMojqrCHgt9/g CfZcTi3Hut9P/r+4RJtX4xq/xOy/5iq/4ilVBQ2Z/tia/AXf2TDh/8KbNi1rBdHyKVpxGdvo NL+eEIHul2Ci6DG2POq2U5pED9of7ZmiPdd2c3zL29eOnkrf9WoakfD47FfgRR26T1l+g+Pf e+IAarAvGDWJ1glPselGcVO99ZUmdePJhnWUDYvqw2CML6ormNosWapVzwOS0Xj0KThZLNfy pq0zBK6gZMehrUMEPBOgVhfV1nwmxHdbqIakKgnKnyQlRjP5XAluzHe2DxGw6bJ1UNByG3/G xlWnweJhvsuDIf68Jj7A9KhMJlBMo5OmU+DKPSTRJSpEKrf/SLm8VXCLLbogsnM2/V9yo8/z TEJMmW19UGDOeZ0m99Zhnmov6+C4U5ebBoHqgG5XwDViTRHJpU2XKDYT3DePY68Ki7CKzSZ6 UL5hcbyvNam9go5rFNJ4XR3U1HGrimb14Yy9n3VE5h59HccMPJNoZtRFmW6w7xN/kwQsxZnK dK/inNIFGK0773Wfizu7JXd7dCsgQEXSDW4w/QfJ9zvoEg2dgW8pkTnTsTWoqX1jMo6dDodq /Oxh/BvwNCuJ2+3m2+gskSRz7wNtd6edxy/3tRsPxJQ4bB9VFXqLgODf8yTTlzLxM+1YCgvf 3XPz/SF43JduUJch5I6rT60FPh33qXHw0Ano7yNVGYhq9tCc/zndfIOQJs8eOq8gJfixNq2f MIZsH9P88bL8E1dGDvw0gJTcccuIqsGfez/i4X4dzXZOkn/is359csfOfzymG47jcX35VZjT ZrHvuFlrimsa2u3Y6bPrdJjSI0PzV3SwXpCE6hzZmNOwFPupkOqQ9SGcTuPAl0ePUZLSLZCt Eyr7RKf5DNp0m7Fxj+BqJlBarpfccVK2q+8HTEPOKZrC6tag8mLaRymlqfnl48vG8/whEoSX piv1BZFiWxJYQTGCsPFInYSkjFEtrIfUsSkZjGkZod1IOg9DaLcZpNZRWo4o7wDovsaoqGO4 mkG2u3+dP5r/zexxu/evc8erKOZ8GA9qDndYJOCaw5msYkzzttUWylT7QAYPC1izlDY51eZf KzSdJHQ35u7GVTummolfi6r62w3wJ8jL9lXv/OIExuAxhCxrTPe7UIf5yvSn307MrWn9wb4s PcMTkgjBXvOpFjhFWX+vCG8VgStbriD940ZAvYn0PR6+b7cupTJvvv2FAdXWhy8dqm9tH/Pz qnaQdT2Et8Og9gB6qvpO3t071IFW27RylXg+p8mkrfun9MrqoeJgLc8ZPfMIS+1zo8xa7gXZ z14b4+PuzYWAew70/Ytj59sYagQnLPEOQ3mMK4f1c0d3nj6jffA3OOxUC5MxA+x/1jWqriUI ngrTxusivHjDEbJLzg4x2j+q8litzo94OSQ46iinqW7zG/nhzihFpG9M7NVFV7Qrb/+9kMDS DBSw+kIQCxjN0BEGC6/5Bk2NiWWiMdGThUkxCmgUey4vxtauWKZ/kcflN/DoKNE38Ujb7nDp 7Pwul7Cd+/lE9FCZV/7p2bF+Haji3MOB0Nxx0UxjVD49U85/Wt9w5hfkZdGvFti28v+Od5Vb 4PDQ23kjQCheJpgvJZCmet973//9UOGxatQdHAHeo/0WPOoSGua66maWam8NVKOCCzrqptns yG121LQTffbjBXL3Jpm6+gXUTqGu5WvT1mx59/beeTw63Hp2Hw/vLZKbBKKkoeETazrPxCWO akn1s6niVUtZUbwq8umavhjKt0y7W2QCYaQhE9DABahIEBXxInlkDSm1Qij/vIjUNT3sx+BD iLHG3IJRknzKiwe6ep9xh5ExBfSNlNLbr2Fq5VpBZpmuisq7bo3pzdv7btu6VJ11uBzGTPFu lt3TXWqiEsNOUZlVLP07gHJ4nGSrdeV7i6RS3hLyUs9RCmSPWi54BYf6F8B2OgMdQdw5YAu8 Usah10izTXL3f09zE03g1KtAn5bvIcsykQNRLtazmb4XhMEYv1M1SVgJnzkbb+AFiH9WdFIA 1lcOMLg9uyn4/p0NlM26PO6upHrfJJgmzlFyX+YVNVotOl5tYr9RqZFSO9X6sjIu8apJTaMi SXVFvQCGrf2FJyx1eY/fdZClpqvv0zsApAZ7vm+7p7u+SC19Xn8diHrFXjCt77LSNQAuSsUx OInxGC9ZjMeek2+VG8YI0IfAE0WJjvpr0oHADRLhZ6y7YWYonxO9hzJVKGyh8zz45AUIHhXz x7uj7v2e2Ue1UH9RX6zXWQt+n8Frt9hFWTvGlKkldxaxkHHu2IjucyMyYHILXfufCx+YJBNB AAA= --------------3D859D18FB33CC48FB315480-- From gmcm@hypernet.com Mon Jul 19 18:39:42 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Mon, 19 Jul 1999 12:39:42 -0500 Subject: [Distutils] Re: Add multiple frozen modules Message-ID: <1279720329-5732174@hypernet.com> Jim Ahlstrom wrote: > I got a little time to think about this over the weekend > and propose this design. It is a way to package *.pyc files > into a single-file portable archive for distribution with > commercial, CGI (like Marc is doing) or otherwise simplified > distributions. It is dumb-stupid-simple, my personal favorite and a > requirement for commercial software. This is not concerned with an > "installer", which is a separate problem. Few of these ideas are > mine. Sigh. Most of this is in my installer. > 1. There is a Python library file format (PYL) which can > hold *.pyc files. The format is: > a.pyc, b.pyc, marshalled dict, offset of dict, magic number. Right (except I put magic before offset). I've got 2. One is just like this, except everything is zlib-ed. There's another where you've got aribtrary chunks of bytes chunk1, chunk2, ... table-of-contents, magic, offset where table-of-contents is fairly easily read and written in either Python or C, and contains additional information (like whether the chunk needs compressing / decompressing, and a filename...). > The a.pyc, b.pyc,... are the bytes of the *.pyc files including the > eight byte header. The dict has keys "name", values are the seek > offset of the start of the *.pyc files. The "offset of dict" is the > seek offset for finding the dict. The magic number identifies the > file as a PYL file. > > The "names" can be normal names such as "os", or dotted names > such as "foo.bar". Although initially devoted to *.pyc files, > we note that it is possible to include other resources with > non-module names such as "more*data". My first kind is confined to compiled python. The 2nd can have anything. > A PYL file is portable to any computer, Unix, Windows, etc. Yup. Both kinds. > Compression is not included, and should be done at the > "installer" level if desired. Actually, because the .pyz file is always open and you don't have to stat anything, I find that it's faster even with decompression than the normal import stuff. > 2. The PYL file always has the same directory and file name > as the binary module containing import.c. but with a .pyl ending: > > Python Binary PYL File Name > /usr/local/bin/python /usr/local/bin/python.pyl > /my/so/dir/python.so /my/so/dir/python.pyl > C:/python/python15.dll C:/python/python15.pyl You put anything in one of my .pyz's, (packages or modules) and they just look like they're on your pythonpath. > These Python binary names are available as sys.executable (the main) > and sys.dllfullpath (the shared library or DLL). Not sure what you're getting at. You can find the name of the .pyz by asking the importer (an attribute stuck onto the imported module). > 3) Since the PYL file can be efficiently read backwards, it > can, if desired, be appended the the Python binary itself: > cat python15.pyl >> python15.dll Never tried it on the dll. I do it with the .exe. > 4) The PYL file is created with the Python program makepyl.py > and no C compiler is necessary. Right. It's called archivebuilder. Pass it module names, package names, directory names... > 5) There is a new optional built-in module "importer" which may be > included by editing Setup. It is imported in Py_Initialize() after > "sys" is set up, but before any other imports such as "exceptions". > It is not an error if it is absent. If present, it replaces > __import__ just like imputils.py does. The replacement importer > searches for PYL files as a .pyl file, in the current > sys.executable, and in the current sys.dllfullpath (name of DLL). > Note that importer can use multiple PYL files. Importer is able to > import the modules exceptions, site and sitecustomize, and probably > most other modules. Importer has methods to print out the names of > PYL modules available. You could still override importer using > sitecustomize and imputils if desired, in which case it may be > convenient to use importer's methods. > > 6) Alternative to (5): Modules exceptions, site, sitecustomize and > imputils are frozen (using frozen modules) into the interpreter > binary, and sitecustomize boots imputils. Thereafter the Python > logic in sitecustomize and imputils implements the logic of (5). > Sitecustomize has methods to print out the names of PYL modules > available. I don't muck with core python at all. That means you need exceptions and site, with site using imputil to load the .pyzs. I'm hoping imputil attains sanctified status, so this happens in only one step. > 7) The Python main program is enhanced to start "__main__" as > the main program if it can be imported, in which case all command > line arguments are sent to __main__. This enables you to create a > Python program "myprog", and start it with the command "myprog -a > arg1 ..." just like a normal program. Yup. For Windows, I include a bunch of different .exes (all from one source) which are just python.exe / pythonw.exe with some added smarts about archives (the 2nd kind). Also not linked against python's import lib, so they don't have to have python.dll in place before they start. > 8) The Python main can start any module "foo" (which may be in a > PYL file) as the main program in response to a new command line > argument. This enables you to ship multiple main programs in the PYL > file. Not really necessary to build that in. Your __main__ script can do it. > 9) The current frozenmain.c is eliminated and the enhanced main is > used instead. This (I hope) results in a net decrease in code. A full python/Lib .pyz occupies less than 500K. Incidentally, I packed up your demo05.py (from wpy). For some reason, on my NT system, I end up using the Tk version of wpy. At any rate, with all the Tcl/Tk, it still comes out to about 1.1Meg. Runs on a system without any Python / Tcl / Tk, but not perfectly - you seem to do some funny things in wpy. - Gordon From arcege@shore.net Mon Jul 19 19:38:09 1999 From: arcege@shore.net (Michael P. Reilly) Date: Mon, 19 Jul 1999 14:38:09 -0400 (EDT) Subject: [Distutils] Re: Add multiple frozen modules In-Reply-To: <1279720329-5732174@hypernet.com> from Gordon McMillan at "Jul 19, 99 12:39:42 pm" Message-ID: <199907191838.OAA08856@northshore.shore.net> > Jim Ahlstrom wrote: > > > I got a little time to think about this over the weekend > > and propose this design. It is a way to package *.pyc files > > into a single-file portable archive for distribution with > > commercial, CGI (like Marc is doing) or otherwise simplified > > distributions. It is dumb-stupid-simple, my personal favorite and a > > requirement for commercial software. This is not concerned with an > > "installer", which is a separate problem. Few of these ideas are > > mine. > > Sigh. Most of this is in my installer. > Gordon, I haven't looked at your own installer (I've been meaning to). But the one issue I have with it is about the compression. When compiling the Python distribution, zlib is not standard. This means that pyz files aren't altogether portable. BTW, what again is the URL of your installer? I have my own that I had written (spamcan) and wanted to compare it. -Arcege -- ------------------------------------------------------------------------ | Michael P. Reilly, Release Engineer | Email: arcege@shore.net | | Salem, Mass. USA 01970 | | ------------------------------------------------------------------------ From gmcm@hypernet.com Tue Jul 20 00:27:07 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Mon, 19 Jul 1999 18:27:07 -0500 Subject: [Distutils] Re: Add multiple frozen modules Message-ID: <1279699484-6985999@hypernet.com> Arcege wrote: > Gordon, > I haven't looked at your own installer (I've been meaning to). But > the one issue I have with it is about the compression. When > compiling the Python distribution, zlib is not standard. This > means that pyz files aren't altogether portable. > BTW, what again is the URL of your installer? I have my own that I > had written (spamcan) and wanted to compare it. zlib is standard on Windows and comes in the RH RPMs. That's close enough for me . It's on the contrib site under System, but that's in the Windows-only form (an exe). It will soon have a homepage on starship. I'll again offer the inner pieces as tar.gz, but when I did that before (on my now defunct corporate site) I didn't get any takers. [BTW, I'm not on this SIG, I just check it periodically]. - Gordon From gward@cnri.reston.va.us Tue Jul 20 02:01:19 1999 From: gward@cnri.reston.va.us (gward@cnri.reston.va.us) Date: Mon, 19 Jul 1999 21:01:19 -0400 Subject: [Distutils] autoconf in distutils? Message-ID: <19990719210118.A14986@cnri.reston.va.us> Hmmm, just as well I was away last week: I *think* I agree with almost all the points-of-view expressed concerning accessing autoconf-like features from distutils. I would have had a hard time participating in that discussion, agreeing with *everyone*. For my next trick, I'll try to figure out a position that's not inherently self-contradictory. ;-) But seriously; I think Marc-Andre's initial post had two points: * testing compiler/header/library features, existence, etc. (the bread and butter of autoconf, but only the beginning of what it is used for) * getting a more detailed platform description (compiler/linker name and version, and an OS/hardware description more detailed than os.name) The autoconf-ish stuff would be fun to do in Python, and a good test of how general my compiler framework is. It wouldn't be as hairy as autoconf itself, because all the horrible things about shell programming magically disappear. As Fred pointed out, it would also be more portable than the M4-generated shell script approach. But it would probably be trickier than I like to admit, and not necessary in most cases. Consider: many (most?) Python extensions are probably thin glue layers to larger C libraries. It is the large C libraries that have configure scripts to adapt themselves to a particular platform; given the existence of Python, a C compiler and library, and the big C library being wrapped, the Python glue extension should build trouble-free. I suspect a similar situation in Perl-land; I've just posted a question to perl-xs@perl.org to see if the experts over there agree with my assessment that autoconf-like features are occasionally handy, but not necessary for most extensions. (I further suspect that most Perl extension developers who need something autoconf-ish have rolled their own in Makefile.PL; if we do nothing autoconf-ish in the distutils, then I suspect that Python extension developers will do similarly in setup.py. It would be nice to have some basic "Does this header file exist?" "Does this library define that symbol?" type functionality, though.) Bottom line is, I suspect it's not essential for building basic Python extensions, and would be a layer on top of the platform-specific CCompiler subclasses... so it's not an immediate concern. Regarding Marc-Andre's other wish: yes, this is important! Sooner or later (probably sooner), somebody is going to want to say: if compiler == 'gcc' and platform == 'linux-x86': compier_options.append ('-funroll-loops') ...and that's probably only the beginning. The CCompiler framework only has a barebones notion of what's needed to compile and link Python extensions (and will know about binary executables, in order to build a new static Python). I'm not even sure of how to handle optimization/ debugging options in a portable way, much less something like -Wall or -funroll-loops (considering only one popular compiler). Ultimately some sort of "Oh, here, just add your own damn compiler options" cop-out will be needed, and the only way to make that workable is to allow the extension developer to make decisions based on the platform and compiler. Perhaps a distutils.platform module is called for, which could expose all these things in a standard way. How's this sound for a start: * OS class ('posix', 'windows') (aka os.name) * OS ('linux', 'solaris', 'irix', 'winnt') * OS version ('2.2.10', '2.7', '6.1', '4.0') * OS vendor ('redhat', 'suse', 'sun', 'sgi', 'microsoft') * architecture ('x86', 'sparc', 'mips', 'alpha', 'ppc') * processor ('586', '686', ... ???) * compiler ('gcc', 'mipspro', 'sunpro', 'msvc') * compiler version ('2.8.1', '7.5', '??', '5.0') (does Sun even have version numbers on their compiler?) OK, flame away. I'm sure someone will tell me that this is a good start, but not nearly detailed enough, and Greg Stein will tell me to stop being so obsessive about finicky little details. I'm inclined to the latter... anyone want to try their hand at hacking up such a module? Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913 From gward@cnri.reston.va.us Tue Jul 20 02:25:57 1999 From: gward@cnri.reston.va.us (gward@cnri.reston.va.us) Date: Mon, 19 Jul 1999 21:25:57 -0400 Subject: [Distutils] Re: Distutils-SIG digest, Vol 1 #79 - 1 msg In-Reply-To: <199907101342.JAA04194@eric.cnri.reston.va.us>; from Guido van Rossum on Sat, Jul 10, 1999 at 09:42:50AM -0400 References: <199907100505.BAA27449@python.org> <199907101342.JAA04194@eric.cnri.reston.va.us> Message-ID: <19990719212557.A14972@cnri.reston.va.us> On 10 July 1999, Guido van Rossum said: > 2) I'm not sure if you're trying to add a method for each of the > typical cc options (I recognized -I, -D, -U, etc.). Looks like you're > missing -R, which is like -L but works at runtime, and is needed if > you're using shared libraries that live in non-standard places. Oops, forgot about that one because I always forget about it myself when writing Makefiles, building binaries, etc. How do 'add_runtime_library_dir()' and 'set_runtime_library_dirs()' sound, apart from being rather windy? > 3) I stringly prefer shared lib over dynamic lib (on Windows everyone > calls them "DLL" anyway). Yeah, I prefer that too. Done, "shared lib" it is. > 4) Do you really not want to support cross-compiling? The Windows/ce > folks will regret that ;-) Nothing against cross-compiling in principle; I just don't need any more distractions. I'd love to see someone who knows about it hack it in, but right now it's kind of academic since there's no mechanism to select which concrete CCompiler subclass is used. > But I don't think you need to support -I > (without a directory) for that; typically the compiler has a different > name and all is well. I bet -I without args is intended for people > who want to experiment with a hacked set of system includes; they can > add it to the compiler command name if they want to. Actually, that won't work, since I intend the command name to be *just* the command name, and options to be supplied as a list of strings. (The best way to avoid shell quoting hell is to avoid the shell!) If someone wants "-I" on the command line, they'll have to use the as-yet-fictitious "Add these compiler options dammit!" interface. A more serious problem occurred to me sometime in the last week: there are not two, but *three* levels of include directories (and library directories, and libraries, and link objects) included in the compile/link steps: those mandated by the system (/usr/include, libc.so), those coming from Python's Makefile (/usr/local/include/python1.5), and those supplied by the various CCompiler set/add methods. Currently the first two are both carved in stone; the user of CCompiler can only add to or replace his own list of include/library directories, libraries, etc. I think this is another one of those things that won't matter for most Python extensions, but will be a pain for those people who really need it. Hmm... > 5) When creating a shared lib from libraries only, you need to specify > an entry point. Is this useful to support? Well, you learn something every day... I didn't even know this was possible! Can you point me at some Fine Manual that explains this? Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913 From gstein@lyra.org Tue Jul 20 08:53:06 1999 From: gstein@lyra.org (Greg Stein) Date: Tue, 20 Jul 1999 00:53:06 -0700 (PDT) Subject: [Distutils] Re: Add multiple frozen modules In-Reply-To: <37934B0D.48A40B4F@lemburg.com> Message-ID: The "Python Lib format" that Jim discusses is exactly what is used in my "small" distribution. Gordon used a slight variant and even created a nice base class to create similar files. In other words, the code already exists in to match the exact requirements. I strongly agree with Jim's thoughts on moving all Python importing out to the Python level (minus a minimalist set of C functions to dynamically load module M from directory D). Freezing that into the interpreter is a great Step 2. IMO, I would also like to see a frozen Python parser, compiler, and interactive loop, but that is a separate discussion :-) Cheers, -g -- Greg Stein, http://www.lyra.org/ On Mon, 19 Jul 1999, M.-A. Lemburg wrote: > [Jim talking about a Python Lib format] > > You may find the attached DictFile.py useful. It does pretty > much what you (and even includes a PYCFile subclass). Be warned > that it will not work out of the box though, since it references > some other modules I normally use. You can safely comment them > out though. > > -- > Marc-Andre Lemburg > ______________________________________________________________________ > Y2000: 165 days left > Business: http://www.lemburg.com/ > Python Pages: http://www.lemburg.com/python/ From s341625@student.uq.edu.au Tue Jul 20 10:47:57 1999 From: s341625@student.uq.edu.au (Anthony Pfrunder) Date: Tue, 20 Jul 1999 19:47:57 +1000 (GMT+1000) Subject: [Distutils] RE: autoconf in distutils Message-ID: Hi, I've been following distutils with interest and would like to share a few thoughts: * Some people may wish to compile blind or get feedback as it goes ie a known working product ppl probably just want to know when the build is finished. For an in-development one they may want compile time feedback * During install stage, people may want feedback or a "quiet" install. * People may wish to import makefiles and output them in a platform specific format (ie Visual C for debugging). This could be done by merging David Aschers compy program into distutil. Distutil could have a load and save option to perform this My concern is, how do we abstract the "interaction" between the user and distutil? This is assuming that we have a class which is over-ridden platform by platform to provide an appropiate interface Cheers, Anthony Pfrunder Computer Systems Engineering University of Queensland From gward@cnri.reston.va.us Tue Jul 20 15:49:32 1999 From: gward@cnri.reston.va.us (gward@cnri.reston.va.us) Date: Tue, 20 Jul 1999 10:49:32 -0400 Subject: [Distutils] RE: autoconf in distutils In-Reply-To: ; from Anthony Pfrunder on Tue, Jul 20, 1999 at 07:47:57PM +1000 References: Message-ID: <19990720104932.A15372@cnri.reston.va.us> On 20 July 1999, Anthony Pfrunder said: > I've been following distutils with interest and would like to share a few > thoughts: > * Some people may wish to compile blind or get feedback as it goes > ie a known working product ppl probably just want to know when > the build is finished. For an in-development one they may want > compile time feedback > > * During install stage, people may want feedback or a "quiet" > install. Currently handled by the '-v/--verbose' flag, which affects any Distutils command class that plays by "the rules". See distutils/command/{build,install}_py.py for examples, or download the Distutils and run python setup.py build versus python setup.py -v build to see the difference. In summary, -v is quite verbose: every individual filesystem-affecting operation (copy a file, compile a file, etc.) prints a line of text. Quiet mode is the default, and is very quiet; only error messages are printed. (Or at least that's the idea: if you see different behaviour, that's a bug so let me know. Or it might be an inadvertent feature, in which case I'll have to change the definition of "verbose". ;-) A middle ground would be nice, eg. just print "copying Python source", "compiling extensions", "installing", etc. but don't show the gory details. This would be a sensible default. Anyone have a nice way to implement multi-level verbosity this without making -v take a cryptic numeric "verbosity level"? (Actually, the code has support for verbosity levels, but I'm not sure how best to expose this to the user on the command line.) I've never been a big fan of either "--verbose=37" or "-vvvvv". > * People may wish to import makefiles and output them in a > platform specific format (ie Visual C for debugging). This > could be done by merging David Aschers compy program into > distutil. Distutil could have a load and save option to perform > this Exporting makefiles is an option which I planned initially. Since I've gone and implemented the easy part of make's functionality in Python (ie. compare datestamps and run commands to generate out-of-date files -- no makefile parsing, no DAG building, and no topographic sorting), this may not as necessary/useful as I initially thought. But it could always be revived. By importing, I certainly hope you don't propose writing a complete makefile parser! Were you thinking of parsing the particular flavour of Makefile generated from Makefile.pre.in? I'm not sure I see the utility in that... please explain! Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913 From lannert@uni-duesseldorf.de Tue Jul 20 16:03:07 1999 From: lannert@uni-duesseldorf.de (lannert@uni-duesseldorf.de) Date: Tue, 20 Jul 1999 17:03:07 +0200 (MEST) Subject: [Distutils] autoconf in distutils? In-Reply-To: <199907200126.VAA20211@python.org> from "distutils-sig-admin@python.org" at "Jul 19, 99 09:26:26 pm" Message-ID: <19990720150307.20416.qmail@lannert.rz.uni-duesseldorf.de> [I'm only subscribed to the digest; please bear with me if I'm out of sync with the current status of the discussion] Greg Ward wrote: > Sooner or > later (probably sooner), somebody is going to want to say: > > if compiler == 'gcc' and platform == 'linux-x86': > compiler_options.append ('-funroll-loops') > > ...and that's probably only the beginning. ... and, later on, suggested variables like OS class, OS name, OS version, OS vendor, architecture, etc. etc. IIRC, Marc-Andre's suggestion pointed into a somewhat different direction: that the properties of a certain [instance of an] OS would be described by a set of variables indicating system "behaviour", rather than the formal vendor/version/architecture description. I would second this idea (and, IIDNRC, first it ); maybe there are some situations where I'd like to know if a certain system is a SuSE or RedHat Linux, but usually I'd rather be interested in whether it has SysV or Simple init (and where the init.d directory is), if ps and friends want BSD or SysV style operands, if we have cc or gcc or egcs, and the like. The "intelligence" about the peculiarities of a vendor-x/version-y installation should IMO not be built into a tool like distutils; it would fail anyway for mangled or personalized systems (e.g. a SuSE Linux updated with a few RH and some home-grown rpm's). All the vendor-, installation- and configuration-specific data should be specified as individual variables assignments, probably in several files which are processed in a hierarchical order (like /etc/system.inst: /var/lib/inst/*.inst:~/.inst/*.inst:.instrc, just to give a silly example). They could be provided by (contributors to) distutils as a first start, but ideally by the vendors or distribution builders themselves <0.99 dream>. I'd even go so far and suggest shell variable assignment format for these files (name=value or name="value with blanks" etc.); it's far from ideal, but since it could be used from non-pythonic tools there would be an incentive for others to support and provide these files. Python programs can, of course, parse these files, although it's not trivial. I've written a ShellAssignment parser which understands "", '', $var, ${var}, \ escapes, at least in the most common cases; I wished there was a string.parsequotedstring() function ... Detlef From M.Faassen@vet.uu.nl Tue Jul 20 16:32:56 1999 From: M.Faassen@vet.uu.nl (Martijn Faassen) Date: Tue, 20 Jul 1999 17:32:56 +0200 Subject: [Distutils] Distributing 'external packages' Message-ID: <379496A8.99D1DEA6@pop.vet.uu.nl> Hi there, Recently in the discussion on autoconf I've seen it mentioned that lots of Python extensions build on external non-python packages (that may be configured by autoconf or in many other ways). A developer using these external packages will obviously have them installed, and it's not too strange to expect co-developers to download these packages and install them as well. However, now we get to the two other audiences of distutils: the distributor/packager, who prepackages everything for one or more particular platforms, and most importantly the user who doesn't want to know about anything, just wants to run the program. If I'm a plain user and want to try out Python-Powered XML-database Warpdrive Enhancer version 4.2, I don't want to be required to download and install the Warpdrive package, and the XML package, etc, if I can avoid it. I just want to install and go. For many systems of course the separate install is unavoidable; we can't package Oracle for download, for instance. But for many smaller packages (such as libraries) that may or may not be on the user's system it becomes more important. So, as a reminder, how are we going to handle this? Some random requirements and ramblings: * The packager/distributor would like a standard way to somehow include these packages (or at least check for them). This doesn't mean *building* these packages, but it does mean a standard way to pack them up and install them. * We don't have any control over these external packages. Still, we don't want everybody to grow their own way to deal with particular packages. * If two Python packages are installed that both use FooLib, we don't want the disutils to install FooLib twice. Same if FooLib is already there. That's also why we need a standard way to handle these. * An idea I aired previously was to provide somekind of disutils wrapper for common packages. For instance, if FooLib is often used by software written in Python, we make a wrapper for it. The wrapper calls Foolib's installation/configuration methods (which may for instance be rpm commands, or a windows installer program) where necessary: * Install FooLib * If this is not possible without manual intervention, some way to tell the user what to do in easy steps. * Check if FooLib is already there (and what properties it has) * Also check if is it already registered with a Disutils Wrapper; if not, some way to try to add the wrapper so that the next pyapps that get installed won't need to go through this process again. * Possibly also uninstall FooLib, though this may be far too tricky. Eventually if this gets popular a Disutils Wrapper can even be distributed with FooLib itself, but the Disutils Archive should also offer some consistent way to get at known external package wrappers. All of this may be simply too hard, as packages can vary in many ways, but I think there are least *some* common things one can standardize. The idea is to hide all the nonstandard ways behind some standard interface, as far as possible, to help packagers and developers. It occurs to me that all of this is somewhat analogous to autoconf again; autoconf checks out capabilities of the system, in particular of the C compiler, and libraries. But as far as I know autoconf isn't modular as this package approach would be. Now you all start shooting this down. :) Regards, Martijn From mal@lemburg.com Tue Jul 20 17:42:34 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Tue, 20 Jul 1999 18:42:34 +0200 Subject: [Distutils] autoconf in distutils? References: <19990720150307.20416.qmail@lannert.rz.uni-duesseldorf.de> Message-ID: <3794A6FA.6F0C0703@lemburg.com> lannert@lannert.rz.uni-duesseldorf.de wrote: > > [I'm only subscribed to the digest; please bear with me if I'm out > of sync with the current status of the discussion] > > Greg Ward wrote: > > Sooner or > > later (probably sooner), somebody is going to want to say: > > > > if compiler == 'gcc' and platform == 'linux-x86': > > compiler_options.append ('-funroll-loops') > > > > ...and that's probably only the beginning. ... > > and, later on, suggested variables like OS class, OS name, OS version, > OS vendor, architecture, etc. etc. > > IIRC, Marc-Andre's suggestion pointed into a somewhat different > direction: that the properties of a certain [instance of an] OS > would be described by a set of variables indicating system > "behaviour", rather than the formal vendor/version/architecture > description. Well, to make things complete, here are the three suggestions again: · a way to test compiler features a la configure, e.g. a simple way to pass a small C program and evaluate the compiler error message (error or no error should suffice) · a way to test the availability of (shared) libs in much the same way, e.g. try to link an empty object file against a set of given libs to see if the linker finds the libs or not · a way to access the compiler/linker name and version More specific the methods: · testcompile(program_string[,options]) · testcompileandlink(program_string[,options,libs]) which return 1/0 to state "success" / "failure" and implement proper cleanup of the intermediate files. And an interface much like the one Greg proposed in his last mail for the compiler/linker/platform names. Note that I only mention abstract interfaces. It's up to the distribution author to use these and do something useful with them. On the practical side: does anyone know of ways to figure out those names ? E.g. how would one test for RedHat vs. SuSE, libc5 vs. libc6, MSVC vs. Borland C++ ? -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 164 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From lannert@uni-duesseldorf.de Tue Jul 20 20:06:53 1999 From: lannert@uni-duesseldorf.de (lannert@uni-duesseldorf.de) Date: Tue, 20 Jul 1999 21:06:53 +0200 (MEST) Subject: [Distutils] autoconf in distutils? In-Reply-To: <3794A6FA.6F0C0703@lemburg.com> from "M.-A. Lemburg" at "Jul 20, 99 06:42:34 pm" Message-ID: <19990720190653.20755.qmail@lannert.rz.uni-duesseldorf.de> "M.-A. Lemburg" wrote: > > On the practical side: does anyone know of ways to figure out > those names ? E.g. how would one test for RedHat vs. SuSE, Again, should one really? I mean, a package [author] is interested in the _features_ of the current system, more than in its brand. One example: init scripts are expected in /etc/rc.d/{rc?,init}.d/ by RedHat, /sbin/init.d/[rc?.d/] by SuSE. How would I inform the install routines if I used RedHat with /sbin/init.d? This is particularly an issue for Linux systems which actually are based on the _same_ OS; but I could also configure most other Unices beyond recognition. Therefore all relevant parameters (which are not easily testable, like presence/absence of header files in a known include path) should be assigned individually. A predefined collection for, say, SuSE or RedHat Linux would be nice, but should be overrideable by the admin and the user. Detlef From bwinton@tor.dhs.org Wed Jul 21 02:16:39 1999 From: bwinton@tor.dhs.org (Blake Winton) Date: Tue, 20 Jul 1999 21:16:39 -0400 (EDT) Subject: [Distutils] RE: autoconf in distutils In-Reply-To: <19990720104932.A15372@cnri.reston.va.us> from "gward@cnri.reston.va.us" at Jul 20, 1999 10:49:32 AM Message-ID: <199907210116.VAA30147@tor.dhs.org> > On 20 July 1999, Anthony Pfrunder said: > > * During install stage, people may want feedback or a "quiet" > > install. > A middle ground would be nice, eg. just print "copying Python source", > "compiling extensions", "installing", etc. but don't show the gory > details. This would be a sensible default. > > Anyone have a nice way to implement multi-level verbosity this without > making -v take a cryptic numeric "verbosity level"? (Actually, the code > has support for verbosity levels, but I'm not sure how best to expose > this to the user on the command line.) I've never been a big fan of > either "--verbose=37" or "-vvvvv". One idea I saw was a concept of classes of messages. (I forget whether it was in the Linux /etc/syslog.conf, or NT's Event Viewer, but now that I think about it a little more, most compilers have a similar feature, where you can turn warnings on or off. Well, it's similar if you think of the warnings as always happening, and what you're turning on and off is the reporting.) So you could have "-v all" for every message, or "-v synopsis" for a high-level overview. It could also go two ways: either we could mandate specific classes, and then the package makers could fit their messages into our classes, or we could let everyone determine their own classes, which would result in a little more work for the user, although we could let them know which classes were available if we decided to set it up that way. I think either way could work, with a little forethought, and possibly some input from the people who would be using the distutils. Any thoughts? Later, Blake. From mal@lemburg.com Wed Jul 21 09:18:26 1999 From: mal@lemburg.com (M.-A. Lemburg) Date: Wed, 21 Jul 1999 10:18:26 +0200 Subject: [Distutils] autoconf in distutils? References: <19990720190653.20755.qmail@lannert.rz.uni-duesseldorf.de> Message-ID: <37958252.597595DA@lemburg.com> lannert@lannert.rz.uni-duesseldorf.de wrote: > > "M.-A. Lemburg" wrote: > > > > On the practical side: does anyone know of ways to figure out > > those names ? E.g. how would one test for RedHat vs. SuSE, > > Again, should one really? I mean, a package [author] is interested > in the _features_ of the current system, more than in its brand. > > One example: init scripts are expected in /etc/rc.d/{rc?,init}.d/ > by RedHat, /sbin/init.d/[rc?.d/] by SuSE. How would I inform the > install routines if I used RedHat with /sbin/init.d? This is > particularly an issue for Linux systems which actually are based > on the _same_ OS; but I could also configure most other Unices > beyond recognition. > > Therefore all relevant parameters (which are not easily testable, > like presence/absence of header files in a known include path) > should be assigned individually. A predefined collection for, say, > SuSE or RedHat Linux would be nice, but should be overrideable > by the admin and the user. Ehm, yes, that's what I was aiming at, basically. The OS vendor name should not be used to hard-code features, but rather to make proper preselection of parameters possible. Adjustments can then be done by the installing user (if she wishes). One thing that might also be of interest is an abstract mechanism for checking the system's installation, e.g. on RedHat and SuSE one could use RPM to get information about which tools are available. Don't know about other systems though (on Windows one could probably query the registry). That way one could create scripts of the form: if system.provides('kde') and system.getversion('kde') >= '1.2': # The system has unixODBC installed, so we default # to that configuration ... elif system.provides('iodbc'): # Ok, then use iODBC ... else: # No ODBC manager ? Ask the user... yn = raw_input('Is an ODBC manager installed on your system ?') ... if system.provides('adabas'): # Auto-configure the ADABAS subpackage if system.provides('informix'): # Same for Informix SE Some handy directory searching tools would also help, e.g. if you look for a lib "by hand". -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 163 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mmuller@enduden.com Wed Jul 21 14:20:59 1999 From: mmuller@enduden.com (Michael Muller) Date: Wed, 21 Jul 1999 09:20:59 -0400 (EDT) Subject: [Distutils] autoconf in distutils? In-Reply-To: <37958252.597595DA@lemburg.com> Message-ID: On Wed, 21 Jul 1999, M.-A. Lemburg wrote: [snip] > One thing that might also be of interest is an abstract mechanism > for checking the system's installation, e.g. on RedHat and SuSE > one could use RPM to get information about which tools are available. > Don't know about other systems though (on Windows one could probably > query the registry). That way one could create scripts of the > form: [snip] I'm kind of partial to using (and updating) the RPM databases on any system in which they are found to exist. If the Windows registry has the same information, it could be used instead. For systems in which neither exists, I recommend the use of a pure python version of such a thing. I'm more concerned with this as a means of guaranteeing package dependencies than of checking for installation/compilation tools. ============================================================================= michaelMuller = proteus@cloud9.net | http://www.cloud9.net/~proteus ----------------------------------------------------------------------------- We are explorers in the further reaches of experience: demons to some, angels to others. - "Pinhead" from "Hellraiser" ============================================================================= From gmcm@hypernet.com Thu Jul 22 04:15:28 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Wed, 21 Jul 1999 22:15:28 -0500 Subject: [Distutils] Re: Add multiple frozen modules Message-ID: <1279512972-18204842@hypernet.com> My installer package now has a homepage on my starship site: http://starship.python.net/crew/gmcm/install.html This page gives a pretty complete writeup. The package itself has not yet been updated (either with Robin Dunn's enhancements, or a couple bug fixes I have planned). Also from this page you can download a tar.gz of just the archiving stuff. For you Unix-weenies, I've even stripped the Windows line endings! - Gordon From jim@interet.com Thu Jul 22 21:43:19 1999 From: jim@interet.com (James C. Ahlstrom) Date: Thu, 22 Jul 1999 16:43:19 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> <378E41EC.FB27A8FC@interet.com> <379342C1.541C0F98@interet.com> Message-ID: <37978267.E577C5E6@interet.com> After writing some code, I see that the library file needs to be a little different. We need to add the file size so all fseek()'s can be done relative to the end. That enables the file to be appended to the executable. The offsets written to the file are from ftell() and are relative to the beginning. The length is subtracted on access. And I introduced a main dictionary TOC and a different dictionary for the *.pyc files pyc_dict. Although that may seem a little fancy, it is actually easier. And it is much more extensible. Data may be freely written to the main dict TOC without damaging the code for reading *.pyc using pyc_dict. And another section (I'm thinking of Gordon's installer) can be added as yet another dict. In another variation, core Python files are in pyc_dict, but encoded *.pyc files are in a different dict, and imputils is used to import either. The Python library file format now is: Data... consisting of *.pyc files including their 8-byte header; Possibly other data... TOC, a marshal'd dictionary table of contents; a twelve byte ascii decimal seek offset to the start of TOC, a NULL byte; a twelve byte ascii decimal total file size, a NULL byte; a 16-byte magic number to identify the file as a valid Python library file. The TOC (table of contents) is a dictionary containing the following keys: "TIME" The time the library was created, time.time(). "ASCTIME" The ascii time the library was created. "VERSION" The version of the library file format, "1". "PYC*VERSION" The ascii version number of the PYC section, "19990720". "PYC" A dictionary with keys "name", and values a three-tuple "tup". The "name" is the module name. This is a plain name such as "os" or a dotted name such as "foo.bar". The "tup[0]" is the seek offset to the data, which is the bytes of the module .pyc file including the 8-byte header. The "tup[1]" is the full path of the .pyc file. The "tup[2]" is an integer used as flag bits. Flag bit 0x01 is one if the module name refers to a package, in which case the file is package.__init__.pyc. This format is designed to be extensible. Just add atoms of data to TOC. To add a another whole section, add another name like "PYC". I have almost finished makepyl.py, a Python program which creates a PYL file or lists its contents. And C-code changes to import.c to be able to import from a PYL file. However, I am a little short of time because I am leaving for two weeks vacation on this Saturday. Jim Ahlstrom From mhammond@bigpond.net.au Thu Jul 22 23:03:18 1999 From: mhammond@bigpond.net.au (Mark Hammond) Date: Fri, 23 Jul 1999 08:03:18 +1000 Subject: [Distutils] Re: Add multiple frozen modules In-Reply-To: <37978267.E577C5E6@interet.com> Message-ID: <00d101bed48e$0235fe10$0801a8c0@bobcat> > After writing some code, I see that the library file needs to > be a little different. We need to add the file size so all fseek()'s > can be done relative to the end. That enables the file to be appended > to the executable. The offsets written to the file are from ftell() > and are relative to the beginning. The length is subtracted on > access. I havent been following this thread as closely as I would like, but I am a little confused. Are we building yet another packager here, or is this work building on the work Gordon currently "owns"? > I have almost finished makepyl.py, a Python program which creates a > PYL file or lists its contents. And C-code changes to import.c to > be able to import from a PYL file. However, I am a little short of If I recall correctly, when Greg posted his version of what grew into Gordon's, there was discussion over the direction import.c should take. Are your patches in line with that? Im just thinking that poor Guido has already discussed this once, so may be reluctant to accept new hacks when the correct direction has been mapped out. As far as I can see, Gordon's code works fine without C level changes, so Im a little unsure what direction this is going, and exactly what this is solving that Gordon's doesnt. Mark. From jim@interet.com Fri Jul 23 15:56:09 1999 From: jim@interet.com (James C. Ahlstrom) Date: Fri, 23 Jul 1999 10:56:09 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <00d101bed48e$0235fe10$0801a8c0@bobcat> Message-ID: <37988289.3BFA6EEF@interet.com> Mark Hammond wrote: > I havent been following this thread as closely as I would like, That's OK, I missed out on the prior distutils-sig discussions anyway. > but I am a > little confused. Are we building yet another packager here, or is this > work building on the work Gordon currently "owns"? It is not another packager. It provides a way to bootstrap Gordon's installer without requiring exceptions.py etc. to be found with the usual sys.path mechanism. All Python files can be in the library files. This has to be done with C code. > If I recall correctly, when Greg posted his version of what grew into > Gordon's, there was discussion over the direction import.c should take. > Are your patches in line with that? I hope so. The "makepyl.py" program for making library files is almost identical to Greg's easygen.py, and I wrote his name at the top. But I did alter the file format so that it could be appended to python15.dll, and so source names could be listed. I am open to any file format that works. I was hoping that we could finalize the format in this sig. It doesn't have to be my format. I read all prior distutil-sig archives, but in case I missed something, could you provide a reference to the discussion you are referring to? > Im just thinking that poor Guido has > already discussed this once, so may be reluctant to accept new hacks when > the correct direction has been mapped out. As far as I can see, Gordon's > code works fine without C level changes, so Im a little unsure what > direction this is going, and exactly what this is solving that Gordon's > doesnt. I solves the bootstrap problem. Any Python-only solution has to read several Python *.pyc files before it can get started. This requires using the regular sys.path mechanism which is what I am trying to replace. My C code can be used to boot up Gordon's installer. Also note Marc's mxCGIPython project. He has valiantly collected dozens of binary frozen library versions just so someone can install Python on their server without a hassle. A portable Python library file is a good idea. I got serious about this because of the recent thread started by Mikael Lyngvig in c.l.p. We really need a way to enable bullet-proof Python program distribution on machines where Python is not installed, on machines which have an invalid Python distribution, and on machines which have a valid Python but the wrong version. It is my perception that we are not there yet. Jim Ahlstrom From jim@interet.com Fri Jul 23 17:26:21 1999 From: jim@interet.com (James C. Ahlstrom) Date: Fri, 23 Jul 1999 12:26:21 -0400 Subject: [Distutils] Re: Add multiple frozen modules References: <1280229508-6766468@hypernet.com> <378C9611.F94AB321@interet.com> <378C9A71.AF4BBD53@appliedbiometrics.com> <378E41EC.FB27A8FC@interet.com> <379342C1.541C0F98@interet.com> <37978267.E577C5E6@interet.com> Message-ID: <379897AD.A2D82A85@interet.com> OK, I put my import Python library code on ftp://ftp.interet.com/pub/pylib.zip and ftp://ftp.interet.com/pub/pylib.tar.gz. The contents are identical. Uncompress in a temp directory. Try listing the contents of the library file makepyl.pyl using "python makepyl.py -t", or make a new one with "python makepyl.py" but be aware that this OVERWRITES makepyl.pyl. You will need to change the config file makepyl.cfg. Copy makepyl.pyl to python15.pyl (in the same directory) to make it active. Then import string, and enter "print string" to see that the module comes from python15.pyl. There are two issues, the library file format and the C-code changes to import.c. I published a proposed library file format here on distutils and it is described in the doc strings. But I didn't describe why I didn't use Greg's nor Gordon's format. Both formats remove the 8-byte .pyc header. But import.c validates this header for .pyc files and it should be retained. There was no file source information, but a user will want to print out the file names in the library. Greg's format could not be appended to another file. Gordon can build the library onto another file with a program, but can not append the same file to different programs. The new format can append the same library file to any other executable or dll using "cat library.pyl >> AnyPythonBinary[|.exe|.so|.dll]. It can also support multiple appends if desired. The changes to import.c add another PY_LIBRARY import method (like PY_SOURCE, PY_COMPILED, etc.) and is programmed in a parallel fashion. The code may be a bit rough but it is sufficient to get an idea. I am going away on vacation tomorrow, so I really hope all this works OK. I will catch up on all your comments when I get back. Jim Ahlstrom From gmcm@hypernet.com Fri Jul 23 22:45:34 1999 From: gmcm@hypernet.com (Gordon McMillan) Date: Fri, 23 Jul 1999 16:45:34 -0500 Subject: [Distutils] Re: Add multiple frozen modules In-Reply-To: <379897AD.A2D82A85@interet.com> Message-ID: <1279359964-27408194@hypernet.com> James C. Ahlstrom wrote: > I published a proposed library file format here on distutils > and it is described in the doc strings. But I didn't describe > why I didn't use Greg's nor Gordon's format. Both formats > remove the 8-byte .pyc header. But import.c validates this > header for .pyc files and it should be retained. Why? So import.c can try to recompile from non-existant source? If you let Greg or me build the archive, you'll find that the compiles get done if they're needed. Then the package contains a python and .pyc's that match that version. With or without a check, a mismatch will produce an ugly mess. > There was no > file source information, but a user will want to print out the > file names in the library. If they want to, the can get the name from the module's __importer__ attribute. They have everything they need to call archive.contents(), too, (if you're letting them get to an interpreter prompt). > Greg's format could not be appended to > another file. Gordon can build the library onto another file with a > program, but can not append the same file to different programs. Um, you mean because there's a "os.remove()" in Builder.py? Builder.py is mostly concerned with parsing the config file and determining dependencies. The actual archive building is done by the archive class, with some utility functions in archivebuilder.py. > The changes to import.c add another PY_LIBRARY import method > (like PY_SOURCE, PY_COMPILED, etc.) and is programmed in a > parallel fashion. The code may be a bit rough but it is > sufficient to get an idea. I realize you don't like having to have four .py files in the current directory. I don't like messing with python internals, especially when (among other things) you can use this to distribute a stripped down but otherwise standard python. In addition, both Greg & I are hoping (with some encouraging signs) that his imputil will attain sanctified status, which would probably mean that something like this could be done with just one 'extra' file. But in the meantime, are you sure you can't get the same effect by using normal embedding techniques in the .exe (ie, instead of just calling Py_Main?). (Well, not quite normal techniques, since you have to use GetProcAddress). > I am going away on vacation tomorrow, so I really hope all > this works OK. I will catch up on all your comments when I > get back. Have a good time. - Gordon