[Python-Dev] advice needed: best approach to enabling "metamodules"?

Chris Angelico rosuav at gmail.com
Sat Nov 29 04:45:11 CET 2014


On Sat, Nov 29, 2014 at 12:59 PM, Nathaniel Smith <njs at pobox.com> wrote:
> Option 4: Add a new function sys.swap_module_internals, which takes
> two module objects and swaps their __dict__ and other attributes. By
> making the operation a swap instead of an assignment, we avoid the
> lifecycle pitfalls from Option 3. By making it a builtin, we can make
> sure it always handles all the module fields that matter, not just
> __dict__. Usage:
>
>    new_module = MyModuleSubclass(...)
>    sys.swap_module_internals(new_module, sys.modules[__name__])
>    sys.modules[__name__] = new_module
>
> Option 4 downside: Obviously a hack.

This one corresponds to what I've seen in quite a number of C APIs.
It's not ideal, but nothing is; and at least this way, it's clear that
you're fiddling with internals. Letting the interpreter do the
grunt-work for you is *definitely* preferable to having recipes out
there saying "swap in a new __dict__, then don't forget to clear the
old module's __dict__", which will have massive versioning issues as
soon as a new best-practice comes along; making it a function, like
this, means its implementation can smoothly change between versions
(even in a bug-fix release).

Would it be better to make that function also switch out the entry in
sys.modules? That way, it's 100% dedicated to this job of "I want to
make a subclass of module and use that for myself", and could then be
made atomic against other imports. I've no idea whether there's any
other weird shenanigans that could be deployed with this kind of
module switch, nor whether cutting them out would be a good or bad
thing!

ChrisA


More information about the Python-Dev mailing list