Why an object changes its "address" between adjacent calls?

Steven D'Aprano steve+comp.lang.python at pearwood.info
Sun Jun 17 07:06:19 EDT 2018


On Sun, 17 Jun 2018 17:39:42 +0800, sales at caprilion.com.tw wrote:

[...]
>> Every time you call nametofont(), you're creating a new instance of the
>> Font class.
> 
>      hmm... It means every time I set a widget's font to
> "TkDefaultFont", a new object was created.

Correct.


> Why python do things this way? Can't it use
> this same object again and again?

So if you create a font object, and then delete it, do you expect the 
interpreter to secretly hold onto that object forever just in case you 
need it again? If Python did that for everything, you would quickly run 
out of memory.

The Python interpreter may cache certain objects. CPython caches some 
integers and small strings that look like they could be identifiers. PyPy 
caches floats. Other implementations may or may not cache anything.

The interpreter may re-use immutable objects if doing so is a pure 
optimization with no effect on semantics, but it is unpredictable when it 
does this unless you study the interpreter source code.

But caching *everything* would be a terrible idea. You cannot cache or re-
use mutable objects like lists and dicts, and in general the interpreter 
cannot know which objects are immutable or not.

So don't expect Python to magically optimize the number of objects you 
create. If you want to use the same object over and over again, use the 
same object, don't create a new one:

# creates multiple font objects
from tkinter import font
a = font.nametofont('TkDefaultFont')
b = font.nametofont('TkDefaultFont')

# creates only one font object
from tkinter import font
a = font.nametofont('TkDefaultFont')
b = a


You can write your own cache function like this:


_fontcache = {}

def cached_font(name):
    try:
        return _fontcache[name]
    except KeyError:
        obj = font.nametofont(name)
        _fontcache[name] = obj
        return obj


Or this might be better:

https://docs.python.org/3/library/functools.html#functools.lru_cache



But 95% of the time, the right answer is, don't worry about it.

Memory is cheap, and chances are that you won't even notice that you 
saved a few bytes by using a cache. Sometimes, the extra time and memory 
used by the cache code will be greater than the memory you save.

Python is designed to be used when memory is cheap, programmer's time is 
expensive, and it is better to throw extra memory at a problem than to 
try to solve it in the smallest amount of memory possible. If you think 
about programming in terms of saving bytes, you will probably hate 
Python. If you remember than even a cheap, bottom of the range computer 
these days has approximately two BILLION bytes of memory, and most 
machines have much more than that, you won't stress so much about reusing 
objects.



    The Rules of Optimization are simple. Rule 1: Don’t do it. 
    Rule 2 (for experts only): Don’t do it yet.
    -- Michael A. Jackson, "Principles of Program Design"



-- 
Steven D'Aprano
"Ever since I learned about confirmation bias, I've been seeing
it everywhere." -- Jon Ronson




More information about the Python-list mailing list