[Python-checkins] r83770 - python/branches/py3k/Doc/whatsnew/3.2.rst

raymond.hettinger python-checkins at python.org
Sat Aug 7 01:23:49 CEST 2010


Author: raymond.hettinger
Date: Sat Aug  7 01:23:49 2010
New Revision: 83770

Log:
Improve the whatsnew article on the lru/lfu cache decorators.

Modified:
   python/branches/py3k/Doc/whatsnew/3.2.rst

Modified: python/branches/py3k/Doc/whatsnew/3.2.rst
==============================================================================
--- python/branches/py3k/Doc/whatsnew/3.2.rst	(original)
+++ python/branches/py3k/Doc/whatsnew/3.2.rst	Sat Aug  7 01:23:49 2010
@@ -71,8 +71,8 @@
   save repeated queries to an external resource whenever the results are
   expected to be the same.
 
-  For example, adding an LFU decorator to a database query function can save
-  database accesses for the most popular searches::
+  For example, adding a caching decorator to a database query function can save
+  database accesses for popular searches::
 
       @functools.lfu_cache(maxsize=50)
       def get_phone_number(name):
@@ -80,21 +80,32 @@
           c.execute('SELECT phonenumber FROM phonelist WHERE name=?', (name,))
           return c.fetchone()[0]
 
-  The LFU (least-frequently-used) cache gives best results when the distribution
-  of popular queries tends to remain the same over time. In contrast, the LRU
-  (least-recently-used) cache gives best results when the distribution changes
-  over time (for example, the most popular news articles change each day as
-  newer articles are added).
+  The caches support two strategies for limiting their size to *maxsize*. The
+  LFU (least-frequently-used) cache works bests when popular queries remain the
+  same over time.  In contrast, the LRU (least-recently-used) cache works best
+  query popularity changes over time (for example, the most popular news
+  articles change each day as newer articles are added).
 
-  The two caching decorators can be composed (nested) to handle hybrid cases
-  that have both long-term access patterns and some short-term access trends.
+  The two caching decorators can be composed (nested) to handle hybrid cases.
   For example, music searches can reflect both long-term patterns (popular
   classics) and short-term trends (new releases)::
 
-      @functools.lfu_cache(maxsize=500)
-      @functools.lru_cache(maxsize=100)
-      def find_music(song):
-          ...
+        @functools.lfu_cache(maxsize=500)
+        @functools.lru_cache(maxsize=100)
+        def find_lyrics(song):
+            query = 'http://www.example.com/songlist/%s' % urllib.quote(song)
+            page = urllib.urlopen(query).read()
+            return parse_lyrics(page)
+
+   To help with choosing an effective cache size, the wrapped function
+   is instrumented with two attributes 'hits' and 'misses'::
+
+        >>> for song in user_requests:
+        ...     find_lyrics(song)
+        >>> print find_lyrics.hits
+        4805
+        >>> print find_lyrics.misses
+        980
 
   (Contributed by Raymond Hettinger)
 


More information about the Python-checkins mailing list