[Python-checkins] peps: PEP 456: typo and grammar fixes.

georg.brandl python-checkins at python.org
Tue Oct 8 15:32:59 CEST 2013


http://hg.python.org/peps/rev/784562675896
changeset:   5177:784562675896
user:        gbrandl
date:        Tue Oct 08 15:32:43 2013 +0200
summary:
  PEP 456: typo and grammar fixes.

files:
  pep-0456.txt |  24 ++++++++++++------------
  1 files changed, 12 insertions(+), 12 deletions(-)


diff --git a/pep-0456.txt b/pep-0456.txt
--- a/pep-0456.txt
+++ b/pep-0456.txt
@@ -426,7 +426,7 @@
 ``generic_hash`` acts as a wrapper around ``_Py_HashBytes`` for the tp_hash
 slots of date, time and datetime types. timedelta objects are hashed by their
 state (days, seconds, microseconds) and tzinfo objects are not hashable. The
-data members of date, time and datetime types' struct are not void* aligned.
+data members of date, time and datetime types' struct are not ``void*`` aligned.
 This can easily by fixed with memcpy()ing four to ten bytes to an aligned
 buffer.
 
@@ -444,7 +444,7 @@
 
 for ASCII string and ASCII bytes. Equal hash values result in a hash collision
 and therefore cause a minor speed penalty for dicts and sets with mixed keys.
-The cause of the collision could be removed by e.g. subtraction ``-2`` from
+The cause of the collision could be removed by e.g. subtracting ``2`` from
 the hash value of bytes. (``-2`` because ``hash(b"") == 0`` and ``-1`` is
 reserved.)
 
@@ -455,8 +455,8 @@
 TBD
 
 First tests suggest that SipHash performs a bit faster on 64-bit CPUs when
-it is feed with medium size byte strings as well as ASCII and UCS2 Unicode
-strings. For very short strings the setup costs for SipHash dominates its
+it is fed with medium size byte strings as well as ASCII and UCS2 Unicode
+strings. For very short strings the setup cost for SipHash dominates its
 speed but it is still in the same order of magnitude as the current FNV code.
 
 It's yet unknown how the new distribution of hash values affects collisions
@@ -491,26 +491,26 @@
 
 The modifications don't alter any existing API.
 
-The output of `hash()` for strings and bytes are going to be different. The
+The output of ``hash()`` for strings and bytes are going to be different. The
 hash values for ASCII Unicode and ASCII bytes will stay equal.
 
 
 Alternative counter measures against hash collision DoS
 =======================================================
 
-Three alternative counter measures against hash collisions were discussed in
+Three alternative countermeasures against hash collisions were discussed in
 the past, but are not subject of this PEP.
 
-1. Marc-Andre Lemburg has suggested that dicts shall count hash collision. In
+1. Marc-Andre Lemburg has suggested that dicts shall count hash collisions. In
    case an insert operation causes too many collisions an exception shall be
    raised.
 
-2. Some application (e.g. PHP) have limit the amount of keys for GET and POST
-   HTTP request. The approach effectively leverages the impact of a hash
+2. Some applications (e.g. PHP) limit the amount of keys for GET and POST
+   HTTP requests. The approach effectively leverages the impact of a hash
    collision attack. (XXX citation needed)
 
 3. Hash maps have a worst case of O(n) for insertion and lookup of keys. This
-   results in an quadratic runtime during a hash collision attack. The
+   results in a quadratic runtime during a hash collision attack. The
    introduction of a new and additional data structure with with O(log n)
    worst case behavior would eliminate the root cause. A data structures like
    red-black-tree or prefix trees (trie [trie]_) would have other benefits,
@@ -531,8 +531,8 @@
 versions of the PEP aim for compile time configuration.
 
 
-Reference
-=========
+References
+==========
 
 * Issue 19183 [issue19183]_ contains a reference implementation.
 

-- 
Repository URL: http://hg.python.org/peps


More information about the Python-checkins mailing list