[Python-checkins] distutils2: merged Alexis/Konrad branches

tarek.ziade python-checkins at python.org
Sun Jul 4 11:48:41 CEST 2010


tarek.ziade pushed c9708f8ad60c to distutils2:

http://hg.python.org/distutils2/rev/c9708f8ad60c
changeset:   348:c9708f8ad60c
parent:      256:f696d39e9e86
parent:      347:a569f25a2038
user:        Tarek Ziade <tarek at ziade.org>
date:        Sun Jul 04 11:34:47 2010 +0200
summary:     merged Alexis/Konrad branches
files:       src/distutils2/dist.py

diff --git a/docs/design/pep-0376.txt b/docs/design/pep-0376.txt
--- a/docs/design/pep-0376.txt
+++ b/docs/design/pep-0376.txt
@@ -633,7 +633,7 @@
 
 Distributions installed using existing, pre-standardization formats do not have
 the necessary metadata available for the new API, and thus will be
-ignored. Third-party tools may of course to continue to support previous
+ignored. Third-party tools may of course continue to support previous
 formats in addition to the new format, in order to ease the transition.
 
 
diff --git a/docs/design/wiki.rst b/docs/design/wiki.rst
--- a/docs/design/wiki.rst
+++ b/docs/design/wiki.rst
@@ -282,7 +282,7 @@
   mailman/etc/*               = {config}                # 8
   mailman/foo/**/bar/*.cfg    = {config}/baz            # 9
   mailman/foo/**/*.cfg        = {config}/hmm            # 9, 10
-  some-new-semantic.txt       = {funky-crazy-category}  # 11
+  some-new-semantic.sns       = {funky-crazy-category}  # 11
 
 The glob definitions are relative paths that match files from the top
 of the source tree (the location of ``setup.cfg``). Forward slashes
diff --git a/docs/source/index.rst b/docs/source/index.rst
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -14,6 +14,10 @@
    metadata
    pkgutil
    depgraph
+   new_commands
+   test_framework
+   pypi
+   version
 
 Indices and tables
 ==================
diff --git a/docs/source/metadata.rst b/docs/source/metadata.rst
--- a/docs/source/metadata.rst
+++ b/docs/source/metadata.rst
@@ -17,7 +17,7 @@
 Reading metadata
 ================
 
-The :class:`DistributionMetadata` class can be instantiated with the path of
+The :class:`DistributionMetadata` class can be instanciated with the path of
 the metadata file, and provides a dict-like interface to the values::
 
     >>> from distutils2.metadata import DistributionMetadata
@@ -32,14 +32,14 @@
     ["pywin32; sys.platform == 'win32'", "Sphinx"]
 
 The fields that supports environment markers can be automatically ignored if
-the object is instantiated using the ``platform_dependent`` option.
+the object is instanciated using the ``platform_dependant`` option.
 :class:`DistributionMetadata` will interpret in the case the markers and will
 automatically remove the fields that are not compliant with the running
 environment. Here's an example under Mac OS X. The win32 dependency
 we saw earlier is ignored::
 
     >>> from distutils2.metadata import DistributionMetadata
-    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True)
+    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependant=True)
     >>> metadata['Requires-Dist']
     ['bar']
 
@@ -53,7 +53,7 @@
 
     >>> from distutils2.metadata import DistributionMetadata
     >>> context = {'sys.platform': 'win32'}
-    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True,
+    >>> metadata = DistributionMetadata('PKG-INFO', platform_dependant=True,
     ...                                 execution_context=context)
     ...
     >>> metadata['Requires-Dist'] = ["pywin32; sys.platform == 'win32'",
@@ -71,7 +71,7 @@
     >>> metadata.write('/to/my/PKG-INFO')
 
 The class will pick the best version for the metadata, depending on the values
-provided. If all the values provided exist in all versions, the class will
+provided. If all the values provided exists in all versions, the class will
 use :attr:`metadata.PKG_INFO_PREFERRED_VERSION`. It is set by default to 1.0.
 
 
@@ -79,7 +79,7 @@
 ==================================
 
 Some fields in :pep:`345` have to follow a version scheme in their versions
-predicate. When the scheme is violated, a warning is emitted::
+predicate. When the scheme is violated, a warning is emited::
 
     >>> from distutils2.metadata import DistributionMetadata
     >>> metadata = DistributionMetadata()
@@ -90,3 +90,6 @@
 
 
 .. TODO talk about check()
+
+
+
diff --git a/docs/source/new_commands.rst b/docs/source/new_commands.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/new_commands.rst
@@ -0,0 +1,65 @@
+========
+Commands
+========
+
+Distutils2 provides a set of commands that are not present in distutils itself.
+You might recognize some of them from other projects, like Distribute or
+Setuptools.
+
+``upload_docs`` - Upload package documentation to PyPI
+======================================================
+
+PyPI now supports uploading project documentation to the dedicated URL
+http://packages.python.org/<project>/.
+
+The ``upload_docs`` command will create the necessary zip file out of a
+documentation directory and will post to the repository.
+
+Note that to upload the documentation of a project, the corresponding version
+must already be registered with PyPI, using the distutils ``register``
+command -- just like the ``upload`` command.
+
+Assuming there is an ``Example`` project with documentation in the
+subdirectory ``docs``, e.g.::
+
+  Example/
+  |-- example.py
+  |-- setup.cfg
+  |-- setup.py
+  |-- docs
+  |   |-- build
+  |   |   `-- html
+  |   |   |   |-- index.html
+  |   |   |   `-- tips_tricks.html
+  |   |-- conf.py
+  |   |-- index.txt
+  |   `-- tips_tricks.txt
+
+You can simply pass the documentation directory path to the ``upload_docs``
+command::
+
+    python setup.py upload_docs --upload-dir=docs/build/html
+
+As with any other ``setuptools`` based command, you can define useful
+defaults in the ``setup.cfg`` of your Python project, e.g.:
+
+.. code-block:: ini
+
+    [upload_docs]
+    upload-dir = docs/build/html
+
+The ``upload_docs`` command has the following options:
+
+``--upload-dir``
+    The directory to be uploaded to the repository. The default value is
+    ``docs`` in project root.
+
+``--show-response``
+    Display the full response text from server; this is useful for debugging
+    PyPI problems.
+
+``--repository=URL, -r URL``
+    The URL of the repository to upload to.  Defaults to
+    http://pypi.python.org/pypi (i.e., the main PyPI installation).
+
+
diff --git a/docs/source/pypi.rst b/docs/source/pypi.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/pypi.rst
@@ -0,0 +1,195 @@
+=========================================
+Tools to query PyPI: the PyPI package
+=========================================
+
+Distutils2 comes with a module (eg. `distutils2.pypi`) which contains
+facilities to access the Python Package Index (named "pypi", and avalaible on
+the url `http://pypi.python.org`.
+
+There is two ways to retrieve data from pypi: using the *simple* API, and using
+*XML-RPC*. The first one is in fact a set of HTML pages avalaible at
+`http://pypi.python.org/simple/`, and the second one contains a set of XML-RPC
+methods. In order to reduce the overload caused by running distant methods on 
+the pypi server (by using the XML-RPC methods), the best way to retrieve 
+informations is by using the simple API, when it contains the information you 
+need.
+
+Distutils2 provides two python modules to ease the work with those two APIs:
+`distutils2.pypi.simple` and `distutils2.pypi.xmlrpc`. Both of them depends on 
+another python module: `distutils2.pypi.dist`.
+
+
+Requesting information via the "simple" API `distutils2.pypi.simple`
+====================================================================
+
+`distutils2.pypi.simple` can process the Python Package Index and return and 
+download urls of distributions, for specific versions or latests, but it also 
+can process external html pages, with the goal to find *pypi unhosted* versions 
+of python distributions.
+
+You should use `distutils2.pypi.simple` for: 
+
+    * Search distributions by name and versions.
+    * Process pypi external pages.
+    * Download distributions by name and versions.
+
+And should not be used to:
+
+    * Things that will end up in too long index processing (like "finding all
+      distributions with a specific version, no matters the name")
+
+API
+----
+
+Here is a complete overview of the APIs of the SimpleIndex class.
+
+.. autoclass:: distutils2.pypi.simple.SimpleIndex
+    :members:
+
+Usage Exemples
+---------------
+
+To help you understand how using the `SimpleIndex` class, here are some basic
+usages.
+
+Request PyPI to get a specific distribution
+++++++++++++++++++++++++++++++++++++++++++++
+
+Supposing you want to scan the PyPI index to get a list of distributions for 
+the "foobar" project. You can use the "find" method for that::
+
+    >>> from distutils2.pypi import SimpleIndex
+    >>> client = SimpleIndex()
+    >>> client.find("foobar")
+    [<PyPIDistribution "Foobar 1.1">, <PyPIDistribution "Foobar 1.2">]
+    
+Note that you also can request the client about specific versions, using version
+specifiers (described in `PEP 345 
+<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_)::
+
+    >>> client.find("foobar < 1.2")
+    [<PyPIDistribution "foobar 1.1">, ]
+
+`find` returns a list of distributions, but you also can get the last
+distribution (the more up to date) that fullfil your requirements, like this::
+    
+    >>> client.get("foobar < 1.2")
+    <PyPIDistribution "foobar 1.1">
+
+Download distributions
++++++++++++++++++++++++
+
+As it can get the urls of distributions provided by PyPI, the `SimpleIndex` 
+client also can download the distributions and put it for you in a temporary
+destination::
+
+    >>> client.download("foobar")
+    /tmp/temp_dir/foobar-1.2.tar.gz
+
+You also can specify the directory you want to download to::
+    
+    >>> client.download("foobar", "/path/to/my/dir")
+    /path/to/my/dir/foobar-1.2.tar.gz
+
+While downloading, the md5 of the archive will be checked, if not matches, it
+will try another time, then if fails again, raise `MD5HashDoesNotMatchError`.
+
+Internally, that's not the SimpleIndex which download the distributions, but the
+`PyPIDistribution` class. Please refer to this documentation for more details.
+
+Following PyPI external links
+++++++++++++++++++++++++++++++
+
+The default behavior for distutils2 is to *not* follow the links provided
+by HTML pages in the "simple index", to find distributions related
+downloads.
+
+It's possible to tell the PyPIClient to follow external links by setting the 
+`follow_externals` attribute, on instanciation or after::
+
+    >>> client = SimpleIndex(follow_externals=True)
+
+or ::
+
+    >>> client = SimpleIndex()
+    >>> client.follow_externals = True
+
+Working with external indexes, and mirrors
++++++++++++++++++++++++++++++++++++++++++++
+
+The default `SimpleIndex` behavior is to rely on the Python Package index stored
+on PyPI (http://pypi.python.org/simple).
+
+As you can need to work with a local index, or private indexes, you can specify
+it using the index_url parameter::
+
+    >>> client = SimpleIndex(index_url="file://filesystem/path/")
+
+or ::
+
+    >>> client = SimpleIndex(index_url="http://some.specific.url/")
+
+You also can specify mirrors to fallback on in case the first index_url you
+provided doesnt respond, or not correctly. The default behavior for
+`SimpleIndex` is to use the list provided by Python.org DNS records, as
+described in the :pep:`381` about mirroring infrastructure.
+
+If you don't want to rely on these, you could specify the list of mirrors you
+want to try by specifying the `mirrors` attribute. It's a simple iterable::
+
+    >>> mirrors = ["http://first.mirror","http://second.mirror"]
+    >>> client = SimpleIndex(mirrors=mirrors)
+
+
+Requesting informations via XML-RPC (`distutils2.pypi.XmlRpcIndex`)
+==========================================================================
+
+The other method to request the Python package index, is using the XML-RPC
+methods. Distutils2 provides a simple wrapper around `xmlrpclib
+<http://docs.python.org/library/xmlrpclib.html>`_, that can return you
+`PyPIDistribution` objects.
+
+::
+    >>> from distutils2.pypi import XmlRpcIndex()
+    >>> client = XmlRpcIndex()
+
+
+PyPI Distributions
+==================
+
+Both `SimpleIndex` and `XmlRpcIndex` classes works with the classes provided
+in the `pypi.dist` package.
+
+`PyPIDistribution`
+------------------
+
+`PyPIDistribution` is a simple class that defines the following attributes:
+
+:name:
+    The name of the package. `foobar` in our exemples here
+:version:
+    The version of the package
+:location:
+    If the files from the archive has been downloaded, here is the path where
+    you can find them.
+:url:
+    The url of the distribution
+
+.. autoclass:: distutils2.pypi.dist.PyPIDistribution
+    :members:
+
+`PyPIDistributions`
+-------------------
+
+The `dist` module also provides another class, to work with lists of 
+`PyPIDistribution` classes. It allow to filter results and is used as a 
+container of 
+
+.. autoclass:: distutils2.pypi.dist.PyPIDistributions
+    :members:
+
+At a higher level
+=================
+
+XXX : A description about a wraper around PyPI simple and XmlRpc Indexes
+(PyPIIndex ?) 
diff --git a/docs/source/test_framework.rst b/docs/source/test_framework.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/test_framework.rst
@@ -0,0 +1,88 @@
+==============
+Test Framework
+==============
+
+When you are testing code that works with distutils, you might find these tools
+useful.
+
+``PyPIServer``
+==============
+
+PyPIServer is a class that implements an HTTP server running in a separate
+thread. All it does is record the requests for further inspection. The recorded
+data is available under ``requests`` attribute. The default
+HTTP response can be overriden with the ``default_response_status``,
+``default_response_headers`` and ``default_response_data`` attributes.
+
+By default, when accessing the server with urls beginning with `/simple/`, 
+the server also record your requests, but will look for files under 
+the `/tests/pypiserver/simple/` path.
+
+You can tell the sever to serve static files for other paths. This could be 
+accomplished by using the `static_uri_paths` parameter, as below::
+
+    server = PyPIServer(static_uri_paths=["first_path", "second_path"])
+
+You need to create the content that will be served under the 
+`/tests/pypiserver/default` path. If you want to serve content from another 
+place, you also can specify another filesystem path (wich need to be under
+`tests/pypiserver/`. This will replace the default behavior of the server, and
+it will not serve content from the `default` dir ::
+
+    server = PyPIServer(static_filesystem_paths=["path/to/your/dir"])
+
+If you just need to add some paths to the existing ones, you can do as shown, 
+keeping in mind that the server will alwas try to load paths in reverse order 
+(e.g here, try "another/super/path" then the default one) ::
+
+    server = PyPIServer(test_static_path="another/super/path")
+    server = PyPIServer("another/super/path")
+    # or 
+    server.static_filesystem_paths.append("another/super/path")
+
+As a result of what, in your tests, while you need to use the PyPIServer, in
+order to isolates the test cases, the best practice is to place the common files
+in the `default` folder, and to create a directory for each specific test case::
+
+    server = PyPIServer(static_filesystem_paths = ["default", "test_pypi_server"],
+        static_uri_paths=["simple", "external"])
+
+``PyPIServerTestCase``
+======================
+
+``PyPIServerTestCase`` is a test case class with setUp and tearDown methods that
+take care of a single PyPIServer instance attached as a ``pypi`` attribute on
+the test class. Use it as one of the base classes in your test case::
+
+  class UploadTestCase(PyPIServerTestCase):
+      def test_something(self):
+          cmd = self.prepare_command()
+          cmd.ensure_finalized()
+          cmd.repository = self.pypi.full_address
+          cmd.run()
+
+          environ, request_data = self.pypi.requests[-1]
+          self.assertEqual(request_data, EXPECTED_REQUEST_DATA)
+
+The ``use_pypi_server`` decorator
+=================================
+
+You also can use a decorator for your tests, if you do not need the same server
+instance along all you test case. So, you can specify, for each test method,
+some initialisation parameters for the server.
+
+For this, you need to add a `server` parameter to your method, like this::
+
+    class SampleTestCase(TestCase):
+        @use_pypi_server()
+        def test_somthing(self, server):
+            # your tests goes here
+
+The decorator will instanciate the server for you, and run and stop it just
+before and after your method call. You also can pass the server initializer,
+just like this::
+
+    class SampleTestCase(TestCase):
+        @use_pypi_server("test_case_name")
+        def test_something(self, server):
+            # something
diff --git a/docs/source/version.rst b/docs/source/version.rst
new file mode 100644
--- /dev/null
+++ b/docs/source/version.rst
@@ -0,0 +1,64 @@
+======================
+Working with versions
+======================
+
+Distutils2 ships with a python package capable to work with version numbers.
+It's an implementation of version specifiers `as defined in PEP 345
+<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_ about
+Metadata.
+
+`distutils2.version.NormalizedVersion`
+======================================
+
+A Normalized version corresponds to a specific version of a distribution, as
+described in the PEP 345. So, you can work with the `NormalizedVersion` like
+this::
+
+    >>> NormalizedVersion("1.2b1")
+    NormalizedVersion('1.2b1')
+
+If you try to use irrational version specifiers, an `IrrationalVersionError`
+will be raised::
+
+    >>> NormalizedVersion("irrational_version_number")
+    ...
+    IrrationalVersionError: irrational_version_number
+
+You can compare NormalizedVersion objects, like this::
+
+    >>> NormalizedVersion("1.2b1") < NormalizedVersion("1.2")
+    True
+
+NormalizedVersion is used internally by `VersionPredicate` to do his stuff.
+
+`distutils2.version.suggest_normalized_version`
+-----------------------------------------------
+
+You also can let the normalized version be suggested to you, using the
+`suggest_normalized_version` function::
+
+    >>> suggest_normalized_version('2.1-rc1') 
+    2.1c1
+
+If `suggest_normalized_version` can't actually suggest you a version, it will
+return `None`::
+
+    >>> print suggest_normalized_version('not a version')
+    None
+
+`distutils2.version.VersionPredicate`
+=====================================
+
+`VersionPredicate` knows how to parse stuff like "ProjectName (>=version)", the
+class also provides a `match` method to test if a version number is the version
+predicate::
+
+    >>> version = VersionPredicate("ProjectName (<1.2,>1.0")
+    >>> version.match("1.2.1")
+    False
+    >>> version.match("1.1.1")
+    True
+
+`is_valid_predicate`
+--------------------
+
diff --git a/src/CONTRIBUTORS.txt b/src/CONTRIBUTORS.txt
--- a/src/CONTRIBUTORS.txt
+++ b/src/CONTRIBUTORS.txt
@@ -18,6 +18,7 @@
 - Jeremy Kloth
 - Martin von Löwis
 - Carl Meyer
+- Alexis Metaireau
 - Michael Mulich
 - George Peris
 - Sean Reifschneider
diff --git a/src/Modules/_hashopenssl.c b/src/Modules/_hashopenssl.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/_hashopenssl.c
@@ -0,0 +1,524 @@
+/* Module that wraps all OpenSSL hash algorithms */
+
+/*
+ * Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+ * Licensed to PSF under a Contributor Agreement.
+ *
+ * Derived from a skeleton of shamodule.c containing work performed by:
+ *
+ * Andrew Kuchling (amk at amk.ca)
+ * Greg Stein (gstein at lyra.org)
+ *
+ */
+
+#define PY_SSIZE_T_CLEAN
+
+#include "Python.h"
+#include "structmember.h"
+
+#if (PY_VERSION_HEX < 0x02050000)
+#define Py_ssize_t      int
+#endif
+
+/* EVP is the preferred interface to hashing in OpenSSL */
+#include <openssl/evp.h>
+
+#define MUNCH_SIZE INT_MAX
+
+
+#ifndef HASH_OBJ_CONSTRUCTOR
+#define HASH_OBJ_CONSTRUCTOR 0
+#endif
+
+typedef struct {
+    PyObject_HEAD
+    PyObject            *name;  /* name of this hash algorithm */
+    EVP_MD_CTX          ctx;    /* OpenSSL message digest context */
+} EVPobject;
+
+
+static PyTypeObject EVPtype;
+
+
+#define DEFINE_CONSTS_FOR_NEW(Name)  \
+    static PyObject *CONST_ ## Name ## _name_obj; \
+    static EVP_MD_CTX CONST_new_ ## Name ## _ctx; \
+    static EVP_MD_CTX *CONST_new_ ## Name ## _ctx_p = NULL;
+
+DEFINE_CONSTS_FOR_NEW(md5)
+DEFINE_CONSTS_FOR_NEW(sha1)
+DEFINE_CONSTS_FOR_NEW(sha224)
+DEFINE_CONSTS_FOR_NEW(sha256)
+DEFINE_CONSTS_FOR_NEW(sha384)
+DEFINE_CONSTS_FOR_NEW(sha512)
+
+
+static EVPobject *
+newEVPobject(PyObject *name)
+{
+    EVPobject *retval = (EVPobject *)PyObject_New(EVPobject, &EVPtype);
+
+    /* save the name for .name to return */
+    if (retval != NULL) {
+        Py_INCREF(name);
+        retval->name = name;
+    }
+
+    return retval;
+}
+
+/* Internal methods for a hash object */
+
+static void
+EVP_dealloc(PyObject *ptr)
+{
+    EVP_MD_CTX_cleanup(&((EVPobject *)ptr)->ctx);
+    Py_XDECREF(((EVPobject *)ptr)->name);
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hash object */
+
+PyDoc_STRVAR(EVP_copy__doc__, "Return a copy of the hash object.");
+
+static PyObject *
+EVP_copy(EVPobject *self, PyObject *unused)
+{
+    EVPobject *newobj;
+
+    if ( (newobj = newEVPobject(self->name))==NULL)
+        return NULL;
+
+    EVP_MD_CTX_copy(&newobj->ctx, &self->ctx);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(EVP_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+EVP_digest(EVPobject *self, PyObject *unused)
+{
+    unsigned char digest[EVP_MAX_MD_SIZE];
+    EVP_MD_CTX temp_ctx;
+    PyObject *retval;
+    unsigned int digest_size;
+
+    EVP_MD_CTX_copy(&temp_ctx, &self->ctx);
+    digest_size = EVP_MD_CTX_size(&temp_ctx);
+    EVP_DigestFinal(&temp_ctx, digest, NULL);
+
+    retval = PyString_FromStringAndSize((const char *)digest, digest_size);
+    EVP_MD_CTX_cleanup(&temp_ctx);
+    return retval;
+}
+
+PyDoc_STRVAR(EVP_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+EVP_hexdigest(EVPobject *self, PyObject *unused)
+{
+    unsigned char digest[EVP_MAX_MD_SIZE];
+    EVP_MD_CTX temp_ctx;
+    PyObject *retval;
+    char *hex_digest;
+    unsigned int i, j, digest_size;
+
+    /* Get the raw (binary) digest value */
+    EVP_MD_CTX_copy(&temp_ctx, &self->ctx);
+    digest_size = EVP_MD_CTX_size(&temp_ctx);
+    EVP_DigestFinal(&temp_ctx, digest, NULL);
+
+    EVP_MD_CTX_cleanup(&temp_ctx);
+
+    /* Create a new string */
+    /* NOTE: not thread safe! modifying an already created string object */
+    /* (not a problem because we hold the GIL by default) */
+    retval = PyString_FromStringAndSize(NULL, digest_size * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for(i=j=0; i<digest_size; i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(EVP_update__doc__,
+"Update this hash object's state with the provided string.");
+
+static PyObject *
+EVP_update(EVPobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    Py_ssize_t len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    if (len > 0 && len <= MUNCH_SIZE) {
+    EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
+                                                      unsigned int));
+    } else {
+        Py_ssize_t offset = 0;
+        while (len) {
+            unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
+            EVP_DigestUpdate(&self->ctx, cp + offset, process);
+            len -= process;
+            offset += process;
+        }
+    }
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef EVP_methods[] = {
+    {"update",	  (PyCFunction)EVP_update,    METH_VARARGS, EVP_update__doc__},
+    {"digest",	  (PyCFunction)EVP_digest,    METH_NOARGS,  EVP_digest__doc__},
+    {"hexdigest", (PyCFunction)EVP_hexdigest, METH_NOARGS,  EVP_hexdigest__doc__},
+    {"copy",	  (PyCFunction)EVP_copy,      METH_NOARGS,  EVP_copy__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+EVP_get_block_size(EVPobject *self, void *closure)
+{
+    return PyInt_FromLong(EVP_MD_CTX_block_size(&((EVPobject *)self)->ctx));
+}
+
+static PyObject *
+EVP_get_digest_size(EVPobject *self, void *closure)
+{
+    return PyInt_FromLong(EVP_MD_CTX_size(&((EVPobject *)self)->ctx));
+}
+
+static PyMemberDef EVP_members[] = {
+    {"name", T_OBJECT, offsetof(EVPobject, name), READONLY, PyDoc_STR("algorithm name.")},
+    {NULL}  /* Sentinel */
+};
+
+static PyGetSetDef EVP_getseters[] = {
+    {"digest_size",
+     (getter)EVP_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {"block_size",
+     (getter)EVP_get_block_size, NULL,
+     NULL,
+     NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize",
+     (getter)EVP_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+
+static PyObject *
+EVP_repr(PyObject *self)
+{
+    char buf[100];
+    PyOS_snprintf(buf, sizeof(buf), "<%s HASH object @ %p>",
+            PyString_AsString(((EVPobject *)self)->name), self);
+    return PyString_FromString(buf);
+}
+
+#if HASH_OBJ_CONSTRUCTOR
+static int
+EVP_tp_init(EVPobject *self, PyObject *args, PyObject *kwds)
+{
+    static char *kwlist[] = {"name", "string", NULL};
+    PyObject *name_obj = NULL;
+    char *nameStr;
+    unsigned char *cp = NULL;
+    Py_ssize_t len = 0;
+    const EVP_MD *digest;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|s#:HASH", kwlist,
+                                     &name_obj, &cp, &len)) {
+        return -1;
+    }
+
+    if (!PyArg_Parse(name_obj, "s", &nameStr)) {
+        PyErr_SetString(PyExc_TypeError, "name must be a string");
+        return -1;
+    }
+
+    digest = EVP_get_digestbyname(nameStr);
+    if (!digest) {
+        PyErr_SetString(PyExc_ValueError, "unknown hash function");
+        return -1;
+    }
+    EVP_DigestInit(&self->ctx, digest);
+
+    self->name = name_obj;
+    Py_INCREF(self->name);
+
+    if (cp && len) {
+        if (len > 0 && len <= MUNCH_SIZE) {
+        EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
+                                                          unsigned int));
+        } else {
+            Py_ssize_t offset = 0;
+            while (len) {
+                unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
+                EVP_DigestUpdate(&self->ctx, cp + offset, process);
+                len -= process;
+                offset += process;
+            }
+        }
+    }
+    
+    return 0;
+}
+#endif
+
+
+PyDoc_STRVAR(hashtype_doc,
+"A hash represents the object used to calculate a checksum of a\n\
+string of information.\n\
+\n\
+Methods:\n\
+\n\
+update() -- updates the current digest with an additional string\n\
+digest() -- return the current digest value\n\
+hexdigest() -- return the current digest as a string of hexadecimal digits\n\
+copy() -- return a copy of the current hash object\n\
+\n\
+Attributes:\n\
+\n\
+name -- the hash algorithm being used by this object\n\
+digest_size -- number of bytes in this hashes output\n");
+
+static PyTypeObject EVPtype = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_hashlib.HASH",    /*tp_name*/
+    sizeof(EVPobject),	/*tp_basicsize*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    EVP_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,                  /*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    EVP_repr,           /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
+    hashtype_doc,       /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    EVP_methods,	/* tp_methods */
+    EVP_members,	/* tp_members */
+    EVP_getseters,      /* tp_getset */
+#if 1
+    0,                  /* tp_base */
+    0,                  /* tp_dict */
+    0,                  /* tp_descr_get */
+    0,                  /* tp_descr_set */
+    0,                  /* tp_dictoffset */
+#endif
+#if HASH_OBJ_CONSTRUCTOR
+    (initproc)EVP_tp_init, /* tp_init */
+#endif
+};
+
+static PyObject *
+EVPnew(PyObject *name_obj,
+       const EVP_MD *digest, const EVP_MD_CTX *initial_ctx,
+       const unsigned char *cp, Py_ssize_t len)
+{
+    EVPobject *self;
+
+    if (!digest && !initial_ctx) {
+        PyErr_SetString(PyExc_ValueError, "unsupported hash type");
+        return NULL;
+    }
+
+    if ((self = newEVPobject(name_obj)) == NULL)
+        return NULL;
+
+    if (initial_ctx) {
+        EVP_MD_CTX_copy(&self->ctx, initial_ctx);
+    } else {
+        EVP_DigestInit(&self->ctx, digest);
+    }
+
+    if (cp && len) {
+        if (len > 0 && len <= MUNCH_SIZE) {
+            EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
+                                                              unsigned int));
+        } else {
+            Py_ssize_t offset = 0;
+            while (len) {
+                unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
+                EVP_DigestUpdate(&self->ctx, cp + offset, process);
+                len -= process;
+                offset += process;
+            }
+        }
+    }
+
+    return (PyObject *)self;
+}
+
+
+/* The module-level function: new() */
+
+PyDoc_STRVAR(EVP_new__doc__,
+"Return a new hash object using the named algorithm.\n\
+An optional string argument may be provided and will be\n\
+automatically hashed.\n\
+\n\
+The MD5 and SHA1 algorithms are always supported.\n");
+
+static PyObject *
+EVP_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"name", "string", NULL};
+    PyObject *name_obj = NULL;
+    char *name;
+    const EVP_MD *digest;
+    unsigned char *cp = NULL;
+    Py_ssize_t len = 0;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "O|s#:new", kwlist,
+                                     &name_obj, &cp, &len)) {
+        return NULL;
+    }
+
+    if (!PyArg_Parse(name_obj, "s", &name)) {
+        PyErr_SetString(PyExc_TypeError, "name must be a string");
+        return NULL;
+    }
+
+    digest = EVP_get_digestbyname(name);
+
+    return EVPnew(name_obj, digest, NULL, cp, len);
+}
+
+/*
+ *  This macro generates constructor function definitions for specific
+ *  hash algorithms.  These constructors are much faster than calling
+ *  the generic one passing it a python string and are noticably
+ *  faster than calling a python new() wrapper.  Thats important for
+ *  code that wants to make hashes of a bunch of small strings.
+ */
+#define GEN_CONSTRUCTOR(NAME)  \
+    static PyObject * \
+    EVP_new_ ## NAME (PyObject *self, PyObject *args) \
+    { \
+        unsigned char *cp = NULL; \
+        Py_ssize_t len = 0; \
+     \
+        if (!PyArg_ParseTuple(args, "|s#:" #NAME , &cp, &len)) { \
+            return NULL; \
+        } \
+     \
+        return EVPnew( \
+                CONST_ ## NAME ## _name_obj, \
+                NULL, \
+                CONST_new_ ## NAME ## _ctx_p, \
+                cp, len); \
+    }
+
+/* a PyMethodDef structure for the constructor */
+#define CONSTRUCTOR_METH_DEF(NAME)  \
+    {"openssl_" #NAME, (PyCFunction)EVP_new_ ## NAME, METH_VARARGS, \
+        PyDoc_STR("Returns a " #NAME \
+                  " hash object; optionally initialized with a string") \
+    }
+
+/* used in the init function to setup a constructor */
+#define INIT_CONSTRUCTOR_CONSTANTS(NAME)  do { \
+    CONST_ ## NAME ## _name_obj = PyString_FromString(#NAME); \
+    if (EVP_get_digestbyname(#NAME)) { \
+        CONST_new_ ## NAME ## _ctx_p = &CONST_new_ ## NAME ## _ctx; \
+        EVP_DigestInit(CONST_new_ ## NAME ## _ctx_p, EVP_get_digestbyname(#NAME)); \
+    } \
+} while (0);
+
+GEN_CONSTRUCTOR(md5)
+GEN_CONSTRUCTOR(sha1)
+GEN_CONSTRUCTOR(sha224)
+GEN_CONSTRUCTOR(sha256)
+GEN_CONSTRUCTOR(sha384)
+GEN_CONSTRUCTOR(sha512)
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef EVP_functions[] = {
+    {"new", (PyCFunction)EVP_new, METH_VARARGS|METH_KEYWORDS, EVP_new__doc__},
+    CONSTRUCTOR_METH_DEF(md5),
+    CONSTRUCTOR_METH_DEF(sha1),
+    CONSTRUCTOR_METH_DEF(sha224),
+    CONSTRUCTOR_METH_DEF(sha256),
+    CONSTRUCTOR_METH_DEF(sha384),
+    CONSTRUCTOR_METH_DEF(sha512),
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+PyMODINIT_FUNC
+init_hashlib(void)
+{
+    PyObject *m;
+
+    OpenSSL_add_all_digests();
+
+    /* TODO build EVP_functions openssl_* entries dynamically based
+     * on what hashes are supported rather than listing many
+     * but having some be unsupported.  Only init appropriate
+     * constants. */
+
+    EVPtype.ob_type = &PyType_Type;
+    if (PyType_Ready(&EVPtype) < 0)
+        return;
+
+    m = Py_InitModule("_hashlib", EVP_functions);
+    if (m == NULL)
+        return;
+
+#if HASH_OBJ_CONSTRUCTOR
+    Py_INCREF(&EVPtype);
+    PyModule_AddObject(m, "HASH", (PyObject *)&EVPtype);
+#endif
+
+    /* these constants are used by the convenience constructors */
+    INIT_CONSTRUCTOR_CONSTANTS(md5);
+    INIT_CONSTRUCTOR_CONSTANTS(sha1);
+    INIT_CONSTRUCTOR_CONSTANTS(sha224);
+    INIT_CONSTRUCTOR_CONSTANTS(sha256);
+    INIT_CONSTRUCTOR_CONSTANTS(sha384);
+    INIT_CONSTRUCTOR_CONSTANTS(sha512);
+}
diff --git a/src/Modules/md5.c b/src/Modules/md5.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/md5.c
@@ -0,0 +1,381 @@
+/*
+  Copyright (C) 1999, 2000, 2002 Aladdin Enterprises.  All rights reserved.
+
+  This software is provided 'as-is', without any express or implied
+  warranty.  In no event will the authors be held liable for any damages
+  arising from the use of this software.
+
+  Permission is granted to anyone to use this software for any purpose,
+  including commercial applications, and to alter it and redistribute it
+  freely, subject to the following restrictions:
+
+  1. The origin of this software must not be misrepresented; you must not
+     claim that you wrote the original software. If you use this software
+     in a product, an acknowledgment in the product documentation would be
+     appreciated but is not required.
+  2. Altered source versions must be plainly marked as such, and must not be
+     misrepresented as being the original software.
+  3. This notice may not be removed or altered from any source distribution.
+
+  L. Peter Deutsch
+  ghost at aladdin.com
+
+ */
+/* $Id: md5.c,v 1.6 2002/04/13 19:20:28 lpd Exp $ */
+/*
+  Independent implementation of MD5 (RFC 1321).
+
+  This code implements the MD5 Algorithm defined in RFC 1321, whose
+  text is available at
+	http://www.ietf.org/rfc/rfc1321.txt
+  The code is derived from the text of the RFC, including the test suite
+  (section A.5) but excluding the rest of Appendix A.  It does not include
+  any code or documentation that is identified in the RFC as being
+  copyrighted.
+
+  The original and principal author of md5.c is L. Peter Deutsch
+  <ghost at aladdin.com>.  Other authors are noted in the change history
+  that follows (in reverse chronological order):
+
+  2002-04-13 lpd Clarified derivation from RFC 1321; now handles byte order
+	either statically or dynamically; added missing #include <string.h>
+	in library.
+  2002-03-11 lpd Corrected argument list for main(), and added int return
+	type, in test program and T value program.
+  2002-02-21 lpd Added missing #include <stdio.h> in test program.
+  2000-07-03 lpd Patched to eliminate warnings about "constant is
+	unsigned in ANSI C, signed in traditional"; made test program
+	self-checking.
+  1999-11-04 lpd Edited comments slightly for automatic TOC extraction.
+  1999-10-18 lpd Fixed typo in header comment (ansi2knr rather than md5).
+  1999-05-03 lpd Original version.
+ */
+
+#include "md5.h"
+#include <string.h>
+
+#undef BYTE_ORDER	/* 1 = big-endian, -1 = little-endian, 0 = unknown */
+#ifdef ARCH_IS_BIG_ENDIAN
+#  define BYTE_ORDER (ARCH_IS_BIG_ENDIAN ? 1 : -1)
+#else
+#  define BYTE_ORDER 0
+#endif
+
+#define T_MASK ((md5_word_t)~0)
+#define T1 /* 0xd76aa478 */ (T_MASK ^ 0x28955b87)
+#define T2 /* 0xe8c7b756 */ (T_MASK ^ 0x173848a9)
+#define T3    0x242070db
+#define T4 /* 0xc1bdceee */ (T_MASK ^ 0x3e423111)
+#define T5 /* 0xf57c0faf */ (T_MASK ^ 0x0a83f050)
+#define T6    0x4787c62a
+#define T7 /* 0xa8304613 */ (T_MASK ^ 0x57cfb9ec)
+#define T8 /* 0xfd469501 */ (T_MASK ^ 0x02b96afe)
+#define T9    0x698098d8
+#define T10 /* 0x8b44f7af */ (T_MASK ^ 0x74bb0850)
+#define T11 /* 0xffff5bb1 */ (T_MASK ^ 0x0000a44e)
+#define T12 /* 0x895cd7be */ (T_MASK ^ 0x76a32841)
+#define T13    0x6b901122
+#define T14 /* 0xfd987193 */ (T_MASK ^ 0x02678e6c)
+#define T15 /* 0xa679438e */ (T_MASK ^ 0x5986bc71)
+#define T16    0x49b40821
+#define T17 /* 0xf61e2562 */ (T_MASK ^ 0x09e1da9d)
+#define T18 /* 0xc040b340 */ (T_MASK ^ 0x3fbf4cbf)
+#define T19    0x265e5a51
+#define T20 /* 0xe9b6c7aa */ (T_MASK ^ 0x16493855)
+#define T21 /* 0xd62f105d */ (T_MASK ^ 0x29d0efa2)
+#define T22    0x02441453
+#define T23 /* 0xd8a1e681 */ (T_MASK ^ 0x275e197e)
+#define T24 /* 0xe7d3fbc8 */ (T_MASK ^ 0x182c0437)
+#define T25    0x21e1cde6
+#define T26 /* 0xc33707d6 */ (T_MASK ^ 0x3cc8f829)
+#define T27 /* 0xf4d50d87 */ (T_MASK ^ 0x0b2af278)
+#define T28    0x455a14ed
+#define T29 /* 0xa9e3e905 */ (T_MASK ^ 0x561c16fa)
+#define T30 /* 0xfcefa3f8 */ (T_MASK ^ 0x03105c07)
+#define T31    0x676f02d9
+#define T32 /* 0x8d2a4c8a */ (T_MASK ^ 0x72d5b375)
+#define T33 /* 0xfffa3942 */ (T_MASK ^ 0x0005c6bd)
+#define T34 /* 0x8771f681 */ (T_MASK ^ 0x788e097e)
+#define T35    0x6d9d6122
+#define T36 /* 0xfde5380c */ (T_MASK ^ 0x021ac7f3)
+#define T37 /* 0xa4beea44 */ (T_MASK ^ 0x5b4115bb)
+#define T38    0x4bdecfa9
+#define T39 /* 0xf6bb4b60 */ (T_MASK ^ 0x0944b49f)
+#define T40 /* 0xbebfbc70 */ (T_MASK ^ 0x4140438f)
+#define T41    0x289b7ec6
+#define T42 /* 0xeaa127fa */ (T_MASK ^ 0x155ed805)
+#define T43 /* 0xd4ef3085 */ (T_MASK ^ 0x2b10cf7a)
+#define T44    0x04881d05
+#define T45 /* 0xd9d4d039 */ (T_MASK ^ 0x262b2fc6)
+#define T46 /* 0xe6db99e5 */ (T_MASK ^ 0x1924661a)
+#define T47    0x1fa27cf8
+#define T48 /* 0xc4ac5665 */ (T_MASK ^ 0x3b53a99a)
+#define T49 /* 0xf4292244 */ (T_MASK ^ 0x0bd6ddbb)
+#define T50    0x432aff97
+#define T51 /* 0xab9423a7 */ (T_MASK ^ 0x546bdc58)
+#define T52 /* 0xfc93a039 */ (T_MASK ^ 0x036c5fc6)
+#define T53    0x655b59c3
+#define T54 /* 0x8f0ccc92 */ (T_MASK ^ 0x70f3336d)
+#define T55 /* 0xffeff47d */ (T_MASK ^ 0x00100b82)
+#define T56 /* 0x85845dd1 */ (T_MASK ^ 0x7a7ba22e)
+#define T57    0x6fa87e4f
+#define T58 /* 0xfe2ce6e0 */ (T_MASK ^ 0x01d3191f)
+#define T59 /* 0xa3014314 */ (T_MASK ^ 0x5cfebceb)
+#define T60    0x4e0811a1
+#define T61 /* 0xf7537e82 */ (T_MASK ^ 0x08ac817d)
+#define T62 /* 0xbd3af235 */ (T_MASK ^ 0x42c50dca)
+#define T63    0x2ad7d2bb
+#define T64 /* 0xeb86d391 */ (T_MASK ^ 0x14792c6e)
+
+
+static void
+md5_process(md5_state_t *pms, const md5_byte_t *data /*[64]*/)
+{
+    md5_word_t
+	a = pms->abcd[0], b = pms->abcd[1],
+	c = pms->abcd[2], d = pms->abcd[3];
+    md5_word_t t;
+#if BYTE_ORDER > 0
+    /* Define storage only for big-endian CPUs. */
+    md5_word_t X[16];
+#else
+    /* Define storage for little-endian or both types of CPUs. */
+    md5_word_t xbuf[16];
+    const md5_word_t *X;
+#endif
+
+    {
+#if BYTE_ORDER == 0
+	/*
+	 * Determine dynamically whether this is a big-endian or
+	 * little-endian machine, since we can use a more efficient
+	 * algorithm on the latter.
+	 */
+	static const int w = 1;
+
+	if (*((const md5_byte_t *)&w)) /* dynamic little-endian */
+#endif
+#if BYTE_ORDER <= 0		/* little-endian */
+	{
+	    /*
+	     * On little-endian machines, we can process properly aligned
+	     * data without copying it.
+	     */
+	    if (!((data - (const md5_byte_t *)0) & 3)) {
+		/* data are properly aligned */
+		X = (const md5_word_t *)data;
+	    } else {
+		/* not aligned */
+		memcpy(xbuf, data, 64);
+		X = xbuf;
+	    }
+	}
+#endif
+#if BYTE_ORDER == 0
+	else			/* dynamic big-endian */
+#endif
+#if BYTE_ORDER >= 0		/* big-endian */
+	{
+	    /*
+	     * On big-endian machines, we must arrange the bytes in the
+	     * right order.
+	     */
+	    const md5_byte_t *xp = data;
+	    int i;
+
+#  if BYTE_ORDER == 0
+	    X = xbuf;		/* (dynamic only) */
+#  else
+#    define xbuf X		/* (static only) */
+#  endif
+	    for (i = 0; i < 16; ++i, xp += 4)
+		xbuf[i] = xp[0] + (xp[1] << 8) + (xp[2] << 16) + (xp[3] << 24);
+	}
+#endif
+    }
+
+#define ROTATE_LEFT(x, n) (((x) << (n)) | ((x) >> (32 - (n))))
+
+    /* Round 1. */
+    /* Let [abcd k s i] denote the operation
+       a = b + ((a + F(b,c,d) + X[k] + T[i]) <<< s). */
+#define F(x, y, z) (((x) & (y)) | (~(x) & (z)))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + F(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+    /* Do the following 16 operations. */
+    SET(a, b, c, d,  0,  7,  T1);
+    SET(d, a, b, c,  1, 12,  T2);
+    SET(c, d, a, b,  2, 17,  T3);
+    SET(b, c, d, a,  3, 22,  T4);
+    SET(a, b, c, d,  4,  7,  T5);
+    SET(d, a, b, c,  5, 12,  T6);
+    SET(c, d, a, b,  6, 17,  T7);
+    SET(b, c, d, a,  7, 22,  T8);
+    SET(a, b, c, d,  8,  7,  T9);
+    SET(d, a, b, c,  9, 12, T10);
+    SET(c, d, a, b, 10, 17, T11);
+    SET(b, c, d, a, 11, 22, T12);
+    SET(a, b, c, d, 12,  7, T13);
+    SET(d, a, b, c, 13, 12, T14);
+    SET(c, d, a, b, 14, 17, T15);
+    SET(b, c, d, a, 15, 22, T16);
+#undef SET
+
+     /* Round 2. */
+     /* Let [abcd k s i] denote the operation
+          a = b + ((a + G(b,c,d) + X[k] + T[i]) <<< s). */
+#define G(x, y, z) (((x) & (z)) | ((y) & ~(z)))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + G(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+     /* Do the following 16 operations. */
+    SET(a, b, c, d,  1,  5, T17);
+    SET(d, a, b, c,  6,  9, T18);
+    SET(c, d, a, b, 11, 14, T19);
+    SET(b, c, d, a,  0, 20, T20);
+    SET(a, b, c, d,  5,  5, T21);
+    SET(d, a, b, c, 10,  9, T22);
+    SET(c, d, a, b, 15, 14, T23);
+    SET(b, c, d, a,  4, 20, T24);
+    SET(a, b, c, d,  9,  5, T25);
+    SET(d, a, b, c, 14,  9, T26);
+    SET(c, d, a, b,  3, 14, T27);
+    SET(b, c, d, a,  8, 20, T28);
+    SET(a, b, c, d, 13,  5, T29);
+    SET(d, a, b, c,  2,  9, T30);
+    SET(c, d, a, b,  7, 14, T31);
+    SET(b, c, d, a, 12, 20, T32);
+#undef SET
+
+     /* Round 3. */
+     /* Let [abcd k s t] denote the operation
+          a = b + ((a + H(b,c,d) + X[k] + T[i]) <<< s). */
+#define H(x, y, z) ((x) ^ (y) ^ (z))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + H(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+     /* Do the following 16 operations. */
+    SET(a, b, c, d,  5,  4, T33);
+    SET(d, a, b, c,  8, 11, T34);
+    SET(c, d, a, b, 11, 16, T35);
+    SET(b, c, d, a, 14, 23, T36);
+    SET(a, b, c, d,  1,  4, T37);
+    SET(d, a, b, c,  4, 11, T38);
+    SET(c, d, a, b,  7, 16, T39);
+    SET(b, c, d, a, 10, 23, T40);
+    SET(a, b, c, d, 13,  4, T41);
+    SET(d, a, b, c,  0, 11, T42);
+    SET(c, d, a, b,  3, 16, T43);
+    SET(b, c, d, a,  6, 23, T44);
+    SET(a, b, c, d,  9,  4, T45);
+    SET(d, a, b, c, 12, 11, T46);
+    SET(c, d, a, b, 15, 16, T47);
+    SET(b, c, d, a,  2, 23, T48);
+#undef SET
+
+     /* Round 4. */
+     /* Let [abcd k s t] denote the operation
+          a = b + ((a + I(b,c,d) + X[k] + T[i]) <<< s). */
+#define I(x, y, z) ((y) ^ ((x) | ~(z)))
+#define SET(a, b, c, d, k, s, Ti)\
+  t = a + I(b,c,d) + X[k] + Ti;\
+  a = ROTATE_LEFT(t, s) + b
+     /* Do the following 16 operations. */
+    SET(a, b, c, d,  0,  6, T49);
+    SET(d, a, b, c,  7, 10, T50);
+    SET(c, d, a, b, 14, 15, T51);
+    SET(b, c, d, a,  5, 21, T52);
+    SET(a, b, c, d, 12,  6, T53);
+    SET(d, a, b, c,  3, 10, T54);
+    SET(c, d, a, b, 10, 15, T55);
+    SET(b, c, d, a,  1, 21, T56);
+    SET(a, b, c, d,  8,  6, T57);
+    SET(d, a, b, c, 15, 10, T58);
+    SET(c, d, a, b,  6, 15, T59);
+    SET(b, c, d, a, 13, 21, T60);
+    SET(a, b, c, d,  4,  6, T61);
+    SET(d, a, b, c, 11, 10, T62);
+    SET(c, d, a, b,  2, 15, T63);
+    SET(b, c, d, a,  9, 21, T64);
+#undef SET
+
+     /* Then perform the following additions. (That is increment each
+        of the four registers by the value it had before this block
+        was started.) */
+    pms->abcd[0] += a;
+    pms->abcd[1] += b;
+    pms->abcd[2] += c;
+    pms->abcd[3] += d;
+}
+
+void
+md5_init(md5_state_t *pms)
+{
+    pms->count[0] = pms->count[1] = 0;
+    pms->abcd[0] = 0x67452301;
+    pms->abcd[1] = /*0xefcdab89*/ T_MASK ^ 0x10325476;
+    pms->abcd[2] = /*0x98badcfe*/ T_MASK ^ 0x67452301;
+    pms->abcd[3] = 0x10325476;
+}
+
+void
+md5_append(md5_state_t *pms, const md5_byte_t *data, int nbytes)
+{
+    const md5_byte_t *p = data;
+    int left = nbytes;
+    int offset = (pms->count[0] >> 3) & 63;
+    md5_word_t nbits = (md5_word_t)(nbytes << 3);
+
+    if (nbytes <= 0)
+	return;
+
+    /* Update the message length. */
+    pms->count[1] += nbytes >> 29;
+    pms->count[0] += nbits;
+    if (pms->count[0] < nbits)
+	pms->count[1]++;
+
+    /* Process an initial partial block. */
+    if (offset) {
+	int copy = (offset + nbytes > 64 ? 64 - offset : nbytes);
+
+	memcpy(pms->buf + offset, p, copy);
+	if (offset + copy < 64)
+	    return;
+	p += copy;
+	left -= copy;
+	md5_process(pms, pms->buf);
+    }
+
+    /* Process full blocks. */
+    for (; left >= 64; p += 64, left -= 64)
+	md5_process(pms, p);
+
+    /* Process a final partial block. */
+    if (left)
+	memcpy(pms->buf, p, left);
+}
+
+void
+md5_finish(md5_state_t *pms, md5_byte_t digest[16])
+{
+    static const md5_byte_t pad[64] = {
+	0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
+	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
+	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
+	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0
+    };
+    md5_byte_t data[8];
+    int i;
+
+    /* Save the length before padding. */
+    for (i = 0; i < 8; ++i)
+	data[i] = (md5_byte_t)(pms->count[i >> 2] >> ((i & 3) << 3));
+    /* Pad to 56 bytes mod 64. */
+    md5_append(pms, pad, ((55 - (pms->count[0] >> 3)) & 63) + 1);
+    /* Append the length. */
+    md5_append(pms, data, 8);
+    for (i = 0; i < 16; ++i)
+	digest[i] = (md5_byte_t)(pms->abcd[i >> 2] >> ((i & 3) << 3));
+}
diff --git a/src/Modules/md5.h b/src/Modules/md5.h
new file mode 100644
--- /dev/null
+++ b/src/Modules/md5.h
@@ -0,0 +1,91 @@
+/*
+  Copyright (C) 1999, 2002 Aladdin Enterprises.  All rights reserved.
+
+  This software is provided 'as-is', without any express or implied
+  warranty.  In no event will the authors be held liable for any damages
+  arising from the use of this software.
+
+  Permission is granted to anyone to use this software for any purpose,
+  including commercial applications, and to alter it and redistribute it
+  freely, subject to the following restrictions:
+
+  1. The origin of this software must not be misrepresented; you must not
+     claim that you wrote the original software. If you use this software
+     in a product, an acknowledgment in the product documentation would be
+     appreciated but is not required.
+  2. Altered source versions must be plainly marked as such, and must not be
+     misrepresented as being the original software.
+  3. This notice may not be removed or altered from any source distribution.
+
+  L. Peter Deutsch
+  ghost at aladdin.com
+
+ */
+/* $Id: md5.h 43594 2006-04-03 16:27:50Z matthias.klose $ */
+/*
+  Independent implementation of MD5 (RFC 1321).
+
+  This code implements the MD5 Algorithm defined in RFC 1321, whose
+  text is available at
+	http://www.ietf.org/rfc/rfc1321.txt
+  The code is derived from the text of the RFC, including the test suite
+  (section A.5) but excluding the rest of Appendix A.  It does not include
+  any code or documentation that is identified in the RFC as being
+  copyrighted.
+
+  The original and principal author of md5.h is L. Peter Deutsch
+  <ghost at aladdin.com>.  Other authors are noted in the change history
+  that follows (in reverse chronological order):
+
+  2002-04-13 lpd Removed support for non-ANSI compilers; removed
+	references to Ghostscript; clarified derivation from RFC 1321;
+	now handles byte order either statically or dynamically.
+  1999-11-04 lpd Edited comments slightly for automatic TOC extraction.
+  1999-10-18 lpd Fixed typo in header comment (ansi2knr rather than md5);
+	added conditionalization for C++ compilation from Martin
+	Purschke <purschke at bnl.gov>.
+  1999-05-03 lpd Original version.
+ */
+
+#ifndef md5_INCLUDED
+#  define md5_INCLUDED
+
+/*
+ * This package supports both compile-time and run-time determination of CPU
+ * byte order.  If ARCH_IS_BIG_ENDIAN is defined as 0, the code will be
+ * compiled to run only on little-endian CPUs; if ARCH_IS_BIG_ENDIAN is
+ * defined as non-zero, the code will be compiled to run only on big-endian
+ * CPUs; if ARCH_IS_BIG_ENDIAN is not defined, the code will be compiled to
+ * run on either big- or little-endian CPUs, but will run slightly less
+ * efficiently on either one than if ARCH_IS_BIG_ENDIAN is defined.
+ */
+
+typedef unsigned char md5_byte_t; /* 8-bit byte */
+typedef unsigned int md5_word_t; /* 32-bit word */
+
+/* Define the state of the MD5 Algorithm. */
+typedef struct md5_state_s {
+    md5_word_t count[2];	/* message length in bits, lsw first */
+    md5_word_t abcd[4];		/* digest buffer */
+    md5_byte_t buf[64];		/* accumulate block */
+} md5_state_t;
+
+#ifdef __cplusplus
+extern "C" 
+{
+#endif
+
+/* Initialize the algorithm. */
+void md5_init(md5_state_t *pms);
+
+/* Append a string to the message. */
+void md5_append(md5_state_t *pms, const md5_byte_t *data, int nbytes);
+
+/* Finish the message and return the digest. */
+void md5_finish(md5_state_t *pms, md5_byte_t digest[16]);
+
+#ifdef __cplusplus
+}  /* end extern "C" */
+#endif
+
+#endif /* md5_INCLUDED */
diff --git a/src/Modules/md5module.c b/src/Modules/md5module.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/md5module.c
@@ -0,0 +1,312 @@
+
+/* MD5 module */
+
+/* This module provides an interface to the RSA Data Security,
+   Inc. MD5 Message-Digest Algorithm, described in RFC 1321.
+   It requires the files md5c.c and md5.h (which are slightly changed
+   from the versions in the RFC to avoid the "global.h" file.) */
+
+
+/* MD5 objects */
+
+#include "Python.h"
+#include "structmember.h"
+#include "md5.h"
+
+typedef struct {
+	PyObject_HEAD
+        md5_state_t	md5;		/* the context holder */
+} md5object;
+
+static PyTypeObject MD5type;
+
+#define is_md5object(v)		((v)->ob_type == &MD5type)
+
+static md5object *
+newmd5object(void)
+{
+	md5object *md5p;
+
+	md5p = PyObject_New(md5object, &MD5type);
+	if (md5p == NULL)
+		return NULL;
+
+	md5_init(&md5p->md5);	/* actual initialisation */
+	return md5p;
+}
+
+
+/* MD5 methods */
+
+static void
+md5_dealloc(md5object *md5p)
+{
+	PyObject_Del(md5p);
+}
+
+
+/* MD5 methods-as-attributes */
+
+static PyObject *
+md5_update(md5object *self, PyObject *args)
+{
+	unsigned char *cp;
+	int len;
+
+	if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+		return NULL;
+
+	md5_append(&self->md5, cp, len);
+
+	Py_INCREF(Py_None);
+	return Py_None;
+}
+
+PyDoc_STRVAR(update_doc,
+"update (arg)\n\
+\n\
+Update the md5 object with the string arg. Repeated calls are\n\
+equivalent to a single call with the concatenation of all the\n\
+arguments.");
+
+
+static PyObject *
+md5_digest(md5object *self)
+{
+ 	md5_state_t mdContext;
+	unsigned char aDigest[16];
+
+	/* make a temporary copy, and perform the final */
+	mdContext = self->md5;
+	md5_finish(&mdContext, aDigest);
+
+	return PyString_FromStringAndSize((char *)aDigest, 16);
+}
+
+PyDoc_STRVAR(digest_doc,
+"digest() -> string\n\
+\n\
+Return the digest of the strings passed to the update() method so\n\
+far. This is a 16-byte string which may contain non-ASCII characters,\n\
+including null bytes.");
+
+
+static PyObject *
+md5_hexdigest(md5object *self)
+{
+ 	md5_state_t mdContext;
+	unsigned char digest[16];
+	unsigned char hexdigest[32];
+	int i, j;
+
+	/* make a temporary copy, and perform the final */
+	mdContext = self->md5;
+	md5_finish(&mdContext, digest);
+
+	/* Make hex version of the digest */
+	for(i=j=0; i<16; i++) {
+		char c;
+		c = (digest[i] >> 4) & 0xf;
+		c = (c>9) ? c+'a'-10 : c + '0';
+		hexdigest[j++] = c;
+		c = (digest[i] & 0xf);
+		c = (c>9) ? c+'a'-10 : c + '0';
+		hexdigest[j++] = c;
+	}
+	return PyString_FromStringAndSize((char*)hexdigest, 32);
+}
+
+
+PyDoc_STRVAR(hexdigest_doc,
+"hexdigest() -> string\n\
+\n\
+Like digest(), but returns the digest as a string of hexadecimal digits.");
+
+
+static PyObject *
+md5_copy(md5object *self)
+{
+	md5object *md5p;
+
+	if ((md5p = newmd5object()) == NULL)
+		return NULL;
+
+	md5p->md5 = self->md5;
+
+	return (PyObject *)md5p;
+}
+
+PyDoc_STRVAR(copy_doc,
+"copy() -> md5 object\n\
+\n\
+Return a copy (``clone'') of the md5 object.");
+
+
+static PyMethodDef md5_methods[] = {
+	{"update",    (PyCFunction)md5_update,    METH_VARARGS, update_doc},
+	{"digest",    (PyCFunction)md5_digest,    METH_NOARGS,  digest_doc},
+	{"hexdigest", (PyCFunction)md5_hexdigest, METH_NOARGS,  hexdigest_doc},
+	{"copy",      (PyCFunction)md5_copy,      METH_NOARGS,  copy_doc},
+	{NULL, NULL}			     /* sentinel */
+};
+
+static PyObject *
+md5_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(64);
+}
+
+static PyObject *
+md5_get_digest_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(16);
+}
+
+static PyObject *
+md5_get_name(PyObject *self, void *closure)
+{
+    return PyString_FromStringAndSize("MD5", 3);
+}
+
+static PyGetSetDef md5_getseters[] = {
+    {"digest_size",
+     (getter)md5_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {"block_size",
+     (getter)md5_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)md5_get_name, NULL,
+     NULL,
+     NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize",
+     (getter)md5_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+
+PyDoc_STRVAR(module_doc,
+"This module implements the interface to RSA's MD5 message digest\n\
+algorithm (see also Internet RFC 1321). Its use is quite\n\
+straightforward: use the new() to create an md5 object. You can now\n\
+feed this object with arbitrary strings using the update() method, and\n\
+at any point you can ask it for the digest (a strong kind of 128-bit\n\
+checksum, a.k.a. ``fingerprint'') of the concatenation of the strings\n\
+fed to it so far using the digest() method.\n\
+\n\
+Functions:\n\
+\n\
+new([arg]) -- return a new md5 object, initialized with arg if provided\n\
+md5([arg]) -- DEPRECATED, same as new, but for compatibility\n\
+\n\
+Special Objects:\n\
+\n\
+MD5Type -- type object for md5 objects");
+
+PyDoc_STRVAR(md5type_doc,
+"An md5 represents the object used to calculate the MD5 checksum of a\n\
+string of information.\n\
+\n\
+Methods:\n\
+\n\
+update() -- updates the current digest with an additional string\n\
+digest() -- return the current digest value\n\
+hexdigest() -- return the current digest as a string of hexadecimal digits\n\
+copy() -- return a copy of the current md5 object");
+
+static PyTypeObject MD5type = {
+	PyObject_HEAD_INIT(NULL)
+	0,			  /*ob_size*/
+	"_md5.md5",		  /*tp_name*/
+	sizeof(md5object),	  /*tp_size*/
+	0,			  /*tp_itemsize*/
+	/* methods */
+	(destructor)md5_dealloc,  /*tp_dealloc*/
+	0,			  /*tp_print*/
+	0,                        /*tp_getattr*/
+	0,			  /*tp_setattr*/
+	0,			  /*tp_compare*/
+	0,			  /*tp_repr*/
+        0,			  /*tp_as_number*/
+	0,                        /*tp_as_sequence*/
+	0,			  /*tp_as_mapping*/
+	0, 			  /*tp_hash*/
+	0,			  /*tp_call*/
+	0,			  /*tp_str*/
+	0,			  /*tp_getattro*/
+	0,			  /*tp_setattro*/
+	0,	                  /*tp_as_buffer*/
+	Py_TPFLAGS_DEFAULT,	  /*tp_flags*/
+	md5type_doc,		  /*tp_doc*/
+        0,                        /*tp_traverse*/
+        0,			  /*tp_clear*/
+        0,			  /*tp_richcompare*/
+        0,			  /*tp_weaklistoffset*/
+        0,			  /*tp_iter*/
+        0,			  /*tp_iternext*/
+        md5_methods,	          /*tp_methods*/
+        0,      	          /*tp_members*/
+        md5_getseters,            /*tp_getset*/
+};
+
+
+/* MD5 functions */
+
+static PyObject *
+MD5_new(PyObject *self, PyObject *args)
+{
+	md5object *md5p;
+	unsigned char *cp = NULL;
+	int len = 0;
+
+	if (!PyArg_ParseTuple(args, "|s#:new", &cp, &len))
+		return NULL;
+
+	if ((md5p = newmd5object()) == NULL)
+		return NULL;
+
+	if (cp)
+		md5_append(&md5p->md5, cp, len);
+
+	return (PyObject *)md5p;
+}
+
+PyDoc_STRVAR(new_doc,
+"new([arg]) -> md5 object\n\
+\n\
+Return a new md5 object. If arg is present, the method call update(arg)\n\
+is made.");
+
+
+/* List of functions exported by this module */
+
+static PyMethodDef md5_functions[] = {
+	{"new",		(PyCFunction)MD5_new, METH_VARARGS, new_doc},
+	{NULL,		NULL}	/* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+PyMODINIT_FUNC
+init_md5(void)
+{
+	PyObject *m, *d;
+
+        MD5type.ob_type = &PyType_Type;
+        if (PyType_Ready(&MD5type) < 0)
+            return;
+	m = Py_InitModule3("_md5", md5_functions, module_doc);
+	if (m == NULL)
+	    return;
+	d = PyModule_GetDict(m);
+	PyDict_SetItemString(d, "MD5Type", (PyObject *)&MD5type);
+	PyModule_AddIntConstant(m, "digest_size", 16);
+	/* No need to check the error here, the caller will do that */
+}
diff --git a/src/Modules/sha256module.c b/src/Modules/sha256module.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/sha256module.c
@@ -0,0 +1,701 @@
+/* SHA256 module */
+
+/* This module provides an interface to NIST's SHA-256 and SHA-224 Algorithms */
+
+/* See below for information about the original code this module was
+   based upon. Additional work performed by:
+
+   Andrew Kuchling (amk at amk.ca)
+   Greg Stein (gstein at lyra.org)
+   Trevor Perrin (trevp at trevp.net)
+
+   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+   Licensed to PSF under a Contributor Agreement.
+
+*/
+
+/* SHA objects */
+
+#include "Python.h"
+#include "structmember.h"
+
+
+/* Endianness testing and definitions */
+#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
+	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
+
+#define PCT_LITTLE_ENDIAN 1
+#define PCT_BIG_ENDIAN 0
+
+/* Some useful types */
+
+typedef unsigned char SHA_BYTE;
+
+#if SIZEOF_INT == 4
+typedef unsigned int SHA_INT32;	/* 32-bit integer */
+#else
+/* not defined. compilation will die. */
+#endif
+
+/* The SHA block size and message digest sizes, in bytes */
+
+#define SHA_BLOCKSIZE    64
+#define SHA_DIGESTSIZE  32
+
+/* The structure for storing SHA info */
+
+typedef struct {
+    PyObject_HEAD
+    SHA_INT32 digest[8];		/* Message digest */
+    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
+    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
+    int Endianness;
+    int local;				/* unprocessed amount in data */
+    int digestsize;
+} SHAobject;
+
+/* When run on a little-endian CPU we need to perform byte reversal on an
+   array of longwords. */
+
+static void longReverse(SHA_INT32 *buffer, int byteCount, int Endianness)
+{
+    SHA_INT32 value;
+
+    if ( Endianness == PCT_BIG_ENDIAN )
+	return;
+
+    byteCount /= sizeof(*buffer);
+    while (byteCount--) {
+        value = *buffer;
+        value = ( ( value & 0xFF00FF00L ) >> 8  ) | \
+                ( ( value & 0x00FF00FFL ) << 8 );
+        *buffer++ = ( value << 16 ) | ( value >> 16 );
+    }
+}
+
+static void SHAcopy(SHAobject *src, SHAobject *dest)
+{
+    dest->Endianness = src->Endianness;
+    dest->local = src->local;
+    dest->digestsize = src->digestsize;
+    dest->count_lo = src->count_lo;
+    dest->count_hi = src->count_hi;
+    memcpy(dest->digest, src->digest, sizeof(src->digest));
+    memcpy(dest->data, src->data, sizeof(src->data));
+}
+
+
+/* ------------------------------------------------------------------------
+ *
+ * This code for the SHA-256 algorithm was noted as public domain. The
+ * original headers are pasted below.
+ *
+ * Several changes have been made to make it more compatible with the
+ * Python environment and desired interface.
+ *
+ */
+
+/* LibTomCrypt, modular cryptographic library -- Tom St Denis
+ *
+ * LibTomCrypt is a library that provides various cryptographic
+ * algorithms in a highly modular and flexible manner.
+ *
+ * The library is free for all purposes without any express
+ * gurantee it works.
+ *
+ * Tom St Denis, tomstdenis at iahu.ca, http://libtomcrypt.org
+ */
+
+
+/* SHA256 by Tom St Denis */
+
+/* Various logical functions */
+#define ROR(x, y)\
+( ((((unsigned long)(x)&0xFFFFFFFFUL)>>(unsigned long)((y)&31)) | \
+((unsigned long)(x)<<(unsigned long)(32-((y)&31)))) & 0xFFFFFFFFUL)
+#define Ch(x,y,z)       (z ^ (x & (y ^ z)))
+#define Maj(x,y,z)      (((x | y) & z) | (x & y)) 
+#define S(x, n)         ROR((x),(n))
+#define R(x, n)         (((x)&0xFFFFFFFFUL)>>(n))
+#define Sigma0(x)       (S(x, 2) ^ S(x, 13) ^ S(x, 22))
+#define Sigma1(x)       (S(x, 6) ^ S(x, 11) ^ S(x, 25))
+#define Gamma0(x)       (S(x, 7) ^ S(x, 18) ^ R(x, 3))
+#define Gamma1(x)       (S(x, 17) ^ S(x, 19) ^ R(x, 10))
+
+
+static void
+sha_transform(SHAobject *sha_info)
+{
+    int i;
+	SHA_INT32 S[8], W[64], t0, t1;
+
+    memcpy(W, sha_info->data, sizeof(sha_info->data));
+    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
+
+    for (i = 16; i < 64; ++i) {
+		W[i] = Gamma1(W[i - 2]) + W[i - 7] + Gamma0(W[i - 15]) + W[i - 16];
+    }
+    for (i = 0; i < 8; ++i) {
+        S[i] = sha_info->digest[i];
+    }
+
+    /* Compress */
+#define RND(a,b,c,d,e,f,g,h,i,ki)                    \
+     t0 = h + Sigma1(e) + Ch(e, f, g) + ki + W[i];   \
+     t1 = Sigma0(a) + Maj(a, b, c);                  \
+     d += t0;                                        \
+     h  = t0 + t1;
+
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],0,0x428a2f98);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],1,0x71374491);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],2,0xb5c0fbcf);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],3,0xe9b5dba5);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],4,0x3956c25b);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],5,0x59f111f1);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],6,0x923f82a4);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],7,0xab1c5ed5);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],8,0xd807aa98);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],9,0x12835b01);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],10,0x243185be);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],11,0x550c7dc3);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],12,0x72be5d74);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],13,0x80deb1fe);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],14,0x9bdc06a7);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],15,0xc19bf174);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],16,0xe49b69c1);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],17,0xefbe4786);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],18,0x0fc19dc6);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],19,0x240ca1cc);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],20,0x2de92c6f);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],21,0x4a7484aa);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],22,0x5cb0a9dc);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],23,0x76f988da);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],24,0x983e5152);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],25,0xa831c66d);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],26,0xb00327c8);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],27,0xbf597fc7);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],28,0xc6e00bf3);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],29,0xd5a79147);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],30,0x06ca6351);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],31,0x14292967);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],32,0x27b70a85);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],33,0x2e1b2138);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],34,0x4d2c6dfc);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],35,0x53380d13);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],36,0x650a7354);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],37,0x766a0abb);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],38,0x81c2c92e);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],39,0x92722c85);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],40,0xa2bfe8a1);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],41,0xa81a664b);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],42,0xc24b8b70);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],43,0xc76c51a3);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],44,0xd192e819);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],45,0xd6990624);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],46,0xf40e3585);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],47,0x106aa070);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],48,0x19a4c116);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],49,0x1e376c08);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],50,0x2748774c);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],51,0x34b0bcb5);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],52,0x391c0cb3);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],53,0x4ed8aa4a);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],54,0x5b9cca4f);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],55,0x682e6ff3);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],56,0x748f82ee);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],57,0x78a5636f);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],58,0x84c87814);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],59,0x8cc70208);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],60,0x90befffa);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],61,0xa4506ceb);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],62,0xbef9a3f7);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],63,0xc67178f2);
+
+#undef RND     
+    
+    /* feedback */
+    for (i = 0; i < 8; i++) {
+        sha_info->digest[i] = sha_info->digest[i] + S[i];
+    }
+
+}
+
+
+
+/* initialize the SHA digest */
+
+static void
+sha_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0x6A09E667L;
+    sha_info->digest[1] = 0xBB67AE85L;
+    sha_info->digest[2] = 0x3C6EF372L;
+    sha_info->digest[3] = 0xA54FF53AL;
+    sha_info->digest[4] = 0x510E527FL;
+    sha_info->digest[5] = 0x9B05688CL;
+    sha_info->digest[6] = 0x1F83D9ABL;
+    sha_info->digest[7] = 0x5BE0CD19L;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 32;
+}
+
+static void
+sha224_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0xc1059ed8L;
+    sha_info->digest[1] = 0x367cd507L;
+    sha_info->digest[2] = 0x3070dd17L;
+    sha_info->digest[3] = 0xf70e5939L;
+    sha_info->digest[4] = 0xffc00b31L;
+    sha_info->digest[5] = 0x68581511L;
+    sha_info->digest[6] = 0x64f98fa7L;
+    sha_info->digest[7] = 0xbefa4fa4L;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 28;
+}
+
+
+/* update the SHA digest */
+
+static void
+sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
+{
+    int i;
+    SHA_INT32 clo;
+
+    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
+    if (clo < sha_info->count_lo) {
+        ++sha_info->count_hi;
+    }
+    sha_info->count_lo = clo;
+    sha_info->count_hi += (SHA_INT32) count >> 29;
+    if (sha_info->local) {
+        i = SHA_BLOCKSIZE - sha_info->local;
+        if (i > count) {
+            i = count;
+        }
+        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
+        count -= i;
+        buffer += i;
+        sha_info->local += i;
+        if (sha_info->local == SHA_BLOCKSIZE) {
+            sha_transform(sha_info);
+        }
+        else {
+            return;
+        }
+    }
+    while (count >= SHA_BLOCKSIZE) {
+        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
+        buffer += SHA_BLOCKSIZE;
+        count -= SHA_BLOCKSIZE;
+        sha_transform(sha_info);
+    }
+    memcpy(sha_info->data, buffer, count);
+    sha_info->local = count;
+}
+
+/* finish computing the SHA digest */
+
+static void
+sha_final(unsigned char digest[SHA_DIGESTSIZE], SHAobject *sha_info)
+{
+    int count;
+    SHA_INT32 lo_bit_count, hi_bit_count;
+
+    lo_bit_count = sha_info->count_lo;
+    hi_bit_count = sha_info->count_hi;
+    count = (int) ((lo_bit_count >> 3) & 0x3f);
+    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
+    if (count > SHA_BLOCKSIZE - 8) {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - count);
+	sha_transform(sha_info);
+	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 8);
+    }
+    else {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - 8 - count);
+    }
+
+    /* GJS: note that we add the hi/lo in big-endian. sha_transform will
+       swap these values into host-order. */
+    sha_info->data[56] = (hi_bit_count >> 24) & 0xff;
+    sha_info->data[57] = (hi_bit_count >> 16) & 0xff;
+    sha_info->data[58] = (hi_bit_count >>  8) & 0xff;
+    sha_info->data[59] = (hi_bit_count >>  0) & 0xff;
+    sha_info->data[60] = (lo_bit_count >> 24) & 0xff;
+    sha_info->data[61] = (lo_bit_count >> 16) & 0xff;
+    sha_info->data[62] = (lo_bit_count >>  8) & 0xff;
+    sha_info->data[63] = (lo_bit_count >>  0) & 0xff;
+    sha_transform(sha_info);
+    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
+    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
+    digest[ 2] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
+    digest[ 3] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
+    digest[ 4] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
+    digest[ 5] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
+    digest[ 6] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
+    digest[ 7] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
+    digest[ 8] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
+    digest[ 9] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
+    digest[10] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
+    digest[11] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
+    digest[12] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
+    digest[13] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
+    digest[14] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
+    digest[15] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
+    digest[16] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
+    digest[17] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
+    digest[18] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
+    digest[19] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
+    digest[20] = (unsigned char) ((sha_info->digest[5] >> 24) & 0xff);
+    digest[21] = (unsigned char) ((sha_info->digest[5] >> 16) & 0xff);
+    digest[22] = (unsigned char) ((sha_info->digest[5] >>  8) & 0xff);
+    digest[23] = (unsigned char) ((sha_info->digest[5]      ) & 0xff);
+    digest[24] = (unsigned char) ((sha_info->digest[6] >> 24) & 0xff);
+    digest[25] = (unsigned char) ((sha_info->digest[6] >> 16) & 0xff);
+    digest[26] = (unsigned char) ((sha_info->digest[6] >>  8) & 0xff);
+    digest[27] = (unsigned char) ((sha_info->digest[6]      ) & 0xff);
+    digest[28] = (unsigned char) ((sha_info->digest[7] >> 24) & 0xff);
+    digest[29] = (unsigned char) ((sha_info->digest[7] >> 16) & 0xff);
+    digest[30] = (unsigned char) ((sha_info->digest[7] >>  8) & 0xff);
+    digest[31] = (unsigned char) ((sha_info->digest[7]      ) & 0xff);
+}
+
+/*
+ * End of copied SHA code.
+ *
+ * ------------------------------------------------------------------------
+ */
+
+static PyTypeObject SHA224type;
+static PyTypeObject SHA256type;
+
+
+static SHAobject *
+newSHA224object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA224type);
+}
+
+static SHAobject *
+newSHA256object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA256type);
+}
+
+/* Internal methods for a hash object */
+
+static void
+SHA_dealloc(PyObject *ptr)
+{
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hash object */
+
+PyDoc_STRVAR(SHA256_copy__doc__, "Return a copy of the hash object.");
+
+static PyObject *
+SHA256_copy(SHAobject *self, PyObject *unused)
+{
+    SHAobject *newobj;
+
+    if (((PyObject*)self)->ob_type == &SHA256type) {
+        if ( (newobj = newSHA256object())==NULL)
+            return NULL;
+    } else {
+        if ( (newobj = newSHA224object())==NULL)
+            return NULL;
+    }
+
+    SHAcopy(self, newobj);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(SHA256_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+SHA256_digest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+    return PyString_FromStringAndSize((const char *)digest, self->digestsize);
+}
+
+PyDoc_STRVAR(SHA256_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+SHA256_hexdigest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+    PyObject *retval;
+    char *hex_digest;
+    int i, j;
+
+    /* Get the raw (binary) digest value */
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+
+    /* Create a new string */
+    retval = PyString_FromStringAndSize(NULL, self->digestsize * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for(i=j=0; i<self->digestsize; i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(SHA256_update__doc__,
+"Update this hash object's state with the provided string.");
+
+static PyObject *
+SHA256_update(SHAobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    int len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    sha_update(self, cp, len);
+
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef SHA_methods[] = {
+    {"copy",	  (PyCFunction)SHA256_copy,      METH_NOARGS,  SHA256_copy__doc__},
+    {"digest",	  (PyCFunction)SHA256_digest,    METH_NOARGS,  SHA256_digest__doc__},
+    {"hexdigest", (PyCFunction)SHA256_hexdigest, METH_NOARGS,  SHA256_hexdigest__doc__},
+    {"update",	  (PyCFunction)SHA256_update,    METH_VARARGS, SHA256_update__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+SHA256_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_BLOCKSIZE);
+}
+
+static PyObject *
+SHA256_get_name(PyObject *self, void *closure)
+{
+    if (((SHAobject *)self)->digestsize == 32)
+        return PyString_FromStringAndSize("SHA256", 6);
+    else
+        return PyString_FromStringAndSize("SHA224", 6);
+}
+
+static PyGetSetDef SHA_getseters[] = {
+    {"block_size",
+     (getter)SHA256_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)SHA256_get_name, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyMemberDef SHA_members[] = {
+    {"digest_size", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyTypeObject SHA224type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha256.sha224",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+static PyTypeObject SHA256type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha256.sha256",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+
+/* The single module-level function: new() */
+
+PyDoc_STRVAR(SHA256_new__doc__,
+"Return a new SHA-256 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA256_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA256object()) == NULL)
+        return NULL;
+
+    sha_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+PyDoc_STRVAR(SHA224_new__doc__,
+"Return a new SHA-224 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA224_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA224object()) == NULL)
+        return NULL;
+
+    sha224_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef SHA_functions[] = {
+    {"sha256", (PyCFunction)SHA256_new, METH_VARARGS|METH_KEYWORDS, SHA256_new__doc__},
+    {"sha224", (PyCFunction)SHA224_new, METH_VARARGS|METH_KEYWORDS, SHA224_new__doc__},
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
+
+PyMODINIT_FUNC
+init_sha256(void)
+{
+    PyObject *m;
+
+    SHA224type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA224type) < 0)
+        return;
+    SHA256type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA256type) < 0)
+        return;
+    m = Py_InitModule("_sha256", SHA_functions);
+    if (m == NULL)
+	return;
+}
diff --git a/src/Modules/sha512module.c b/src/Modules/sha512module.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/sha512module.c
@@ -0,0 +1,769 @@
+/* SHA512 module */
+
+/* This module provides an interface to NIST's SHA-512 and SHA-384 Algorithms */
+
+/* See below for information about the original code this module was
+   based upon. Additional work performed by:
+
+   Andrew Kuchling (amk at amk.ca)
+   Greg Stein (gstein at lyra.org)
+   Trevor Perrin (trevp at trevp.net)
+
+   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+   Licensed to PSF under a Contributor Agreement.
+
+*/
+
+/* SHA objects */
+
+#include "Python.h"
+#include "structmember.h"
+
+#ifdef PY_LONG_LONG /* If no PY_LONG_LONG, don't compile anything! */
+
+/* Endianness testing and definitions */
+#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
+	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
+
+#define PCT_LITTLE_ENDIAN 1
+#define PCT_BIG_ENDIAN 0
+
+/* Some useful types */
+
+typedef unsigned char SHA_BYTE;
+
+#if SIZEOF_INT == 4
+typedef unsigned int SHA_INT32;	/* 32-bit integer */
+typedef unsigned PY_LONG_LONG SHA_INT64;	/* 64-bit integer */
+#else
+/* not defined. compilation will die. */
+#endif
+
+/* The SHA block size and message digest sizes, in bytes */
+
+#define SHA_BLOCKSIZE   128
+#define SHA_DIGESTSIZE  64
+
+/* The structure for storing SHA info */
+
+typedef struct {
+    PyObject_HEAD
+    SHA_INT64 digest[8];		/* Message digest */
+    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
+    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
+    int Endianness;
+    int local;				/* unprocessed amount in data */
+    int digestsize;
+} SHAobject;
+
+/* When run on a little-endian CPU we need to perform byte reversal on an
+   array of longwords. */
+
+static void longReverse(SHA_INT64 *buffer, int byteCount, int Endianness)
+{
+    SHA_INT64 value;
+
+    if ( Endianness == PCT_BIG_ENDIAN )
+	return;
+
+    byteCount /= sizeof(*buffer);
+    while (byteCount--) {
+        value = *buffer;
+
+		((unsigned char*)buffer)[0] = (unsigned char)(value >> 56) & 0xff;
+		((unsigned char*)buffer)[1] = (unsigned char)(value >> 48) & 0xff;
+		((unsigned char*)buffer)[2] = (unsigned char)(value >> 40) & 0xff;
+		((unsigned char*)buffer)[3] = (unsigned char)(value >> 32) & 0xff;
+		((unsigned char*)buffer)[4] = (unsigned char)(value >> 24) & 0xff;
+		((unsigned char*)buffer)[5] = (unsigned char)(value >> 16) & 0xff;
+		((unsigned char*)buffer)[6] = (unsigned char)(value >>  8) & 0xff;
+		((unsigned char*)buffer)[7] = (unsigned char)(value      ) & 0xff;
+        
+		buffer++;
+    }
+}
+
+static void SHAcopy(SHAobject *src, SHAobject *dest)
+{
+    dest->Endianness = src->Endianness;
+    dest->local = src->local;
+    dest->digestsize = src->digestsize;
+    dest->count_lo = src->count_lo;
+    dest->count_hi = src->count_hi;
+    memcpy(dest->digest, src->digest, sizeof(src->digest));
+    memcpy(dest->data, src->data, sizeof(src->data));
+}
+
+
+/* ------------------------------------------------------------------------
+ *
+ * This code for the SHA-512 algorithm was noted as public domain. The
+ * original headers are pasted below.
+ *
+ * Several changes have been made to make it more compatible with the
+ * Python environment and desired interface.
+ *
+ */
+
+/* LibTomCrypt, modular cryptographic library -- Tom St Denis
+ *
+ * LibTomCrypt is a library that provides various cryptographic
+ * algorithms in a highly modular and flexible manner.
+ *
+ * The library is free for all purposes without any express
+ * gurantee it works.
+ *
+ * Tom St Denis, tomstdenis at iahu.ca, http://libtomcrypt.org
+ */
+
+
+/* SHA512 by Tom St Denis */
+
+/* Various logical functions */
+#define ROR64(x, y) \
+    ( ((((x) & 0xFFFFFFFFFFFFFFFFULL)>>((unsigned PY_LONG_LONG)(y) & 63)) | \
+      ((x)<<((unsigned PY_LONG_LONG)(64-((y) & 63))))) & 0xFFFFFFFFFFFFFFFFULL)
+#define Ch(x,y,z)       (z ^ (x & (y ^ z)))
+#define Maj(x,y,z)      (((x | y) & z) | (x & y)) 
+#define S(x, n)         ROR64((x),(n))
+#define R(x, n)         (((x) & 0xFFFFFFFFFFFFFFFFULL) >> ((unsigned PY_LONG_LONG)n))
+#define Sigma0(x)       (S(x, 28) ^ S(x, 34) ^ S(x, 39))
+#define Sigma1(x)       (S(x, 14) ^ S(x, 18) ^ S(x, 41))
+#define Gamma0(x)       (S(x, 1) ^ S(x, 8) ^ R(x, 7))
+#define Gamma1(x)       (S(x, 19) ^ S(x, 61) ^ R(x, 6))
+
+
+static void
+sha512_transform(SHAobject *sha_info)
+{
+    int i;
+    SHA_INT64 S[8], W[80], t0, t1;
+
+    memcpy(W, sha_info->data, sizeof(sha_info->data));
+    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
+
+    for (i = 16; i < 80; ++i) {
+		W[i] = Gamma1(W[i - 2]) + W[i - 7] + Gamma0(W[i - 15]) + W[i - 16];
+    }
+    for (i = 0; i < 8; ++i) {
+        S[i] = sha_info->digest[i];
+    }
+
+    /* Compress */
+#define RND(a,b,c,d,e,f,g,h,i,ki)                    \
+     t0 = h + Sigma1(e) + Ch(e, f, g) + ki + W[i];   \
+     t1 = Sigma0(a) + Maj(a, b, c);                  \
+     d += t0;                                        \
+     h  = t0 + t1;
+
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],0,0x428a2f98d728ae22ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],1,0x7137449123ef65cdULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],2,0xb5c0fbcfec4d3b2fULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],3,0xe9b5dba58189dbbcULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],4,0x3956c25bf348b538ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],5,0x59f111f1b605d019ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],6,0x923f82a4af194f9bULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],7,0xab1c5ed5da6d8118ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],8,0xd807aa98a3030242ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],9,0x12835b0145706fbeULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],10,0x243185be4ee4b28cULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],11,0x550c7dc3d5ffb4e2ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],12,0x72be5d74f27b896fULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],13,0x80deb1fe3b1696b1ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],14,0x9bdc06a725c71235ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],15,0xc19bf174cf692694ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],16,0xe49b69c19ef14ad2ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],17,0xefbe4786384f25e3ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],18,0x0fc19dc68b8cd5b5ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],19,0x240ca1cc77ac9c65ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],20,0x2de92c6f592b0275ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],21,0x4a7484aa6ea6e483ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],22,0x5cb0a9dcbd41fbd4ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],23,0x76f988da831153b5ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],24,0x983e5152ee66dfabULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],25,0xa831c66d2db43210ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],26,0xb00327c898fb213fULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],27,0xbf597fc7beef0ee4ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],28,0xc6e00bf33da88fc2ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],29,0xd5a79147930aa725ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],30,0x06ca6351e003826fULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],31,0x142929670a0e6e70ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],32,0x27b70a8546d22ffcULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],33,0x2e1b21385c26c926ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],34,0x4d2c6dfc5ac42aedULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],35,0x53380d139d95b3dfULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],36,0x650a73548baf63deULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],37,0x766a0abb3c77b2a8ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],38,0x81c2c92e47edaee6ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],39,0x92722c851482353bULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],40,0xa2bfe8a14cf10364ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],41,0xa81a664bbc423001ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],42,0xc24b8b70d0f89791ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],43,0xc76c51a30654be30ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],44,0xd192e819d6ef5218ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],45,0xd69906245565a910ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],46,0xf40e35855771202aULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],47,0x106aa07032bbd1b8ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],48,0x19a4c116b8d2d0c8ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],49,0x1e376c085141ab53ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],50,0x2748774cdf8eeb99ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],51,0x34b0bcb5e19b48a8ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],52,0x391c0cb3c5c95a63ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],53,0x4ed8aa4ae3418acbULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],54,0x5b9cca4f7763e373ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],55,0x682e6ff3d6b2b8a3ULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],56,0x748f82ee5defb2fcULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],57,0x78a5636f43172f60ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],58,0x84c87814a1f0ab72ULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],59,0x8cc702081a6439ecULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],60,0x90befffa23631e28ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],61,0xa4506cebde82bde9ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],62,0xbef9a3f7b2c67915ULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],63,0xc67178f2e372532bULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],64,0xca273eceea26619cULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],65,0xd186b8c721c0c207ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],66,0xeada7dd6cde0eb1eULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],67,0xf57d4f7fee6ed178ULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],68,0x06f067aa72176fbaULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],69,0x0a637dc5a2c898a6ULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],70,0x113f9804bef90daeULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],71,0x1b710b35131c471bULL);
+    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],72,0x28db77f523047d84ULL);
+    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],73,0x32caab7b40c72493ULL);
+    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],74,0x3c9ebe0a15c9bebcULL);
+    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],75,0x431d67c49c100d4cULL);
+    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],76,0x4cc5d4becb3e42b6ULL);
+    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],77,0x597f299cfc657e2aULL);
+    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],78,0x5fcb6fab3ad6faecULL);
+    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],79,0x6c44198c4a475817ULL);
+
+#undef RND     
+    
+    /* feedback */
+    for (i = 0; i < 8; i++) {
+        sha_info->digest[i] = sha_info->digest[i] + S[i];
+    }
+
+}
+
+
+
+/* initialize the SHA digest */
+
+static void
+sha512_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0x6a09e667f3bcc908ULL;
+    sha_info->digest[1] = 0xbb67ae8584caa73bULL;
+    sha_info->digest[2] = 0x3c6ef372fe94f82bULL;
+    sha_info->digest[3] = 0xa54ff53a5f1d36f1ULL;
+    sha_info->digest[4] = 0x510e527fade682d1ULL;
+    sha_info->digest[5] = 0x9b05688c2b3e6c1fULL;
+    sha_info->digest[6] = 0x1f83d9abfb41bd6bULL;
+    sha_info->digest[7] = 0x5be0cd19137e2179ULL;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 64;
+}
+
+static void
+sha384_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+    sha_info->digest[0] = 0xcbbb9d5dc1059ed8ULL;
+    sha_info->digest[1] = 0x629a292a367cd507ULL;
+    sha_info->digest[2] = 0x9159015a3070dd17ULL;
+    sha_info->digest[3] = 0x152fecd8f70e5939ULL;
+    sha_info->digest[4] = 0x67332667ffc00b31ULL;
+    sha_info->digest[5] = 0x8eb44a8768581511ULL;
+    sha_info->digest[6] = 0xdb0c2e0d64f98fa7ULL;
+    sha_info->digest[7] = 0x47b5481dbefa4fa4ULL;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+    sha_info->digestsize = 48;
+}
+
+
+/* update the SHA digest */
+
+static void
+sha512_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
+{
+    int i;
+    SHA_INT32 clo;
+
+    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
+    if (clo < sha_info->count_lo) {
+        ++sha_info->count_hi;
+    }
+    sha_info->count_lo = clo;
+    sha_info->count_hi += (SHA_INT32) count >> 29;
+    if (sha_info->local) {
+        i = SHA_BLOCKSIZE - sha_info->local;
+        if (i > count) {
+            i = count;
+        }
+        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
+        count -= i;
+        buffer += i;
+        sha_info->local += i;
+        if (sha_info->local == SHA_BLOCKSIZE) {
+            sha512_transform(sha_info);
+        }
+        else {
+            return;
+        }
+    }
+    while (count >= SHA_BLOCKSIZE) {
+        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
+        buffer += SHA_BLOCKSIZE;
+        count -= SHA_BLOCKSIZE;
+        sha512_transform(sha_info);
+    }
+    memcpy(sha_info->data, buffer, count);
+    sha_info->local = count;
+}
+
+/* finish computing the SHA digest */
+
+static void
+sha512_final(unsigned char digest[SHA_DIGESTSIZE], SHAobject *sha_info)
+{
+    int count;
+    SHA_INT32 lo_bit_count, hi_bit_count;
+
+    lo_bit_count = sha_info->count_lo;
+    hi_bit_count = sha_info->count_hi;
+    count = (int) ((lo_bit_count >> 3) & 0x7f);
+    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
+    if (count > SHA_BLOCKSIZE - 16) {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - count);
+	sha512_transform(sha_info);
+	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 16);
+    }
+    else {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - 16 - count);
+    }
+
+    /* GJS: note that we add the hi/lo in big-endian. sha512_transform will
+       swap these values into host-order. */
+    sha_info->data[112] = 0;
+    sha_info->data[113] = 0;
+    sha_info->data[114] = 0;
+    sha_info->data[115] = 0;
+    sha_info->data[116] = 0;
+    sha_info->data[117] = 0;
+    sha_info->data[118] = 0;
+    sha_info->data[119] = 0;
+    sha_info->data[120] = (hi_bit_count >> 24) & 0xff;
+    sha_info->data[121] = (hi_bit_count >> 16) & 0xff;
+    sha_info->data[122] = (hi_bit_count >>  8) & 0xff;
+    sha_info->data[123] = (hi_bit_count >>  0) & 0xff;
+    sha_info->data[124] = (lo_bit_count >> 24) & 0xff;
+    sha_info->data[125] = (lo_bit_count >> 16) & 0xff;
+    sha_info->data[126] = (lo_bit_count >>  8) & 0xff;
+    sha_info->data[127] = (lo_bit_count >>  0) & 0xff;
+    sha512_transform(sha_info);
+    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 56) & 0xff);
+    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 48) & 0xff);
+    digest[ 2] = (unsigned char) ((sha_info->digest[0] >> 40) & 0xff);
+    digest[ 3] = (unsigned char) ((sha_info->digest[0] >> 32) & 0xff);
+    digest[ 4] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
+    digest[ 5] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
+    digest[ 6] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
+    digest[ 7] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
+    digest[ 8] = (unsigned char) ((sha_info->digest[1] >> 56) & 0xff);
+    digest[ 9] = (unsigned char) ((sha_info->digest[1] >> 48) & 0xff);
+    digest[10] = (unsigned char) ((sha_info->digest[1] >> 40) & 0xff);
+    digest[11] = (unsigned char) ((sha_info->digest[1] >> 32) & 0xff);
+    digest[12] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
+    digest[13] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
+    digest[14] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
+    digest[15] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
+    digest[16] = (unsigned char) ((sha_info->digest[2] >> 56) & 0xff);
+    digest[17] = (unsigned char) ((sha_info->digest[2] >> 48) & 0xff);
+    digest[18] = (unsigned char) ((sha_info->digest[2] >> 40) & 0xff);
+    digest[19] = (unsigned char) ((sha_info->digest[2] >> 32) & 0xff);
+    digest[20] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
+    digest[21] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
+    digest[22] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
+    digest[23] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
+    digest[24] = (unsigned char) ((sha_info->digest[3] >> 56) & 0xff);
+    digest[25] = (unsigned char) ((sha_info->digest[3] >> 48) & 0xff);
+    digest[26] = (unsigned char) ((sha_info->digest[3] >> 40) & 0xff);
+    digest[27] = (unsigned char) ((sha_info->digest[3] >> 32) & 0xff);
+    digest[28] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
+    digest[29] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
+    digest[30] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
+    digest[31] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
+    digest[32] = (unsigned char) ((sha_info->digest[4] >> 56) & 0xff);
+    digest[33] = (unsigned char) ((sha_info->digest[4] >> 48) & 0xff);
+    digest[34] = (unsigned char) ((sha_info->digest[4] >> 40) & 0xff);
+    digest[35] = (unsigned char) ((sha_info->digest[4] >> 32) & 0xff);
+    digest[36] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
+    digest[37] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
+    digest[38] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
+    digest[39] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
+    digest[40] = (unsigned char) ((sha_info->digest[5] >> 56) & 0xff);
+    digest[41] = (unsigned char) ((sha_info->digest[5] >> 48) & 0xff);
+    digest[42] = (unsigned char) ((sha_info->digest[5] >> 40) & 0xff);
+    digest[43] = (unsigned char) ((sha_info->digest[5] >> 32) & 0xff);
+    digest[44] = (unsigned char) ((sha_info->digest[5] >> 24) & 0xff);
+    digest[45] = (unsigned char) ((sha_info->digest[5] >> 16) & 0xff);
+    digest[46] = (unsigned char) ((sha_info->digest[5] >>  8) & 0xff);
+    digest[47] = (unsigned char) ((sha_info->digest[5]      ) & 0xff);
+    digest[48] = (unsigned char) ((sha_info->digest[6] >> 56) & 0xff);
+    digest[49] = (unsigned char) ((sha_info->digest[6] >> 48) & 0xff);
+    digest[50] = (unsigned char) ((sha_info->digest[6] >> 40) & 0xff);
+    digest[51] = (unsigned char) ((sha_info->digest[6] >> 32) & 0xff);
+    digest[52] = (unsigned char) ((sha_info->digest[6] >> 24) & 0xff);
+    digest[53] = (unsigned char) ((sha_info->digest[6] >> 16) & 0xff);
+    digest[54] = (unsigned char) ((sha_info->digest[6] >>  8) & 0xff);
+    digest[55] = (unsigned char) ((sha_info->digest[6]      ) & 0xff);
+    digest[56] = (unsigned char) ((sha_info->digest[7] >> 56) & 0xff);
+    digest[57] = (unsigned char) ((sha_info->digest[7] >> 48) & 0xff);
+    digest[58] = (unsigned char) ((sha_info->digest[7] >> 40) & 0xff);
+    digest[59] = (unsigned char) ((sha_info->digest[7] >> 32) & 0xff);
+    digest[60] = (unsigned char) ((sha_info->digest[7] >> 24) & 0xff);
+    digest[61] = (unsigned char) ((sha_info->digest[7] >> 16) & 0xff);
+    digest[62] = (unsigned char) ((sha_info->digest[7] >>  8) & 0xff);
+    digest[63] = (unsigned char) ((sha_info->digest[7]      ) & 0xff);
+}
+
+/*
+ * End of copied SHA code.
+ *
+ * ------------------------------------------------------------------------
+ */
+
+static PyTypeObject SHA384type;
+static PyTypeObject SHA512type;
+
+
+static SHAobject *
+newSHA384object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA384type);
+}
+
+static SHAobject *
+newSHA512object(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHA512type);
+}
+
+/* Internal methods for a hash object */
+
+static void
+SHA512_dealloc(PyObject *ptr)
+{
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hash object */
+
+PyDoc_STRVAR(SHA512_copy__doc__, "Return a copy of the hash object.");
+
+static PyObject *
+SHA512_copy(SHAobject *self, PyObject *unused)
+{
+    SHAobject *newobj;
+
+    if (((PyObject*)self)->ob_type == &SHA512type) {
+        if ( (newobj = newSHA512object())==NULL)
+            return NULL;
+    } else {
+        if ( (newobj = newSHA384object())==NULL)
+            return NULL;
+    }
+
+    SHAcopy(self, newobj);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(SHA512_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+SHA512_digest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+
+    SHAcopy(self, &temp);
+    sha512_final(digest, &temp);
+    return PyString_FromStringAndSize((const char *)digest, self->digestsize);
+}
+
+PyDoc_STRVAR(SHA512_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+SHA512_hexdigest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+    PyObject *retval;
+    char *hex_digest;
+    int i, j;
+
+    /* Get the raw (binary) digest value */
+    SHAcopy(self, &temp);
+    sha512_final(digest, &temp);
+
+    /* Create a new string */
+    retval = PyString_FromStringAndSize(NULL, self->digestsize * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for (i=j=0; i<self->digestsize; i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(SHA512_update__doc__,
+"Update this hash object's state with the provided string.");
+
+static PyObject *
+SHA512_update(SHAobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    int len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    sha512_update(self, cp, len);
+
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef SHA_methods[] = {
+    {"copy",	  (PyCFunction)SHA512_copy,      METH_NOARGS, SHA512_copy__doc__},
+    {"digest",	  (PyCFunction)SHA512_digest,    METH_NOARGS, SHA512_digest__doc__},
+    {"hexdigest", (PyCFunction)SHA512_hexdigest, METH_NOARGS, SHA512_hexdigest__doc__},
+    {"update",	  (PyCFunction)SHA512_update,    METH_VARARGS, SHA512_update__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+SHA512_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_BLOCKSIZE);
+}
+
+static PyObject *
+SHA512_get_name(PyObject *self, void *closure)
+{
+    if (((SHAobject *)self)->digestsize == 64)
+        return PyString_FromStringAndSize("SHA512", 6);
+    else
+        return PyString_FromStringAndSize("SHA384", 6);
+}
+
+static PyGetSetDef SHA_getseters[] = {
+    {"block_size",
+     (getter)SHA512_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)SHA512_get_name, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyMemberDef SHA_members[] = {
+    {"digest_size", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyTypeObject SHA384type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha512.sha384",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA512_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+static PyTypeObject SHA512type = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha512.sha512",	/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA512_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,          	/*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    SHA_members,	/* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+
+/* The single module-level function: new() */
+
+PyDoc_STRVAR(SHA512_new__doc__,
+"Return a new SHA-512 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA512_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA512object()) == NULL)
+        return NULL;
+
+    sha512_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha512_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+PyDoc_STRVAR(SHA384_new__doc__,
+"Return a new SHA-384 hash object; optionally initialized with a string.");
+
+static PyObject *
+SHA384_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHA384object()) == NULL)
+        return NULL;
+
+    sha384_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha512_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef SHA_functions[] = {
+    {"sha512", (PyCFunction)SHA512_new, METH_VARARGS|METH_KEYWORDS, SHA512_new__doc__},
+    {"sha384", (PyCFunction)SHA384_new, METH_VARARGS|METH_KEYWORDS, SHA384_new__doc__},
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
+
+PyMODINIT_FUNC
+init_sha512(void)
+{
+    PyObject *m;
+
+    SHA384type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA384type) < 0)
+        return;
+    SHA512type.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHA512type) < 0)
+        return;
+    m = Py_InitModule("_sha512", SHA_functions);
+    if (m == NULL)
+	return;
+}
+
+#endif
diff --git a/src/Modules/shamodule.c b/src/Modules/shamodule.c
new file mode 100644
--- /dev/null
+++ b/src/Modules/shamodule.c
@@ -0,0 +1,593 @@
+/* SHA module */
+
+/* This module provides an interface to NIST's Secure Hash Algorithm */
+
+/* See below for information about the original code this module was
+   based upon. Additional work performed by:
+
+   Andrew Kuchling (amk at amk.ca)
+   Greg Stein (gstein at lyra.org)
+
+   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+   Licensed to PSF under a Contributor Agreement.
+
+*/
+
+/* SHA objects */
+
+#include "Python.h"
+#include "structmember.h"
+
+
+/* Endianness testing and definitions */
+#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
+	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
+
+#define PCT_LITTLE_ENDIAN 1
+#define PCT_BIG_ENDIAN 0
+
+/* Some useful types */
+
+typedef unsigned char SHA_BYTE;
+
+#if SIZEOF_INT == 4
+typedef unsigned int SHA_INT32;	/* 32-bit integer */
+#else
+/* not defined. compilation will die. */
+#endif
+
+/* The SHA block size and message digest sizes, in bytes */
+
+#define SHA_BLOCKSIZE    64
+#define SHA_DIGESTSIZE  20
+
+/* The structure for storing SHS info */
+
+typedef struct {
+    PyObject_HEAD
+    SHA_INT32 digest[5];		/* Message digest */
+    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
+    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
+    int Endianness;
+    int local;				/* unprocessed amount in data */
+} SHAobject;
+
+/* When run on a little-endian CPU we need to perform byte reversal on an
+   array of longwords. */
+
+static void longReverse(SHA_INT32 *buffer, int byteCount, int Endianness)
+{
+    SHA_INT32 value;
+
+    if ( Endianness == PCT_BIG_ENDIAN )
+	return;
+
+    byteCount /= sizeof(*buffer);
+    while (byteCount--) {
+        value = *buffer;
+        value = ( ( value & 0xFF00FF00L ) >> 8  ) | \
+                ( ( value & 0x00FF00FFL ) << 8 );
+        *buffer++ = ( value << 16 ) | ( value >> 16 );
+    }
+}
+
+static void SHAcopy(SHAobject *src, SHAobject *dest)
+{
+    dest->Endianness = src->Endianness;
+    dest->local = src->local;
+    dest->count_lo = src->count_lo;
+    dest->count_hi = src->count_hi;
+    memcpy(dest->digest, src->digest, sizeof(src->digest));
+    memcpy(dest->data, src->data, sizeof(src->data));
+}
+
+
+/* ------------------------------------------------------------------------
+ *
+ * This code for the SHA algorithm was noted as public domain. The original
+ * headers are pasted below.
+ *
+ * Several changes have been made to make it more compatible with the
+ * Python environment and desired interface.
+ *
+ */
+
+/* NIST Secure Hash Algorithm */
+/* heavily modified by Uwe Hollerbach <uh at alumni.caltech edu> */
+/* from Peter C. Gutmann's implementation as found in */
+/* Applied Cryptography by Bruce Schneier */
+/* Further modifications to include the "UNRAVEL" stuff, below */
+
+/* This code is in the public domain */
+
+/* UNRAVEL should be fastest & biggest */
+/* UNROLL_LOOPS should be just as big, but slightly slower */
+/* both undefined should be smallest and slowest */
+
+#define UNRAVEL
+/* #define UNROLL_LOOPS */
+
+/* The SHA f()-functions.  The f1 and f3 functions can be optimized to
+   save one boolean operation each - thanks to Rich Schroeppel,
+   rcs at cs.arizona.edu for discovering this */
+
+/*#define f1(x,y,z)	((x & y) | (~x & z))		// Rounds  0-19 */
+#define f1(x,y,z)	(z ^ (x & (y ^ z)))		/* Rounds  0-19 */
+#define f2(x,y,z)	(x ^ y ^ z)			/* Rounds 20-39 */
+/*#define f3(x,y,z)	((x & y) | (x & z) | (y & z))	// Rounds 40-59 */
+#define f3(x,y,z)	((x & y) | (z & (x | y)))	/* Rounds 40-59 */
+#define f4(x,y,z)	(x ^ y ^ z)			/* Rounds 60-79 */
+
+/* SHA constants */
+
+#define CONST1		0x5a827999L			/* Rounds  0-19 */
+#define CONST2		0x6ed9eba1L			/* Rounds 20-39 */
+#define CONST3		0x8f1bbcdcL			/* Rounds 40-59 */
+#define CONST4		0xca62c1d6L			/* Rounds 60-79 */
+
+/* 32-bit rotate */
+
+#define R32(x,n)	((x << n) | (x >> (32 - n)))
+
+/* the generic case, for when the overall rotation is not unraveled */
+
+#define FG(n)	\
+    T = R32(A,5) + f##n(B,C,D) + E + *WP++ + CONST##n;	\
+    E = D; D = C; C = R32(B,30); B = A; A = T
+
+/* specific cases, for when the overall rotation is unraveled */
+
+#define FA(n)	\
+    T = R32(A,5) + f##n(B,C,D) + E + *WP++ + CONST##n; B = R32(B,30)
+
+#define FB(n)	\
+    E = R32(T,5) + f##n(A,B,C) + D + *WP++ + CONST##n; A = R32(A,30)
+
+#define FC(n)	\
+    D = R32(E,5) + f##n(T,A,B) + C + *WP++ + CONST##n; T = R32(T,30)
+
+#define FD(n)	\
+    C = R32(D,5) + f##n(E,T,A) + B + *WP++ + CONST##n; E = R32(E,30)
+
+#define FE(n)	\
+    B = R32(C,5) + f##n(D,E,T) + A + *WP++ + CONST##n; D = R32(D,30)
+
+#define FT(n)	\
+    A = R32(B,5) + f##n(C,D,E) + T + *WP++ + CONST##n; C = R32(C,30)
+
+/* do SHA transformation */
+
+static void
+sha_transform(SHAobject *sha_info)
+{
+    int i;
+    SHA_INT32 T, A, B, C, D, E, W[80], *WP;
+
+    memcpy(W, sha_info->data, sizeof(sha_info->data));
+    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
+
+    for (i = 16; i < 80; ++i) {
+	W[i] = W[i-3] ^ W[i-8] ^ W[i-14] ^ W[i-16];
+
+	/* extra rotation fix */
+	W[i] = R32(W[i], 1);
+    }
+    A = sha_info->digest[0];
+    B = sha_info->digest[1];
+    C = sha_info->digest[2];
+    D = sha_info->digest[3];
+    E = sha_info->digest[4];
+    WP = W;
+#ifdef UNRAVEL
+    FA(1); FB(1); FC(1); FD(1); FE(1); FT(1); FA(1); FB(1); FC(1); FD(1);
+    FE(1); FT(1); FA(1); FB(1); FC(1); FD(1); FE(1); FT(1); FA(1); FB(1);
+    FC(2); FD(2); FE(2); FT(2); FA(2); FB(2); FC(2); FD(2); FE(2); FT(2);
+    FA(2); FB(2); FC(2); FD(2); FE(2); FT(2); FA(2); FB(2); FC(2); FD(2);
+    FE(3); FT(3); FA(3); FB(3); FC(3); FD(3); FE(3); FT(3); FA(3); FB(3);
+    FC(3); FD(3); FE(3); FT(3); FA(3); FB(3); FC(3); FD(3); FE(3); FT(3);
+    FA(4); FB(4); FC(4); FD(4); FE(4); FT(4); FA(4); FB(4); FC(4); FD(4);
+    FE(4); FT(4); FA(4); FB(4); FC(4); FD(4); FE(4); FT(4); FA(4); FB(4);
+    sha_info->digest[0] += E;
+    sha_info->digest[1] += T;
+    sha_info->digest[2] += A;
+    sha_info->digest[3] += B;
+    sha_info->digest[4] += C;
+#else /* !UNRAVEL */
+#ifdef UNROLL_LOOPS
+    FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1);
+    FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1);
+    FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2);
+    FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2);
+    FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3);
+    FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3);
+    FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4);
+    FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4);
+#else /* !UNROLL_LOOPS */
+    for (i =  0; i < 20; ++i) { FG(1); }
+    for (i = 20; i < 40; ++i) { FG(2); }
+    for (i = 40; i < 60; ++i) { FG(3); }
+    for (i = 60; i < 80; ++i) { FG(4); }
+#endif /* !UNROLL_LOOPS */
+    sha_info->digest[0] += A;
+    sha_info->digest[1] += B;
+    sha_info->digest[2] += C;
+    sha_info->digest[3] += D;
+    sha_info->digest[4] += E;
+#endif /* !UNRAVEL */
+}
+
+/* initialize the SHA digest */
+
+static void
+sha_init(SHAobject *sha_info)
+{
+    TestEndianness(sha_info->Endianness)
+
+    sha_info->digest[0] = 0x67452301L;
+    sha_info->digest[1] = 0xefcdab89L;
+    sha_info->digest[2] = 0x98badcfeL;
+    sha_info->digest[3] = 0x10325476L;
+    sha_info->digest[4] = 0xc3d2e1f0L;
+    sha_info->count_lo = 0L;
+    sha_info->count_hi = 0L;
+    sha_info->local = 0;
+}
+
+/* update the SHA digest */
+
+static void
+sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
+{
+    int i;
+    SHA_INT32 clo;
+
+    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
+    if (clo < sha_info->count_lo) {
+        ++sha_info->count_hi;
+    }
+    sha_info->count_lo = clo;
+    sha_info->count_hi += (SHA_INT32) count >> 29;
+    if (sha_info->local) {
+        i = SHA_BLOCKSIZE - sha_info->local;
+        if (i > count) {
+            i = count;
+        }
+        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
+        count -= i;
+        buffer += i;
+        sha_info->local += i;
+        if (sha_info->local == SHA_BLOCKSIZE) {
+            sha_transform(sha_info);
+        }
+        else {
+            return;
+        }
+    }
+    while (count >= SHA_BLOCKSIZE) {
+        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
+        buffer += SHA_BLOCKSIZE;
+        count -= SHA_BLOCKSIZE;
+        sha_transform(sha_info);
+    }
+    memcpy(sha_info->data, buffer, count);
+    sha_info->local = count;
+}
+
+/* finish computing the SHA digest */
+
+static void
+sha_final(unsigned char digest[20], SHAobject *sha_info)
+{
+    int count;
+    SHA_INT32 lo_bit_count, hi_bit_count;
+
+    lo_bit_count = sha_info->count_lo;
+    hi_bit_count = sha_info->count_hi;
+    count = (int) ((lo_bit_count >> 3) & 0x3f);
+    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
+    if (count > SHA_BLOCKSIZE - 8) {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - count);
+	sha_transform(sha_info);
+	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 8);
+    }
+    else {
+	memset(((SHA_BYTE *) sha_info->data) + count, 0,
+	       SHA_BLOCKSIZE - 8 - count);
+    }
+
+    /* GJS: note that we add the hi/lo in big-endian. sha_transform will
+       swap these values into host-order. */
+    sha_info->data[56] = (hi_bit_count >> 24) & 0xff;
+    sha_info->data[57] = (hi_bit_count >> 16) & 0xff;
+    sha_info->data[58] = (hi_bit_count >>  8) & 0xff;
+    sha_info->data[59] = (hi_bit_count >>  0) & 0xff;
+    sha_info->data[60] = (lo_bit_count >> 24) & 0xff;
+    sha_info->data[61] = (lo_bit_count >> 16) & 0xff;
+    sha_info->data[62] = (lo_bit_count >>  8) & 0xff;
+    sha_info->data[63] = (lo_bit_count >>  0) & 0xff;
+    sha_transform(sha_info);
+    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
+    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
+    digest[ 2] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
+    digest[ 3] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
+    digest[ 4] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
+    digest[ 5] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
+    digest[ 6] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
+    digest[ 7] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
+    digest[ 8] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
+    digest[ 9] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
+    digest[10] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
+    digest[11] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
+    digest[12] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
+    digest[13] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
+    digest[14] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
+    digest[15] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
+    digest[16] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
+    digest[17] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
+    digest[18] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
+    digest[19] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
+}
+
+/*
+ * End of copied SHA code.
+ *
+ * ------------------------------------------------------------------------
+ */
+
+static PyTypeObject SHAtype;
+
+
+static SHAobject *
+newSHAobject(void)
+{
+    return (SHAobject *)PyObject_New(SHAobject, &SHAtype);
+}
+
+/* Internal methods for a hashing object */
+
+static void
+SHA_dealloc(PyObject *ptr)
+{
+    PyObject_Del(ptr);
+}
+
+
+/* External methods for a hashing object */
+
+PyDoc_STRVAR(SHA_copy__doc__, "Return a copy of the hashing object.");
+
+static PyObject *
+SHA_copy(SHAobject *self, PyObject *unused)
+{
+    SHAobject *newobj;
+
+    if ( (newobj = newSHAobject())==NULL)
+        return NULL;
+
+    SHAcopy(self, newobj);
+    return (PyObject *)newobj;
+}
+
+PyDoc_STRVAR(SHA_digest__doc__,
+"Return the digest value as a string of binary data.");
+
+static PyObject *
+SHA_digest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+    return PyString_FromStringAndSize((const char *)digest, sizeof(digest));
+}
+
+PyDoc_STRVAR(SHA_hexdigest__doc__,
+"Return the digest value as a string of hexadecimal digits.");
+
+static PyObject *
+SHA_hexdigest(SHAobject *self, PyObject *unused)
+{
+    unsigned char digest[SHA_DIGESTSIZE];
+    SHAobject temp;
+    PyObject *retval;
+    char *hex_digest;
+    int i, j;
+
+    /* Get the raw (binary) digest value */
+    SHAcopy(self, &temp);
+    sha_final(digest, &temp);
+
+    /* Create a new string */
+    retval = PyString_FromStringAndSize(NULL, sizeof(digest) * 2);
+    if (!retval)
+	    return NULL;
+    hex_digest = PyString_AsString(retval);
+    if (!hex_digest) {
+	    Py_DECREF(retval);
+	    return NULL;
+    }
+
+    /* Make hex version of the digest */
+    for(i=j=0; i<sizeof(digest); i++) {
+        char c;
+        c = (digest[i] >> 4) & 0xf;
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+        c = (digest[i] & 0xf);
+	c = (c>9) ? c+'a'-10 : c + '0';
+        hex_digest[j++] = c;
+    }
+    return retval;
+}
+
+PyDoc_STRVAR(SHA_update__doc__,
+"Update this hashing object's state with the provided string.");
+
+static PyObject *
+SHA_update(SHAobject *self, PyObject *args)
+{
+    unsigned char *cp;
+    int len;
+
+    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
+        return NULL;
+
+    sha_update(self, cp, len);
+
+    Py_INCREF(Py_None);
+    return Py_None;
+}
+
+static PyMethodDef SHA_methods[] = {
+    {"copy",	  (PyCFunction)SHA_copy,      METH_NOARGS,  SHA_copy__doc__},
+    {"digest",	  (PyCFunction)SHA_digest,    METH_NOARGS,  SHA_digest__doc__},
+    {"hexdigest", (PyCFunction)SHA_hexdigest, METH_NOARGS,  SHA_hexdigest__doc__},
+    {"update",	  (PyCFunction)SHA_update,    METH_VARARGS, SHA_update__doc__},
+    {NULL,	  NULL}		/* sentinel */
+};
+
+static PyObject *
+SHA_get_block_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_BLOCKSIZE);
+}
+
+static PyObject *
+SHA_get_digest_size(PyObject *self, void *closure)
+{
+    return PyInt_FromLong(SHA_DIGESTSIZE);
+}
+
+static PyObject *
+SHA_get_name(PyObject *self, void *closure)
+{
+    return PyString_FromStringAndSize("SHA1", 4);
+}
+
+static PyGetSetDef SHA_getseters[] = {
+    {"digest_size",
+     (getter)SHA_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {"block_size",
+     (getter)SHA_get_block_size, NULL,
+     NULL,
+     NULL},
+    {"name",
+     (getter)SHA_get_name, NULL,
+     NULL,
+     NULL},
+    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
+     * the old sha module also supported 'digestsize'.  ugh. */
+    {"digestsize",
+     (getter)SHA_get_digest_size, NULL,
+     NULL,
+     NULL},
+    {NULL}  /* Sentinel */
+};
+
+static PyTypeObject SHAtype = {
+    PyObject_HEAD_INIT(NULL)
+    0,			/*ob_size*/
+    "_sha.sha",		/*tp_name*/
+    sizeof(SHAobject),	/*tp_size*/
+    0,			/*tp_itemsize*/
+    /* methods */
+    SHA_dealloc,	/*tp_dealloc*/
+    0,			/*tp_print*/
+    0,                  /*tp_getattr*/
+    0,                  /*tp_setattr*/
+    0,                  /*tp_compare*/
+    0,                  /*tp_repr*/
+    0,                  /*tp_as_number*/
+    0,                  /*tp_as_sequence*/
+    0,                  /*tp_as_mapping*/
+    0,                  /*tp_hash*/
+    0,                  /*tp_call*/
+    0,                  /*tp_str*/
+    0,                  /*tp_getattro*/
+    0,                  /*tp_setattro*/
+    0,                  /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT, /*tp_flags*/
+    0,                  /*tp_doc*/
+    0,                  /*tp_traverse*/
+    0,			/*tp_clear*/
+    0,			/*tp_richcompare*/
+    0,			/*tp_weaklistoffset*/
+    0,			/*tp_iter*/
+    0,			/*tp_iternext*/
+    SHA_methods,	/* tp_methods */
+    0,                  /* tp_members */
+    SHA_getseters,      /* tp_getset */
+};
+
+
+/* The single module-level function: new() */
+
+PyDoc_STRVAR(SHA_new__doc__,
+"Return a new SHA hashing object.  An optional string argument\n\
+may be provided; if present, this string will be automatically\n\
+hashed.");
+
+static PyObject *
+SHA_new(PyObject *self, PyObject *args, PyObject *kwdict)
+{
+    static char *kwlist[] = {"string", NULL};
+    SHAobject *new;
+    unsigned char *cp = NULL;
+    int len;
+
+    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
+                                     &cp, &len)) {
+        return NULL;
+    }
+
+    if ((new = newSHAobject()) == NULL)
+        return NULL;
+
+    sha_init(new);
+
+    if (PyErr_Occurred()) {
+        Py_DECREF(new);
+        return NULL;
+    }
+    if (cp)
+        sha_update(new, cp, len);
+
+    return (PyObject *)new;
+}
+
+
+/* List of functions exported by this module */
+
+static struct PyMethodDef SHA_functions[] = {
+    {"new", (PyCFunction)SHA_new, METH_VARARGS|METH_KEYWORDS, SHA_new__doc__},
+    {NULL,	NULL}		 /* Sentinel */
+};
+
+
+/* Initialize this module. */
+
+#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
+
+PyMODINIT_FUNC
+init_sha(void)
+{
+    PyObject *m;
+
+    SHAtype.ob_type = &PyType_Type;
+    if (PyType_Ready(&SHAtype) < 0)
+        return;
+    m = Py_InitModule("_sha", SHA_functions);
+    if (m == NULL)
+	return;
+
+    /* Add some symbolic constants to the module */
+    insint("blocksize", 1);  /* For future use, in case some hash
+                                functions require an integral number of
+                                blocks */ 
+    insint("digestsize", 20);
+    insint("digest_size", 20);
+}
diff --git a/src/distutils2/_backport/hashlib.py b/src/distutils2/_backport/hashlib.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/_backport/hashlib.py
@@ -0,0 +1,143 @@
+# $Id$
+#
+#  Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
+#  Licensed to PSF under a Contributor Agreement.
+#
+
+__doc__ = """hashlib module - A common interface to many hash functions.
+
+new(name, string='') - returns a new hash object implementing the
+                       given hash function; initializing the hash
+                       using the given string data.
+
+Named constructor functions are also available, these are much faster
+than using new():
+
+md5(), sha1(), sha224(), sha256(), sha384(), and sha512()
+
+More algorithms may be available on your platform but the above are
+guaranteed to exist.
+
+NOTE: If you want the adler32 or crc32 hash functions they are available in
+the zlib module.
+
+Choose your hash function wisely.  Some have known collision weaknesses.
+sha384 and sha512 will be slow on 32 bit platforms.
+
+Hash objects have these methods:
+ - update(arg): Update the hash object with the string arg. Repeated calls
+                are equivalent to a single call with the concatenation of all
+                the arguments.
+ - digest():    Return the digest of the strings passed to the update() method
+                so far. This may contain non-ASCII characters, including
+                NUL bytes.
+ - hexdigest(): Like digest() except the digest is returned as a string of
+                double length, containing only hexadecimal digits.
+ - copy():      Return a copy (clone) of the hash object. This can be used to
+                efficiently compute the digests of strings that share a common
+                initial substring.
+
+For example, to obtain the digest of the string 'Nobody inspects the
+spammish repetition':
+
+    >>> import hashlib
+    >>> m = hashlib.md5()
+    >>> m.update("Nobody inspects")
+    >>> m.update(" the spammish repetition")
+    >>> m.digest()
+    '\\xbbd\\x9c\\x83\\xdd\\x1e\\xa5\\xc9\\xd9\\xde\\xc9\\xa1\\x8d\\xf0\\xff\\xe9'
+
+More condensed:
+
+    >>> hashlib.sha224("Nobody inspects the spammish repetition").hexdigest()
+    'a4337bc45a8fc544c03f52dc550cd6e1e87021bc896588bd79e901e2'
+
+"""
+
+# This tuple and __get_builtin_constructor() must be modified if a new
+# always available algorithm is added.
+__always_supported = ('md5', 'sha1', 'sha224', 'sha256', 'sha384', 'sha512')
+
+algorithms = __always_supported
+
+__all__ = __always_supported + ('new', 'algorithms')
+
+
+def __get_builtin_constructor(name):
+    if name in ('SHA1', 'sha1'):
+        import _sha
+        return _sha.new
+    elif name in ('MD5', 'md5'):
+        import _md5
+        return _md5.new
+    elif name in ('SHA256', 'sha256', 'SHA224', 'sha224'):
+        import _sha256
+        bs = name[3:]
+        if bs == '256':
+            return _sha256.sha256
+        elif bs == '224':
+            return _sha256.sha224
+    elif name in ('SHA512', 'sha512', 'SHA384', 'sha384'):
+        import _sha512
+        bs = name[3:]
+        if bs == '512':
+            return _sha512.sha512
+        elif bs == '384':
+            return _sha512.sha384
+
+    raise ValueError('unsupported hash type %s' % name)
+
+
+def __get_openssl_constructor(name):
+    try:
+        f = getattr(_hashlib, 'openssl_' + name)
+        # Allow the C module to raise ValueError.  The function will be
+        # defined but the hash not actually available thanks to OpenSSL.
+        f()
+        # Use the C function directly (very fast)
+        return f
+    except (AttributeError, ValueError):
+        return __get_builtin_constructor(name)
+
+
+def __py_new(name, string=''):
+    """new(name, string='') - Return a new hashing object using the named algorithm;
+    optionally initialized with a string.
+    """
+    return __get_builtin_constructor(name)(string)
+
+
+def __hash_new(name, string=''):
+    """new(name, string='') - Return a new hashing object using the named algorithm;
+    optionally initialized with a string.
+    """
+    try:
+        return _hashlib.new(name, string)
+    except ValueError:
+        # If the _hashlib module (OpenSSL) doesn't support the named
+        # hash, try using our builtin implementations.
+        # This allows for SHA224/256 and SHA384/512 support even though
+        # the OpenSSL library prior to 0.9.8 doesn't provide them.
+        return __get_builtin_constructor(name)(string)
+
+
+try:
+    import _hashlib
+    new = __hash_new
+    __get_hash = __get_openssl_constructor
+except ImportError:
+    new = __py_new
+    __get_hash = __get_builtin_constructor
+
+for __func_name in __always_supported:
+    # try them all, some may not work due to the OpenSSL
+    # version not supporting that algorithm.
+    try:
+        globals()[__func_name] = __get_hash(__func_name)
+    except ValueError:
+        import logging
+        logging.exception('code for hash %s was not found.', __func_name)
+
+# Cleanup locals()
+del __always_supported, __func_name, __get_hash
+del __py_new, __hash_new, __get_openssl_constructor
diff --git a/src/distutils2/_backport/tests/test_pkgutil.py b/src/distutils2/_backport/tests/test_pkgutil.py
--- a/src/distutils2/_backport/tests/test_pkgutil.py
+++ b/src/distutils2/_backport/tests/test_pkgutil.py
@@ -1,5 +1,7 @@
 # -*- coding: utf-8 -*-
 """Tests for PEP 376 pkgutil functionality"""
+import unittest2
+import unittest2.compatibility
 import sys
 import os
 import csv
@@ -13,7 +15,6 @@
     from md5 import md5
 
 from test.test_support import run_unittest, TESTFN
-from distutils2.tests.support import unittest
 
 from distutils2._backport import pkgutil
 
@@ -23,15 +24,13 @@
 # TODO Add a test for absolute pathed RECORD items (e.g. /etc/myapp/config.ini)
 
 # Adapted from Python 2.7's trunk
-class TestPkgUtilData(unittest.TestCase):
+class TestPkgUtilData(unittest2.TestCase):
 
     def setUp(self):
-        super(TestPkgUtilData, self).setUp()
         self.dirname = tempfile.mkdtemp()
         sys.path.insert(0, self.dirname)
 
     def tearDown(self):
-        super(TestPkgUtilData, self).tearDown()
         del sys.path[0]
         shutil.rmtree(self.dirname)
 
@@ -46,22 +45,15 @@
         os.mkdir(package_dir)
         # Empty init.py
         f = open(os.path.join(package_dir, '__init__.py'), "wb")
-        try:
-            pass
-        finally:
-            f.close()
+        f.close()
         # Resource files, res.txt, sub/res.txt
         f = open(os.path.join(package_dir, 'res.txt'), "wb")
-        try:
-            f.write(RESOURCE_DATA)
-        finally:
-            f.close()
+        f.write(RESOURCE_DATA)
+        f.close()
         os.mkdir(os.path.join(package_dir, 'sub'))
         f = open(os.path.join(package_dir, 'sub', 'res.txt'), "wb")
-        try:
-            f.write(RESOURCE_DATA)
-        finally:
-            f.close()
+        f.write(RESOURCE_DATA)
+        f.close()
 
         # Check we can read the resources
         res1 = pkgutil.get_data(pkg, 'res.txt')
@@ -81,14 +73,13 @@
         # Make a package with some resources
         zip_file = os.path.join(self.dirname, zip)
         z = zipfile.ZipFile(zip_file, 'w')
-        try:
-            # Empty init.py
-            z.writestr(pkg + '/__init__.py', "")
-            # Resource files, res.txt, sub/res.txt
-            z.writestr(pkg + '/res.txt', RESOURCE_DATA)
-            z.writestr(pkg + '/sub/res.txt', RESOURCE_DATA)
-        finally:
-            z.close()
+
+        # Empty init.py
+        z.writestr(pkg + '/__init__.py', "")
+        # Resource files, res.txt, sub/res.txt
+        z.writestr(pkg + '/res.txt', RESOURCE_DATA)
+        z.writestr(pkg + '/sub/res.txt', RESOURCE_DATA)
+        z.close()
 
         # Check we can read the resources
         sys.path.insert(0, zip_file)
@@ -101,7 +92,7 @@
         del sys.modules[pkg]
 
 # Adapted from Python 2.7's trunk
-class TestPkgUtilPEP302(unittest.TestCase):
+class TestPkgUtilPEP302(unittest2.TestCase):
 
     class MyTestLoader(object):
         def load_module(self, fullname):
@@ -123,12 +114,10 @@
             return TestPkgUtilPEP302.MyTestLoader()
 
     def setUp(self):
-        super(TestPkgUtilPEP302, self).setUp()
         sys.meta_path.insert(0, self.MyTestImporter())
 
     def tearDown(self):
         del sys.meta_path[0]
-        super(TestPkgUtilPEP302, self).tearDown()
 
     def test_getdata_pep302(self):
         # Use a dummy importer/loader
@@ -146,11 +135,10 @@
         del sys.modules['foo']
 
 
-class TestPkgUtilDistribution(unittest.TestCase):
+class TestPkgUtilDistribution(unittest2.TestCase):
     """Tests the pkgutil.Distribution class"""
 
     def setUp(self):
-        super(TestPkgUtilDistribution, self).setUp()
         self.fake_dists_path = os.path.abspath(
             os.path.join(os.path.dirname(__file__), 'fake_dists'))
 
@@ -199,7 +187,6 @@
         for distinfo_dir in self.distinfo_dirs:
             record_file = os.path.join(distinfo_dir, 'RECORD')
             open(record_file, 'w').close()
-        super(TestPkgUtilDistribution, self).tearDown()
 
     def test_instantiation(self):
         """Test the Distribution class's instantiation provides us with usable
@@ -302,11 +289,10 @@
         self.assertEqual(sorted(found), sorted(distinfo_record_paths))
 
 
-class TestPkgUtilPEP376(unittest.TestCase):
+class TestPkgUtilPEP376(unittest2.TestCase):
     """Tests for the new functionality added in PEP 376."""
 
     def setUp(self):
-        super(TestPkgUtilPEP376, self).setUp()
         # Setup the path environment with our fake distributions
         current_path = os.path.abspath(os.path.dirname(__file__))
         self.sys_path = sys.path[:]
@@ -315,7 +301,6 @@
 
     def tearDown(self):
         sys.path[:] = self.sys_path
-        super(TestPkgUtilPEP376, self).tearDown()
 
     def test_distinfo_dirname(self):
         """Given a name and a version, we expect the distinfo_dirname function
@@ -543,8 +528,8 @@
 
 
 def test_suite():
-    suite = unittest.TestSuite()
-    testcase_loader = unittest.loader.defaultTestLoader.loadTestsFromTestCase
+    suite = unittest2.TestSuite()
+    testcase_loader = unittest2.loader.defaultTestLoader.loadTestsFromTestCase
     suite.addTest(testcase_loader(TestPkgUtilData))
     suite.addTest(testcase_loader(TestPkgUtilDistribution))
     suite.addTest(testcase_loader(TestPkgUtilPEP302))
diff --git a/src/distutils2/command/cmd.py b/src/distutils2/command/cmd.py
--- a/src/distutils2/command/cmd.py
+++ b/src/distutils2/command/cmd.py
@@ -57,8 +57,7 @@
     def __init__(self, dist):
         """Create and initialize a new Command object.  Most importantly,
         invokes the 'initialize_options()' method, which is the real
-        initializer and depends on the actual command being
-        instantiated.
+        initializer and depends on the actual command being instantiated.
         """
         # late import because of mutual dependence between these classes
         from distutils2.dist import Distribution
diff --git a/src/distutils2/command/upload.py b/src/distutils2/command/upload.py
--- a/src/distutils2/command/upload.py
+++ b/src/distutils2/command/upload.py
@@ -160,7 +160,7 @@
         # send the data
         try:
             result = urlopen(request)
-            status = result.getcode()
+            status = result.code
             reason = result.msg
         except socket.error, e:
             self.announce(str(e), log.ERROR)
diff --git a/src/distutils2/command/upload_docs.py b/src/distutils2/command/upload_docs.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/command/upload_docs.py
@@ -0,0 +1,135 @@
+import base64, httplib, os.path, socket, tempfile, urlparse, zipfile
+from cStringIO import StringIO
+from distutils2 import log
+from distutils2.command.upload import upload
+from distutils2.core import PyPIRCCommand
+from distutils2.errors import DistutilsFileError
+
+def zip_dir(directory):
+    """Compresses recursively contents of directory into a StringIO object"""
+    destination = StringIO()
+    zip_file = zipfile.ZipFile(destination, "w")
+    for root, dirs, files in os.walk(directory):
+        for name in files:
+            full = os.path.join(root, name)
+            relative = root[len(directory):].lstrip(os.path.sep)
+            dest = os.path.join(relative, name)
+            zip_file.write(full, dest)
+    zip_file.close()
+    return destination
+
+# grabbed from
+#    http://code.activestate.com/recipes/146306-http-client-to-post-using-multipartform-data/
+def encode_multipart(fields, files, boundary=None):
+    """
+    fields is a sequence of (name, value) elements for regular form fields.
+    files is a sequence of (name, filename, value) elements for data to be uploaded as files
+    Return (content_type, body) ready for httplib.HTTP instance
+    """
+    if boundary is None:
+        boundary = '--------------GHSKFJDLGDS7543FJKLFHRE75642756743254'
+    l = []
+    for (key, value) in fields:
+        l.extend([
+            '--' + boundary,
+            'Content-Disposition: form-data; name="%s"' % key,
+            '',
+            value])
+    for (key, filename, value) in files:
+        l.extend([
+            '--' + boundary,
+            'Content-Disposition: form-data; name="%s"; filename="%s"' % (key, filename),
+            '',
+            value])
+    l.append('--' + boundary + '--')
+    l.append('')
+    body =  '\r\n'.join(l)
+    content_type = 'multipart/form-data; boundary=%s' % boundary
+    return content_type, body
+
+class upload_docs(PyPIRCCommand):
+
+    user_options = [
+        ('repository=', 'r', "url of repository [default: %s]" % upload.DEFAULT_REPOSITORY),
+        ('show-response', None, 'display full response text from server'),
+        ('upload-dir=', None, 'directory to upload'),
+        ]
+
+    def initialize_options(self):
+        PyPIRCCommand.initialize_options(self)
+        self.upload_dir = "build/docs"
+
+    def finalize_options(self):
+        PyPIRCCommand.finalize_options(self)
+        if self.upload_dir == None:
+            build = self.get_finalized_command('build')
+            self.upload_dir = os.path.join(build.build_base, "docs")
+        self.announce('Using upload directory %s' % self.upload_dir)
+        self.verify_upload_dir(self.upload_dir)
+        config = self._read_pypirc()
+        if config != {}:
+            self.username = config['username']
+            self.password = config['password']
+            self.repository = config['repository']
+            self.realm = config['realm']
+
+    def verify_upload_dir(self, upload_dir):
+        self.ensure_dirname('upload_dir')
+        index_location = os.path.join(upload_dir, "index.html")
+        if not os.path.exists(index_location):
+            mesg = "No 'index.html found in docs directory (%s)"
+            raise DistutilsFileError(mesg % upload_dir)
+
+    def run(self):
+        tmp_dir = tempfile.mkdtemp()
+        name = self.distribution.metadata['Name']
+        zip_file = zip_dir(self.upload_dir)
+
+        fields = {':action': 'doc_upload', 'name': name}.items()
+        files = [('content', name, zip_file.getvalue())]
+        content_type, body = encode_multipart(fields, files)
+
+        credentials = self.username + ':' + self.password
+        auth = "Basic " + base64.encodestring(credentials).strip()
+
+        self.announce("Submitting documentation to %s" % (self.repository),
+                      log.INFO)
+
+        schema, netloc, url, params, query, fragments = \
+            urlparse.urlparse(self.repository)
+        if schema == "http":
+            conn = httplib.HTTPConnection(netloc)
+        elif schema == "https":
+            conn = httplib.HTTPSConnection(netloc)
+        else:
+            raise AssertionError("unsupported schema "+schema)
+
+        try:
+            conn.connect()
+            conn.putrequest("POST", url)
+            conn.putheader('Content-type', content_type)
+            conn.putheader('Content-length', str(len(body)))
+            conn.putheader('Authorization', auth)
+            conn.endheaders()
+            conn.send(body)
+        except socket.error, e:
+            self.announce(str(e), log.ERROR)
+            return
+
+        r = conn.getresponse()
+
+        if r.status == 200:
+            self.announce('Server response (%s): %s' % (r.status, r.reason),
+                          log.INFO)
+        elif r.status == 301:
+            location = r.getheader('Location')
+            if location is None:
+                location = 'http://packages.python.org/%s/' % meta.get_name()
+            self.announce('Upload successful. Visit %s' % location,
+                          log.INFO)
+        else:
+            self.announce('Upload failed (%s): %s' % (r.status, r.reason),
+                          log.ERROR)
+
+        if self.show_response:
+            print "\n".join(['-'*75, r.read(), '-'*75])
diff --git a/src/distutils2/dist.py b/src/distutils2/dist.py
--- a/src/distutils2/dist.py
+++ b/src/distutils2/dist.py
@@ -150,8 +150,8 @@
         # information here (and enough command-line options) that it's
         # worth it.  Also delegate 'get_XXX()' methods to the 'metadata'
         # object in a sneaky and underhanded (but efficient!) way.
+        self.metadata = DistributionMetadata()
 
-        self.metadata = DistributionMetadata()
         #for basename in self.metadata._METHOD_BASENAMES:
         #    method_name = "get_" + basename
         #    setattr(self, method_name, getattr(self.metadata, method_name))
diff --git a/src/distutils2/pypi/__init__.py b/src/distutils2/pypi/__init__.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/__init__.py
@@ -0,0 +1,8 @@
+"""distutils2.pypi
+
+Package containing ways to interact with the PyPI APIs.
+""" 
+
+__all__ = ['simple',
+           'dist',
+]
diff --git a/src/distutils2/pypi/dist.py b/src/distutils2/pypi/dist.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/dist.py
@@ -0,0 +1,315 @@
+"""distutils2.pypi.dist
+
+Provides the PyPIDistribution class thats represents a distribution retrieved
+on PyPI.
+"""
+import re
+import urlparse
+import urllib
+import tempfile
+from operator import attrgetter
+
+try:
+    import hashlib
+except ImportError:
+    from distutils2._backport import hashlib
+
+from distutils2.version import suggest_normalized_version, NormalizedVersion
+from distutils2.pypi.errors import HashDoesNotMatch, UnsupportedHashName
+
+EXTENSIONS = ".tar.gz .tar.bz2 .tar .zip .tgz .egg".split()
+MD5_HASH = re.compile(r'^.*#md5=([a-f0-9]+)$')
+
+
+class PyPIDistribution(object):
+    """Represents a distribution retrieved from PyPI.
+
+    This is a simple container for various attributes as name, version,
+    downloaded_location, url etc.
+
+    The PyPIDistribution class is used by the pypi.*Index class to return
+    information about distributions.
+    """
+
+    @classmethod
+    def from_url(cls, url, probable_dist_name=None, is_external=True):
+        """Build a Distribution from a url archive (egg or zip or tgz).
+
+        :param url: complete url of the distribution
+        :param probable_dist_name: A probable name of the distribution.
+        :param is_external: Tell if the url commes from an index or from
+                            an external URL.
+        """
+        # if the url contains a md5 hash, get it.
+        md5_hash = None
+        match = MD5_HASH.match(url)
+        if match is not None:
+            md5_hash = match.group(1)
+            # remove the hash
+            url = url.replace("#md5=%s" % md5_hash, "")
+
+        # parse the archive name to find dist name and version
+        archive_name = urlparse.urlparse(url)[2].split('/')[-1]
+        extension_matched = False
+        # remove the extension from the name
+        for ext in EXTENSIONS:
+            if archive_name.endswith(ext):
+                archive_name = archive_name[:-len(ext)]
+                extension_matched = True
+
+        name, version = split_archive_name(archive_name)
+        if extension_matched is True:
+            return PyPIDistribution(name, version, url=url, url_hashname="md5",
+                                    url_hashval=md5_hash, url_is_external=is_external)
+
+    def __init__(self, name, version, type=None, url=None, url_hashname=None,
+                 url_hashval=None, url_is_external=True):
+        """Create a new instance of PyPIDistribution.
+
+        :param name: the name of the distribution
+        :param version: the version of the distribution
+        :param type: the type of the dist (eg. source, bin-*, etc.)
+        :param url: URL where we found this distribution
+        :param url_hashname: the name of the hash we want to use. Refer to the
+                         hashlib.new documentation for more information.
+        :param url_hashval: the hash value.
+        :param url_is_external: we need to know if the provided url comes from an
+                            index browsing, or from an external resource.
+
+        """
+        self.name = name
+        self.version = NormalizedVersion(version)
+        self.type = type
+        # set the downloaded path to None by default. The goal here
+        # is to not download distributions multiple times
+        self.downloaded_location = None
+        # We store urls in dict, because we need to have a bit more informations
+        # than the simple URL. It will be used later to find the good url to
+        # use.
+        # We have two _url* attributes: _url and _urls. _urls contains a list of
+        # dict for the different urls, and _url contains the choosen url, in
+        # order to dont make the selection process multiple times.
+        self._urls = []
+        self._url = None
+        self.add_url(url, url_hashname, url_hashval, url_is_external)
+
+    def add_url(self, url, hashname=None, hashval=None, is_external=True):
+        """Add a new url to the list of urls"""
+        if hashname is not None:
+            try:
+                hashlib.new(hashname)
+            except ValueError:
+                raise UnsupportedHashName(hashname)
+
+        self._urls.append({
+            'url': url,
+            'hashname': hashname,
+            'hashval': hashval,
+            'is_external': is_external,
+        })
+        # reset the url selection process
+        self._url = None
+
+    @property
+    def url(self):
+        """Pick up the right url for the list of urls in self.urls"""
+        # We return internal urls over externals.
+        # If there is more than one internal or external, return the first
+        # one.
+        if self._url is None:
+            if len(self._urls) > 1:
+                internals_urls = [u for u in self._urls \
+                                  if u['is_external'] == False]
+                if len(internals_urls) >= 1:
+                    self._url = internals_urls[0]
+            if self._url is None:
+                self._url = self._urls[0]
+        return self._url
+
+    @property
+    def is_source(self):
+        """return if the distribution is a source one or not"""
+        return self.type == 'source'
+
+    @property
+    def is_final(self):
+        """proxy to version.is_final"""
+        return self.version.is_final
+
+    def download(self, path=None):
+        """Download the distribution to a path, and return it.
+
+        If the path is given in path, use this, otherwise, generates a new one
+        """
+        if path is None:
+            path = tempfile.mkdtemp()
+
+        # if we do not have downloaded it yet, do it.
+        if self.downloaded_location is None:
+            url = self.url['url']
+            archive_name = urlparse.urlparse(url)[2].split('/')[-1]
+            filename, headers = urllib.urlretrieve(url,
+                                                   path + "/" + archive_name)
+            self.downloaded_location = filename
+            self._check_md5(filename)
+        return self.downloaded_location
+
+    def _check_md5(self, filename):
+        """Check that the md5 checksum of the given file matches the one in
+        url param"""
+        hashname = self.url['hashname']
+        expected_hashval = self.url['hashval']
+        if not None in (expected_hashval, hashname):
+            f = open(filename)
+            hashval = hashlib.new(hashname)
+            hashval.update(f.read())
+            if hashval.hexdigest() != expected_hashval:
+                raise HashDoesNotMatch("got %s instead of %s"
+                    % (hashval.hexdigest(), expected_hashval))
+
+    def __repr__(self):
+        return "%s %s %s %s" \
+            % (self.__class__.__name__, self.name, self.version,
+               self.type or "")
+
+    def _check_is_comparable(self, other):
+        if not isinstance(other, PyPIDistribution):
+            raise TypeError("cannot compare %s and %s"
+                % (type(self).__name__, type(other).__name__))
+        elif self.name != other.name:
+            raise TypeError("cannot compare %s and %s"
+                % (self.name, other.name))
+
+    def __eq__(self, other):
+        self._check_is_comparable(other)
+        return self.version == other.version
+
+    def __lt__(self, other):
+        self._check_is_comparable(other)
+        return self.version < other.version
+
+    def __ne__(self, other):
+        return not self.__eq__(other)
+
+    def __gt__(self, other):
+        return not (self.__lt__(other) or self.__eq__(other))
+
+    def __le__(self, other):
+        return self.__eq__(other) or self.__lt__(other)
+
+    def __ge__(self, other):
+        return self.__eq__(other) or self.__gt__(other)
+
+    # See http://docs.python.org/reference/datamodel#object.__hash__
+    __hash__ = object.__hash__
+
+
+class PyPIDistributions(list):
+    """A container of PyPIDistribution objects.
+
+    Contains methods and facilities to sort and filter distributions.
+    """
+    def __init__(self, list=[]):
+        # To disable the ability to pass lists on instanciation
+        super(PyPIDistributions, self).__init__()
+        for item in list:
+            self.append(item)
+
+    def filter(self, predicate):
+        """Filter the distributions and return a subset of distributions that
+        match the given predicate
+        """
+        return PyPIDistributions(
+            [dist for dist in self if dist.name == predicate.name and
+            predicate.match(dist.version)])
+
+    def get_last(self, predicate, prefer_source=None, prefer_final=None):
+        """Return the most up to date version, that satisfy the given
+        predicate
+        """
+        distributions = self.filter(predicate)
+        distributions.sort_distributions(prefer_source, prefer_final, reverse=True)
+        return distributions[0]
+
+    def get_same_name_and_version(self):
+        """Return lists of PyPIDistribution objects that refer to the same
+        name and version number. This do not consider the type (source, binary,
+        etc.)"""
+        processed = []
+        duplicates = []
+        for dist in self:
+            if (dist.name, dist.version) not in processed:
+                processed.append((dist.name, dist.version))
+                found_duplicates = [d for d in self if d.name == dist.name and
+                                    d.version == dist.version]
+                if len(found_duplicates) > 1:
+                    duplicates.append(found_duplicates)
+        return duplicates
+
+    def append(self, o):
+        """Append a new distribution to the list.
+
+        If a distribution with the same name and version exists, just grab the
+        URL informations and add a new new url for the existing one.
+        """
+        similar_dists = [d for d in self if d.name == o.name and
+                         d.version == o.version and d.type == o.type]
+        if len(similar_dists) > 0:
+            dist = similar_dists[0]
+            dist.add_url(**o.url)
+        else:
+            super(PyPIDistributions, self).append(o)
+
+    def sort_distributions(self, prefer_source=None, prefer_final=None,
+                           reverse=True, *args, **kwargs):
+        """order the results with the given properties"""
+
+        sort_by = []
+        if prefer_final is not None:
+            if prefer_final is True:
+                sort_by.append("is_final")
+        sort_by.append("version")
+
+        if prefer_source is not None:
+            if prefer_source is True:
+                sort_by.append("is_source")
+
+        super(PyPIDistributions, self).sort(
+            key=lambda i: [getattr(i, arg) for arg in sort_by],
+            reverse=reverse, *args, **kwargs)
+
+def split_archive_name(archive_name, probable_name=None):
+    """Split an archive name into two parts: name and version.
+
+    Return the tuple (name, version)
+    """
+    # Try to determine wich part is the name and wich is the version using the
+    # "-" separator. Take the larger part to be the version number then reduce
+    # if this not works.
+    def eager_split(str, maxsplit=2):
+        # split using the "-" separator
+        splits = str.rsplit("-", maxsplit)
+        name = splits[0]
+        version = "-".join(splits[1:])
+        if version.startswith("-"):
+            version = version[1:]
+        if suggest_normalized_version(version) is None and maxsplit >= 0:
+            # we dont get a good version number: recurse !
+            return eager_split(str, maxsplit - 1)
+        else:
+            return (name, version)
+    if probable_name is not None:
+        probable_name = probable_name.lower()
+    name = None
+    if probable_name is not None and probable_name in archive_name:
+        # we get the name from probable_name, if given.
+        name = probable_name
+        version = archive_name.lstrip(name)
+    else:
+        name, version = eager_split(archive_name)
+
+    version = suggest_normalized_version(version)
+    if version != "" and name != "":
+        return (name.lower(), version)
+    else:
+        raise CantParseArchiveName(archive_name)
diff --git a/src/distutils2/pypi/errors.py b/src/distutils2/pypi/errors.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/errors.py
@@ -0,0 +1,33 @@
+"""distutils2.pypi.errors
+
+All errors and exceptions raised by PyPiIndex classes.
+"""
+from distutils2.errors import DistutilsError
+
+
+class PyPIError(DistutilsError):
+    """The base class for errors of the pypi python package."""
+
+
+class DistributionNotFound(PyPIError):
+    """No distribution match the given requirements."""
+
+
+class CantParseArchiveName(PyPIError):
+    """An archive name can't be parsed to find distribution name and version"""
+
+
+class DownloadError(PyPIError):
+    """An error has occurs while downloading"""
+
+
+class HashDoesNotMatch(DownloadError):
+    """Compared hashes does not match"""
+
+
+class UnsupportedHashName(PyPIError):
+    """A unsupported hashname has been used"""
+
+
+class UnableToDownload(PyPIError):
+    """All mirrors have been tried, without success"""
diff --git a/src/distutils2/pypi/simple.py b/src/distutils2/pypi/simple.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/pypi/simple.py
@@ -0,0 +1,395 @@
+"""pypi.simple
+
+Contains the class "SimpleIndex", a simple spider to find and retrieve
+distributions on the Python Package Index, using it's "simple" API,
+avalaible at http://pypi.python.org/simple/
+"""
+from fnmatch import translate
+import urlparse
+import sys
+import re
+import urllib2
+import httplib
+import socket
+
+from distutils2.version import VersionPredicate
+from distutils2.pypi.dist import PyPIDistribution, PyPIDistributions, \
+    EXTENSIONS
+from distutils2.pypi.errors import PyPIError, DistributionNotFound, \
+    DownloadError, UnableToDownload
+from distutils2 import __version__ as __distutils2_version__
+
+# -- Constants -----------------------------------------------
+PYPI_DEFAULT_INDEX_URL = "http://pypi.python.org/simple/"
+PYPI_DEFAULT_MIRROR_URL = "mirrors.pypi.python.org"
+DEFAULT_HOSTS = ("*",)
+SOCKET_TIMEOUT = 15
+USER_AGENT = "Python-urllib/%s distutils2/%s" % (
+    sys.version[:3], __distutils2_version__)
+
+# -- Regexps -------------------------------------------------
+EGG_FRAGMENT = re.compile(r'^egg=([-A-Za-z0-9_.]+)$')
+HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I)
+PYPI_MD5 = re.compile(
+    '<a href="([^"#]+)">([^<]+)</a>\n\s+\\(<a (?:title="MD5 hash"\n\s+)'
+    'href="[^?]+\?:action=show_md5&amp;digest=([0-9a-f]{32})">md5</a>\\)')
+URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):', re.I).match
+
+# This pattern matches a character entity reference (a decimal numeric
+# references, a hexadecimal numeric reference, or a named reference).
+ENTITY_SUB = re.compile(r'&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?').sub
+REL = re.compile("""<([^>]*\srel\s*=\s*['"]?([^'">]+)[^>]*)>""", re.I)
+
+
+def socket_timeout(timeout=SOCKET_TIMEOUT):
+    """Decorator to add a socket timeout when requesting pages on PyPI.
+    """
+    def _socket_timeout(func):
+        def _socket_timeout(self, *args, **kwargs):
+            old_timeout = socket.getdefaulttimeout()
+            if hasattr(self, "_timeout"):
+                timeout = self._timeout
+            socket.setdefaulttimeout(timeout)
+            try:
+                return func(self, *args, **kwargs)
+            finally:
+                socket.setdefaulttimeout(old_timeout)
+        return _socket_timeout
+    return _socket_timeout
+
+
+class SimpleIndex(object):
+    """Provides useful tools to request the Python Package Index simple API
+    """
+
+    def __init__(self, index_url=PYPI_DEFAULT_INDEX_URL, hosts=DEFAULT_HOSTS,
+                 follow_externals=False, prefer_source=True,
+                 prefer_final=False, mirrors_url=PYPI_DEFAULT_MIRROR_URL,
+                 mirrors=None, timeout=SOCKET_TIMEOUT):
+        """Class constructor.
+
+        :param index_url: the url of the simple index to search on.
+        :param follow_externals: tell if following external links is needed or
+                                 not. Default is False.
+        :param hosts: a list of hosts allowed to be processed while using
+                      follow_externals=True. Default behavior is to follow all
+                      hosts.
+        :param follow_externals: tell if following external links is needed or
+                                 not. Default is False.
+        :param prefer_source: if there is binary and source distributions, the
+                              source prevails.
+        :param prefer_final: if the version is not mentioned, and the last
+                             version is not a "final" one (alpha, beta, etc.),
+                             pick up the last final version.
+        :param mirrors_url: the url to look on for DNS records giving mirror
+                            adresses.
+        :param mirrors: a list of mirrors to check out if problems
+                             occurs while working with the one given in "url"
+        :param timeout: time in seconds to consider a url has timeouted.
+        """
+        self.follow_externals = follow_externals
+
+        if not index_url.endswith("/"):
+            index_url += "/"
+        self._index_urls = [index_url]
+        # if no mirrors are defined, use the method described in PEP 381.
+        if mirrors is None:
+            try:
+                mirrors = socket.gethostbyname_ex(mirrors_url)[-1]
+            except socket.gaierror:
+                mirrors = []
+        self._index_urls.extend(mirrors)
+        self._current_index_url = 0
+        self._timeout = timeout
+        self._prefer_source = prefer_source
+        self._prefer_final = prefer_final
+
+        # create a regexp to match all given hosts
+        self._allowed_hosts = re.compile('|'.join(map(translate, hosts))).match
+
+        # we keep an index of pages we have processed, in order to avoid
+        # scanning them multple time (eg. if there is multiple pages pointing
+        # on one)
+        self._processed_urls = []
+        self._distributions = {}
+
+    def find(self, requirements, prefer_source=None, prefer_final=None):
+        """Browse the PyPI to find distributions that fullfil the given
+        requirements.
+
+        :param requirements: A project name and it's distribution, using
+                             version specifiers, as described in PEP345.
+        :type requirements:  You can pass either a version.VersionPredicate
+                             or a string.
+        :param prefer_source: if there is binary and source distributions, the
+                              source prevails.
+        :param prefer_final: if the version is not mentioned, and the last
+                             version is not a "final" one (alpha, beta, etc.),
+                             pick up the last final version.
+        """
+        requirements = self._get_version_predicate(requirements)
+        if prefer_source is None:
+            prefer_source = self._prefer_source
+        if prefer_final is None:
+            prefer_final = self._prefer_final
+
+        # process the index for this project
+        self._process_pypi_page(requirements.name)
+
+        # filter with requirements and return the results
+        if requirements.name in self._distributions:
+            dists = self._distributions[requirements.name].filter(requirements)
+            dists.sort_distributions(prefer_source=prefer_source,
+                                     prefer_final=prefer_final)
+        else:
+            dists = []
+
+        return dists
+
+    def get(self, requirements, *args, **kwargs):
+        """Browse the PyPI index to find distributions that fullfil the
+        given requirements, and return the most recent one.
+
+        You can specify prefer_final and prefer_source arguments here.
+        If not, the default one will be used.
+        """
+        predicate = self._get_version_predicate(requirements)
+        dists = self.find(predicate, *args, **kwargs)
+
+        if len(dists) == 0:
+            raise DistributionNotFound(requirements)
+
+        return dists.get_last(predicate)
+
+    def download(self, requirements, temp_path=None, *args, **kwargs):
+        """Download the distribution, using the requirements.
+
+        If more than one distribution match the requirements, use the last
+        version.
+        Download the distribution, and put it in the temp_path. If no temp_path
+        is given, creates and return one.
+
+        Returns the complete absolute path to the downloaded archive.
+
+        :param requirements: The same as the find attribute of `find`.
+
+        You can specify prefer_final and prefer_source arguments here.
+        If not, the default one will be used.
+        """
+        return self.get(requirements, *args, **kwargs)\
+                   .download(path=temp_path)
+
+    def _get_version_predicate(self, requirements):
+        """Return a VersionPredicate object, from a string or an already
+        existing object.
+        """
+        if isinstance(requirements, str):
+            requirements = VersionPredicate(requirements)
+        return requirements
+
+    @property
+    def index_url(self):
+        return self._index_urls[self._current_index_url]
+
+    def _switch_to_next_mirror(self):
+        """Switch to the next mirror (eg. point self.index_url to the next
+        url.
+        """
+        # Internally, iter over the _index_url iterable, if we have read all
+        # of the available indexes, raise an exception.
+        if self._current_index_url < len(self._index_urls):
+            self._current_index_url = self._current_index_url + 1
+        else:
+            raise UnableToDownload("All mirrors fails")
+
+    def _is_browsable(self, url):
+        """Tell if the given URL can be browsed or not.
+
+        It uses the follow_externals and the hosts list to tell if the given
+        url is browsable or not.
+        """
+        # if _index_url is contained in the given URL, we are browsing the
+        # index, and it's always "browsable".
+        # local files are always considered browable resources
+        if self.index_url in url or urlparse.urlparse(url)[0] == "file":
+            return True
+        elif self.follow_externals is True:
+            if self._allowed_hosts(urlparse.urlparse(url)[1]):  # 1 is netloc
+                return True
+            else:
+                return False
+        return False
+
+    def _is_distribution(self, link):
+        """Tell if the given URL matches to a distribution name or not.
+        """
+        #XXX find a better way to check that links are distributions
+        # Using a regexp ?
+        for ext in EXTENSIONS:
+            if ext in link:
+                return True
+        return False
+
+    def _register_dist(self, dist):
+        """Register a distribution as a part of fetched distributions for
+        SimpleIndex.
+
+        Return the PyPIDistributions object for the specified project name
+        """
+        # Internally, check if a entry exists with the project name, if not,
+        # create a new one, and if exists, add the dist to the pool.
+        if not dist.name in self._distributions:
+            self._distributions[dist.name] = PyPIDistributions()
+        self._distributions[dist.name].append(dist)
+        return self._distributions[dist.name]
+
+    def _process_url(self, url, project_name=None, follow_links=True):
+        """Process an url and search for distributions packages.
+
+        For each URL found, if it's a download, creates a PyPIdistribution
+        object. If it's a homepage and we can follow links, process it too.
+
+        :param url: the url to process
+        :param project_name: the project name we are searching for.
+        :param follow_links: Do not want to follow links more than from one
+                             level. This parameter tells if we want to follow
+                             the links we find (eg. run recursively this
+                             method on it)
+        """
+        f = self._open_url(url)
+        base_url = f.url
+        if url not in self._processed_urls:
+            self._processed_urls.append(url)
+            link_matcher = self._get_link_matcher(url)
+            for link, is_download in link_matcher(f.read(), base_url):
+                if link not in self._processed_urls:
+                    if self._is_distribution(link) or is_download:
+                        self._processed_urls.append(link)
+                        # it's a distribution, so create a dist object
+                        dist = PyPIDistribution.from_url(link, project_name,
+                                    is_external=not self.index_url in url)
+                        self._register_dist(dist)
+                    else:
+                        if self._is_browsable(link) and follow_links:
+                            self._process_url(link, project_name,
+                                follow_links=False)
+
+    def _get_link_matcher(self, url):
+        """Returns the right link matcher function of the given url
+        """
+        if self.index_url in url:
+            return self._simple_link_matcher
+        else:
+            return self._default_link_matcher
+
+    def _simple_link_matcher(self, content, base_url):
+        """Yield all links with a rel="download" or rel="homepage".
+
+        This matches the simple index requirements for matching links.
+        If follow_externals is set to False, dont yeld the external
+        urls.
+        """
+        for match in REL.finditer(content):
+            tag, rel = match.groups()
+            rels = map(str.strip, rel.lower().split(','))
+            if 'homepage' in rels or 'download' in rels:
+                for match in HREF.finditer(tag):
+                    url = urlparse.urljoin(base_url,
+                                           self._htmldecode(match.group(1)))
+                    if 'download' in rels or self._is_browsable(url):
+                        # yield a list of (url, is_download)
+                        yield (urlparse.urljoin(base_url, url),
+                               'download' in rels)
+
+    def _default_link_matcher(self, content, base_url):
+        """Yield all links found on the page.
+        """
+        for match in HREF.finditer(content):
+            url = urlparse.urljoin(base_url, self._htmldecode(match.group(1)))
+            if self._is_browsable(url):
+                yield (url, False)
+
+    def _process_pypi_page(self, name):
+        """Find and process a PyPI page for the given project name.
+
+        :param name: the name of the project to find the page
+        """
+        try:
+            # Browse and index the content of the given PyPI page.
+            url = self.index_url + name + "/"
+            self._process_url(url, name)
+        except DownloadError:
+            # if an error occurs, try with the next index_url
+            # (provided by the mirrors)
+            self._switch_to_next_mirror()
+            self._distributions.clear()
+            self._process_pypi_page(name)
+
+    @socket_timeout()
+    def _open_url(self, url):
+        """Open a urllib2 request, handling HTTP authentication, and local
+        files support.
+
+        """
+        try:
+            scheme, netloc, path, params, query, frag = urlparse.urlparse(url)
+
+            if scheme in ('http', 'https'):
+                auth, host = urllib2.splituser(netloc)
+            else:
+                auth = None
+
+            # add index.html automatically for filesystem paths
+            if scheme == 'file':
+                if url.endswith('/'):
+                    url += "index.html"
+
+            if auth:
+                auth = "Basic " + \
+                    urllib2.unquote(auth).encode('base64').strip()
+                new_url = urlparse.urlunparse((
+                    scheme, host, path, params, query, frag))
+                request = urllib2.Request(new_url)
+                request.add_header("Authorization", auth)
+            else:
+                request = urllib2.Request(url)
+            request.add_header('User-Agent', USER_AGENT)
+            fp = urllib2.urlopen(request)
+
+            if auth:
+                # Put authentication info back into request URL if same host,
+                # so that links found on the page will work
+                s2, h2, path2, param2, query2, frag2 = \
+                    urlparse.urlparse(fp.url)
+                if s2 == scheme and h2 == host:
+                    fp.url = urlparse.urlunparse(
+                        (s2, netloc, path2, param2, query2, frag2))
+
+            return fp
+        except (ValueError, httplib.InvalidURL), v:
+            msg = ' '.join([str(arg) for arg in v.args])
+            raise PyPIError('%s %s' % (url, msg))
+        except urllib2.HTTPError, v:
+            return v
+        except urllib2.URLError, v:
+            raise DownloadError("Download error for %s: %s" % (url, v.reason))
+        except httplib.BadStatusLine, v:
+            raise DownloadError('%s returned a bad status line. '
+                'The server might be down, %s' % (url, v.line))
+        except httplib.HTTPException, v:
+            raise DownloadError("Download error for %s: %s" % (url, v))
+
+    def _decode_entity(self, match):
+        what = match.group(1)
+        if what.startswith('#x'):
+            what = int(what[2:], 16)
+        elif what.startswith('#'):
+            what = int(what[1:])
+        else:
+            from htmlentitydefs import name2codepoint
+            what = name2codepoint.get(what, match.group(0))
+        return unichr(what)
+
+    def _htmldecode(self, text):
+        """Decode HTML entities in the given text."""
+        return ENTITY_SUB(self._decode_entity, text)
diff --git a/src/distutils2/tests/pypi_server.py b/src/distutils2/tests/pypi_server.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypi_server.py
@@ -0,0 +1,195 @@
+"""Mocked PyPI Server implementation, to use in tests.
+
+This module also provides a simple test case to extend if you need to use
+the PyPIServer all along your test case. Be sure to read the documentation 
+before any use.
+"""
+
+import Queue
+import threading
+import time
+import urllib2
+from BaseHTTPServer import HTTPServer
+from SimpleHTTPServer import SimpleHTTPRequestHandler
+import os.path
+import select
+
+from distutils2.tests.support import unittest
+
+PYPI_DEFAULT_STATIC_PATH = os.path.dirname(os.path.abspath(__file__)) + "/pypiserver"
+
+def use_pypi_server(*server_args, **server_kwargs):
+    """Decorator to make use of the PyPIServer for test methods, 
+    just when needed, and not for the entire duration of the testcase.
+    """
+    def wrapper(func):
+        def wrapped(*args, **kwargs):
+            server = PyPIServer(*server_args, **server_kwargs)
+            server.start()
+            try:
+                func(server=server, *args, **kwargs)
+            finally:
+                server.stop()
+        return wrapped
+    return wrapper
+
+class PyPIServerTestCase(unittest.TestCase):
+
+    def setUp(self):
+        super(PyPIServerTestCase, self).setUp()
+        self.pypi = PyPIServer()
+        self.pypi.start()
+
+    def tearDown(self):
+        super(PyPIServerTestCase, self).tearDown()
+        self.pypi.stop()
+
+class PyPIServer(threading.Thread):
+    """PyPI Mocked server.
+    Provides a mocked version of the PyPI API's, to ease tests.
+
+    Support serving static content and serving previously given text.
+    """
+
+    def __init__(self, test_static_path=None,
+                 static_filesystem_paths=["default"], static_uri_paths=["simple"]):
+        """Initialize the server.
+
+        static_uri_paths and static_base_path are parameters used to provides
+        respectively the http_paths to serve statically, and where to find the
+        matching files on the filesystem.
+        """
+        threading.Thread.__init__(self)
+        self._run = True
+        self.httpd = HTTPServer(('', 0), PyPIRequestHandler)
+        self.httpd.RequestHandlerClass.log_request = lambda *_: None
+        self.httpd.RequestHandlerClass.pypi_server = self
+        self.address = (self.httpd.server_name, self.httpd.server_port)
+        self.request_queue = Queue.Queue()
+        self._requests = []
+        self.default_response_status = 200
+        self.default_response_headers = [('Content-type', 'text/plain')]
+        self.default_response_data = "hello"
+        
+        # initialize static paths / filesystems
+        self.static_uri_paths = static_uri_paths
+        if test_static_path is not None:
+            static_filesystem_paths.append(test_static_path)
+        self.static_filesystem_paths = [PYPI_DEFAULT_STATIC_PATH + "/" + path
+            for path in static_filesystem_paths]
+
+    def run(self):
+        # loop because we can't stop it otherwise, for python < 2.6
+        while self._run:
+            r, w, e = select.select([self.httpd], [], [], 0.5)
+            if r:
+                self.httpd.handle_request()
+
+    def stop(self):
+        """self shutdown is not supported for python < 2.6"""
+        self._run = False
+
+    def get_next_response(self):
+        return (self.default_response_status,
+                self.default_response_headers,
+                self.default_response_data)
+
+    @property
+    def requests(self):
+        """Use this property to get all requests that have been made
+        to the server
+        """
+        while True:
+            try:
+                self._requests.append(self.request_queue.get_nowait())
+            except Queue.Empty:
+                break
+        return self._requests
+
+    @property
+    def full_address(self):
+        return "http://%s:%s" % self.address
+
+
+class PyPIRequestHandler(SimpleHTTPRequestHandler):
+    # we need to access the pypi server while serving the content
+    pypi_server = None
+
+    def do_POST(self):
+        return self.serve_request()
+    def do_GET(self):
+        return self.serve_request()
+    def do_DELETE(self):
+        return self.serve_request()
+    def do_PUT(self):
+        return self.serve_request()
+
+    def serve_request(self):
+        """Serve the content.
+
+        Also record the requests to be accessed later. If trying to access an
+        url matching a static uri, serve static content, otherwise serve
+        what is provided by the `get_next_response` method.
+        """
+        # record the request. Read the input only on PUT or POST requests
+        if self.command in ("PUT", "POST"):
+            if 'content-length' in self.headers.dict:
+                request_data = self.rfile.read(
+                    int(self.headers['content-length']))
+            else:
+                request_data = self.rfile.read()
+        elif self.command in ("GET", "DELETE"):
+            request_data = ''
+
+        self.pypi_server.request_queue.put((self, request_data))
+
+        # serve the content from local disc if we request an URL beginning
+        # by a pattern defined in `static_paths`
+        url_parts = self.path.split("/")
+        if (len(url_parts) > 1 and 
+                url_parts[1] in self.pypi_server.static_uri_paths):
+            data = None
+            # always take the last first.
+            fs_paths = []
+            fs_paths.extend(self.pypi_server.static_filesystem_paths)
+            fs_paths.reverse()
+            relative_path = self.path
+            for fs_path in fs_paths:
+                try:
+                    if self.path.endswith("/"):
+                        relative_path += "index.html"
+                    file = open(fs_path + relative_path)
+                    data = file.read()
+                    if relative_path.endswith('.tar.gz'):
+                        headers=[('Content-type', 'application/x-gtar')]
+                    else:
+                        headers=[('Content-type', 'text/html')]
+                    self.make_response(data, headers=headers)
+                except IOError:
+                    pass
+
+            if data is None:
+                self.make_response("Not found", 404)
+
+        # otherwise serve the content from get_next_response
+        else:
+            # send back a response
+            status, headers, data = self.pypi_server.get_next_response()
+            self.make_response(data, status, headers)
+
+    def make_response(self, data, status=200,
+                      headers=[('Content-type', 'text/html')]):
+        """Send the response to the HTTP client"""
+        if not isinstance(status, int):
+            try:
+                status = int(status)
+            except ValueError:
+                # we probably got something like YYY Codename. 
+                # Just get the first 3 digits
+                status = int(status[:3])
+
+        self.send_response(status)
+        for header, value in headers:
+            self.send_header(header, value)
+        self.end_headers()
+        self.wfile.write(data)
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/badmd5-0.1.tar.gz b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/badmd5-0.1.tar.gz
new file mode 100644
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/index.html b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/badmd5/index.html
@@ -0,0 +1,3 @@
+<html><body>
+<a href="badmd5-0.1.tar.gz#md5=3e3d86693d6564c807272b11b3069dfe" rel="download">badmd5-0.1.tar.gz</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz
new file mode 100644
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/index.html b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/index.html
@@ -0,0 +1,3 @@
+<html><body>
+<a href="foobar-0.1.tar.gz#md5=d41d8cd98f00b204e9800998ecf8427e" rel="download">foobar-0.1.tar.gz</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/downloads_with_md5/simple/index.html b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/downloads_with_md5/simple/index.html
@@ -0,0 +1,2 @@
+<a href="foobar/">foobar/</a> 
+<a href="badmd5/">badmd5/</a> 
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/bar/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/bar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/bar/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for bar</title></head><body><h1>Links for bar</h1>
+<a rel="download" href="../../packages/source/F/bar/bar-1.0.tar.gz">bar-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/bar/bar-1.0.1.tar.gz">bar-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/bar/bar-2.0.tar.gz">bar-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/bar/bar-2.0.1.tar.gz">bar-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/baz/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/baz/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/baz/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for baz</title></head><body><h1>Links for baz</h1>
+<a rel="download" href="../../packages/source/F/baz/baz-1.0.tar.gz">baz-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/baz/baz-1.0.1.tar.gz">baz-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/baz/baz-2.0.tar.gz">baz-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/baz/baz-2.0.1.tar.gz">baz-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/foo/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/foo/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/foo/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for foo</title></head><body><h1>Links for foo</h1>
+<a rel="download" href="../../packages/source/F/foo/foo-1.0.tar.gz">foo-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/foo/foo-1.0.1.tar.gz">foo-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/foo/foo-2.0.tar.gz">foo-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/foo/foo-2.0.1.tar.gz">foo-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/foo_bar_baz/simple/index.html b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/foo_bar_baz/simple/index.html
@@ -0,0 +1,3 @@
+<a href="foo/">foo/</a> 
+<a href="bar/">bar/</a> 
+<a href="baz/">baz/</a> 
diff --git a/src/distutils2/tests/pypiserver/test_found_links/simple/foobar/index.html b/src/distutils2/tests/pypiserver/test_found_links/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_found_links/simple/foobar/index.html
@@ -0,0 +1,6 @@
+<html><head><title>Links for Foobar</title></head><body><h1>Links for Foobar</h1>
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-1.0.tar.gz#md5=98fa833fdabcdd78d00245aead66c174">Foobar-1.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-1.0.1.tar.gz#md5=2351efb20f6b7b5d9ce80fa4cb1bd9ca">Foobar-1.0.1.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-2.0.tar.gz#md5=98fa833fdabcdd78d00245aead66c274">Foobar-2.0.tar.gz</a><br/> 
+<a rel="download" href="../../packages/source/F/Foobar/Foobar-2.0.1.tar.gz#md5=2352efb20f6b7b5d9ce80fa4cb2bd9ca">Foobar-2.0.1.tar.gz</a><br/> 
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/test_found_links/simple/index.html b/src/distutils2/tests/pypiserver/test_found_links/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_found_links/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/pypiserver/test_pypi_server/external/index.html b/src/distutils2/tests/pypiserver/test_pypi_server/external/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_pypi_server/external/index.html
@@ -0,0 +1,1 @@
+index.html from external server
diff --git a/src/distutils2/tests/pypiserver/test_pypi_server/simple/index.html b/src/distutils2/tests/pypiserver/test_pypi_server/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/test_pypi_server/simple/index.html
@@ -0,0 +1,1 @@
+Yeah
diff --git a/src/distutils2/tests/pypiserver/with_externals/external/external.html b/src/distutils2/tests/pypiserver/with_externals/external/external.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_externals/external/external.html
@@ -0,0 +1,3 @@
+<html><body>
+<a href="/foobar-0.1.tar.gz#md5=1__bad_md5___">bad old link</a>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_externals/simple/foobar/index.html b/src/distutils2/tests/pypiserver/with_externals/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_externals/simple/foobar/index.html
@@ -0,0 +1,4 @@
+<html><body>
+<a rel ="download" href="/foobar-0.1.tar.gz#md5=12345678901234567">foobar-0.1.tar.gz</a><br/>
+<a href="../../external/external.html" rel="homepage">external homepage</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_externals/simple/index.html b/src/distutils2/tests/pypiserver/with_externals/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_externals/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/external/homepage.html b/src/distutils2/tests/pypiserver/with_norel_links/external/homepage.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/external/homepage.html
@@ -0,0 +1,7 @@
+<html>
+<body>
+<p>a rel=homepage HTML page</p>
+<a href="/foobar-2.0.tar.gz">foobar 2.0</a>
+</body>
+</html>
+
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/external/nonrel.html b/src/distutils2/tests/pypiserver/with_norel_links/external/nonrel.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/external/nonrel.html
@@ -0,0 +1,1 @@
+A page linked without rel="download" or rel="homepage" link.
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/simple/foobar/index.html b/src/distutils2/tests/pypiserver/with_norel_links/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/simple/foobar/index.html
@@ -0,0 +1,6 @@
+<html><body>
+<a rel="download" href="/foobar-0.1.tar.gz" rel="download">foobar-0.1.tar.gz</a><br/>
+<a href="../../external/homepage.html" rel="homepage">external homepage</a><br/>
+<a href="../../external/nonrel.html">unrelated link</a><br/>
+<a href="/unrelated-0.2.tar.gz">unrelated download</a></br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_norel_links/simple/index.html b/src/distutils2/tests/pypiserver/with_norel_links/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_norel_links/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/pypiserver/with_real_externals/simple/foobar/index.html b/src/distutils2/tests/pypiserver/with_real_externals/simple/foobar/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_real_externals/simple/foobar/index.html
@@ -0,0 +1,4 @@
+<html><body>
+<a rel="download" href="/foobar-0.1.tar.gz#md5=0_correct_md5">foobar-0.1.tar.gz</a><br/>
+<a href="http://a-really-external-website/external/external.html" rel="homepage">external homepage</a><br/>
+</body></html>
diff --git a/src/distutils2/tests/pypiserver/with_real_externals/simple/index.html b/src/distutils2/tests/pypiserver/with_real_externals/simple/index.html
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/pypiserver/with_real_externals/simple/index.html
@@ -0,0 +1,1 @@
+<a href="foobar/">foobar/</a> 
diff --git a/src/distutils2/tests/test_pypi_dist.py b/src/distutils2/tests/test_pypi_dist.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_pypi_dist.py
@@ -0,0 +1,251 @@
+"""Tests for the distutils2.pypi.dist module."""
+
+import os
+import shutil
+import tempfile
+
+from distutils2.tests.pypi_server import use_pypi_server
+from distutils2.tests import support
+from distutils2.tests.support import unittest
+from distutils2.version import VersionPredicate
+from distutils2.pypi.errors import HashDoesNotMatch, UnsupportedHashName
+from distutils2.pypi.dist import (PyPIDistribution as Dist,
+                                  PyPIDistributions as Dists,
+                                  split_archive_name)
+
+
+class TestPyPIDistribution(support.TempdirManager,
+                           unittest.TestCase):
+    """tests the pypi.dist.PyPIDistribution class"""
+
+    def test_instanciation(self):
+        """Test the Distribution class provides us the good attributes when
+        given on construction"""
+        dist = Dist("FooBar", "1.1")
+        self.assertEqual("FooBar", dist.name)
+        self.assertEqual("1.1", "%s" % dist.version)
+
+    def test_from_url(self):
+        """Test that the Distribution object can be built from a single URL"""
+        url_list = {
+            'FooBar-1.1.0.tar.gz': {
+                'name': 'foobar',  # lowercase the name
+                'version': '1.1',
+            },
+            'Foo-Bar-1.1.0.zip': {
+                'name': 'foo-bar',  # keep the dash
+                'version': '1.1',
+            },
+            'foobar-1.1b2.tar.gz#md5=123123123123123': {
+                'name': 'foobar',
+                'version': '1.1b2',
+                'url': {
+                    'url': 'http://test.tld/foobar-1.1b2.tar.gz',  # no hash
+                    'hashval': '123123123123123',
+                    'hashname': 'md5',
+                }
+            },
+            'foobar-1.1-rc2.tar.gz': {  # use suggested name
+                'name': 'foobar',
+                'version': '1.1c2',
+                'url': {
+                    'url': 'http://test.tld/foobar-1.1-rc2.tar.gz',
+                }
+            }
+        }
+
+        for url, attributes in url_list.items():
+            dist = Dist.from_url("http://test.tld/" + url)
+            for attribute, value in attributes.items():
+                if isinstance(value, dict):
+                    mylist = getattr(dist, attribute)
+                    for val in value.keys():
+                        self.assertEqual(value[val], mylist[val])
+                else:
+                    if attribute == "version":
+                        self.assertEqual("%s" % getattr(dist, "version"), value)
+                    else:
+                        self.assertEqual(getattr(dist, attribute), value)
+
+    def test_get_url(self):
+        """Test that the url property works well"""
+
+        d = Dist("foobar", "1.1", url="test_url")
+        self.assertDictEqual(d.url, {
+            "url": "test_url",
+            "is_external": True,
+            "hashname": None,
+            "hashval": None,
+        })
+
+        # add a new url
+        d.add_url(url="internal_url", is_external=False)
+        self.assertEqual(d._url, None)
+        self.assertDictEqual(d.url, {
+            "url": "internal_url",
+            "is_external": False,
+            "hashname": None,
+            "hashval": None,
+        })
+        self.assertEqual(2, len(d._urls))
+
+    def test_comparaison(self):
+        """Test that we can compare PyPIDistributions"""
+        foo1 = Dist("foo", "1.0")
+        foo2 = Dist("foo", "2.0")
+        bar = Dist("bar", "2.0")
+        # assert we use the version to compare
+        self.assertTrue(foo1 < foo2)
+        self.assertFalse(foo1 > foo2)
+        self.assertFalse(foo1 == foo2)
+
+        # assert we can't compare dists with different names
+        self.assertRaises(TypeError, foo1.__eq__, bar)
+
+    def test_split_archive_name(self):
+        """Test we can split the archive names"""
+        names = {
+            'foo-bar-baz-1.0-rc2': ('foo-bar-baz', '1.0c2'),
+            'foo-bar-baz-1.0': ('foo-bar-baz', '1.0'),
+            'foobarbaz-1.0': ('foobarbaz', '1.0'),
+        }
+        for name, results in names.items():
+            self.assertEqual(results, split_archive_name(name))
+
+    @use_pypi_server("downloads_with_md5")
+    def test_download(self, server):
+        """Download is possible, and the md5 is checked if given"""
+
+        url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address
+        # check md5 if given
+        dist = Dist("FooBar", "0.1", url=url,
+            url_hashname="md5", url_hashval="d41d8cd98f00b204e9800998ecf8427e")
+        dist.download()
+
+        # a wrong md5 fails
+        dist2 = Dist("FooBar", "0.1", url=url,
+            url_hashname="md5", url_hashval="wrongmd5")
+        self.assertRaises(HashDoesNotMatch, dist2.download)
+
+        # we can omit the md5 hash
+        dist3 = Dist("FooBar", "0.1", url=url)
+        dist3.download()
+
+        # and specify a temporary location
+        # for an already downloaded dist
+        path1 = tempfile.mkdtemp()
+        dist3.download(path=path1)
+        # and for a new one
+        path2_base = tempfile.mkdtemp()
+        dist4 = Dist("FooBar", "0.1", url=url)
+        path2 = dist4.download(path=path2_base)
+        self.assertTrue(path2_base in path2)
+
+        # remove the temp folders
+        shutil.rmtree(path1)
+        shutil.rmtree(os.path.dirname(path2))
+
+    def test_hashname(self):
+        """Invalid hashnames raises an exception on assignation"""
+        # should be ok
+        Dist("FooBar", "0.1", url_hashname="md5", url_hashval="value")
+
+        self.assertRaises(UnsupportedHashName, Dist, "FooBar", "0.1",
+                          url_hashname="invalid_hashname", url_hashval="value")
+
+
+class TestPyPIDistributions(unittest.TestCase):
+    """test the pypi.distr.PyPIDistributions class"""
+
+    def test_filter(self):
+        """Test we filter the distributions the right way, using version
+        predicate match method"""
+        dists = Dists((
+            Dist("FooBar", "1.1"),
+            Dist("FooBar", "1.1.1"),
+            Dist("FooBar", "1.2"),
+            Dist("FooBar", "1.2.1"),
+        ))
+        filtered = dists.filter(VersionPredicate("FooBar (<1.2)"))
+        self.assertNotIn(dists[2], filtered)
+        self.assertNotIn(dists[3], filtered)
+        self.assertIn(dists[0], filtered)
+        self.assertIn(dists[1], filtered)
+
+    def test_append(self):
+        """Test the append method of PyPIDistributions"""
+        # When adding a new item to the list, the behavior is to test if
+        # a distribution with the same name and version number already exists,
+        # and if so, to add url informations to the existing PyPIDistribution
+        # object.
+        # If no object matches, just add "normally" the object to the list.
+
+        dists = Dists([
+            Dist("FooBar", "1.1", url="external_url", type="source"),
+        ])
+        self.assertEqual(1, len(dists))
+        dists.append(Dist("FooBar", "1.1", url="internal_url",
+                          url_is_external=False, type="source"))
+        self.assertEqual(1, len(dists))
+        self.assertEqual(2, len(dists[0]._urls))
+
+        dists.append(Dist("Foobar", "1.1.1", type="source"))
+        self.assertEqual(2, len(dists))
+
+        # when adding a distribution whith a different type, a new distribution
+        # has to be added.
+        dists.append(Dist("Foobar", "1.1.1", type="binary"))
+        self.assertEqual(3, len(dists))
+
+    def test_prefer_final(self):
+        """Ordering support prefer_final"""
+
+        fb10 = Dist("FooBar", "1.0")  # final distribution
+        fb11a = Dist("FooBar", "1.1a1")  # alpha
+        fb12a = Dist("FooBar", "1.2a1")  # alpha
+        fb12b = Dist("FooBar", "1.2b1")  # beta
+        dists = Dists([fb10, fb11a, fb12a, fb12b])
+
+        dists.sort_distributions(prefer_final=True)
+        self.assertEqual(fb10, dists[0])
+
+        dists.sort_distributions(prefer_final=False)
+        self.assertEqual(fb12b, dists[0])
+
+    def test_prefer_source(self):
+        """Ordering support prefer_source"""
+        fb_source = Dist("FooBar", "1.0", type="source")
+        fb_binary = Dist("FooBar", "1.0", type="binary")
+        fb2_binary = Dist("FooBar", "2.0", type="binary")
+        dists = Dists([fb_binary, fb_source])
+
+        dists.sort_distributions(prefer_source=True)
+        self.assertEqual(fb_source, dists[0])
+
+        dists.sort_distributions(prefer_source=False)
+        self.assertEqual(fb_binary, dists[0])
+
+        dists.append(fb2_binary)
+        dists.sort_distributions(prefer_source=True)
+        self.assertEqual(fb2_binary, dists[0])
+
+    def test_get_same_name_and_version(self):
+        """PyPIDistributions can return a list of "duplicates"
+        """
+        fb_source = Dist("FooBar", "1.0", type="source")
+        fb_binary = Dist("FooBar", "1.0", type="binary")
+        fb2_binary = Dist("FooBar", "2.0", type="binary")
+        dists = Dists([fb_binary, fb_source, fb2_binary])
+        duplicates = dists.get_same_name_and_version()
+        self.assertTrue(1, len(duplicates))
+        self.assertIn(fb_source, duplicates[0])
+
+
+def test_suite():
+    suite = unittest.TestSuite()
+    suite.addTest(unittest.makeSuite(TestPyPIDistribution))
+    suite.addTest(unittest.makeSuite(TestPyPIDistributions))
+    return suite
+
+if __name__ == '__main__':
+    run_unittest(test_suite())
diff --git a/src/distutils2/tests/test_pypi_server.py b/src/distutils2/tests/test_pypi_server.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_pypi_server.py
@@ -0,0 +1,70 @@
+"""Tests for distutils.command.bdist."""
+import urllib
+import urllib2
+import os.path
+
+from distutils2.tests.pypi_server import PyPIServer, PYPI_DEFAULT_STATIC_PATH
+from distutils2.tests.support import unittest
+
+
+class PyPIServerTest(unittest.TestCase):
+
+    def test_records_requests(self):
+        """We expect that PyPIServer can log our requests"""
+        server = PyPIServer()
+        server.start()
+        self.assertEqual(len(server.requests), 0)
+
+        data = "Rock Around The Bunker"
+        headers = {"X-test-header": "Mister Iceberg"}
+
+        request = urllib2.Request(server.full_address, data, headers)
+        urllib2.urlopen(request)
+        self.assertEqual(len(server.requests), 1)
+        handler, request_data = server.requests[-1]
+        self.assertIn("Rock Around The Bunker", request_data)
+        self.assertIn("x-test-header", handler.headers.dict)
+        self.assertEqual(handler.headers.dict["x-test-header"],
+                         "Mister Iceberg")
+        server.stop()
+
+    def test_serve_static_content(self):
+        """PYPI Mocked server can serve static content from disk.
+        """
+
+        def uses_local_files_for(server, url_path):
+            """Test that files are served statically (eg. the output from the
+            server is the same than the one made by a simple file read.
+            """
+            url = server.full_address + url_path
+            request = urllib2.Request(url)
+            response = urllib2.urlopen(request)
+            file = open(PYPI_DEFAULT_STATIC_PATH + "/test_pypi_server" +
+               url_path)
+            return response.read() == file.read()
+
+        server = PyPIServer(static_uri_paths=["simple", "external"],
+            static_filesystem_paths=["test_pypi_server"])
+        server.start()
+
+        # the file does not exists on the disc, so it might not be served
+        url = server.full_address + "/simple/unexisting_page"
+        request = urllib2.Request(url)
+        try:
+            urllib2.urlopen(request)
+        except urllib2.HTTPError,e:
+            self.assertEqual(e.code, 404)
+
+        # now try serving a content that do exists
+        self.assertTrue(uses_local_files_for(server, "/simple/index.html"))
+
+        # and another one in another root path
+        self.assertTrue(uses_local_files_for(server, "/external/index.html"))
+        server.stop()
+
+
+def test_suite():
+    return unittest.makeSuite(PyPIServerTest)
+
+if __name__ == '__main__':
+    unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_pypi_simple.py b/src/distutils2/tests/test_pypi_simple.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_pypi_simple.py
@@ -0,0 +1,290 @@
+"""Tests for the pypi.simple module.
+
+"""
+import sys
+import os
+import shutil
+import tempfile
+import urllib2
+
+from distutils2.tests import support
+from distutils2.tests.support import unittest
+from distutils2.tests.pypi_server import use_pypi_server, PyPIServer, \
+                                         PYPI_DEFAULT_STATIC_PATH
+from distutils2.pypi import simple
+
+from distutils2.errors import DistutilsError
+
+
+class PyPISimpleTestCase(support.TempdirManager,
+                         unittest.TestCase):
+
+    def _get_simple_index(self, server, base_url="/simple/", hosts=None,
+        *args, **kwargs):
+        """Build and return a SimpleSimpleIndex instance, with the test server
+        urls
+        """
+        if hosts is None:
+            hosts = (server.full_address.strip("http://"),)
+        kwargs['hosts'] = hosts
+        return simple.SimpleIndex(server.full_address + base_url, *args,
+            **kwargs)
+
+    def test_bad_urls(self):
+        index = simple.SimpleIndex()
+        url = 'http://127.0.0.1:0/nonesuch/test_simple'
+        try:
+            v = index._open_url(url)
+        except Exception, v:
+            self.assertTrue(url in str(v))
+        else:
+            self.assertTrue(isinstance(v, urllib2.HTTPError))
+
+        # issue 16
+        # easy_install inquant.contentmirror.plone breaks because of a typo
+        # in its home URL
+        index = simple.SimpleIndex(
+            hosts=('www.example.com',))
+
+        url = 'url:%20https://svn.plone.org/svn/collective/inquant.contentmirror.plone/trunk'
+        try:
+            v = index._open_url(url)
+        except Exception, v:
+            self.assertTrue(url in str(v))
+        else:
+            self.assertTrue(isinstance(v, urllib2.HTTPError))
+
+        def _urlopen(*args):
+            import httplib
+            raise httplib.BadStatusLine('line')
+
+        old_urlopen = urllib2.urlopen
+        urllib2.urlopen = _urlopen
+        url = 'http://example.com'
+        try:
+            try:
+                v = index._open_url(url)
+            except Exception, v:
+                self.assertTrue('line' in str(v))
+            else:
+                raise AssertionError('Should have raise here!')
+        finally:
+            urllib2.urlopen = old_urlopen
+
+        # issue 20
+        url = 'http://http://svn.pythonpaste.org/Paste/wphp/trunk'
+        try:
+            index._open_url(url)
+        except Exception, v:
+            self.assertTrue('nonnumeric port' in str(v))
+
+        # issue #160
+        if sys.version_info[0] == 2 and sys.version_info[1] == 7:
+            # this should not fail
+            url = 'http://example.com'
+            page = ('<a href="http://www.famfamfam.com]('
+                    'http://www.famfamfam.com/">')
+            index.process_index(url, page)
+
+    @use_pypi_server("test_found_links")
+    def test_found_links(self, server):
+        """Browse the index, asking for a specified distribution version
+        """
+        # The PyPI index contains links for version 1.0, 1.1, 2.0 and 2.0.1
+        index = self._get_simple_index(server)
+        last_distribution = index.get("foobar")
+
+        # we have scanned the index page
+        self.assertIn(server.full_address + "/simple/foobar/",
+            index._processed_urls)
+
+        # we have found 4 distributions in this page
+        self.assertEqual(len(index._distributions["foobar"]), 4)
+
+        # and returned the most recent one
+        self.assertEqual("%s" % last_distribution.version, '2.0.1')
+
+    def test_is_browsable(self):
+        index = simple.SimpleIndex(follow_externals=False)
+        self.assertTrue(index._is_browsable(index.index_url + "test"))
+
+        # Now, when following externals, we can have a list of hosts to trust.
+        # and don't follow other external links than the one described here.
+        index = simple.SimpleIndex(hosts=["pypi.python.org", "test.org"],
+                                   follow_externals=True)
+        good_urls = (
+            "http://pypi.python.org/foo/bar",
+            "http://pypi.python.org/simple/foobar",
+            "http://test.org",
+            "http://test.org/",
+            "http://test.org/simple/",
+        )
+        bad_urls = (
+            "http://python.org",
+            "http://test.tld",
+        )
+
+        for url in good_urls:
+            self.assertTrue(index._is_browsable(url))
+
+        for url in bad_urls:
+            self.assertFalse(index._is_browsable(url))
+
+        # allow all hosts
+        index = simple.SimpleIndex(follow_externals=True, hosts=("*",))
+        self.assertTrue(index._is_browsable("http://an-external.link/path"))
+        self.assertTrue(index._is_browsable("pypi.test.tld/a/path"))
+
+        # specify a list of hosts we want to allow
+        index = simple.SimpleIndex(follow_externals=True,
+                                   hosts=("*.test.tld",))
+        self.assertFalse(index._is_browsable("http://an-external.link/path"))
+        self.assertTrue(index._is_browsable("http://pypi.test.tld/a/path"))
+
+    @use_pypi_server("with_externals")
+    def test_restrict_hosts(self, server):
+        """Include external pages
+        """
+        # Try to request the package index, wich contains links to "externals"
+        # resources. They have to  be scanned too.
+        index = self._get_simple_index(server, follow_externals=True)
+        index.get("foobar")
+        self.assertIn(server.full_address + "/external/external.html",
+            index._processed_urls)
+
+    @use_pypi_server("with_real_externals")
+    def test_restrict_hosts(self, server):
+        """Only use a list of allowed hosts is possible
+        """
+        # Test that telling the simple pyPI client to not retrieve external
+        # works
+        index = self._get_simple_index(server, follow_externals=False)
+        index.get("foobar")
+        self.assertNotIn(server.full_address + "/external/external.html",
+            index._processed_urls)
+
+    @use_pypi_server("with_egg_files")
+    def test_scan_egg_files(self, server):
+        """Assert that egg files are indexed as well"""
+        pass
+
+    @use_pypi_server(static_filesystem_paths=["with_externals"],
+        static_uri_paths=["simple", "external"])
+    def test_links_priority(self, server):
+        """
+        Download links from the pypi simple index should be used before
+        external download links.
+        http://bitbucket.org/tarek/distribute/issue/163/md5-validation-error
+
+        Usecase :
+        - someone uploads a package on pypi, a md5 is generated
+        - someone manually coindexes this link (with the md5 in the url) onto
+          an external page accessible from the package page.
+        - someone reuploads the package (with a different md5)
+        - while easy_installing, an MD5 error occurs because the external link
+          is used
+        -> The index should use the link from pypi, not the external one.
+        """
+        # start an index server
+        index_url = server.full_address + '/simple/'
+
+        # scan a test index
+        index = simple.SimpleIndex(index_url, follow_externals=True)
+        dists = index.find("foobar")
+        server.stop()
+
+        # we have only one link, because links are compared without md5
+        self.assertEqual(len(dists), 1)
+        # the link should be from the index
+        self.assertEqual('12345678901234567', dists[0].url['hashval'])
+        self.assertEqual('md5', dists[0].url['hashname'])
+
+    @use_pypi_server(static_filesystem_paths=["with_norel_links"],
+        static_uri_paths=["simple", "external"])
+    def test_not_scan_all_links(self, server):
+        """Do not follow all index page links.
+        The links not tagged with rel="download" and rel="homepage" have
+        to not be processed by the package index, while processing "pages".
+        """
+        # process the pages
+        index = self._get_simple_index(server, follow_externals=True)
+        index.find("foobar")
+        # now it should have processed only pages with links rel="download"
+        # and rel="homepage"
+        self.assertIn("%s/simple/foobar/" % server.full_address,
+            index._processed_urls)  # it's the simple index page
+        self.assertIn("%s/external/homepage.html" % server.full_address,
+            index._processed_urls)  # the external homepage is rel="homepage"
+        self.assertNotIn("%s/external/nonrel.html" % server.full_address,
+            index._processed_urls)  # this link contains no rel=*
+        self.assertNotIn("%s/unrelated-0.2.tar.gz" % server.full_address,
+            index._processed_urls)  # linked from simple index (no rel)
+        self.assertIn("%s/foobar-0.1.tar.gz" % server.full_address,
+            index._processed_urls)  # linked from simple index (rel)
+        self.assertIn("%s/foobar-2.0.tar.gz" % server.full_address,
+            index._processed_urls)  # linked from external homepage (rel)
+
+    def test_uses_mirrors(self):
+        """When the main repository seems down, try using the given mirrors"""
+        server = PyPIServer("foo_bar_baz")
+        mirror = PyPIServer("foo_bar_baz")
+        mirror.start()  # we dont start the server here
+
+        try:
+            # create the index using both servers
+            index = simple.SimpleIndex(server.full_address + "/simple/",
+                hosts=('*',), timeout=1,  # set the timeout to 1s for the tests
+                mirrors=[mirror.full_address + "/simple/",])
+
+            # this should not raise a timeout
+            self.assertEqual(4, len(index.find("foo")))
+        finally:
+            mirror.stop()
+
+    def test_simple_link_matcher(self):
+        """Test that the simple link matcher yields the right links"""
+        index = simple.SimpleIndex(follow_externals=False)
+
+        # Here, we define:
+        #   1. one link that must be followed, cause it's a download one
+        #   2. one link that must *not* be followed, cause the is_browsable
+        #      returns false for it.
+        #   3. one link that must be followed cause it's a homepage that is
+        #      browsable
+        self.assertTrue(index._is_browsable("%stest" % index.index_url))
+        self.assertFalse(index._is_browsable("http://dl-link2"))
+        content = """
+        <a href="http://dl-link1" rel="download">download_link1</a>
+        <a href="http://dl-link2" rel="homepage">homepage_link1</a>
+        <a href="%stest" rel="homepage">homepage_link2</a>
+        """ % index.index_url
+
+        # Test that the simple link matcher yield the good links.
+        generator = index._simple_link_matcher(content, index.index_url)
+        self.assertEqual(('http://dl-link1', True), generator.next())
+        self.assertEqual(('%stest' % index.index_url, False),
+                         generator.next())
+        self.assertRaises(StopIteration, generator.next)
+
+        # Follow the external links is possible
+        index.follow_externals = True
+        generator = index._simple_link_matcher(content, index.index_url)
+        self.assertEqual(('http://dl-link1', True), generator.next())
+        self.assertEqual(('http://dl-link2', False), generator.next())
+        self.assertEqual(('%stest' % index.index_url, False),
+                         generator.next())
+        self.assertRaises(StopIteration, generator.next)
+
+    def test_browse_local_files(self):
+        """Test that we can browse local files"""
+        index_path = os.sep.join(["file://" + PYPI_DEFAULT_STATIC_PATH,
+                                  "test_found_links", "simple"])
+        index = simple.SimpleIndex(index_path)
+        dists = index.find("foobar")
+        self.assertEqual(4, len(dists))
+
+def test_suite():
+    return unittest.makeSuite(PyPISimpleTestCase)
+
+if __name__ == '__main__':
+    run_unittest(test_suite())
diff --git a/src/distutils2/tests/test_upload.py b/src/distutils2/tests/test_upload.py
--- a/src/distutils2/tests/test_upload.py
+++ b/src/distutils2/tests/test_upload.py
@@ -1,34 +1,15 @@
 """Tests for distutils.command.upload."""
 # -*- encoding: utf8 -*-
-import sys
-import os
+import os, sys
 
-from distutils2.command import upload as upload_mod
 from distutils2.command.upload import upload
 from distutils2.core import Distribution
 
+from distutils2.tests.pypi_server import PyPIServer, PyPIServerTestCase
 from distutils2.tests import support
 from distutils2.tests.support import unittest
 from distutils2.tests.test_config import PYPIRC, PyPIRCCommandTestCase
 
-PYPIRC_LONG_PASSWORD = """\
-[distutils]
-
-index-servers =
-    server1
-    server2
-
-[server1]
-username:me
-password:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
-
-[server2]
-username:meagain
-password: secret
-realm:acme
-repository:http://another.pypi/
-"""
-
 
 PYPIRC_NOPASSWORD = """\
 [distutils]
@@ -40,47 +21,18 @@
 username:me
 """
 
-class FakeOpen(object):
-
-    def __init__(self, url):
-        self.url = url
-        if not isinstance(url, str):
-            self.req = url
-        else:
-            self.req = None
-        self.msg = 'OK'
-
-    def getcode(self):
-        return 200
-
-
-class uploadTestCase(PyPIRCCommandTestCase):
-
-    def setUp(self):
-        super(uploadTestCase, self).setUp()
-        self.old_open = upload_mod.urlopen
-        upload_mod.urlopen = self._urlopen
-        self.last_open = None
-
-    def tearDown(self):
-        upload_mod.urlopen = self.old_open
-        super(uploadTestCase, self).tearDown()
-
-    def _urlopen(self, url):
-        self.last_open = FakeOpen(url)
-        return self.last_open
+class UploadTestCase(PyPIServerTestCase, PyPIRCCommandTestCase):
 
     def test_finalize_options(self):
-
         # new format
         self.write_file(self.rc, PYPIRC)
         dist = Distribution()
         cmd = upload(dist)
         cmd.finalize_options()
-        for attr, expected in (('username', 'me'), ('password', 'secret'),
-                               ('realm', 'pypi'),
-                               ('repository', 'http://pypi.python.org/pypi')):
-            self.assertEqual(getattr(cmd, attr), expected)
+        for attr, waited in (('username', 'me'), ('password', 'secret'),
+                             ('realm', 'pypi'),
+                             ('repository', 'http://pypi.python.org/pypi')):
+            self.assertEqual(getattr(cmd, attr), waited)
 
     def test_saved_password(self):
         # file with no password
@@ -89,7 +41,7 @@
         # make sure it passes
         dist = Distribution()
         cmd = upload(dist)
-        cmd.finalize_options()
+        cmd.ensure_finalized()
         self.assertEqual(cmd.password, None)
 
         # make sure we get it as well, if another command
@@ -100,33 +52,31 @@
         self.assertEqual(cmd.password, 'xxx')
 
     def test_upload(self):
-        tmp = self.mkdtemp()
-        path = os.path.join(tmp, 'xxx')
+        path = os.path.join(self.tmp_dir, 'xxx')
         self.write_file(path)
         command, pyversion, filename = 'xxx', '2.6', path
         dist_files = [(command, pyversion, filename)]
-        self.write_file(self.rc, PYPIRC_LONG_PASSWORD)
 
         # lets run it
         pkg_dir, dist = self.create_dist(dist_files=dist_files, author=u'dédé')
         cmd = upload(dist)
         cmd.ensure_finalized()
+        cmd.repository = self.pypi.full_address
         cmd.run()
 
         # what did we send ?
-        self.assertIn('dédé', self.last_open.req.data)
-        headers = dict(self.last_open.req.headers)
-        self.assertTrue(int(headers['Content-length']) < 2000)
-        self.assertTrue(headers['Content-type'].startswith('multipart/form-data'))
-        self.assertEqual(self.last_open.req.get_method(), 'POST')
-        self.assertEqual(self.last_open.req.get_full_url(),
-                          'http://pypi.python.org/pypi')
-        self.assertTrue('xxx' in self.last_open.req.data)
-        auth = self.last_open.req.headers['Authorization']
-        self.assertFalse('\n' in auth)
+        handler, request_data = self.pypi.requests[-1]
+        headers = handler.headers.dict
+        self.assertIn('dédé', request_data)
+        self.assertIn('xxx', request_data)
+        self.assertEqual(int(headers['content-length']), len(request_data))
+        self.assertTrue(int(headers['content-length']) < 2000)
+        self.assertTrue(headers['content-type'].startswith('multipart/form-data'))
+        self.assertEqual(handler.command, 'POST')
+        self.assertNotIn('\n', headers['authorization'])
 
 def test_suite():
-    return unittest.makeSuite(uploadTestCase)
+    return unittest.makeSuite(UploadTestCase)
 
 if __name__ == "__main__":
     unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_upload_docs.py b/src/distutils2/tests/test_upload_docs.py
new file mode 100644
--- /dev/null
+++ b/src/distutils2/tests/test_upload_docs.py
@@ -0,0 +1,198 @@
+"""Tests for distutils.command.upload_docs."""
+# -*- encoding: utf8 -*-
+import httplib, os, os.path, shutil, sys, tempfile, zipfile
+from cStringIO import StringIO
+
+from distutils2.command import upload_docs as upload_docs_mod
+from distutils2.command.upload_docs import (upload_docs, zip_dir,
+                                    encode_multipart)
+from distutils2.core import Distribution
+
+from distutils2.errors import DistutilsFileError, DistutilsOptionError
+
+from distutils2.tests import support
+from distutils2.tests.pypi_server import PyPIServer, PyPIServerTestCase
+from distutils2.tests.test_config import PyPIRCCommandTestCase
+from distutils2.tests.support import unittest
+
+
+EXPECTED_MULTIPART_OUTPUT = "\r\n".join([
+'---x',
+'Content-Disposition: form-data; name="a"',
+'',
+'b',
+'---x',
+'Content-Disposition: form-data; name="c"',
+'',
+'d',
+'---x',
+'Content-Disposition: form-data; name="e"; filename="f"',
+'',
+'g',
+'---x',
+'Content-Disposition: form-data; name="h"; filename="i"',
+'',
+'j',
+'---x--',
+'',
+])
+
+PYPIRC = """\
+[distutils]
+index-servers = server1
+
+[server1]
+repository = %s
+username = real_slim_shady
+password = long_island
+"""
+
+class UploadDocsTestCase(PyPIServerTestCase, PyPIRCCommandTestCase):
+
+    def setUp(self):
+        super(UploadDocsTestCase, self).setUp()
+        self.dist = Distribution()
+        self.dist.metadata['Name'] = "distr-name"
+        self.cmd = upload_docs(self.dist)
+
+    def test_default_uploaddir(self):
+        sandbox = tempfile.mkdtemp()
+        previous = os.getcwd()
+        os.chdir(sandbox)
+        try:
+            os.mkdir("build")
+            self.prepare_sample_dir("build")
+            self.cmd.ensure_finalized()
+            self.assertEqual(self.cmd.upload_dir, os.path.join("build", "docs"))
+        finally:
+            os.chdir(previous)
+
+    def prepare_sample_dir(self, sample_dir=None):
+        if sample_dir is None:
+            sample_dir = tempfile.mkdtemp()
+        os.mkdir(os.path.join(sample_dir, "docs"))
+        self.write_file(os.path.join(sample_dir, "docs", "index.html"), "Ce mortel ennui")
+        self.write_file(os.path.join(sample_dir, "index.html"), "Oh la la")
+        return sample_dir
+
+    def test_zip_dir(self):
+        source_dir = self.prepare_sample_dir()
+        compressed = zip_dir(source_dir)
+
+        zip_f = zipfile.ZipFile(compressed)
+        self.assertEqual(zip_f.namelist(), ['index.html', 'docs/index.html'])
+
+    def test_encode_multipart(self):
+        fields = [("a", "b"), ("c", "d")]
+        files = [("e", "f", "g"), ("h", "i", "j")]
+        content_type, body = encode_multipart(fields, files, "-x")
+        self.assertEqual(content_type, "multipart/form-data; boundary=-x")
+        self.assertEqual(body, EXPECTED_MULTIPART_OUTPUT)
+
+    def prepare_command(self):
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        self.cmd.ensure_finalized()
+        self.cmd.repository = self.pypi.full_address
+        self.cmd.username = "username"
+        self.cmd.password = "password"
+
+    def test_upload(self):
+        self.prepare_command()
+        self.cmd.run()
+
+        self.assertEqual(len(self.pypi.requests), 1)
+        handler, request_data = self.pypi.requests[-1]
+        self.assertIn("content", request_data)
+        self.assertIn("Basic", handler.headers.dict['authorization'])
+        self.assertTrue(handler.headers.dict['content-type']
+            .startswith('multipart/form-data;'))
+
+        action, name, content =\
+            request_data.split("----------------GHSKFJDLGDS7543FJKLFHRE75642756743254")[1:4]
+
+        # check that we picked the right chunks
+        self.assertIn('name=":action"', action)
+        self.assertIn('name="name"', name)
+        self.assertIn('name="content"', content)
+
+        # check their contents
+        self.assertIn("doc_upload", action)
+        self.assertIn("distr-name", name)
+        self.assertIn("docs/index.html", content)
+        self.assertIn("Ce mortel ennui", content)
+
+    def test_https_connection(self):
+        https_called = False
+        orig_https = upload_docs_mod.httplib.HTTPSConnection
+        def https_conn_wrapper(*args):
+            https_called = True
+            return upload_docs_mod.httplib.HTTPConnection(*args) # the testing server is http
+        upload_docs_mod.httplib.HTTPSConnection = https_conn_wrapper
+        try:
+            self.prepare_command()
+            self.cmd.run()
+            self.assertFalse(https_called)
+
+            self.cmd.repository = self.cmd.repository.replace("http", "https")
+            self.cmd.run()
+            self.assertFalse(https_called)
+        finally:
+            upload_docs_mod.httplib.HTTPSConnection = orig_https
+
+    def test_handling_response(self):
+        calls = []
+        def aggr(*args):
+            calls.append(args)
+        self.pypi.default_response_status = '403 Forbidden'
+        self.prepare_command()
+        self.cmd.announce = aggr
+        self.cmd.run()
+        message, _ = calls[-1]
+        self.assertIn('Upload failed (403): Forbidden', message)
+
+        calls = []
+        self.pypi.default_response_status = '301 Moved Permanently'
+        self.pypi.default_response_headers.append(("Location", "brand_new_location"))
+        self.cmd.run()
+        message, _ = calls[-1]
+        self.assertIn('brand_new_location', message)
+
+    def test_reads_pypirc_data(self):
+        self.write_file(self.rc, PYPIRC % self.pypi.full_address)
+        self.cmd.repository = self.pypi.full_address
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        self.cmd.ensure_finalized()
+        self.assertEqual(self.cmd.username, "real_slim_shady")
+        self.assertEqual(self.cmd.password, "long_island")
+
+    def test_checks_index_html_presence(self):
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        os.remove(os.path.join(self.cmd.upload_dir, "index.html"))
+        self.assertRaises(DistutilsFileError, self.cmd.ensure_finalized)
+
+    def test_checks_upload_dir(self):
+        self.cmd.upload_dir = self.prepare_sample_dir()
+        shutil.rmtree(os.path.join(self.cmd.upload_dir))
+        self.assertRaises(DistutilsOptionError, self.cmd.ensure_finalized)
+
+    def test_show_response(self):
+        orig_stdout = sys.stdout
+        write_args = []
+        class MockStdIn(object):
+            def write(self, arg):
+                write_args.append(arg)
+        sys.stdout = MockStdIn()
+        try:
+            self.prepare_command()
+            self.cmd.show_response = True
+            self.cmd.run()
+        finally:
+            sys.stdout = orig_stdout
+        self.assertTrue(write_args[0], "should report the response")
+        self.assertIn(self.pypi.default_response_data + "\n", write_args[0])
+
+def test_suite():
+    return unittest.makeSuite(UploadDocsTestCase)
+
+if __name__ == "__main__":
+    unittest.main(defaultTest="test_suite")
diff --git a/src/distutils2/tests/test_version.py b/src/distutils2/tests/test_version.py
--- a/src/distutils2/tests/test_version.py
+++ b/src/distutils2/tests/test_version.py
@@ -154,6 +154,17 @@
         # XXX need to silent the micro version in this case
         #assert not VersionPredicate('Ho (<3.0,!=2.6)').match('2.6.3')
 
+    def test_is_final(self):
+        # VersionPredicate knows is a distribution is a final one or not.
+        final_versions = ('1.0', '1.0.post456')
+        other_versions = ('1.0.dev1', '1.0a2', '1.0c3')
+
+        for version in final_versions:
+            self.assertTrue(V(version).is_final)
+        for version in other_versions:
+            self.assertFalse(V(version).is_final)
+
+
 def test_suite():
     #README = os.path.join(os.path.dirname(__file__), 'README.txt')
     #suite = [doctest.DocFileSuite(README), unittest.makeSuite(VersionTestCase)]
@@ -162,4 +173,3 @@
 
 if __name__ == "__main__":
     unittest.main(defaultTest="test_suite")
-
diff --git a/src/distutils2/version.py b/src/distutils2/version.py
--- a/src/distutils2/version.py
+++ b/src/distutils2/version.py
@@ -36,6 +36,7 @@
     (?P<postdev>(\.post(?P<post>\d+))?(\.dev(?P<dev>\d+))?)?
     $''', re.VERBOSE)
 
+
 class NormalizedVersion(object):
     """A rational version.
 
@@ -61,7 +62,8 @@
         @param error_on_huge_major_num {bool} Whether to consider an
             apparent use of a year or full date as the major version number
             an error. Default True. One of the observed patterns on PyPI before
-            the introduction of `NormalizedVersion` was version numbers like this:
+            the introduction of `NormalizedVersion` was version numbers like
+            this:
                 2009.01.03
                 20040603
                 2005.01
@@ -71,6 +73,7 @@
             the possibility of using a version number like "1.0" (i.e.
             where the major number is less than that huge major number).
         """
+        self.is_final = True  # by default, consider a version as final.
         self._parse(s, error_on_huge_major_num)
 
     @classmethod
@@ -101,6 +104,7 @@
             block += self._parse_numdots(groups.get('prerelversion'), s,
                                          pad_zeros_length=1)
             parts.append(tuple(block))
+            self.is_final = False
         else:
             parts.append(_FINAL_MARKER)
 
@@ -115,6 +119,7 @@
                     postdev.append(_FINAL_MARKER[0])
             if dev is not None:
                 postdev.extend(['dev', int(dev)])
+                self.is_final = False
             parts.append(tuple(postdev))
         else:
             parts.append(_FINAL_MARKER)
@@ -204,6 +209,7 @@
     # See http://docs.python.org/reference/datamodel#object.__hash__
     __hash__ = object.__hash__
 
+
 def suggest_normalized_version(s):
     """Suggest a normalized version close to the given version string.
 
@@ -215,7 +221,7 @@
     on observation of versions currently in use on PyPI. Given a dump of
     those version during PyCon 2009, 4287 of them:
     - 2312 (53.93%) match NormalizedVersion without change
-    - with the automatic suggestion
+      with the automatic suggestion
     - 3474 (81.04%) match when using this suggestion method
 
     @param s {str} An irrational version string.
@@ -305,7 +311,6 @@
     # PyPI stats: ~21 (0.62%) better
     rs = re.sub(r"\.?(pre|preview|-c)(\d+)$", r"c\g<2>", rs)
 
-
     # Tcl/Tk uses "px" for their post release markers
     rs = re.sub(r"p(\d+)$", r".post\1", rs)
 
@@ -322,6 +327,7 @@
 _PLAIN_VERSIONS = re.compile(r"^\s*(.*)\s*$")
 _SPLIT_CMP = re.compile(r"^\s*(<=|>=|<|>|!=|==)\s*([^\s,]+)\s*$")
 
+
 def _split_predicate(predicate):
     match = _SPLIT_CMP.match(predicate)
     if match is None:
@@ -368,6 +374,7 @@
                 return False
         return True
 
+
 class _Versions(VersionPredicate):
     def __init__(self, predicate):
         predicate = predicate.strip()
@@ -379,6 +386,7 @@
         self.predicates = [_split_predicate(pred.strip())
                            for pred in predicates.split(',')]
 
+
 class _Version(VersionPredicate):
     def __init__(self, predicate):
         predicate = predicate.strip()
@@ -388,6 +396,7 @@
         self.name = None
         self.predicates = _split_predicate(match.groups()[0])
 
+
 def is_valid_predicate(predicate):
     try:
         VersionPredicate(predicate)
@@ -396,6 +405,7 @@
     else:
         return True
 
+
 def is_valid_versions(predicate):
     try:
         _Versions(predicate)
@@ -404,6 +414,7 @@
     else:
         return True
 
+
 def is_valid_version(predicate):
     try:
         _Version(predicate)
@@ -411,4 +422,3 @@
         return False
     else:
         return True
-
diff --git a/src/setup.py b/src/setup.py
--- a/src/setup.py
+++ b/src/setup.py
@@ -3,8 +3,12 @@
 __revision__ = "$Id$"
 import sys
 import os
+import re
 
-from distutils2.core import setup
+from distutils2 import __version__ as VERSION
+from distutils2 import log
+from distutils2.core import setup, Extension
+from distutils2.compiler.ccompiler import new_compiler
 from distutils2.command.sdist import sdist
 from distutils2.command.install import install
 from distutils2 import __version__ as VERSION
@@ -31,6 +35,7 @@
 
 DEV_SUFFIX = '.dev%d' % get_tip_revision('..')
 
+
 class install_hg(install):
 
     user_options = install.user_options + [
@@ -62,10 +67,141 @@
             self.distribution.metadata.version += DEV_SUFFIX
         sdist.run(self)
 
+
+# additional paths to check, set from the command line
+SSL_INCDIR = ''   # --openssl-incdir=
+SSL_LIBDIR = ''   # --openssl-libdir=
+SSL_DIR = ''      # --openssl-prefix=
+
+def add_dir_to_list(dirlist, dir):
+    """Add the directory 'dir' to the list 'dirlist' (at the front) if
+    'dir' actually exists and is a directory.  If 'dir' is already in
+    'dirlist' it is moved to the front.
+    """
+    if dir is not None and os.path.isdir(dir) and dir not in dirlist:
+        if dir in dirlist:
+            dirlist.remove(dir)
+        dirlist.insert(0, dir)
+
+
+def prepare_hashlib_extensions():
+    """Decide which C extensions to build and create the appropriate
+    Extension objects to build them.  Return a list of Extensions.
+    """
+    # this CCompiler object is only used to locate include files
+    compiler = new_compiler()
+
+    # Ensure that these paths are always checked
+    if os.name == 'posix':
+        add_dir_to_list(compiler.library_dirs, '/usr/local/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/local/include')
+
+        add_dir_to_list(compiler.library_dirs, '/usr/local/ssl/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/local/ssl/include')
+
+        add_dir_to_list(compiler.library_dirs, '/usr/contrib/ssl/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/contrib/ssl/include')
+
+        add_dir_to_list(compiler.library_dirs, '/usr/lib')
+        add_dir_to_list(compiler.include_dirs, '/usr/include')
+
+    # look in command line supplied paths
+    if SSL_LIBDIR:
+        add_dir_to_list(compiler.library_dirs, SSL_LIBDIR)
+    if SSL_INCDIR:
+        add_dir_to_list(compiler.include_dirs, SSL_INCDIR)
+    if SSL_DIR:
+        if os.name == 'nt':
+            add_dir_to_list(compiler.library_dirs, os.path.join(SSL_DIR, 'out32dll'))
+            # prefer the static library
+            add_dir_to_list(compiler.library_dirs, os.path.join(SSL_DIR, 'out32'))
+        else:
+            add_dir_to_list(compiler.library_dirs, os.path.join(SSL_DIR, 'lib'))
+        add_dir_to_list(compiler.include_dirs, os.path.join(SSL_DIR, 'include'))
+
+    oslibs = {'posix': ['ssl', 'crypto'],
+              'nt': ['libeay32',  'gdi32', 'advapi32', 'user32']}
+
+    if os.name not in oslibs:
+        sys.stderr.write(
+            'unknown operating system, impossible to compile _hashlib')
+        sys.exit(1)
+
+    exts = []
+
+    ssl_inc_dirs = []
+    ssl_incs = []
+    for inc_dir in compiler.include_dirs:
+        f = os.path.join(inc_dir, 'openssl', 'ssl.h')
+        if os.path.exists(f):
+            ssl_incs.append(f)
+            ssl_inc_dirs.append(inc_dir)
+
+    ssl_lib = compiler.find_library_file(compiler.library_dirs, oslibs[os.name][0])
+
+    # find out which version of OpenSSL we have
+    openssl_ver = 0
+    openssl_ver_re = re.compile(
+        '^\s*#\s*define\s+OPENSSL_VERSION_NUMBER\s+(0x[0-9a-fA-F]+)' )
+    ssl_inc_dir = ''
+    for ssl_inc_dir in ssl_inc_dirs:
+        name = os.path.join(ssl_inc_dir, 'openssl', 'opensslv.h')
+        if os.path.isfile(name):
+            try:
+                incfile = open(name, 'r')
+                for line in incfile:
+                    m = openssl_ver_re.match(line)
+                    if m:
+                        openssl_ver = int(m.group(1), 16)
+                        break
+            except IOError:
+                pass
+
+        # first version found is what we'll use
+        if openssl_ver:
+            break
+
+    if (ssl_inc_dir and ssl_lib is not None and openssl_ver >= 0x00907000):
+
+        log.info('Using OpenSSL version 0x%08x from', openssl_ver)
+        log.info(' Headers:\t%s', ssl_inc_dir)
+        log.info(' Library:\t%s', ssl_lib)
+
+        # The _hashlib module wraps optimized implementations
+        # of hash functions from the OpenSSL library.
+        exts.append(Extension('_hashlib', ['_hashopenssl.c'],
+                              include_dirs = [ssl_inc_dir],
+                              library_dirs = [os.path.dirname(ssl_lib)],
+                              libraries = oslibs[os.name]))
+    else:
+        exts.append(Extension('_sha', ['shamodule.c']) )
+        exts.append(Extension('_md5',
+                              sources=['md5module.c', 'md5.c'],
+                              depends=['md5.h']) )
+
+    if (not ssl_lib or openssl_ver < 0x00908000):
+        # OpenSSL doesn't do these until 0.9.8 so we'll bring our own
+        exts.append(Extension('_sha256', ['sha256module.c']))
+        exts.append(Extension('_sha512', ['sha512module.c']))
+
+    def prepend_modules(filename):
+        return os.path.join('Modules', filename)
+
+    # all the C code is in the Modules subdirectory, prepend the path
+    for ext in exts:
+        ext.sources = [prepend_modules(fn) for fn in ext.sources]
+        if hasattr(ext, 'depends') and ext.depends is not None:
+            ext.depends = [prepend_modules(fn) for fn in ext.depends]
+
+    return exts
+
 setup_kwargs = {}
 if sys.version < '2.6':
     setup_kwargs['scripts'] = ['distutils2/mkpkg.py']
 
+if sys.version < '2.5':
+    setup_kwargs['ext_modules'] = prepare_hashlib_extensions()
+
 _CLASSIFIERS = """\
 Development Status :: 3 - Alpha
 Intended Audience :: Developers
@@ -77,26 +213,23 @@
 Topic :: System :: Systems Administration
 Topic :: Utilities"""
 
-setup (name="Distutils2",
-       version=VERSION,
-       summary="Python Distribution Utilities",
-       keywords=['packaging', 'distutils'],
-       author="Tarek Ziade",
-       author_email="tarek at ziade.org",
-       home_page="http://bitbucket.org/tarek/distutils2/wiki/Home",
-       license="PSF",
-       description=README,
-       classifier=_CLASSIFIERS.split('\n'),
-       packages=find_packages(),
-       cmdclass={'sdist': sdist_hg, 'install': install_hg},
-       package_data={'distutils2._backport': ['sysconfig.cfg']},
-       project_url=[('Mailing list',
+setup(name="Distutils2",
+      version=VERSION,
+      summary="Python Distribution Utilities",
+      keywords=['packaging', 'distutils'],
+      author="Tarek Ziade",
+      author_email="tarek at ziade.org",
+      home_page="http://bitbucket.org/tarek/distutils2/wiki/Home",
+      license="PSF",
+      description=README,
+      classifier=_CLASSIFIERS.split('\n'),
+      packages=find_packages(),
+      cmdclass={'sdist': sdist_hg, 'install': install_hg},
+      package_data={'distutils2._backport': ['sysconfig.cfg']},
+      project_url=[('Mailing-list',
                     'http://mail.python.org/mailman/listinfo/distutils-sig/'),
-                    ('Documentation',
-                     'http://packages.python.org/Distutils2'),
-                    ('Repository', 'http://hg.python.org/distutils2'),
-                    ('Bug tracker', 'http://bugs.python.org')],
-       **setup_kwargs
-       )
-
-
+                   ('Documentation',
+                    'http://packages.python.org/Distutils2'),
+                   ('Repository', 'http://hg.python.org/distutils2'),
+                   ('Bug tracker', 'http://bugs.python.org')],
+      **setup_kwargs)

--
Repository URL: http://hg.python.org/distutils2


More information about the Python-checkins mailing list