[Python-checkins] distutils2 (python3): Start a branch to provide Distutils2 for Python 3.

eric.araujo python-checkins at python.org
Sun Sep 25 01:42:30 CEST 2011


http://hg.python.org/distutils2/rev/d9ce66d22bb5
changeset:   1194:d9ce66d22bb5
branch:      python3
user:        Éric Araujo <merwok at netwok.org>
date:        Sat Sep 24 01:06:28 2011 +0200
summary:
  Start a branch to provide Distutils2 for Python 3.

This codebase is compatible with 3.1, 3.2 and 3.3.  It was converted with 2to3
and a semi-automated diff/merge with packaging in 3.3 to fix some idioms.
We’ve now come full circle from 2.x to 3.x to 2.x to 3.x again :)

Starting from now, contributors can make patches for packaging (preferred, as
the stdlib’s regrtest is very useful), distutils2 or distutils-python3, and
we’ll make patches flow between versions.

files:
  PC/build_ssl.py                                   |  282 ---
  distutils2/_backport/_hashopenssl.c               |  524 ------
  distutils2/_backport/hashlib.py                   |  143 -
  distutils2/_backport/md5.c                        |  381 ----
  distutils2/_backport/md5.h                        |   91 -
  distutils2/_backport/md5module.c                  |  312 ----
  distutils2/_backport/sha256module.c               |  701 ---------
  distutils2/_backport/sha512module.c               |  769 ----------
  distutils2/_backport/shamodule.c                  |  593 -------
  distutils2/_backport/shutil.py                    |   39 +-
  distutils2/_backport/sysconfig.cfg                |    4 +-
  distutils2/_backport/sysconfig.py                 |   46 +-
  distutils2/_backport/tarfile.py                   |  304 +--
  distutils2/_backport/tests/test_shutil.py         |   17 +-
  distutils2/_backport/tests/test_sysconfig.py      |   54 +-
  distutils2/command/bdist_msi.py                   |   10 +-
  distutils2/command/bdist_wininst.py               |   36 +-
  distutils2/command/build_clib.py                  |    4 +-
  distutils2/command/build_ext.py                   |   35 +-
  distutils2/command/build_py.py                    |    4 +-
  distutils2/command/build_scripts.py               |   18 +-
  distutils2/command/cmd.py                         |   12 +-
  distutils2/command/config.py                      |   23 +-
  distutils2/command/install_data.py                |    2 +-
  distutils2/command/install_dist.py                |   56 +-
  distutils2/command/install_distinfo.py            |   26 +-
  distutils2/command/install_lib.py                 |    2 +-
  distutils2/command/install_scripts.py             |    2 +-
  distutils2/command/register.py                    |   37 +-
  distutils2/command/sdist.py                       |    8 +-
  distutils2/command/upload.py                      |   29 +-
  distutils2/command/upload_docs.py                 |   18 +-
  distutils2/compat.py                              |  126 +-
  distutils2/compiler/__init__.py                   |    4 +-
  distutils2/compiler/bcppcompiler.py               |   12 +-
  distutils2/compiler/ccompiler.py                  |   21 +-
  distutils2/compiler/cygwinccompiler.py            |   11 +-
  distutils2/compiler/extension.py                  |    6 +-
  distutils2/compiler/msvc9compiler.py              |   16 +-
  distutils2/compiler/msvccompiler.py               |   12 +-
  distutils2/compiler/unixccompiler.py              |    8 +-
  distutils2/config.py                              |   31 +-
  distutils2/create.py                              |  120 +-
  distutils2/database.py                            |   38 +-
  distutils2/depgraph.py                            |   49 +-
  distutils2/dist.py                                |   36 +-
  distutils2/fancy_getopt.py                        |   10 +-
  distutils2/install.py                             |   21 +-
  distutils2/manifest.py                            |   18 +-
  distutils2/markers.py                             |   18 +-
  distutils2/metadata.py                            |   25 +-
  distutils2/pypi/base.py                           |    2 +-
  distutils2/pypi/dist.py                           |   24 +-
  distutils2/pypi/simple.py                         |   73 +-
  distutils2/pypi/wrapper.py                        |    8 +-
  distutils2/pypi/xmlrpc.py                         |    6 +-
  distutils2/run.py                                 |   67 +-
  distutils2/tests/__init__.py                      |   10 +-
  distutils2/tests/__main__.py                      |    2 +-
  distutils2/tests/pypi_server.py                   |   36 +-
  distutils2/tests/pypi_test_server.py              |    2 +-
  distutils2/tests/support.py                       |   31 +-
  distutils2/tests/test_command_build_clib.py       |    2 +-
  distutils2/tests/test_command_build_ext.py        |   19 +-
  distutils2/tests/test_command_build_py.py         |    4 +-
  distutils2/tests/test_command_build_scripts.py    |    5 +-
  distutils2/tests/test_command_config.py           |    5 +-
  distutils2/tests/test_command_install_dist.py     |   26 +-
  distutils2/tests/test_command_install_distinfo.py |   45 +-
  distutils2/tests/test_command_install_lib.py      |    5 +-
  distutils2/tests/test_command_install_scripts.py  |    5 +-
  distutils2/tests/test_command_register.py         |   39 +-
  distutils2/tests/test_command_sdist.py            |   30 +-
  distutils2/tests/test_command_test.py             |    6 +-
  distutils2/tests/test_command_upload.py           |   14 +-
  distutils2/tests/test_command_upload_docs.py      |   32 +-
  distutils2/tests/test_compiler.py                 |    4 +-
  distutils2/tests/test_config.py                   |   14 +-
  distutils2/tests/test_create.py                   |   29 +-
  distutils2/tests/test_database.py                 |   63 +-
  distutils2/tests/test_depgraph.py                 |    2 +-
  distutils2/tests/test_dist.py                     |   16 +-
  distutils2/tests/test_install.py                  |    6 +-
  distutils2/tests/test_manifest.py                 |   12 +-
  distutils2/tests/test_markers.py                  |    3 +-
  distutils2/tests/test_metadata.py                 |   64 +-
  distutils2/tests/test_mixin2to3.py                |   33 +-
  distutils2/tests/test_msvc9compiler.py            |   10 +-
  distutils2/tests/test_pypi_server.py              |   33 +-
  distutils2/tests/test_pypi_simple.py              |   59 +-
  distutils2/tests/test_pypi_xmlrpc.py              |    5 +-
  distutils2/tests/test_run.py                      |   11 +-
  distutils2/tests/test_uninstall.py                |    2 +-
  distutils2/tests/test_util.py                     |   72 +-
  distutils2/tests/xxmodule.c                       |  507 +++---
  distutils2/util.py                                |  132 +-
  distutils2/version.py                             |    8 +-
  runtests.py                                       |    6 -
  setup.py                                          |  143 +-
  tests.sh                                          |   29 +-
  100 files changed, 1215 insertions(+), 5710 deletions(-)


diff --git a/PC/build_ssl.py b/PC/build_ssl.py
deleted file mode 100644
--- a/PC/build_ssl.py
+++ /dev/null
@@ -1,282 +0,0 @@
-# Script for building the _ssl and _hashlib modules for Windows.
-# Uses Perl to setup the OpenSSL environment correctly
-# and build OpenSSL, then invokes a simple nmake session
-# for the actual _ssl.pyd and _hashlib.pyd DLLs.
-
-# THEORETICALLY, you can:
-# * Unpack the latest SSL release one level above your main Python source
-#   directory.  It is likely you will already find the zlib library and
-#   any other external packages there.
-# * Install ActivePerl and ensure it is somewhere on your path.
-# * Run this script from the PCBuild directory.
-#
-# it should configure and build SSL, then build the _ssl and _hashlib
-# Python extensions without intervention.
-
-# Modified by Christian Heimes
-# Now this script supports pre-generated makefiles and assembly files.
-# Developers don't need an installation of Perl anymore to build Python. A svn
-# checkout from our svn repository is enough.
-#
-# In Order to create the files in the case of an update you still need Perl.
-# Run build_ssl in this order:
-# python.exe build_ssl.py Release x64
-# python.exe build_ssl.py Release Win32
-
-import os, sys, re, shutil
-
-# Find all "foo.exe" files on the PATH.
-def find_all_on_path(filename, extras = None):
-    entries = os.environ["PATH"].split(os.pathsep)
-    ret = []
-    for p in entries:
-        fname = os.path.abspath(os.path.join(p, filename))
-        if os.path.isfile(fname) and fname not in ret:
-            ret.append(fname)
-    if extras:
-        for p in extras:
-            fname = os.path.abspath(os.path.join(p, filename))
-            if os.path.isfile(fname) and fname not in ret:
-                ret.append(fname)
-    return ret
-
-# Find a suitable Perl installation for OpenSSL.
-# cygwin perl does *not* work.  ActivePerl does.
-# Being a Perl dummy, the simplest way I can check is if the "Win32" package
-# is available.
-def find_working_perl(perls):
-    for perl in perls:
-        fh = os.popen('"%s" -e "use Win32;"' % perl)
-        fh.read()
-        rc = fh.close()
-        if rc:
-            continue
-        return perl
-    print("Can not find a suitable PERL:")
-    if perls:
-        print(" the following perl interpreters were found:")
-        for p in perls:
-            print(" ", p)
-        print(" None of these versions appear suitable for building OpenSSL")
-    else:
-        print(" NO perl interpreters were found on this machine at all!")
-    print(" Please install ActivePerl and ensure it appears on your path")
-    return None
-
-# Locate the best SSL directory given a few roots to look into.
-def find_best_ssl_dir(sources):
-    candidates = []
-    for s in sources:
-        try:
-            # note: do not abspath s; the build will fail if any
-            # higher up directory name has spaces in it.
-            fnames = os.listdir(s)
-        except os.error:
-            fnames = []
-        for fname in fnames:
-            fqn = os.path.join(s, fname)
-            if os.path.isdir(fqn) and fname.startswith("openssl-"):
-                candidates.append(fqn)
-    # Now we have all the candidates, locate the best.
-    best_parts = []
-    best_name = None
-    for c in candidates:
-        parts = re.split("[.-]", os.path.basename(c))[1:]
-        # eg - openssl-0.9.7-beta1 - ignore all "beta" or any other qualifiers
-        if len(parts) >= 4:
-            continue
-        if parts > best_parts:
-            best_parts = parts
-            best_name = c
-    if best_name is not None:
-        print("Found an SSL directory at '%s'" % (best_name,))
-    else:
-        print("Could not find an SSL directory in '%s'" % (sources,))
-    sys.stdout.flush()
-    return best_name
-
-def create_makefile64(makefile, m32):
-    """Create and fix makefile for 64bit
-
-    Replace 32 with 64bit directories
-    """
-    if not os.path.isfile(m32):
-        return
-    with open(m32) as fin:
-        with open(makefile, 'w') as fout:
-            for line in fin:
-                line = line.replace("=tmp32", "=tmp64")
-                line = line.replace("=out32", "=out64")
-                line = line.replace("=inc32", "=inc64")
-                # force 64 bit machine
-                line = line.replace("MKLIB=lib", "MKLIB=lib /MACHINE:X64")
-                line = line.replace("LFLAGS=", "LFLAGS=/MACHINE:X64 ")
-                # don't link against the lib on 64bit systems
-                line = line.replace("bufferoverflowu.lib", "")
-                fout.write(line)
-    os.unlink(m32)
-
-def fix_makefile(makefile):
-    """Fix some stuff in all makefiles
-    """
-    if not os.path.isfile(makefile):
-        return
-    with open(makefile) as fin:
-        lines = fin.readlines()
-    with open(makefile, 'w') as fout:
-        for line in lines:
-            if line.startswith("PERL="):
-                continue
-            if line.startswith("CP="):
-                line = "CP=copy\n"
-            if line.startswith("MKDIR="):
-                line = "MKDIR=mkdir\n"
-            if line.startswith("CFLAG="):
-                line = line.strip()
-                for algo in ("RC5", "MDC2", "IDEA"):
-                    noalgo = " -DOPENSSL_NO_%s" % algo
-                    if noalgo not in line:
-                        line = line + noalgo
-                line = line + '\n'
-            fout.write(line)
-
-def run_configure(configure, do_script):
-    print("perl Configure "+configure+" no-idea no-mdc2")
-    os.system("perl Configure "+configure+" no-idea no-mdc2")
-    print(do_script)
-    os.system(do_script)
-
-def cmp(f1, f2):
-    bufsize = 1024 * 8
-    with open(f1, 'rb') as fp1, open(f2, 'rb') as fp2:
-        while True:
-            b1 = fp1.read(bufsize)
-            b2 = fp2.read(bufsize)
-            if b1 != b2:
-                return False
-            if not b1:
-                return True
-
-def copy(src, dst):
-    if os.path.isfile(dst) and cmp(src, dst):
-        return
-    shutil.copy(src, dst)
-
-def main():
-    build_all = "-a" in sys.argv
-    # Default to 'Release' configuration on for the 'Win32' platform
-    try:
-        configuration, platform = sys.argv[1:3]
-    except ValueError:
-        configuration, platform = 'Release', 'Win32'
-    if configuration == "Release":
-        debug = False
-    elif configuration == "Debug":
-        debug = True
-    else:
-        raise ValueError(str(sys.argv))
-
-    if platform == "Win32":
-        arch = "x86"
-        configure = "VC-WIN32"
-        do_script = "ms\\do_nasm"
-        makefile="ms\\nt.mak"
-        m32 = makefile
-        dirsuffix = "32"
-    elif platform == "x64":
-        arch="amd64"
-        configure = "VC-WIN64A"
-        do_script = "ms\\do_win64a"
-        makefile = "ms\\nt64.mak"
-        m32 = makefile.replace('64', '')
-        dirsuffix = "64"
-        #os.environ["VSEXTCOMP_USECL"] = "MS_OPTERON"
-    else:
-        raise ValueError(str(sys.argv))
-
-    make_flags = ""
-    if build_all:
-        make_flags = "-a"
-    # perl should be on the path, but we also look in "\perl" and "c:\\perl"
-    # as "well known" locations
-    perls = find_all_on_path("perl.exe", ["\\perl\\bin", "C:\\perl\\bin"])
-    perl = find_working_perl(perls)
-    if perl:
-        print("Found a working perl at '%s'" % (perl,))
-    else:
-        print("No Perl installation was found. Existing Makefiles are used.")
-    sys.stdout.flush()
-    # Look for SSL 2 levels up from pcbuild - ie, same place zlib etc all live.
-    ssl_dir = find_best_ssl_dir(("..\\..",))
-    if ssl_dir is None:
-        sys.exit(1)
-
-    old_cd = os.getcwd()
-    try:
-        os.chdir(ssl_dir)
-        # rebuild makefile when we do the role over from 32 to 64 build
-        if arch == "amd64" and os.path.isfile(m32) and not os.path.isfile(makefile):
-            os.unlink(m32)
-
-        # If the ssl makefiles do not exist, we invoke Perl to generate them.
-        # Due to a bug in this script, the makefile sometimes ended up empty
-        # Force a regeneration if it is.
-        if not os.path.isfile(makefile) or os.path.getsize(makefile)==0:
-            if perl is None:
-                print("Perl is required to build the makefiles!")
-                sys.exit(1)
-
-            print("Creating the makefiles...")
-            sys.stdout.flush()
-            # Put our working Perl at the front of our path
-            os.environ["PATH"] = os.path.dirname(perl) + \
-                                          os.pathsep + \
-                                          os.environ["PATH"]
-            run_configure(configure, do_script)
-            if debug:
-                print("OpenSSL debug builds aren't supported.")
-            #if arch=="x86" and debug:
-            #    # the do_masm script in openssl doesn't generate a debug
-            #    # build makefile so we generate it here:
-            #    os.system("perl util\mk1mf.pl debug "+configure+" >"+makefile)
-
-            if arch == "amd64":
-                create_makefile64(makefile, m32)
-            fix_makefile(makefile)
-            copy(r"crypto\buildinf.h", r"crypto\buildinf_%s.h" % arch)
-            copy(r"crypto\opensslconf.h", r"crypto\opensslconf_%s.h" % arch)
-
-        # If the assembler files don't exist in tmpXX, copy them there
-        if perl is None and os.path.exists("asm"+dirsuffix):
-            if not os.path.exists("tmp"+dirsuffix):
-                os.mkdir("tmp"+dirsuffix)
-            for f in os.listdir("asm"+dirsuffix):
-                if not f.endswith(".asm"): continue
-                if os.path.isfile(r"tmp%s\%s" % (dirsuffix, f)): continue
-                shutil.copy(r"asm%s\%s" % (dirsuffix, f), "tmp"+dirsuffix)
-
-        # Now run make.
-        if arch == "amd64":
-            rc = os.system("ml64 -c -Foms\\uptable.obj ms\\uptable.asm")
-            if rc:
-                print("ml64 assembler has failed.")
-                sys.exit(rc)
-
-        copy(r"crypto\buildinf_%s.h" % arch, r"crypto\buildinf.h")
-        copy(r"crypto\opensslconf_%s.h" % arch, r"crypto\opensslconf.h")
-
-        #makeCommand = "nmake /nologo PERL=\"%s\" -f \"%s\"" %(perl, makefile)
-        makeCommand = "nmake /nologo -f \"%s\"" % makefile
-        print("Executing ssl makefiles: " + makeCommand)
-        sys.stdout.flush()
-        rc = os.system(makeCommand)
-        if rc:
-            print("Executing "+makefile+" failed")
-            print(rc)
-            sys.exit(rc)
-    finally:
-        os.chdir(old_cd)
-    sys.exit(rc)
-
-if __name__=='__main__':
-    main()
diff --git a/distutils2/_backport/_hashopenssl.c b/distutils2/_backport/_hashopenssl.c
deleted file mode 100644
--- a/distutils2/_backport/_hashopenssl.c
+++ /dev/null
@@ -1,524 +0,0 @@
-/* Module that wraps all OpenSSL hash algorithms */
-
-/*
- * Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
- * Licensed to PSF under a Contributor Agreement.
- *
- * Derived from a skeleton of shamodule.c containing work performed by:
- *
- * Andrew Kuchling (amk at amk.ca)
- * Greg Stein (gstein at lyra.org)
- *
- */
-
-#define PY_SSIZE_T_CLEAN
-
-#include "Python.h"
-#include "structmember.h"
-
-#if (PY_VERSION_HEX < 0x02050000)
-#define Py_ssize_t      int
-#endif
-
-/* EVP is the preferred interface to hashing in OpenSSL */
-#include <openssl/evp.h>
-
-#define MUNCH_SIZE INT_MAX
-
-
-#ifndef HASH_OBJ_CONSTRUCTOR
-#define HASH_OBJ_CONSTRUCTOR 0
-#endif
-
-typedef struct {
-    PyObject_HEAD
-    PyObject            *name;  /* name of this hash algorithm */
-    EVP_MD_CTX          ctx;    /* OpenSSL message digest context */
-} EVPobject;
-
-
-static PyTypeObject EVPtype;
-
-
-#define DEFINE_CONSTS_FOR_NEW(Name)  \
-    static PyObject *CONST_ ## Name ## _name_obj; \
-    static EVP_MD_CTX CONST_new_ ## Name ## _ctx; \
-    static EVP_MD_CTX *CONST_new_ ## Name ## _ctx_p = NULL;
-
-DEFINE_CONSTS_FOR_NEW(md5)
-DEFINE_CONSTS_FOR_NEW(sha1)
-DEFINE_CONSTS_FOR_NEW(sha224)
-DEFINE_CONSTS_FOR_NEW(sha256)
-DEFINE_CONSTS_FOR_NEW(sha384)
-DEFINE_CONSTS_FOR_NEW(sha512)
-
-
-static EVPobject *
-newEVPobject(PyObject *name)
-{
-    EVPobject *retval = (EVPobject *)PyObject_New(EVPobject, &EVPtype);
-
-    /* save the name for .name to return */
-    if (retval != NULL) {
-        Py_INCREF(name);
-        retval->name = name;
-    }
-
-    return retval;
-}
-
-/* Internal methods for a hash object */
-
-static void
-EVP_dealloc(PyObject *ptr)
-{
-    EVP_MD_CTX_cleanup(&((EVPobject *)ptr)->ctx);
-    Py_XDECREF(((EVPobject *)ptr)->name);
-    PyObject_Del(ptr);
-}
-
-
-/* External methods for a hash object */
-
-PyDoc_STRVAR(EVP_copy__doc__, "Return a copy of the hash object.");
-
-static PyObject *
-EVP_copy(EVPobject *self, PyObject *unused)
-{
-    EVPobject *newobj;
-
-    if ( (newobj = newEVPobject(self->name))==NULL)
-        return NULL;
-
-    EVP_MD_CTX_copy(&newobj->ctx, &self->ctx);
-    return (PyObject *)newobj;
-}
-
-PyDoc_STRVAR(EVP_digest__doc__,
-"Return the digest value as a string of binary data.");
-
-static PyObject *
-EVP_digest(EVPobject *self, PyObject *unused)
-{
-    unsigned char digest[EVP_MAX_MD_SIZE];
-    EVP_MD_CTX temp_ctx;
-    PyObject *retval;
-    unsigned int digest_size;
-
-    EVP_MD_CTX_copy(&temp_ctx, &self->ctx);
-    digest_size = EVP_MD_CTX_size(&temp_ctx);
-    EVP_DigestFinal(&temp_ctx, digest, NULL);
-
-    retval = PyString_FromStringAndSize((const char *)digest, digest_size);
-    EVP_MD_CTX_cleanup(&temp_ctx);
-    return retval;
-}
-
-PyDoc_STRVAR(EVP_hexdigest__doc__,
-"Return the digest value as a string of hexadecimal digits.");
-
-static PyObject *
-EVP_hexdigest(EVPobject *self, PyObject *unused)
-{
-    unsigned char digest[EVP_MAX_MD_SIZE];
-    EVP_MD_CTX temp_ctx;
-    PyObject *retval;
-    char *hex_digest;
-    unsigned int i, j, digest_size;
-
-    /* Get the raw (binary) digest value */
-    EVP_MD_CTX_copy(&temp_ctx, &self->ctx);
-    digest_size = EVP_MD_CTX_size(&temp_ctx);
-    EVP_DigestFinal(&temp_ctx, digest, NULL);
-
-    EVP_MD_CTX_cleanup(&temp_ctx);
-
-    /* Create a new string */
-    /* NOTE: not thread safe! modifying an already created string object */
-    /* (not a problem because we hold the GIL by default) */
-    retval = PyString_FromStringAndSize(NULL, digest_size * 2);
-    if (!retval)
-	    return NULL;
-    hex_digest = PyString_AsString(retval);
-    if (!hex_digest) {
-	    Py_DECREF(retval);
-	    return NULL;
-    }
-
-    /* Make hex version of the digest */
-    for(i=j=0; i<digest_size; i++) {
-        char c;
-        c = (digest[i] >> 4) & 0xf;
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-        c = (digest[i] & 0xf);
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-    }
-    return retval;
-}
-
-PyDoc_STRVAR(EVP_update__doc__,
-"Update this hash object's state with the provided string.");
-
-static PyObject *
-EVP_update(EVPobject *self, PyObject *args)
-{
-    unsigned char *cp;
-    Py_ssize_t len;
-
-    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
-        return NULL;
-
-    if (len > 0 && len <= MUNCH_SIZE) {
-    EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
-                                                      unsigned int));
-    } else {
-        Py_ssize_t offset = 0;
-        while (len) {
-            unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
-            EVP_DigestUpdate(&self->ctx, cp + offset, process);
-            len -= process;
-            offset += process;
-        }
-    }
-    Py_INCREF(Py_None);
-    return Py_None;
-}
-
-static PyMethodDef EVP_methods[] = {
-    {"update",	  (PyCFunction)EVP_update,    METH_VARARGS, EVP_update__doc__},
-    {"digest",	  (PyCFunction)EVP_digest,    METH_NOARGS,  EVP_digest__doc__},
-    {"hexdigest", (PyCFunction)EVP_hexdigest, METH_NOARGS,  EVP_hexdigest__doc__},
-    {"copy",	  (PyCFunction)EVP_copy,      METH_NOARGS,  EVP_copy__doc__},
-    {NULL,	  NULL}		/* sentinel */
-};
-
-static PyObject *
-EVP_get_block_size(EVPobject *self, void *closure)
-{
-    return PyInt_FromLong(EVP_MD_CTX_block_size(&((EVPobject *)self)->ctx));
-}
-
-static PyObject *
-EVP_get_digest_size(EVPobject *self, void *closure)
-{
-    return PyInt_FromLong(EVP_MD_CTX_size(&((EVPobject *)self)->ctx));
-}
-
-static PyMemberDef EVP_members[] = {
-    {"name", T_OBJECT, offsetof(EVPobject, name), READONLY, PyDoc_STR("algorithm name.")},
-    {NULL}  /* Sentinel */
-};
-
-static PyGetSetDef EVP_getseters[] = {
-    {"digest_size",
-     (getter)EVP_get_digest_size, NULL,
-     NULL,
-     NULL},
-    {"block_size",
-     (getter)EVP_get_block_size, NULL,
-     NULL,
-     NULL},
-    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
-     * the old sha module also supported 'digestsize'.  ugh. */
-    {"digestsize",
-     (getter)EVP_get_digest_size, NULL,
-     NULL,
-     NULL},
-    {NULL}  /* Sentinel */
-};
-
-
-static PyObject *
-EVP_repr(PyObject *self)
-{
-    char buf[100];
-    PyOS_snprintf(buf, sizeof(buf), "<%s HASH object @ %p>",
-            PyString_AsString(((EVPobject *)self)->name), self);
-    return PyString_FromString(buf);
-}
-
-#if HASH_OBJ_CONSTRUCTOR
-static int
-EVP_tp_init(EVPobject *self, PyObject *args, PyObject *kwds)
-{
-    static char *kwlist[] = {"name", "string", NULL};
-    PyObject *name_obj = NULL;
-    char *nameStr;
-    unsigned char *cp = NULL;
-    Py_ssize_t len = 0;
-    const EVP_MD *digest;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|s#:HASH", kwlist,
-                                     &name_obj, &cp, &len)) {
-        return -1;
-    }
-
-    if (!PyArg_Parse(name_obj, "s", &nameStr)) {
-        PyErr_SetString(PyExc_TypeError, "name must be a string");
-        return -1;
-    }
-
-    digest = EVP_get_digestbyname(nameStr);
-    if (!digest) {
-        PyErr_SetString(PyExc_ValueError, "unknown hash function");
-        return -1;
-    }
-    EVP_DigestInit(&self->ctx, digest);
-
-    self->name = name_obj;
-    Py_INCREF(self->name);
-
-    if (cp && len) {
-        if (len > 0 && len <= MUNCH_SIZE) {
-        EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
-                                                          unsigned int));
-        } else {
-            Py_ssize_t offset = 0;
-            while (len) {
-                unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
-                EVP_DigestUpdate(&self->ctx, cp + offset, process);
-                len -= process;
-                offset += process;
-            }
-        }
-    }
-    
-    return 0;
-}
-#endif
-
-
-PyDoc_STRVAR(hashtype_doc,
-"A hash represents the object used to calculate a checksum of a\n\
-string of information.\n\
-\n\
-Methods:\n\
-\n\
-update() -- updates the current digest with an additional string\n\
-digest() -- return the current digest value\n\
-hexdigest() -- return the current digest as a string of hexadecimal digits\n\
-copy() -- return a copy of the current hash object\n\
-\n\
-Attributes:\n\
-\n\
-name -- the hash algorithm being used by this object\n\
-digest_size -- number of bytes in this hashes output\n");
-
-static PyTypeObject EVPtype = {
-    PyObject_HEAD_INIT(NULL)
-    0,			/*ob_size*/
-    "_hashlib.HASH",    /*tp_name*/
-    sizeof(EVPobject),	/*tp_basicsize*/
-    0,			/*tp_itemsize*/
-    /* methods */
-    EVP_dealloc,	/*tp_dealloc*/
-    0,			/*tp_print*/
-    0,                  /*tp_getattr*/
-    0,                  /*tp_setattr*/
-    0,                  /*tp_compare*/
-    EVP_repr,           /*tp_repr*/
-    0,                  /*tp_as_number*/
-    0,                  /*tp_as_sequence*/
-    0,                  /*tp_as_mapping*/
-    0,                  /*tp_hash*/
-    0,                  /*tp_call*/
-    0,                  /*tp_str*/
-    0,                  /*tp_getattro*/
-    0,                  /*tp_setattro*/
-    0,                  /*tp_as_buffer*/
-    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
-    hashtype_doc,       /*tp_doc*/
-    0,                  /*tp_traverse*/
-    0,			/*tp_clear*/
-    0,			/*tp_richcompare*/
-    0,			/*tp_weaklistoffset*/
-    0,			/*tp_iter*/
-    0,			/*tp_iternext*/
-    EVP_methods,	/* tp_methods */
-    EVP_members,	/* tp_members */
-    EVP_getseters,      /* tp_getset */
-#if 1
-    0,                  /* tp_base */
-    0,                  /* tp_dict */
-    0,                  /* tp_descr_get */
-    0,                  /* tp_descr_set */
-    0,                  /* tp_dictoffset */
-#endif
-#if HASH_OBJ_CONSTRUCTOR
-    (initproc)EVP_tp_init, /* tp_init */
-#endif
-};
-
-static PyObject *
-EVPnew(PyObject *name_obj,
-       const EVP_MD *digest, const EVP_MD_CTX *initial_ctx,
-       const unsigned char *cp, Py_ssize_t len)
-{
-    EVPobject *self;
-
-    if (!digest && !initial_ctx) {
-        PyErr_SetString(PyExc_ValueError, "unsupported hash type");
-        return NULL;
-    }
-
-    if ((self = newEVPobject(name_obj)) == NULL)
-        return NULL;
-
-    if (initial_ctx) {
-        EVP_MD_CTX_copy(&self->ctx, initial_ctx);
-    } else {
-        EVP_DigestInit(&self->ctx, digest);
-    }
-
-    if (cp && len) {
-        if (len > 0 && len <= MUNCH_SIZE) {
-            EVP_DigestUpdate(&self->ctx, cp, Py_SAFE_DOWNCAST(len, Py_ssize_t,
-                                                              unsigned int));
-        } else {
-            Py_ssize_t offset = 0;
-            while (len) {
-                unsigned int process = len > MUNCH_SIZE ? MUNCH_SIZE : len;
-                EVP_DigestUpdate(&self->ctx, cp + offset, process);
-                len -= process;
-                offset += process;
-            }
-        }
-    }
-
-    return (PyObject *)self;
-}
-
-
-/* The module-level function: new() */
-
-PyDoc_STRVAR(EVP_new__doc__,
-"Return a new hash object using the named algorithm.\n\
-An optional string argument may be provided and will be\n\
-automatically hashed.\n\
-\n\
-The MD5 and SHA1 algorithms are always supported.\n");
-
-static PyObject *
-EVP_new(PyObject *self, PyObject *args, PyObject *kwdict)
-{
-    static char *kwlist[] = {"name", "string", NULL};
-    PyObject *name_obj = NULL;
-    char *name;
-    const EVP_MD *digest;
-    unsigned char *cp = NULL;
-    Py_ssize_t len = 0;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "O|s#:new", kwlist,
-                                     &name_obj, &cp, &len)) {
-        return NULL;
-    }
-
-    if (!PyArg_Parse(name_obj, "s", &name)) {
-        PyErr_SetString(PyExc_TypeError, "name must be a string");
-        return NULL;
-    }
-
-    digest = EVP_get_digestbyname(name);
-
-    return EVPnew(name_obj, digest, NULL, cp, len);
-}
-
-/*
- *  This macro generates constructor function definitions for specific
- *  hash algorithms.  These constructors are much faster than calling
- *  the generic one passing it a python string and are noticably
- *  faster than calling a python new() wrapper.  Thats important for
- *  code that wants to make hashes of a bunch of small strings.
- */
-#define GEN_CONSTRUCTOR(NAME)  \
-    static PyObject * \
-    EVP_new_ ## NAME (PyObject *self, PyObject *args) \
-    { \
-        unsigned char *cp = NULL; \
-        Py_ssize_t len = 0; \
-     \
-        if (!PyArg_ParseTuple(args, "|s#:" #NAME , &cp, &len)) { \
-            return NULL; \
-        } \
-     \
-        return EVPnew( \
-                CONST_ ## NAME ## _name_obj, \
-                NULL, \
-                CONST_new_ ## NAME ## _ctx_p, \
-                cp, len); \
-    }
-
-/* a PyMethodDef structure for the constructor */
-#define CONSTRUCTOR_METH_DEF(NAME)  \
-    {"openssl_" #NAME, (PyCFunction)EVP_new_ ## NAME, METH_VARARGS, \
-        PyDoc_STR("Returns a " #NAME \
-                  " hash object; optionally initialized with a string") \
-    }
-
-/* used in the init function to setup a constructor */
-#define INIT_CONSTRUCTOR_CONSTANTS(NAME)  do { \
-    CONST_ ## NAME ## _name_obj = PyString_FromString(#NAME); \
-    if (EVP_get_digestbyname(#NAME)) { \
-        CONST_new_ ## NAME ## _ctx_p = &CONST_new_ ## NAME ## _ctx; \
-        EVP_DigestInit(CONST_new_ ## NAME ## _ctx_p, EVP_get_digestbyname(#NAME)); \
-    } \
-} while (0);
-
-GEN_CONSTRUCTOR(md5)
-GEN_CONSTRUCTOR(sha1)
-GEN_CONSTRUCTOR(sha224)
-GEN_CONSTRUCTOR(sha256)
-GEN_CONSTRUCTOR(sha384)
-GEN_CONSTRUCTOR(sha512)
-
-/* List of functions exported by this module */
-
-static struct PyMethodDef EVP_functions[] = {
-    {"new", (PyCFunction)EVP_new, METH_VARARGS|METH_KEYWORDS, EVP_new__doc__},
-    CONSTRUCTOR_METH_DEF(md5),
-    CONSTRUCTOR_METH_DEF(sha1),
-    CONSTRUCTOR_METH_DEF(sha224),
-    CONSTRUCTOR_METH_DEF(sha256),
-    CONSTRUCTOR_METH_DEF(sha384),
-    CONSTRUCTOR_METH_DEF(sha512),
-    {NULL,	NULL}		 /* Sentinel */
-};
-
-
-/* Initialize this module. */
-
-PyMODINIT_FUNC
-init_hashlib(void)
-{
-    PyObject *m;
-
-    OpenSSL_add_all_digests();
-
-    /* TODO build EVP_functions openssl_* entries dynamically based
-     * on what hashes are supported rather than listing many
-     * but having some be unsupported.  Only init appropriate
-     * constants. */
-
-    EVPtype.ob_type = &PyType_Type;
-    if (PyType_Ready(&EVPtype) < 0)
-        return;
-
-    m = Py_InitModule("_hashlib", EVP_functions);
-    if (m == NULL)
-        return;
-
-#if HASH_OBJ_CONSTRUCTOR
-    Py_INCREF(&EVPtype);
-    PyModule_AddObject(m, "HASH", (PyObject *)&EVPtype);
-#endif
-
-    /* these constants are used by the convenience constructors */
-    INIT_CONSTRUCTOR_CONSTANTS(md5);
-    INIT_CONSTRUCTOR_CONSTANTS(sha1);
-    INIT_CONSTRUCTOR_CONSTANTS(sha224);
-    INIT_CONSTRUCTOR_CONSTANTS(sha256);
-    INIT_CONSTRUCTOR_CONSTANTS(sha384);
-    INIT_CONSTRUCTOR_CONSTANTS(sha512);
-}
diff --git a/distutils2/_backport/hashlib.py b/distutils2/_backport/hashlib.py
deleted file mode 100644
--- a/distutils2/_backport/hashlib.py
+++ /dev/null
@@ -1,143 +0,0 @@
-# $Id$
-#
-#  Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
-#  Licensed to PSF under a Contributor Agreement.
-#
-
-__doc__ = """hashlib module - A common interface to many hash functions.
-
-new(name, string='') - returns a new hash object implementing the
-                       given hash function; initializing the hash
-                       using the given string data.
-
-Named constructor functions are also available, these are much faster
-than using new():
-
-md5(), sha1(), sha224(), sha256(), sha384(), and sha512()
-
-More algorithms may be available on your platform but the above are
-guaranteed to exist.
-
-NOTE: If you want the adler32 or crc32 hash functions they are available in
-the zlib module.
-
-Choose your hash function wisely.  Some have known collision weaknesses.
-sha384 and sha512 will be slow on 32 bit platforms.
-
-Hash objects have these methods:
- - update(arg): Update the hash object with the string arg. Repeated calls
-                are equivalent to a single call with the concatenation of all
-                the arguments.
- - digest():    Return the digest of the strings passed to the update() method
-                so far. This may contain non-ASCII characters, including
-                NUL bytes.
- - hexdigest(): Like digest() except the digest is returned as a string of
-                double length, containing only hexadecimal digits.
- - copy():      Return a copy (clone) of the hash object. This can be used to
-                efficiently compute the digests of strings that share a common
-                initial substring.
-
-For example, to obtain the digest of the string 'Nobody inspects the
-spammish repetition':
-
-    >>> import hashlib
-    >>> m = hashlib.md5()
-    >>> m.update("Nobody inspects")
-    >>> m.update(" the spammish repetition")
-    >>> m.digest()
-    '\\xbbd\\x9c\\x83\\xdd\\x1e\\xa5\\xc9\\xd9\\xde\\xc9\\xa1\\x8d\\xf0\\xff\\xe9'
-
-More condensed:
-
-    >>> hashlib.sha224("Nobody inspects the spammish repetition").hexdigest()
-    'a4337bc45a8fc544c03f52dc550cd6e1e87021bc896588bd79e901e2'
-
-"""
-
-# This tuple and __get_builtin_constructor() must be modified if a new
-# always available algorithm is added.
-__always_supported = ('md5', 'sha1', 'sha224', 'sha256', 'sha384', 'sha512')
-
-algorithms = __always_supported
-
-__all__ = __always_supported + ('new', 'algorithms')
-
-
-def __get_builtin_constructor(name):
-    if name in ('SHA1', 'sha1'):
-        from distutils2._backport import _sha
-        return _sha.new
-    elif name in ('MD5', 'md5'):
-        from distutils2._backport import _md5
-        return _md5.new
-    elif name in ('SHA256', 'sha256', 'SHA224', 'sha224'):
-        from distutils2._backport import _sha256
-        bs = name[3:]
-        if bs == '256':
-            return _sha256.sha256
-        elif bs == '224':
-            return _sha256.sha224
-    elif name in ('SHA512', 'sha512', 'SHA384', 'sha384'):
-        from distutils2._backport import _sha512
-        bs = name[3:]
-        if bs == '512':
-            return _sha512.sha512
-        elif bs == '384':
-            return _sha512.sha384
-
-    raise ValueError('unsupported hash type %s' % name)
-
-
-def __get_openssl_constructor(name):
-    try:
-        f = getattr(_hashlib, 'openssl_' + name)
-        # Allow the C module to raise ValueError.  The function will be
-        # defined but the hash not actually available thanks to OpenSSL.
-        f()
-        # Use the C function directly (very fast)
-        return f
-    except (AttributeError, ValueError):
-        return __get_builtin_constructor(name)
-
-
-def __py_new(name, string=''):
-    """new(name, string='') - Return a new hashing object using the named algorithm;
-    optionally initialized with a string.
-    """
-    return __get_builtin_constructor(name)(string)
-
-
-def __hash_new(name, string=''):
-    """new(name, string='') - Return a new hashing object using the named algorithm;
-    optionally initialized with a string.
-    """
-    try:
-        return _hashlib.new(name, string)
-    except ValueError:
-        # If the _hashlib module (OpenSSL) doesn't support the named
-        # hash, try using our builtin implementations.
-        # This allows for SHA224/256 and SHA384/512 support even though
-        # the OpenSSL library prior to 0.9.8 doesn't provide them.
-        return __get_builtin_constructor(name)(string)
-
-
-try:
-    from distutils2._backport import _hashlib
-    new = __hash_new
-    __get_hash = __get_openssl_constructor
-except ImportError:
-    new = __py_new
-    __get_hash = __get_builtin_constructor
-
-for __func_name in __always_supported:
-    # try them all, some may not work due to the OpenSSL
-    # version not supporting that algorithm.
-    try:
-        globals()[__func_name] = __get_hash(__func_name)
-    except ValueError:
-        import logging
-        logging.exception('code for hash %s was not found.', __func_name)
-
-# Cleanup locals()
-del __always_supported, __func_name, __get_hash
-del __py_new, __hash_new, __get_openssl_constructor
diff --git a/distutils2/_backport/md5.c b/distutils2/_backport/md5.c
deleted file mode 100644
--- a/distutils2/_backport/md5.c
+++ /dev/null
@@ -1,381 +0,0 @@
-/*
-  Copyright (C) 1999, 2000, 2002 Aladdin Enterprises.  All rights reserved.
-
-  This software is provided 'as-is', without any express or implied
-  warranty.  In no event will the authors be held liable for any damages
-  arising from the use of this software.
-
-  Permission is granted to anyone to use this software for any purpose,
-  including commercial applications, and to alter it and redistribute it
-  freely, subject to the following restrictions:
-
-  1. The origin of this software must not be misrepresented; you must not
-     claim that you wrote the original software. If you use this software
-     in a product, an acknowledgment in the product documentation would be
-     appreciated but is not required.
-  2. Altered source versions must be plainly marked as such, and must not be
-     misrepresented as being the original software.
-  3. This notice may not be removed or altered from any source distribution.
-
-  L. Peter Deutsch
-  ghost at aladdin.com
-
- */
-/* $Id: md5.c,v 1.6 2002/04/13 19:20:28 lpd Exp $ */
-/*
-  Independent implementation of MD5 (RFC 1321).
-
-  This code implements the MD5 Algorithm defined in RFC 1321, whose
-  text is available at
-	http://www.ietf.org/rfc/rfc1321.txt
-  The code is derived from the text of the RFC, including the test suite
-  (section A.5) but excluding the rest of Appendix A.  It does not include
-  any code or documentation that is identified in the RFC as being
-  copyrighted.
-
-  The original and principal author of md5.c is L. Peter Deutsch
-  <ghost at aladdin.com>.  Other authors are noted in the change history
-  that follows (in reverse chronological order):
-
-  2002-04-13 lpd Clarified derivation from RFC 1321; now handles byte order
-	either statically or dynamically; added missing #include <string.h>
-	in library.
-  2002-03-11 lpd Corrected argument list for main(), and added int return
-	type, in test program and T value program.
-  2002-02-21 lpd Added missing #include <stdio.h> in test program.
-  2000-07-03 lpd Patched to eliminate warnings about "constant is
-	unsigned in ANSI C, signed in traditional"; made test program
-	self-checking.
-  1999-11-04 lpd Edited comments slightly for automatic TOC extraction.
-  1999-10-18 lpd Fixed typo in header comment (ansi2knr rather than md5).
-  1999-05-03 lpd Original version.
- */
-
-#include "md5.h"
-#include <string.h>
-
-#undef BYTE_ORDER	/* 1 = big-endian, -1 = little-endian, 0 = unknown */
-#ifdef ARCH_IS_BIG_ENDIAN
-#  define BYTE_ORDER (ARCH_IS_BIG_ENDIAN ? 1 : -1)
-#else
-#  define BYTE_ORDER 0
-#endif
-
-#define T_MASK ((md5_word_t)~0)
-#define T1 /* 0xd76aa478 */ (T_MASK ^ 0x28955b87)
-#define T2 /* 0xe8c7b756 */ (T_MASK ^ 0x173848a9)
-#define T3    0x242070db
-#define T4 /* 0xc1bdceee */ (T_MASK ^ 0x3e423111)
-#define T5 /* 0xf57c0faf */ (T_MASK ^ 0x0a83f050)
-#define T6    0x4787c62a
-#define T7 /* 0xa8304613 */ (T_MASK ^ 0x57cfb9ec)
-#define T8 /* 0xfd469501 */ (T_MASK ^ 0x02b96afe)
-#define T9    0x698098d8
-#define T10 /* 0x8b44f7af */ (T_MASK ^ 0x74bb0850)
-#define T11 /* 0xffff5bb1 */ (T_MASK ^ 0x0000a44e)
-#define T12 /* 0x895cd7be */ (T_MASK ^ 0x76a32841)
-#define T13    0x6b901122
-#define T14 /* 0xfd987193 */ (T_MASK ^ 0x02678e6c)
-#define T15 /* 0xa679438e */ (T_MASK ^ 0x5986bc71)
-#define T16    0x49b40821
-#define T17 /* 0xf61e2562 */ (T_MASK ^ 0x09e1da9d)
-#define T18 /* 0xc040b340 */ (T_MASK ^ 0x3fbf4cbf)
-#define T19    0x265e5a51
-#define T20 /* 0xe9b6c7aa */ (T_MASK ^ 0x16493855)
-#define T21 /* 0xd62f105d */ (T_MASK ^ 0x29d0efa2)
-#define T22    0x02441453
-#define T23 /* 0xd8a1e681 */ (T_MASK ^ 0x275e197e)
-#define T24 /* 0xe7d3fbc8 */ (T_MASK ^ 0x182c0437)
-#define T25    0x21e1cde6
-#define T26 /* 0xc33707d6 */ (T_MASK ^ 0x3cc8f829)
-#define T27 /* 0xf4d50d87 */ (T_MASK ^ 0x0b2af278)
-#define T28    0x455a14ed
-#define T29 /* 0xa9e3e905 */ (T_MASK ^ 0x561c16fa)
-#define T30 /* 0xfcefa3f8 */ (T_MASK ^ 0x03105c07)
-#define T31    0x676f02d9
-#define T32 /* 0x8d2a4c8a */ (T_MASK ^ 0x72d5b375)
-#define T33 /* 0xfffa3942 */ (T_MASK ^ 0x0005c6bd)
-#define T34 /* 0x8771f681 */ (T_MASK ^ 0x788e097e)
-#define T35    0x6d9d6122
-#define T36 /* 0xfde5380c */ (T_MASK ^ 0x021ac7f3)
-#define T37 /* 0xa4beea44 */ (T_MASK ^ 0x5b4115bb)
-#define T38    0x4bdecfa9
-#define T39 /* 0xf6bb4b60 */ (T_MASK ^ 0x0944b49f)
-#define T40 /* 0xbebfbc70 */ (T_MASK ^ 0x4140438f)
-#define T41    0x289b7ec6
-#define T42 /* 0xeaa127fa */ (T_MASK ^ 0x155ed805)
-#define T43 /* 0xd4ef3085 */ (T_MASK ^ 0x2b10cf7a)
-#define T44    0x04881d05
-#define T45 /* 0xd9d4d039 */ (T_MASK ^ 0x262b2fc6)
-#define T46 /* 0xe6db99e5 */ (T_MASK ^ 0x1924661a)
-#define T47    0x1fa27cf8
-#define T48 /* 0xc4ac5665 */ (T_MASK ^ 0x3b53a99a)
-#define T49 /* 0xf4292244 */ (T_MASK ^ 0x0bd6ddbb)
-#define T50    0x432aff97
-#define T51 /* 0xab9423a7 */ (T_MASK ^ 0x546bdc58)
-#define T52 /* 0xfc93a039 */ (T_MASK ^ 0x036c5fc6)
-#define T53    0x655b59c3
-#define T54 /* 0x8f0ccc92 */ (T_MASK ^ 0x70f3336d)
-#define T55 /* 0xffeff47d */ (T_MASK ^ 0x00100b82)
-#define T56 /* 0x85845dd1 */ (T_MASK ^ 0x7a7ba22e)
-#define T57    0x6fa87e4f
-#define T58 /* 0xfe2ce6e0 */ (T_MASK ^ 0x01d3191f)
-#define T59 /* 0xa3014314 */ (T_MASK ^ 0x5cfebceb)
-#define T60    0x4e0811a1
-#define T61 /* 0xf7537e82 */ (T_MASK ^ 0x08ac817d)
-#define T62 /* 0xbd3af235 */ (T_MASK ^ 0x42c50dca)
-#define T63    0x2ad7d2bb
-#define T64 /* 0xeb86d391 */ (T_MASK ^ 0x14792c6e)
-
-
-static void
-md5_process(md5_state_t *pms, const md5_byte_t *data /*[64]*/)
-{
-    md5_word_t
-	a = pms->abcd[0], b = pms->abcd[1],
-	c = pms->abcd[2], d = pms->abcd[3];
-    md5_word_t t;
-#if BYTE_ORDER > 0
-    /* Define storage only for big-endian CPUs. */
-    md5_word_t X[16];
-#else
-    /* Define storage for little-endian or both types of CPUs. */
-    md5_word_t xbuf[16];
-    const md5_word_t *X;
-#endif
-
-    {
-#if BYTE_ORDER == 0
-	/*
-	 * Determine dynamically whether this is a big-endian or
-	 * little-endian machine, since we can use a more efficient
-	 * algorithm on the latter.
-	 */
-	static const int w = 1;
-
-	if (*((const md5_byte_t *)&w)) /* dynamic little-endian */
-#endif
-#if BYTE_ORDER <= 0		/* little-endian */
-	{
-	    /*
-	     * On little-endian machines, we can process properly aligned
-	     * data without copying it.
-	     */
-	    if (!((data - (const md5_byte_t *)0) & 3)) {
-		/* data are properly aligned */
-		X = (const md5_word_t *)data;
-	    } else {
-		/* not aligned */
-		memcpy(xbuf, data, 64);
-		X = xbuf;
-	    }
-	}
-#endif
-#if BYTE_ORDER == 0
-	else			/* dynamic big-endian */
-#endif
-#if BYTE_ORDER >= 0		/* big-endian */
-	{
-	    /*
-	     * On big-endian machines, we must arrange the bytes in the
-	     * right order.
-	     */
-	    const md5_byte_t *xp = data;
-	    int i;
-
-#  if BYTE_ORDER == 0
-	    X = xbuf;		/* (dynamic only) */
-#  else
-#    define xbuf X		/* (static only) */
-#  endif
-	    for (i = 0; i < 16; ++i, xp += 4)
-		xbuf[i] = xp[0] + (xp[1] << 8) + (xp[2] << 16) + (xp[3] << 24);
-	}
-#endif
-    }
-
-#define ROTATE_LEFT(x, n) (((x) << (n)) | ((x) >> (32 - (n))))
-
-    /* Round 1. */
-    /* Let [abcd k s i] denote the operation
-       a = b + ((a + F(b,c,d) + X[k] + T[i]) <<< s). */
-#define F(x, y, z) (((x) & (y)) | (~(x) & (z)))
-#define SET(a, b, c, d, k, s, Ti)\
-  t = a + F(b,c,d) + X[k] + Ti;\
-  a = ROTATE_LEFT(t, s) + b
-    /* Do the following 16 operations. */
-    SET(a, b, c, d,  0,  7,  T1);
-    SET(d, a, b, c,  1, 12,  T2);
-    SET(c, d, a, b,  2, 17,  T3);
-    SET(b, c, d, a,  3, 22,  T4);
-    SET(a, b, c, d,  4,  7,  T5);
-    SET(d, a, b, c,  5, 12,  T6);
-    SET(c, d, a, b,  6, 17,  T7);
-    SET(b, c, d, a,  7, 22,  T8);
-    SET(a, b, c, d,  8,  7,  T9);
-    SET(d, a, b, c,  9, 12, T10);
-    SET(c, d, a, b, 10, 17, T11);
-    SET(b, c, d, a, 11, 22, T12);
-    SET(a, b, c, d, 12,  7, T13);
-    SET(d, a, b, c, 13, 12, T14);
-    SET(c, d, a, b, 14, 17, T15);
-    SET(b, c, d, a, 15, 22, T16);
-#undef SET
-
-     /* Round 2. */
-     /* Let [abcd k s i] denote the operation
-          a = b + ((a + G(b,c,d) + X[k] + T[i]) <<< s). */
-#define G(x, y, z) (((x) & (z)) | ((y) & ~(z)))
-#define SET(a, b, c, d, k, s, Ti)\
-  t = a + G(b,c,d) + X[k] + Ti;\
-  a = ROTATE_LEFT(t, s) + b
-     /* Do the following 16 operations. */
-    SET(a, b, c, d,  1,  5, T17);
-    SET(d, a, b, c,  6,  9, T18);
-    SET(c, d, a, b, 11, 14, T19);
-    SET(b, c, d, a,  0, 20, T20);
-    SET(a, b, c, d,  5,  5, T21);
-    SET(d, a, b, c, 10,  9, T22);
-    SET(c, d, a, b, 15, 14, T23);
-    SET(b, c, d, a,  4, 20, T24);
-    SET(a, b, c, d,  9,  5, T25);
-    SET(d, a, b, c, 14,  9, T26);
-    SET(c, d, a, b,  3, 14, T27);
-    SET(b, c, d, a,  8, 20, T28);
-    SET(a, b, c, d, 13,  5, T29);
-    SET(d, a, b, c,  2,  9, T30);
-    SET(c, d, a, b,  7, 14, T31);
-    SET(b, c, d, a, 12, 20, T32);
-#undef SET
-
-     /* Round 3. */
-     /* Let [abcd k s t] denote the operation
-          a = b + ((a + H(b,c,d) + X[k] + T[i]) <<< s). */
-#define H(x, y, z) ((x) ^ (y) ^ (z))
-#define SET(a, b, c, d, k, s, Ti)\
-  t = a + H(b,c,d) + X[k] + Ti;\
-  a = ROTATE_LEFT(t, s) + b
-     /* Do the following 16 operations. */
-    SET(a, b, c, d,  5,  4, T33);
-    SET(d, a, b, c,  8, 11, T34);
-    SET(c, d, a, b, 11, 16, T35);
-    SET(b, c, d, a, 14, 23, T36);
-    SET(a, b, c, d,  1,  4, T37);
-    SET(d, a, b, c,  4, 11, T38);
-    SET(c, d, a, b,  7, 16, T39);
-    SET(b, c, d, a, 10, 23, T40);
-    SET(a, b, c, d, 13,  4, T41);
-    SET(d, a, b, c,  0, 11, T42);
-    SET(c, d, a, b,  3, 16, T43);
-    SET(b, c, d, a,  6, 23, T44);
-    SET(a, b, c, d,  9,  4, T45);
-    SET(d, a, b, c, 12, 11, T46);
-    SET(c, d, a, b, 15, 16, T47);
-    SET(b, c, d, a,  2, 23, T48);
-#undef SET
-
-     /* Round 4. */
-     /* Let [abcd k s t] denote the operation
-          a = b + ((a + I(b,c,d) + X[k] + T[i]) <<< s). */
-#define I(x, y, z) ((y) ^ ((x) | ~(z)))
-#define SET(a, b, c, d, k, s, Ti)\
-  t = a + I(b,c,d) + X[k] + Ti;\
-  a = ROTATE_LEFT(t, s) + b
-     /* Do the following 16 operations. */
-    SET(a, b, c, d,  0,  6, T49);
-    SET(d, a, b, c,  7, 10, T50);
-    SET(c, d, a, b, 14, 15, T51);
-    SET(b, c, d, a,  5, 21, T52);
-    SET(a, b, c, d, 12,  6, T53);
-    SET(d, a, b, c,  3, 10, T54);
-    SET(c, d, a, b, 10, 15, T55);
-    SET(b, c, d, a,  1, 21, T56);
-    SET(a, b, c, d,  8,  6, T57);
-    SET(d, a, b, c, 15, 10, T58);
-    SET(c, d, a, b,  6, 15, T59);
-    SET(b, c, d, a, 13, 21, T60);
-    SET(a, b, c, d,  4,  6, T61);
-    SET(d, a, b, c, 11, 10, T62);
-    SET(c, d, a, b,  2, 15, T63);
-    SET(b, c, d, a,  9, 21, T64);
-#undef SET
-
-     /* Then perform the following additions. (That is increment each
-        of the four registers by the value it had before this block
-        was started.) */
-    pms->abcd[0] += a;
-    pms->abcd[1] += b;
-    pms->abcd[2] += c;
-    pms->abcd[3] += d;
-}
-
-void
-md5_init(md5_state_t *pms)
-{
-    pms->count[0] = pms->count[1] = 0;
-    pms->abcd[0] = 0x67452301;
-    pms->abcd[1] = /*0xefcdab89*/ T_MASK ^ 0x10325476;
-    pms->abcd[2] = /*0x98badcfe*/ T_MASK ^ 0x67452301;
-    pms->abcd[3] = 0x10325476;
-}
-
-void
-md5_append(md5_state_t *pms, const md5_byte_t *data, int nbytes)
-{
-    const md5_byte_t *p = data;
-    int left = nbytes;
-    int offset = (pms->count[0] >> 3) & 63;
-    md5_word_t nbits = (md5_word_t)(nbytes << 3);
-
-    if (nbytes <= 0)
-	return;
-
-    /* Update the message length. */
-    pms->count[1] += nbytes >> 29;
-    pms->count[0] += nbits;
-    if (pms->count[0] < nbits)
-	pms->count[1]++;
-
-    /* Process an initial partial block. */
-    if (offset) {
-	int copy = (offset + nbytes > 64 ? 64 - offset : nbytes);
-
-	memcpy(pms->buf + offset, p, copy);
-	if (offset + copy < 64)
-	    return;
-	p += copy;
-	left -= copy;
-	md5_process(pms, pms->buf);
-    }
-
-    /* Process full blocks. */
-    for (; left >= 64; p += 64, left -= 64)
-	md5_process(pms, p);
-
-    /* Process a final partial block. */
-    if (left)
-	memcpy(pms->buf, p, left);
-}
-
-void
-md5_finish(md5_state_t *pms, md5_byte_t digest[16])
-{
-    static const md5_byte_t pad[64] = {
-	0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
-	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
-	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
-	0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0
-    };
-    md5_byte_t data[8];
-    int i;
-
-    /* Save the length before padding. */
-    for (i = 0; i < 8; ++i)
-	data[i] = (md5_byte_t)(pms->count[i >> 2] >> ((i & 3) << 3));
-    /* Pad to 56 bytes mod 64. */
-    md5_append(pms, pad, ((55 - (pms->count[0] >> 3)) & 63) + 1);
-    /* Append the length. */
-    md5_append(pms, data, 8);
-    for (i = 0; i < 16; ++i)
-	digest[i] = (md5_byte_t)(pms->abcd[i >> 2] >> ((i & 3) << 3));
-}
diff --git a/distutils2/_backport/md5.h b/distutils2/_backport/md5.h
deleted file mode 100644
--- a/distutils2/_backport/md5.h
+++ /dev/null
@@ -1,91 +0,0 @@
-/*
-  Copyright (C) 1999, 2002 Aladdin Enterprises.  All rights reserved.
-
-  This software is provided 'as-is', without any express or implied
-  warranty.  In no event will the authors be held liable for any damages
-  arising from the use of this software.
-
-  Permission is granted to anyone to use this software for any purpose,
-  including commercial applications, and to alter it and redistribute it
-  freely, subject to the following restrictions:
-
-  1. The origin of this software must not be misrepresented; you must not
-     claim that you wrote the original software. If you use this software
-     in a product, an acknowledgment in the product documentation would be
-     appreciated but is not required.
-  2. Altered source versions must be plainly marked as such, and must not be
-     misrepresented as being the original software.
-  3. This notice may not be removed or altered from any source distribution.
-
-  L. Peter Deutsch
-  ghost at aladdin.com
-
- */
-/* $Id: md5.h 43594 2006-04-03 16:27:50Z matthias.klose $ */
-/*
-  Independent implementation of MD5 (RFC 1321).
-
-  This code implements the MD5 Algorithm defined in RFC 1321, whose
-  text is available at
-	http://www.ietf.org/rfc/rfc1321.txt
-  The code is derived from the text of the RFC, including the test suite
-  (section A.5) but excluding the rest of Appendix A.  It does not include
-  any code or documentation that is identified in the RFC as being
-  copyrighted.
-
-  The original and principal author of md5.h is L. Peter Deutsch
-  <ghost at aladdin.com>.  Other authors are noted in the change history
-  that follows (in reverse chronological order):
-
-  2002-04-13 lpd Removed support for non-ANSI compilers; removed
-	references to Ghostscript; clarified derivation from RFC 1321;
-	now handles byte order either statically or dynamically.
-  1999-11-04 lpd Edited comments slightly for automatic TOC extraction.
-  1999-10-18 lpd Fixed typo in header comment (ansi2knr rather than md5);
-	added conditionalization for C++ compilation from Martin
-	Purschke <purschke at bnl.gov>.
-  1999-05-03 lpd Original version.
- */
-
-#ifndef md5_INCLUDED
-#  define md5_INCLUDED
-
-/*
- * This package supports both compile-time and run-time determination of CPU
- * byte order.  If ARCH_IS_BIG_ENDIAN is defined as 0, the code will be
- * compiled to run only on little-endian CPUs; if ARCH_IS_BIG_ENDIAN is
- * defined as non-zero, the code will be compiled to run only on big-endian
- * CPUs; if ARCH_IS_BIG_ENDIAN is not defined, the code will be compiled to
- * run on either big- or little-endian CPUs, but will run slightly less
- * efficiently on either one than if ARCH_IS_BIG_ENDIAN is defined.
- */
-
-typedef unsigned char md5_byte_t; /* 8-bit byte */
-typedef unsigned int md5_word_t; /* 32-bit word */
-
-/* Define the state of the MD5 Algorithm. */
-typedef struct md5_state_s {
-    md5_word_t count[2];	/* message length in bits, lsw first */
-    md5_word_t abcd[4];		/* digest buffer */
-    md5_byte_t buf[64];		/* accumulate block */
-} md5_state_t;
-
-#ifdef __cplusplus
-extern "C" 
-{
-#endif
-
-/* Initialize the algorithm. */
-void md5_init(md5_state_t *pms);
-
-/* Append a string to the message. */
-void md5_append(md5_state_t *pms, const md5_byte_t *data, int nbytes);
-
-/* Finish the message and return the digest. */
-void md5_finish(md5_state_t *pms, md5_byte_t digest[16]);
-
-#ifdef __cplusplus
-}  /* end extern "C" */
-#endif
-
-#endif /* md5_INCLUDED */
diff --git a/distutils2/_backport/md5module.c b/distutils2/_backport/md5module.c
deleted file mode 100644
--- a/distutils2/_backport/md5module.c
+++ /dev/null
@@ -1,312 +0,0 @@
-
-/* MD5 module */
-
-/* This module provides an interface to the RSA Data Security,
-   Inc. MD5 Message-Digest Algorithm, described in RFC 1321.
-   It requires the files md5c.c and md5.h (which are slightly changed
-   from the versions in the RFC to avoid the "global.h" file.) */
-
-
-/* MD5 objects */
-
-#include "Python.h"
-#include "structmember.h"
-#include "md5.h"
-
-typedef struct {
-	PyObject_HEAD
-        md5_state_t	md5;		/* the context holder */
-} md5object;
-
-static PyTypeObject MD5type;
-
-#define is_md5object(v)		((v)->ob_type == &MD5type)
-
-static md5object *
-newmd5object(void)
-{
-	md5object *md5p;
-
-	md5p = PyObject_New(md5object, &MD5type);
-	if (md5p == NULL)
-		return NULL;
-
-	md5_init(&md5p->md5);	/* actual initialisation */
-	return md5p;
-}
-
-
-/* MD5 methods */
-
-static void
-md5_dealloc(md5object *md5p)
-{
-	PyObject_Del(md5p);
-}
-
-
-/* MD5 methods-as-attributes */
-
-static PyObject *
-md5_update(md5object *self, PyObject *args)
-{
-	unsigned char *cp;
-	int len;
-
-	if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
-		return NULL;
-
-	md5_append(&self->md5, cp, len);
-
-	Py_INCREF(Py_None);
-	return Py_None;
-}
-
-PyDoc_STRVAR(update_doc,
-"update (arg)\n\
-\n\
-Update the md5 object with the string arg. Repeated calls are\n\
-equivalent to a single call with the concatenation of all the\n\
-arguments.");
-
-
-static PyObject *
-md5_digest(md5object *self)
-{
- 	md5_state_t mdContext;
-	unsigned char aDigest[16];
-
-	/* make a temporary copy, and perform the final */
-	mdContext = self->md5;
-	md5_finish(&mdContext, aDigest);
-
-	return PyString_FromStringAndSize((char *)aDigest, 16);
-}
-
-PyDoc_STRVAR(digest_doc,
-"digest() -> string\n\
-\n\
-Return the digest of the strings passed to the update() method so\n\
-far. This is a 16-byte string which may contain non-ASCII characters,\n\
-including null bytes.");
-
-
-static PyObject *
-md5_hexdigest(md5object *self)
-{
- 	md5_state_t mdContext;
-	unsigned char digest[16];
-	unsigned char hexdigest[32];
-	int i, j;
-
-	/* make a temporary copy, and perform the final */
-	mdContext = self->md5;
-	md5_finish(&mdContext, digest);
-
-	/* Make hex version of the digest */
-	for(i=j=0; i<16; i++) {
-		char c;
-		c = (digest[i] >> 4) & 0xf;
-		c = (c>9) ? c+'a'-10 : c + '0';
-		hexdigest[j++] = c;
-		c = (digest[i] & 0xf);
-		c = (c>9) ? c+'a'-10 : c + '0';
-		hexdigest[j++] = c;
-	}
-	return PyString_FromStringAndSize((char*)hexdigest, 32);
-}
-
-
-PyDoc_STRVAR(hexdigest_doc,
-"hexdigest() -> string\n\
-\n\
-Like digest(), but returns the digest as a string of hexadecimal digits.");
-
-
-static PyObject *
-md5_copy(md5object *self)
-{
-	md5object *md5p;
-
-	if ((md5p = newmd5object()) == NULL)
-		return NULL;
-
-	md5p->md5 = self->md5;
-
-	return (PyObject *)md5p;
-}
-
-PyDoc_STRVAR(copy_doc,
-"copy() -> md5 object\n\
-\n\
-Return a copy (``clone'') of the md5 object.");
-
-
-static PyMethodDef md5_methods[] = {
-	{"update",    (PyCFunction)md5_update,    METH_VARARGS, update_doc},
-	{"digest",    (PyCFunction)md5_digest,    METH_NOARGS,  digest_doc},
-	{"hexdigest", (PyCFunction)md5_hexdigest, METH_NOARGS,  hexdigest_doc},
-	{"copy",      (PyCFunction)md5_copy,      METH_NOARGS,  copy_doc},
-	{NULL, NULL}			     /* sentinel */
-};
-
-static PyObject *
-md5_get_block_size(PyObject *self, void *closure)
-{
-    return PyInt_FromLong(64);
-}
-
-static PyObject *
-md5_get_digest_size(PyObject *self, void *closure)
-{
-    return PyInt_FromLong(16);
-}
-
-static PyObject *
-md5_get_name(PyObject *self, void *closure)
-{
-    return PyString_FromStringAndSize("MD5", 3);
-}
-
-static PyGetSetDef md5_getseters[] = {
-    {"digest_size",
-     (getter)md5_get_digest_size, NULL,
-     NULL,
-     NULL},
-    {"block_size",
-     (getter)md5_get_block_size, NULL,
-     NULL,
-     NULL},
-    {"name",
-     (getter)md5_get_name, NULL,
-     NULL,
-     NULL},
-    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
-     * the old sha module also supported 'digestsize'.  ugh. */
-    {"digestsize",
-     (getter)md5_get_digest_size, NULL,
-     NULL,
-     NULL},
-    {NULL}  /* Sentinel */
-};
-
-
-PyDoc_STRVAR(module_doc,
-"This module implements the interface to RSA's MD5 message digest\n\
-algorithm (see also Internet RFC 1321). Its use is quite\n\
-straightforward: use the new() to create an md5 object. You can now\n\
-feed this object with arbitrary strings using the update() method, and\n\
-at any point you can ask it for the digest (a strong kind of 128-bit\n\
-checksum, a.k.a. ``fingerprint'') of the concatenation of the strings\n\
-fed to it so far using the digest() method.\n\
-\n\
-Functions:\n\
-\n\
-new([arg]) -- return a new md5 object, initialized with arg if provided\n\
-md5([arg]) -- DEPRECATED, same as new, but for compatibility\n\
-\n\
-Special Objects:\n\
-\n\
-MD5Type -- type object for md5 objects");
-
-PyDoc_STRVAR(md5type_doc,
-"An md5 represents the object used to calculate the MD5 checksum of a\n\
-string of information.\n\
-\n\
-Methods:\n\
-\n\
-update() -- updates the current digest with an additional string\n\
-digest() -- return the current digest value\n\
-hexdigest() -- return the current digest as a string of hexadecimal digits\n\
-copy() -- return a copy of the current md5 object");
-
-static PyTypeObject MD5type = {
-	PyObject_HEAD_INIT(NULL)
-	0,			  /*ob_size*/
-	"_md5.md5",		  /*tp_name*/
-	sizeof(md5object),	  /*tp_size*/
-	0,			  /*tp_itemsize*/
-	/* methods */
-	(destructor)md5_dealloc,  /*tp_dealloc*/
-	0,			  /*tp_print*/
-	0,                        /*tp_getattr*/
-	0,			  /*tp_setattr*/
-	0,			  /*tp_compare*/
-	0,			  /*tp_repr*/
-        0,			  /*tp_as_number*/
-	0,                        /*tp_as_sequence*/
-	0,			  /*tp_as_mapping*/
-	0, 			  /*tp_hash*/
-	0,			  /*tp_call*/
-	0,			  /*tp_str*/
-	0,			  /*tp_getattro*/
-	0,			  /*tp_setattro*/
-	0,	                  /*tp_as_buffer*/
-	Py_TPFLAGS_DEFAULT,	  /*tp_flags*/
-	md5type_doc,		  /*tp_doc*/
-        0,                        /*tp_traverse*/
-        0,			  /*tp_clear*/
-        0,			  /*tp_richcompare*/
-        0,			  /*tp_weaklistoffset*/
-        0,			  /*tp_iter*/
-        0,			  /*tp_iternext*/
-        md5_methods,	          /*tp_methods*/
-        0,      	          /*tp_members*/
-        md5_getseters,            /*tp_getset*/
-};
-
-
-/* MD5 functions */
-
-static PyObject *
-MD5_new(PyObject *self, PyObject *args)
-{
-	md5object *md5p;
-	unsigned char *cp = NULL;
-	int len = 0;
-
-	if (!PyArg_ParseTuple(args, "|s#:new", &cp, &len))
-		return NULL;
-
-	if ((md5p = newmd5object()) == NULL)
-		return NULL;
-
-	if (cp)
-		md5_append(&md5p->md5, cp, len);
-
-	return (PyObject *)md5p;
-}
-
-PyDoc_STRVAR(new_doc,
-"new([arg]) -> md5 object\n\
-\n\
-Return a new md5 object. If arg is present, the method call update(arg)\n\
-is made.");
-
-
-/* List of functions exported by this module */
-
-static PyMethodDef md5_functions[] = {
-	{"new",		(PyCFunction)MD5_new, METH_VARARGS, new_doc},
-	{NULL,		NULL}	/* Sentinel */
-};
-
-
-/* Initialize this module. */
-
-PyMODINIT_FUNC
-init_md5(void)
-{
-	PyObject *m, *d;
-
-        MD5type.ob_type = &PyType_Type;
-        if (PyType_Ready(&MD5type) < 0)
-            return;
-	m = Py_InitModule3("_md5", md5_functions, module_doc);
-	if (m == NULL)
-	    return;
-	d = PyModule_GetDict(m);
-	PyDict_SetItemString(d, "MD5Type", (PyObject *)&MD5type);
-	PyModule_AddIntConstant(m, "digest_size", 16);
-	/* No need to check the error here, the caller will do that */
-}
diff --git a/distutils2/_backport/sha256module.c b/distutils2/_backport/sha256module.c
deleted file mode 100644
--- a/distutils2/_backport/sha256module.c
+++ /dev/null
@@ -1,701 +0,0 @@
-/* SHA256 module */
-
-/* This module provides an interface to NIST's SHA-256 and SHA-224 Algorithms */
-
-/* See below for information about the original code this module was
-   based upon. Additional work performed by:
-
-   Andrew Kuchling (amk at amk.ca)
-   Greg Stein (gstein at lyra.org)
-   Trevor Perrin (trevp at trevp.net)
-
-   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
-   Licensed to PSF under a Contributor Agreement.
-
-*/
-
-/* SHA objects */
-
-#include "Python.h"
-#include "structmember.h"
-
-
-/* Endianness testing and definitions */
-#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
-	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
-
-#define PCT_LITTLE_ENDIAN 1
-#define PCT_BIG_ENDIAN 0
-
-/* Some useful types */
-
-typedef unsigned char SHA_BYTE;
-
-#if SIZEOF_INT == 4
-typedef unsigned int SHA_INT32;	/* 32-bit integer */
-#else
-/* not defined. compilation will die. */
-#endif
-
-/* The SHA block size and message digest sizes, in bytes */
-
-#define SHA_BLOCKSIZE    64
-#define SHA_DIGESTSIZE  32
-
-/* The structure for storing SHA info */
-
-typedef struct {
-    PyObject_HEAD
-    SHA_INT32 digest[8];		/* Message digest */
-    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
-    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
-    int Endianness;
-    int local;				/* unprocessed amount in data */
-    int digestsize;
-} SHAobject;
-
-/* When run on a little-endian CPU we need to perform byte reversal on an
-   array of longwords. */
-
-static void longReverse(SHA_INT32 *buffer, int byteCount, int Endianness)
-{
-    SHA_INT32 value;
-
-    if ( Endianness == PCT_BIG_ENDIAN )
-	return;
-
-    byteCount /= sizeof(*buffer);
-    while (byteCount--) {
-        value = *buffer;
-        value = ( ( value & 0xFF00FF00L ) >> 8  ) | \
-                ( ( value & 0x00FF00FFL ) << 8 );
-        *buffer++ = ( value << 16 ) | ( value >> 16 );
-    }
-}
-
-static void SHAcopy(SHAobject *src, SHAobject *dest)
-{
-    dest->Endianness = src->Endianness;
-    dest->local = src->local;
-    dest->digestsize = src->digestsize;
-    dest->count_lo = src->count_lo;
-    dest->count_hi = src->count_hi;
-    memcpy(dest->digest, src->digest, sizeof(src->digest));
-    memcpy(dest->data, src->data, sizeof(src->data));
-}
-
-
-/* ------------------------------------------------------------------------
- *
- * This code for the SHA-256 algorithm was noted as public domain. The
- * original headers are pasted below.
- *
- * Several changes have been made to make it more compatible with the
- * Python environment and desired interface.
- *
- */
-
-/* LibTomCrypt, modular cryptographic library -- Tom St Denis
- *
- * LibTomCrypt is a library that provides various cryptographic
- * algorithms in a highly modular and flexible manner.
- *
- * The library is free for all purposes without any express
- * gurantee it works.
- *
- * Tom St Denis, tomstdenis at iahu.ca, http://libtomcrypt.org
- */
-
-
-/* SHA256 by Tom St Denis */
-
-/* Various logical functions */
-#define ROR(x, y)\
-( ((((unsigned long)(x)&0xFFFFFFFFUL)>>(unsigned long)((y)&31)) | \
-((unsigned long)(x)<<(unsigned long)(32-((y)&31)))) & 0xFFFFFFFFUL)
-#define Ch(x,y,z)       (z ^ (x & (y ^ z)))
-#define Maj(x,y,z)      (((x | y) & z) | (x & y)) 
-#define S(x, n)         ROR((x),(n))
-#define R(x, n)         (((x)&0xFFFFFFFFUL)>>(n))
-#define Sigma0(x)       (S(x, 2) ^ S(x, 13) ^ S(x, 22))
-#define Sigma1(x)       (S(x, 6) ^ S(x, 11) ^ S(x, 25))
-#define Gamma0(x)       (S(x, 7) ^ S(x, 18) ^ R(x, 3))
-#define Gamma1(x)       (S(x, 17) ^ S(x, 19) ^ R(x, 10))
-
-
-static void
-sha_transform(SHAobject *sha_info)
-{
-    int i;
-	SHA_INT32 S[8], W[64], t0, t1;
-
-    memcpy(W, sha_info->data, sizeof(sha_info->data));
-    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
-
-    for (i = 16; i < 64; ++i) {
-		W[i] = Gamma1(W[i - 2]) + W[i - 7] + Gamma0(W[i - 15]) + W[i - 16];
-    }
-    for (i = 0; i < 8; ++i) {
-        S[i] = sha_info->digest[i];
-    }
-
-    /* Compress */
-#define RND(a,b,c,d,e,f,g,h,i,ki)                    \
-     t0 = h + Sigma1(e) + Ch(e, f, g) + ki + W[i];   \
-     t1 = Sigma0(a) + Maj(a, b, c);                  \
-     d += t0;                                        \
-     h  = t0 + t1;
-
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],0,0x428a2f98);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],1,0x71374491);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],2,0xb5c0fbcf);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],3,0xe9b5dba5);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],4,0x3956c25b);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],5,0x59f111f1);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],6,0x923f82a4);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],7,0xab1c5ed5);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],8,0xd807aa98);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],9,0x12835b01);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],10,0x243185be);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],11,0x550c7dc3);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],12,0x72be5d74);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],13,0x80deb1fe);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],14,0x9bdc06a7);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],15,0xc19bf174);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],16,0xe49b69c1);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],17,0xefbe4786);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],18,0x0fc19dc6);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],19,0x240ca1cc);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],20,0x2de92c6f);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],21,0x4a7484aa);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],22,0x5cb0a9dc);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],23,0x76f988da);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],24,0x983e5152);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],25,0xa831c66d);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],26,0xb00327c8);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],27,0xbf597fc7);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],28,0xc6e00bf3);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],29,0xd5a79147);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],30,0x06ca6351);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],31,0x14292967);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],32,0x27b70a85);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],33,0x2e1b2138);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],34,0x4d2c6dfc);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],35,0x53380d13);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],36,0x650a7354);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],37,0x766a0abb);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],38,0x81c2c92e);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],39,0x92722c85);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],40,0xa2bfe8a1);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],41,0xa81a664b);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],42,0xc24b8b70);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],43,0xc76c51a3);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],44,0xd192e819);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],45,0xd6990624);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],46,0xf40e3585);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],47,0x106aa070);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],48,0x19a4c116);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],49,0x1e376c08);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],50,0x2748774c);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],51,0x34b0bcb5);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],52,0x391c0cb3);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],53,0x4ed8aa4a);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],54,0x5b9cca4f);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],55,0x682e6ff3);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],56,0x748f82ee);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],57,0x78a5636f);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],58,0x84c87814);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],59,0x8cc70208);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],60,0x90befffa);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],61,0xa4506ceb);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],62,0xbef9a3f7);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],63,0xc67178f2);
-
-#undef RND     
-    
-    /* feedback */
-    for (i = 0; i < 8; i++) {
-        sha_info->digest[i] = sha_info->digest[i] + S[i];
-    }
-
-}
-
-
-
-/* initialize the SHA digest */
-
-static void
-sha_init(SHAobject *sha_info)
-{
-    TestEndianness(sha_info->Endianness)
-    sha_info->digest[0] = 0x6A09E667L;
-    sha_info->digest[1] = 0xBB67AE85L;
-    sha_info->digest[2] = 0x3C6EF372L;
-    sha_info->digest[3] = 0xA54FF53AL;
-    sha_info->digest[4] = 0x510E527FL;
-    sha_info->digest[5] = 0x9B05688CL;
-    sha_info->digest[6] = 0x1F83D9ABL;
-    sha_info->digest[7] = 0x5BE0CD19L;
-    sha_info->count_lo = 0L;
-    sha_info->count_hi = 0L;
-    sha_info->local = 0;
-    sha_info->digestsize = 32;
-}
-
-static void
-sha224_init(SHAobject *sha_info)
-{
-    TestEndianness(sha_info->Endianness)
-    sha_info->digest[0] = 0xc1059ed8L;
-    sha_info->digest[1] = 0x367cd507L;
-    sha_info->digest[2] = 0x3070dd17L;
-    sha_info->digest[3] = 0xf70e5939L;
-    sha_info->digest[4] = 0xffc00b31L;
-    sha_info->digest[5] = 0x68581511L;
-    sha_info->digest[6] = 0x64f98fa7L;
-    sha_info->digest[7] = 0xbefa4fa4L;
-    sha_info->count_lo = 0L;
-    sha_info->count_hi = 0L;
-    sha_info->local = 0;
-    sha_info->digestsize = 28;
-}
-
-
-/* update the SHA digest */
-
-static void
-sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
-{
-    int i;
-    SHA_INT32 clo;
-
-    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
-    if (clo < sha_info->count_lo) {
-        ++sha_info->count_hi;
-    }
-    sha_info->count_lo = clo;
-    sha_info->count_hi += (SHA_INT32) count >> 29;
-    if (sha_info->local) {
-        i = SHA_BLOCKSIZE - sha_info->local;
-        if (i > count) {
-            i = count;
-        }
-        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
-        count -= i;
-        buffer += i;
-        sha_info->local += i;
-        if (sha_info->local == SHA_BLOCKSIZE) {
-            sha_transform(sha_info);
-        }
-        else {
-            return;
-        }
-    }
-    while (count >= SHA_BLOCKSIZE) {
-        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
-        buffer += SHA_BLOCKSIZE;
-        count -= SHA_BLOCKSIZE;
-        sha_transform(sha_info);
-    }
-    memcpy(sha_info->data, buffer, count);
-    sha_info->local = count;
-}
-
-/* finish computing the SHA digest */
-
-static void
-sha_final(unsigned char digest[SHA_DIGESTSIZE], SHAobject *sha_info)
-{
-    int count;
-    SHA_INT32 lo_bit_count, hi_bit_count;
-
-    lo_bit_count = sha_info->count_lo;
-    hi_bit_count = sha_info->count_hi;
-    count = (int) ((lo_bit_count >> 3) & 0x3f);
-    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
-    if (count > SHA_BLOCKSIZE - 8) {
-	memset(((SHA_BYTE *) sha_info->data) + count, 0,
-	       SHA_BLOCKSIZE - count);
-	sha_transform(sha_info);
-	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 8);
-    }
-    else {
-	memset(((SHA_BYTE *) sha_info->data) + count, 0,
-	       SHA_BLOCKSIZE - 8 - count);
-    }
-
-    /* GJS: note that we add the hi/lo in big-endian. sha_transform will
-       swap these values into host-order. */
-    sha_info->data[56] = (hi_bit_count >> 24) & 0xff;
-    sha_info->data[57] = (hi_bit_count >> 16) & 0xff;
-    sha_info->data[58] = (hi_bit_count >>  8) & 0xff;
-    sha_info->data[59] = (hi_bit_count >>  0) & 0xff;
-    sha_info->data[60] = (lo_bit_count >> 24) & 0xff;
-    sha_info->data[61] = (lo_bit_count >> 16) & 0xff;
-    sha_info->data[62] = (lo_bit_count >>  8) & 0xff;
-    sha_info->data[63] = (lo_bit_count >>  0) & 0xff;
-    sha_transform(sha_info);
-    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
-    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
-    digest[ 2] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
-    digest[ 3] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
-    digest[ 4] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
-    digest[ 5] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
-    digest[ 6] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
-    digest[ 7] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
-    digest[ 8] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
-    digest[ 9] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
-    digest[10] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
-    digest[11] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
-    digest[12] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
-    digest[13] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
-    digest[14] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
-    digest[15] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
-    digest[16] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
-    digest[17] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
-    digest[18] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
-    digest[19] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
-    digest[20] = (unsigned char) ((sha_info->digest[5] >> 24) & 0xff);
-    digest[21] = (unsigned char) ((sha_info->digest[5] >> 16) & 0xff);
-    digest[22] = (unsigned char) ((sha_info->digest[5] >>  8) & 0xff);
-    digest[23] = (unsigned char) ((sha_info->digest[5]      ) & 0xff);
-    digest[24] = (unsigned char) ((sha_info->digest[6] >> 24) & 0xff);
-    digest[25] = (unsigned char) ((sha_info->digest[6] >> 16) & 0xff);
-    digest[26] = (unsigned char) ((sha_info->digest[6] >>  8) & 0xff);
-    digest[27] = (unsigned char) ((sha_info->digest[6]      ) & 0xff);
-    digest[28] = (unsigned char) ((sha_info->digest[7] >> 24) & 0xff);
-    digest[29] = (unsigned char) ((sha_info->digest[7] >> 16) & 0xff);
-    digest[30] = (unsigned char) ((sha_info->digest[7] >>  8) & 0xff);
-    digest[31] = (unsigned char) ((sha_info->digest[7]      ) & 0xff);
-}
-
-/*
- * End of copied SHA code.
- *
- * ------------------------------------------------------------------------
- */
-
-static PyTypeObject SHA224type;
-static PyTypeObject SHA256type;
-
-
-static SHAobject *
-newSHA224object(void)
-{
-    return (SHAobject *)PyObject_New(SHAobject, &SHA224type);
-}
-
-static SHAobject *
-newSHA256object(void)
-{
-    return (SHAobject *)PyObject_New(SHAobject, &SHA256type);
-}
-
-/* Internal methods for a hash object */
-
-static void
-SHA_dealloc(PyObject *ptr)
-{
-    PyObject_Del(ptr);
-}
-
-
-/* External methods for a hash object */
-
-PyDoc_STRVAR(SHA256_copy__doc__, "Return a copy of the hash object.");
-
-static PyObject *
-SHA256_copy(SHAobject *self, PyObject *unused)
-{
-    SHAobject *newobj;
-
-    if (((PyObject*)self)->ob_type == &SHA256type) {
-        if ( (newobj = newSHA256object())==NULL)
-            return NULL;
-    } else {
-        if ( (newobj = newSHA224object())==NULL)
-            return NULL;
-    }
-
-    SHAcopy(self, newobj);
-    return (PyObject *)newobj;
-}
-
-PyDoc_STRVAR(SHA256_digest__doc__,
-"Return the digest value as a string of binary data.");
-
-static PyObject *
-SHA256_digest(SHAobject *self, PyObject *unused)
-{
-    unsigned char digest[SHA_DIGESTSIZE];
-    SHAobject temp;
-
-    SHAcopy(self, &temp);
-    sha_final(digest, &temp);
-    return PyString_FromStringAndSize((const char *)digest, self->digestsize);
-}
-
-PyDoc_STRVAR(SHA256_hexdigest__doc__,
-"Return the digest value as a string of hexadecimal digits.");
-
-static PyObject *
-SHA256_hexdigest(SHAobject *self, PyObject *unused)
-{
-    unsigned char digest[SHA_DIGESTSIZE];
-    SHAobject temp;
-    PyObject *retval;
-    char *hex_digest;
-    int i, j;
-
-    /* Get the raw (binary) digest value */
-    SHAcopy(self, &temp);
-    sha_final(digest, &temp);
-
-    /* Create a new string */
-    retval = PyString_FromStringAndSize(NULL, self->digestsize * 2);
-    if (!retval)
-	    return NULL;
-    hex_digest = PyString_AsString(retval);
-    if (!hex_digest) {
-	    Py_DECREF(retval);
-	    return NULL;
-    }
-
-    /* Make hex version of the digest */
-    for(i=j=0; i<self->digestsize; i++) {
-        char c;
-        c = (digest[i] >> 4) & 0xf;
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-        c = (digest[i] & 0xf);
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-    }
-    return retval;
-}
-
-PyDoc_STRVAR(SHA256_update__doc__,
-"Update this hash object's state with the provided string.");
-
-static PyObject *
-SHA256_update(SHAobject *self, PyObject *args)
-{
-    unsigned char *cp;
-    int len;
-
-    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
-        return NULL;
-
-    sha_update(self, cp, len);
-
-    Py_INCREF(Py_None);
-    return Py_None;
-}
-
-static PyMethodDef SHA_methods[] = {
-    {"copy",	  (PyCFunction)SHA256_copy,      METH_NOARGS,  SHA256_copy__doc__},
-    {"digest",	  (PyCFunction)SHA256_digest,    METH_NOARGS,  SHA256_digest__doc__},
-    {"hexdigest", (PyCFunction)SHA256_hexdigest, METH_NOARGS,  SHA256_hexdigest__doc__},
-    {"update",	  (PyCFunction)SHA256_update,    METH_VARARGS, SHA256_update__doc__},
-    {NULL,	  NULL}		/* sentinel */
-};
-
-static PyObject *
-SHA256_get_block_size(PyObject *self, void *closure)
-{
-    return PyInt_FromLong(SHA_BLOCKSIZE);
-}
-
-static PyObject *
-SHA256_get_name(PyObject *self, void *closure)
-{
-    if (((SHAobject *)self)->digestsize == 32)
-        return PyString_FromStringAndSize("SHA256", 6);
-    else
-        return PyString_FromStringAndSize("SHA224", 6);
-}
-
-static PyGetSetDef SHA_getseters[] = {
-    {"block_size",
-     (getter)SHA256_get_block_size, NULL,
-     NULL,
-     NULL},
-    {"name",
-     (getter)SHA256_get_name, NULL,
-     NULL,
-     NULL},
-    {NULL}  /* Sentinel */
-};
-
-static PyMemberDef SHA_members[] = {
-    {"digest_size", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
-    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
-     * the old sha module also supported 'digestsize'.  ugh. */
-    {"digestsize", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
-    {NULL}  /* Sentinel */
-};
-
-static PyTypeObject SHA224type = {
-    PyObject_HEAD_INIT(NULL)
-    0,			/*ob_size*/
-    "_sha256.sha224",	/*tp_name*/
-    sizeof(SHAobject),	/*tp_size*/
-    0,			/*tp_itemsize*/
-    /* methods */
-    SHA_dealloc,	/*tp_dealloc*/
-    0,			/*tp_print*/
-    0,          	/*tp_getattr*/
-    0,                  /*tp_setattr*/
-    0,                  /*tp_compare*/
-    0,                  /*tp_repr*/
-    0,                  /*tp_as_number*/
-    0,                  /*tp_as_sequence*/
-    0,                  /*tp_as_mapping*/
-    0,                  /*tp_hash*/
-    0,                  /*tp_call*/
-    0,                  /*tp_str*/
-    0,                  /*tp_getattro*/
-    0,                  /*tp_setattro*/
-    0,                  /*tp_as_buffer*/
-    Py_TPFLAGS_DEFAULT, /*tp_flags*/
-    0,                  /*tp_doc*/
-    0,                  /*tp_traverse*/
-    0,			/*tp_clear*/
-    0,			/*tp_richcompare*/
-    0,			/*tp_weaklistoffset*/
-    0,			/*tp_iter*/
-    0,			/*tp_iternext*/
-    SHA_methods,	/* tp_methods */
-    SHA_members,	/* tp_members */
-    SHA_getseters,      /* tp_getset */
-};
-
-static PyTypeObject SHA256type = {
-    PyObject_HEAD_INIT(NULL)
-    0,			/*ob_size*/
-    "_sha256.sha256",	/*tp_name*/
-    sizeof(SHAobject),	/*tp_size*/
-    0,			/*tp_itemsize*/
-    /* methods */
-    SHA_dealloc,	/*tp_dealloc*/
-    0,			/*tp_print*/
-    0,          	/*tp_getattr*/
-    0,                  /*tp_setattr*/
-    0,                  /*tp_compare*/
-    0,                  /*tp_repr*/
-    0,                  /*tp_as_number*/
-    0,                  /*tp_as_sequence*/
-    0,                  /*tp_as_mapping*/
-    0,                  /*tp_hash*/
-    0,                  /*tp_call*/
-    0,                  /*tp_str*/
-    0,                  /*tp_getattro*/
-    0,                  /*tp_setattro*/
-    0,                  /*tp_as_buffer*/
-    Py_TPFLAGS_DEFAULT, /*tp_flags*/
-    0,                  /*tp_doc*/
-    0,                  /*tp_traverse*/
-    0,			/*tp_clear*/
-    0,			/*tp_richcompare*/
-    0,			/*tp_weaklistoffset*/
-    0,			/*tp_iter*/
-    0,			/*tp_iternext*/
-    SHA_methods,	/* tp_methods */
-    SHA_members,	/* tp_members */
-    SHA_getseters,      /* tp_getset */
-};
-
-
-/* The single module-level function: new() */
-
-PyDoc_STRVAR(SHA256_new__doc__,
-"Return a new SHA-256 hash object; optionally initialized with a string.");
-
-static PyObject *
-SHA256_new(PyObject *self, PyObject *args, PyObject *kwdict)
-{
-    static char *kwlist[] = {"string", NULL};
-    SHAobject *new;
-    unsigned char *cp = NULL;
-    int len;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
-                                     &cp, &len)) {
-        return NULL;
-    }
-
-    if ((new = newSHA256object()) == NULL)
-        return NULL;
-
-    sha_init(new);
-
-    if (PyErr_Occurred()) {
-        Py_DECREF(new);
-        return NULL;
-    }
-    if (cp)
-        sha_update(new, cp, len);
-
-    return (PyObject *)new;
-}
-
-PyDoc_STRVAR(SHA224_new__doc__,
-"Return a new SHA-224 hash object; optionally initialized with a string.");
-
-static PyObject *
-SHA224_new(PyObject *self, PyObject *args, PyObject *kwdict)
-{
-    static char *kwlist[] = {"string", NULL};
-    SHAobject *new;
-    unsigned char *cp = NULL;
-    int len;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
-                                     &cp, &len)) {
-        return NULL;
-    }
-
-    if ((new = newSHA224object()) == NULL)
-        return NULL;
-
-    sha224_init(new);
-
-    if (PyErr_Occurred()) {
-        Py_DECREF(new);
-        return NULL;
-    }
-    if (cp)
-        sha_update(new, cp, len);
-
-    return (PyObject *)new;
-}
-
-
-/* List of functions exported by this module */
-
-static struct PyMethodDef SHA_functions[] = {
-    {"sha256", (PyCFunction)SHA256_new, METH_VARARGS|METH_KEYWORDS, SHA256_new__doc__},
-    {"sha224", (PyCFunction)SHA224_new, METH_VARARGS|METH_KEYWORDS, SHA224_new__doc__},
-    {NULL,	NULL}		 /* Sentinel */
-};
-
-
-/* Initialize this module. */
-
-#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
-
-PyMODINIT_FUNC
-init_sha256(void)
-{
-    PyObject *m;
-
-    SHA224type.ob_type = &PyType_Type;
-    if (PyType_Ready(&SHA224type) < 0)
-        return;
-    SHA256type.ob_type = &PyType_Type;
-    if (PyType_Ready(&SHA256type) < 0)
-        return;
-    m = Py_InitModule("_sha256", SHA_functions);
-    if (m == NULL)
-	return;
-}
diff --git a/distutils2/_backport/sha512module.c b/distutils2/_backport/sha512module.c
deleted file mode 100644
--- a/distutils2/_backport/sha512module.c
+++ /dev/null
@@ -1,769 +0,0 @@
-/* SHA512 module */
-
-/* This module provides an interface to NIST's SHA-512 and SHA-384 Algorithms */
-
-/* See below for information about the original code this module was
-   based upon. Additional work performed by:
-
-   Andrew Kuchling (amk at amk.ca)
-   Greg Stein (gstein at lyra.org)
-   Trevor Perrin (trevp at trevp.net)
-
-   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
-   Licensed to PSF under a Contributor Agreement.
-
-*/
-
-/* SHA objects */
-
-#include "Python.h"
-#include "structmember.h"
-
-#ifdef PY_LONG_LONG /* If no PY_LONG_LONG, don't compile anything! */
-
-/* Endianness testing and definitions */
-#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
-	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
-
-#define PCT_LITTLE_ENDIAN 1
-#define PCT_BIG_ENDIAN 0
-
-/* Some useful types */
-
-typedef unsigned char SHA_BYTE;
-
-#if SIZEOF_INT == 4
-typedef unsigned int SHA_INT32;	/* 32-bit integer */
-typedef unsigned PY_LONG_LONG SHA_INT64;	/* 64-bit integer */
-#else
-/* not defined. compilation will die. */
-#endif
-
-/* The SHA block size and message digest sizes, in bytes */
-
-#define SHA_BLOCKSIZE   128
-#define SHA_DIGESTSIZE  64
-
-/* The structure for storing SHA info */
-
-typedef struct {
-    PyObject_HEAD
-    SHA_INT64 digest[8];		/* Message digest */
-    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
-    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
-    int Endianness;
-    int local;				/* unprocessed amount in data */
-    int digestsize;
-} SHAobject;
-
-/* When run on a little-endian CPU we need to perform byte reversal on an
-   array of longwords. */
-
-static void longReverse(SHA_INT64 *buffer, int byteCount, int Endianness)
-{
-    SHA_INT64 value;
-
-    if ( Endianness == PCT_BIG_ENDIAN )
-	return;
-
-    byteCount /= sizeof(*buffer);
-    while (byteCount--) {
-        value = *buffer;
-
-		((unsigned char*)buffer)[0] = (unsigned char)(value >> 56) & 0xff;
-		((unsigned char*)buffer)[1] = (unsigned char)(value >> 48) & 0xff;
-		((unsigned char*)buffer)[2] = (unsigned char)(value >> 40) & 0xff;
-		((unsigned char*)buffer)[3] = (unsigned char)(value >> 32) & 0xff;
-		((unsigned char*)buffer)[4] = (unsigned char)(value >> 24) & 0xff;
-		((unsigned char*)buffer)[5] = (unsigned char)(value >> 16) & 0xff;
-		((unsigned char*)buffer)[6] = (unsigned char)(value >>  8) & 0xff;
-		((unsigned char*)buffer)[7] = (unsigned char)(value      ) & 0xff;
-        
-		buffer++;
-    }
-}
-
-static void SHAcopy(SHAobject *src, SHAobject *dest)
-{
-    dest->Endianness = src->Endianness;
-    dest->local = src->local;
-    dest->digestsize = src->digestsize;
-    dest->count_lo = src->count_lo;
-    dest->count_hi = src->count_hi;
-    memcpy(dest->digest, src->digest, sizeof(src->digest));
-    memcpy(dest->data, src->data, sizeof(src->data));
-}
-
-
-/* ------------------------------------------------------------------------
- *
- * This code for the SHA-512 algorithm was noted as public domain. The
- * original headers are pasted below.
- *
- * Several changes have been made to make it more compatible with the
- * Python environment and desired interface.
- *
- */
-
-/* LibTomCrypt, modular cryptographic library -- Tom St Denis
- *
- * LibTomCrypt is a library that provides various cryptographic
- * algorithms in a highly modular and flexible manner.
- *
- * The library is free for all purposes without any express
- * gurantee it works.
- *
- * Tom St Denis, tomstdenis at iahu.ca, http://libtomcrypt.org
- */
-
-
-/* SHA512 by Tom St Denis */
-
-/* Various logical functions */
-#define ROR64(x, y) \
-    ( ((((x) & 0xFFFFFFFFFFFFFFFFULL)>>((unsigned PY_LONG_LONG)(y) & 63)) | \
-      ((x)<<((unsigned PY_LONG_LONG)(64-((y) & 63))))) & 0xFFFFFFFFFFFFFFFFULL)
-#define Ch(x,y,z)       (z ^ (x & (y ^ z)))
-#define Maj(x,y,z)      (((x | y) & z) | (x & y)) 
-#define S(x, n)         ROR64((x),(n))
-#define R(x, n)         (((x) & 0xFFFFFFFFFFFFFFFFULL) >> ((unsigned PY_LONG_LONG)n))
-#define Sigma0(x)       (S(x, 28) ^ S(x, 34) ^ S(x, 39))
-#define Sigma1(x)       (S(x, 14) ^ S(x, 18) ^ S(x, 41))
-#define Gamma0(x)       (S(x, 1) ^ S(x, 8) ^ R(x, 7))
-#define Gamma1(x)       (S(x, 19) ^ S(x, 61) ^ R(x, 6))
-
-
-static void
-sha512_transform(SHAobject *sha_info)
-{
-    int i;
-    SHA_INT64 S[8], W[80], t0, t1;
-
-    memcpy(W, sha_info->data, sizeof(sha_info->data));
-    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
-
-    for (i = 16; i < 80; ++i) {
-		W[i] = Gamma1(W[i - 2]) + W[i - 7] + Gamma0(W[i - 15]) + W[i - 16];
-    }
-    for (i = 0; i < 8; ++i) {
-        S[i] = sha_info->digest[i];
-    }
-
-    /* Compress */
-#define RND(a,b,c,d,e,f,g,h,i,ki)                    \
-     t0 = h + Sigma1(e) + Ch(e, f, g) + ki + W[i];   \
-     t1 = Sigma0(a) + Maj(a, b, c);                  \
-     d += t0;                                        \
-     h  = t0 + t1;
-
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],0,0x428a2f98d728ae22ULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],1,0x7137449123ef65cdULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],2,0xb5c0fbcfec4d3b2fULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],3,0xe9b5dba58189dbbcULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],4,0x3956c25bf348b538ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],5,0x59f111f1b605d019ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],6,0x923f82a4af194f9bULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],7,0xab1c5ed5da6d8118ULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],8,0xd807aa98a3030242ULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],9,0x12835b0145706fbeULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],10,0x243185be4ee4b28cULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],11,0x550c7dc3d5ffb4e2ULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],12,0x72be5d74f27b896fULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],13,0x80deb1fe3b1696b1ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],14,0x9bdc06a725c71235ULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],15,0xc19bf174cf692694ULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],16,0xe49b69c19ef14ad2ULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],17,0xefbe4786384f25e3ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],18,0x0fc19dc68b8cd5b5ULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],19,0x240ca1cc77ac9c65ULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],20,0x2de92c6f592b0275ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],21,0x4a7484aa6ea6e483ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],22,0x5cb0a9dcbd41fbd4ULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],23,0x76f988da831153b5ULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],24,0x983e5152ee66dfabULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],25,0xa831c66d2db43210ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],26,0xb00327c898fb213fULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],27,0xbf597fc7beef0ee4ULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],28,0xc6e00bf33da88fc2ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],29,0xd5a79147930aa725ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],30,0x06ca6351e003826fULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],31,0x142929670a0e6e70ULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],32,0x27b70a8546d22ffcULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],33,0x2e1b21385c26c926ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],34,0x4d2c6dfc5ac42aedULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],35,0x53380d139d95b3dfULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],36,0x650a73548baf63deULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],37,0x766a0abb3c77b2a8ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],38,0x81c2c92e47edaee6ULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],39,0x92722c851482353bULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],40,0xa2bfe8a14cf10364ULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],41,0xa81a664bbc423001ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],42,0xc24b8b70d0f89791ULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],43,0xc76c51a30654be30ULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],44,0xd192e819d6ef5218ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],45,0xd69906245565a910ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],46,0xf40e35855771202aULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],47,0x106aa07032bbd1b8ULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],48,0x19a4c116b8d2d0c8ULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],49,0x1e376c085141ab53ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],50,0x2748774cdf8eeb99ULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],51,0x34b0bcb5e19b48a8ULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],52,0x391c0cb3c5c95a63ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],53,0x4ed8aa4ae3418acbULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],54,0x5b9cca4f7763e373ULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],55,0x682e6ff3d6b2b8a3ULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],56,0x748f82ee5defb2fcULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],57,0x78a5636f43172f60ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],58,0x84c87814a1f0ab72ULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],59,0x8cc702081a6439ecULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],60,0x90befffa23631e28ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],61,0xa4506cebde82bde9ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],62,0xbef9a3f7b2c67915ULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],63,0xc67178f2e372532bULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],64,0xca273eceea26619cULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],65,0xd186b8c721c0c207ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],66,0xeada7dd6cde0eb1eULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],67,0xf57d4f7fee6ed178ULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],68,0x06f067aa72176fbaULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],69,0x0a637dc5a2c898a6ULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],70,0x113f9804bef90daeULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],71,0x1b710b35131c471bULL);
-    RND(S[0],S[1],S[2],S[3],S[4],S[5],S[6],S[7],72,0x28db77f523047d84ULL);
-    RND(S[7],S[0],S[1],S[2],S[3],S[4],S[5],S[6],73,0x32caab7b40c72493ULL);
-    RND(S[6],S[7],S[0],S[1],S[2],S[3],S[4],S[5],74,0x3c9ebe0a15c9bebcULL);
-    RND(S[5],S[6],S[7],S[0],S[1],S[2],S[3],S[4],75,0x431d67c49c100d4cULL);
-    RND(S[4],S[5],S[6],S[7],S[0],S[1],S[2],S[3],76,0x4cc5d4becb3e42b6ULL);
-    RND(S[3],S[4],S[5],S[6],S[7],S[0],S[1],S[2],77,0x597f299cfc657e2aULL);
-    RND(S[2],S[3],S[4],S[5],S[6],S[7],S[0],S[1],78,0x5fcb6fab3ad6faecULL);
-    RND(S[1],S[2],S[3],S[4],S[5],S[6],S[7],S[0],79,0x6c44198c4a475817ULL);
-
-#undef RND     
-    
-    /* feedback */
-    for (i = 0; i < 8; i++) {
-        sha_info->digest[i] = sha_info->digest[i] + S[i];
-    }
-
-}
-
-
-
-/* initialize the SHA digest */
-
-static void
-sha512_init(SHAobject *sha_info)
-{
-    TestEndianness(sha_info->Endianness)
-    sha_info->digest[0] = 0x6a09e667f3bcc908ULL;
-    sha_info->digest[1] = 0xbb67ae8584caa73bULL;
-    sha_info->digest[2] = 0x3c6ef372fe94f82bULL;
-    sha_info->digest[3] = 0xa54ff53a5f1d36f1ULL;
-    sha_info->digest[4] = 0x510e527fade682d1ULL;
-    sha_info->digest[5] = 0x9b05688c2b3e6c1fULL;
-    sha_info->digest[6] = 0x1f83d9abfb41bd6bULL;
-    sha_info->digest[7] = 0x5be0cd19137e2179ULL;
-    sha_info->count_lo = 0L;
-    sha_info->count_hi = 0L;
-    sha_info->local = 0;
-    sha_info->digestsize = 64;
-}
-
-static void
-sha384_init(SHAobject *sha_info)
-{
-    TestEndianness(sha_info->Endianness)
-    sha_info->digest[0] = 0xcbbb9d5dc1059ed8ULL;
-    sha_info->digest[1] = 0x629a292a367cd507ULL;
-    sha_info->digest[2] = 0x9159015a3070dd17ULL;
-    sha_info->digest[3] = 0x152fecd8f70e5939ULL;
-    sha_info->digest[4] = 0x67332667ffc00b31ULL;
-    sha_info->digest[5] = 0x8eb44a8768581511ULL;
-    sha_info->digest[6] = 0xdb0c2e0d64f98fa7ULL;
-    sha_info->digest[7] = 0x47b5481dbefa4fa4ULL;
-    sha_info->count_lo = 0L;
-    sha_info->count_hi = 0L;
-    sha_info->local = 0;
-    sha_info->digestsize = 48;
-}
-
-
-/* update the SHA digest */
-
-static void
-sha512_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
-{
-    int i;
-    SHA_INT32 clo;
-
-    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
-    if (clo < sha_info->count_lo) {
-        ++sha_info->count_hi;
-    }
-    sha_info->count_lo = clo;
-    sha_info->count_hi += (SHA_INT32) count >> 29;
-    if (sha_info->local) {
-        i = SHA_BLOCKSIZE - sha_info->local;
-        if (i > count) {
-            i = count;
-        }
-        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
-        count -= i;
-        buffer += i;
-        sha_info->local += i;
-        if (sha_info->local == SHA_BLOCKSIZE) {
-            sha512_transform(sha_info);
-        }
-        else {
-            return;
-        }
-    }
-    while (count >= SHA_BLOCKSIZE) {
-        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
-        buffer += SHA_BLOCKSIZE;
-        count -= SHA_BLOCKSIZE;
-        sha512_transform(sha_info);
-    }
-    memcpy(sha_info->data, buffer, count);
-    sha_info->local = count;
-}
-
-/* finish computing the SHA digest */
-
-static void
-sha512_final(unsigned char digest[SHA_DIGESTSIZE], SHAobject *sha_info)
-{
-    int count;
-    SHA_INT32 lo_bit_count, hi_bit_count;
-
-    lo_bit_count = sha_info->count_lo;
-    hi_bit_count = sha_info->count_hi;
-    count = (int) ((lo_bit_count >> 3) & 0x7f);
-    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
-    if (count > SHA_BLOCKSIZE - 16) {
-	memset(((SHA_BYTE *) sha_info->data) + count, 0,
-	       SHA_BLOCKSIZE - count);
-	sha512_transform(sha_info);
-	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 16);
-    }
-    else {
-	memset(((SHA_BYTE *) sha_info->data) + count, 0,
-	       SHA_BLOCKSIZE - 16 - count);
-    }
-
-    /* GJS: note that we add the hi/lo in big-endian. sha512_transform will
-       swap these values into host-order. */
-    sha_info->data[112] = 0;
-    sha_info->data[113] = 0;
-    sha_info->data[114] = 0;
-    sha_info->data[115] = 0;
-    sha_info->data[116] = 0;
-    sha_info->data[117] = 0;
-    sha_info->data[118] = 0;
-    sha_info->data[119] = 0;
-    sha_info->data[120] = (hi_bit_count >> 24) & 0xff;
-    sha_info->data[121] = (hi_bit_count >> 16) & 0xff;
-    sha_info->data[122] = (hi_bit_count >>  8) & 0xff;
-    sha_info->data[123] = (hi_bit_count >>  0) & 0xff;
-    sha_info->data[124] = (lo_bit_count >> 24) & 0xff;
-    sha_info->data[125] = (lo_bit_count >> 16) & 0xff;
-    sha_info->data[126] = (lo_bit_count >>  8) & 0xff;
-    sha_info->data[127] = (lo_bit_count >>  0) & 0xff;
-    sha512_transform(sha_info);
-    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 56) & 0xff);
-    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 48) & 0xff);
-    digest[ 2] = (unsigned char) ((sha_info->digest[0] >> 40) & 0xff);
-    digest[ 3] = (unsigned char) ((sha_info->digest[0] >> 32) & 0xff);
-    digest[ 4] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
-    digest[ 5] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
-    digest[ 6] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
-    digest[ 7] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
-    digest[ 8] = (unsigned char) ((sha_info->digest[1] >> 56) & 0xff);
-    digest[ 9] = (unsigned char) ((sha_info->digest[1] >> 48) & 0xff);
-    digest[10] = (unsigned char) ((sha_info->digest[1] >> 40) & 0xff);
-    digest[11] = (unsigned char) ((sha_info->digest[1] >> 32) & 0xff);
-    digest[12] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
-    digest[13] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
-    digest[14] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
-    digest[15] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
-    digest[16] = (unsigned char) ((sha_info->digest[2] >> 56) & 0xff);
-    digest[17] = (unsigned char) ((sha_info->digest[2] >> 48) & 0xff);
-    digest[18] = (unsigned char) ((sha_info->digest[2] >> 40) & 0xff);
-    digest[19] = (unsigned char) ((sha_info->digest[2] >> 32) & 0xff);
-    digest[20] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
-    digest[21] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
-    digest[22] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
-    digest[23] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
-    digest[24] = (unsigned char) ((sha_info->digest[3] >> 56) & 0xff);
-    digest[25] = (unsigned char) ((sha_info->digest[3] >> 48) & 0xff);
-    digest[26] = (unsigned char) ((sha_info->digest[3] >> 40) & 0xff);
-    digest[27] = (unsigned char) ((sha_info->digest[3] >> 32) & 0xff);
-    digest[28] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
-    digest[29] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
-    digest[30] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
-    digest[31] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
-    digest[32] = (unsigned char) ((sha_info->digest[4] >> 56) & 0xff);
-    digest[33] = (unsigned char) ((sha_info->digest[4] >> 48) & 0xff);
-    digest[34] = (unsigned char) ((sha_info->digest[4] >> 40) & 0xff);
-    digest[35] = (unsigned char) ((sha_info->digest[4] >> 32) & 0xff);
-    digest[36] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
-    digest[37] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
-    digest[38] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
-    digest[39] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
-    digest[40] = (unsigned char) ((sha_info->digest[5] >> 56) & 0xff);
-    digest[41] = (unsigned char) ((sha_info->digest[5] >> 48) & 0xff);
-    digest[42] = (unsigned char) ((sha_info->digest[5] >> 40) & 0xff);
-    digest[43] = (unsigned char) ((sha_info->digest[5] >> 32) & 0xff);
-    digest[44] = (unsigned char) ((sha_info->digest[5] >> 24) & 0xff);
-    digest[45] = (unsigned char) ((sha_info->digest[5] >> 16) & 0xff);
-    digest[46] = (unsigned char) ((sha_info->digest[5] >>  8) & 0xff);
-    digest[47] = (unsigned char) ((sha_info->digest[5]      ) & 0xff);
-    digest[48] = (unsigned char) ((sha_info->digest[6] >> 56) & 0xff);
-    digest[49] = (unsigned char) ((sha_info->digest[6] >> 48) & 0xff);
-    digest[50] = (unsigned char) ((sha_info->digest[6] >> 40) & 0xff);
-    digest[51] = (unsigned char) ((sha_info->digest[6] >> 32) & 0xff);
-    digest[52] = (unsigned char) ((sha_info->digest[6] >> 24) & 0xff);
-    digest[53] = (unsigned char) ((sha_info->digest[6] >> 16) & 0xff);
-    digest[54] = (unsigned char) ((sha_info->digest[6] >>  8) & 0xff);
-    digest[55] = (unsigned char) ((sha_info->digest[6]      ) & 0xff);
-    digest[56] = (unsigned char) ((sha_info->digest[7] >> 56) & 0xff);
-    digest[57] = (unsigned char) ((sha_info->digest[7] >> 48) & 0xff);
-    digest[58] = (unsigned char) ((sha_info->digest[7] >> 40) & 0xff);
-    digest[59] = (unsigned char) ((sha_info->digest[7] >> 32) & 0xff);
-    digest[60] = (unsigned char) ((sha_info->digest[7] >> 24) & 0xff);
-    digest[61] = (unsigned char) ((sha_info->digest[7] >> 16) & 0xff);
-    digest[62] = (unsigned char) ((sha_info->digest[7] >>  8) & 0xff);
-    digest[63] = (unsigned char) ((sha_info->digest[7]      ) & 0xff);
-}
-
-/*
- * End of copied SHA code.
- *
- * ------------------------------------------------------------------------
- */
-
-static PyTypeObject SHA384type;
-static PyTypeObject SHA512type;
-
-
-static SHAobject *
-newSHA384object(void)
-{
-    return (SHAobject *)PyObject_New(SHAobject, &SHA384type);
-}
-
-static SHAobject *
-newSHA512object(void)
-{
-    return (SHAobject *)PyObject_New(SHAobject, &SHA512type);
-}
-
-/* Internal methods for a hash object */
-
-static void
-SHA512_dealloc(PyObject *ptr)
-{
-    PyObject_Del(ptr);
-}
-
-
-/* External methods for a hash object */
-
-PyDoc_STRVAR(SHA512_copy__doc__, "Return a copy of the hash object.");
-
-static PyObject *
-SHA512_copy(SHAobject *self, PyObject *unused)
-{
-    SHAobject *newobj;
-
-    if (((PyObject*)self)->ob_type == &SHA512type) {
-        if ( (newobj = newSHA512object())==NULL)
-            return NULL;
-    } else {
-        if ( (newobj = newSHA384object())==NULL)
-            return NULL;
-    }
-
-    SHAcopy(self, newobj);
-    return (PyObject *)newobj;
-}
-
-PyDoc_STRVAR(SHA512_digest__doc__,
-"Return the digest value as a string of binary data.");
-
-static PyObject *
-SHA512_digest(SHAobject *self, PyObject *unused)
-{
-    unsigned char digest[SHA_DIGESTSIZE];
-    SHAobject temp;
-
-    SHAcopy(self, &temp);
-    sha512_final(digest, &temp);
-    return PyString_FromStringAndSize((const char *)digest, self->digestsize);
-}
-
-PyDoc_STRVAR(SHA512_hexdigest__doc__,
-"Return the digest value as a string of hexadecimal digits.");
-
-static PyObject *
-SHA512_hexdigest(SHAobject *self, PyObject *unused)
-{
-    unsigned char digest[SHA_DIGESTSIZE];
-    SHAobject temp;
-    PyObject *retval;
-    char *hex_digest;
-    int i, j;
-
-    /* Get the raw (binary) digest value */
-    SHAcopy(self, &temp);
-    sha512_final(digest, &temp);
-
-    /* Create a new string */
-    retval = PyString_FromStringAndSize(NULL, self->digestsize * 2);
-    if (!retval)
-	    return NULL;
-    hex_digest = PyString_AsString(retval);
-    if (!hex_digest) {
-	    Py_DECREF(retval);
-	    return NULL;
-    }
-
-    /* Make hex version of the digest */
-    for (i=j=0; i<self->digestsize; i++) {
-        char c;
-        c = (digest[i] >> 4) & 0xf;
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-        c = (digest[i] & 0xf);
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-    }
-    return retval;
-}
-
-PyDoc_STRVAR(SHA512_update__doc__,
-"Update this hash object's state with the provided string.");
-
-static PyObject *
-SHA512_update(SHAobject *self, PyObject *args)
-{
-    unsigned char *cp;
-    int len;
-
-    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
-        return NULL;
-
-    sha512_update(self, cp, len);
-
-    Py_INCREF(Py_None);
-    return Py_None;
-}
-
-static PyMethodDef SHA_methods[] = {
-    {"copy",	  (PyCFunction)SHA512_copy,      METH_NOARGS, SHA512_copy__doc__},
-    {"digest",	  (PyCFunction)SHA512_digest,    METH_NOARGS, SHA512_digest__doc__},
-    {"hexdigest", (PyCFunction)SHA512_hexdigest, METH_NOARGS, SHA512_hexdigest__doc__},
-    {"update",	  (PyCFunction)SHA512_update,    METH_VARARGS, SHA512_update__doc__},
-    {NULL,	  NULL}		/* sentinel */
-};
-
-static PyObject *
-SHA512_get_block_size(PyObject *self, void *closure)
-{
-    return PyInt_FromLong(SHA_BLOCKSIZE);
-}
-
-static PyObject *
-SHA512_get_name(PyObject *self, void *closure)
-{
-    if (((SHAobject *)self)->digestsize == 64)
-        return PyString_FromStringAndSize("SHA512", 6);
-    else
-        return PyString_FromStringAndSize("SHA384", 6);
-}
-
-static PyGetSetDef SHA_getseters[] = {
-    {"block_size",
-     (getter)SHA512_get_block_size, NULL,
-     NULL,
-     NULL},
-    {"name",
-     (getter)SHA512_get_name, NULL,
-     NULL,
-     NULL},
-    {NULL}  /* Sentinel */
-};
-
-static PyMemberDef SHA_members[] = {
-    {"digest_size", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
-    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
-     * the old sha module also supported 'digestsize'.  ugh. */
-    {"digestsize", T_INT, offsetof(SHAobject, digestsize), READONLY, NULL},
-    {NULL}  /* Sentinel */
-};
-
-static PyTypeObject SHA384type = {
-    PyObject_HEAD_INIT(NULL)
-    0,			/*ob_size*/
-    "_sha512.sha384",	/*tp_name*/
-    sizeof(SHAobject),	/*tp_size*/
-    0,			/*tp_itemsize*/
-    /* methods */
-    SHA512_dealloc,	/*tp_dealloc*/
-    0,			/*tp_print*/
-    0,          	/*tp_getattr*/
-    0,                  /*tp_setattr*/
-    0,                  /*tp_compare*/
-    0,                  /*tp_repr*/
-    0,                  /*tp_as_number*/
-    0,                  /*tp_as_sequence*/
-    0,                  /*tp_as_mapping*/
-    0,                  /*tp_hash*/
-    0,                  /*tp_call*/
-    0,                  /*tp_str*/
-    0,                  /*tp_getattro*/
-    0,                  /*tp_setattro*/
-    0,                  /*tp_as_buffer*/
-    Py_TPFLAGS_DEFAULT, /*tp_flags*/
-    0,                  /*tp_doc*/
-    0,                  /*tp_traverse*/
-    0,			/*tp_clear*/
-    0,			/*tp_richcompare*/
-    0,			/*tp_weaklistoffset*/
-    0,			/*tp_iter*/
-    0,			/*tp_iternext*/
-    SHA_methods,	/* tp_methods */
-    SHA_members,	/* tp_members */
-    SHA_getseters,      /* tp_getset */
-};
-
-static PyTypeObject SHA512type = {
-    PyObject_HEAD_INIT(NULL)
-    0,			/*ob_size*/
-    "_sha512.sha512",	/*tp_name*/
-    sizeof(SHAobject),	/*tp_size*/
-    0,			/*tp_itemsize*/
-    /* methods */
-    SHA512_dealloc,	/*tp_dealloc*/
-    0,			/*tp_print*/
-    0,          	/*tp_getattr*/
-    0,                  /*tp_setattr*/
-    0,                  /*tp_compare*/
-    0,                  /*tp_repr*/
-    0,                  /*tp_as_number*/
-    0,                  /*tp_as_sequence*/
-    0,                  /*tp_as_mapping*/
-    0,                  /*tp_hash*/
-    0,                  /*tp_call*/
-    0,                  /*tp_str*/
-    0,                  /*tp_getattro*/
-    0,                  /*tp_setattro*/
-    0,                  /*tp_as_buffer*/
-    Py_TPFLAGS_DEFAULT, /*tp_flags*/
-    0,                  /*tp_doc*/
-    0,                  /*tp_traverse*/
-    0,			/*tp_clear*/
-    0,			/*tp_richcompare*/
-    0,			/*tp_weaklistoffset*/
-    0,			/*tp_iter*/
-    0,			/*tp_iternext*/
-    SHA_methods,	/* tp_methods */
-    SHA_members,	/* tp_members */
-    SHA_getseters,      /* tp_getset */
-};
-
-
-/* The single module-level function: new() */
-
-PyDoc_STRVAR(SHA512_new__doc__,
-"Return a new SHA-512 hash object; optionally initialized with a string.");
-
-static PyObject *
-SHA512_new(PyObject *self, PyObject *args, PyObject *kwdict)
-{
-    static char *kwlist[] = {"string", NULL};
-    SHAobject *new;
-    unsigned char *cp = NULL;
-    int len;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
-                                     &cp, &len)) {
-        return NULL;
-    }
-
-    if ((new = newSHA512object()) == NULL)
-        return NULL;
-
-    sha512_init(new);
-
-    if (PyErr_Occurred()) {
-        Py_DECREF(new);
-        return NULL;
-    }
-    if (cp)
-        sha512_update(new, cp, len);
-
-    return (PyObject *)new;
-}
-
-PyDoc_STRVAR(SHA384_new__doc__,
-"Return a new SHA-384 hash object; optionally initialized with a string.");
-
-static PyObject *
-SHA384_new(PyObject *self, PyObject *args, PyObject *kwdict)
-{
-    static char *kwlist[] = {"string", NULL};
-    SHAobject *new;
-    unsigned char *cp = NULL;
-    int len;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
-                                     &cp, &len)) {
-        return NULL;
-    }
-
-    if ((new = newSHA384object()) == NULL)
-        return NULL;
-
-    sha384_init(new);
-
-    if (PyErr_Occurred()) {
-        Py_DECREF(new);
-        return NULL;
-    }
-    if (cp)
-        sha512_update(new, cp, len);
-
-    return (PyObject *)new;
-}
-
-
-/* List of functions exported by this module */
-
-static struct PyMethodDef SHA_functions[] = {
-    {"sha512", (PyCFunction)SHA512_new, METH_VARARGS|METH_KEYWORDS, SHA512_new__doc__},
-    {"sha384", (PyCFunction)SHA384_new, METH_VARARGS|METH_KEYWORDS, SHA384_new__doc__},
-    {NULL,	NULL}		 /* Sentinel */
-};
-
-
-/* Initialize this module. */
-
-#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
-
-PyMODINIT_FUNC
-init_sha512(void)
-{
-    PyObject *m;
-
-    SHA384type.ob_type = &PyType_Type;
-    if (PyType_Ready(&SHA384type) < 0)
-        return;
-    SHA512type.ob_type = &PyType_Type;
-    if (PyType_Ready(&SHA512type) < 0)
-        return;
-    m = Py_InitModule("_sha512", SHA_functions);
-    if (m == NULL)
-	return;
-}
-
-#endif
diff --git a/distutils2/_backport/shamodule.c b/distutils2/_backport/shamodule.c
deleted file mode 100644
--- a/distutils2/_backport/shamodule.c
+++ /dev/null
@@ -1,593 +0,0 @@
-/* SHA module */
-
-/* This module provides an interface to NIST's Secure Hash Algorithm */
-
-/* See below for information about the original code this module was
-   based upon. Additional work performed by:
-
-   Andrew Kuchling (amk at amk.ca)
-   Greg Stein (gstein at lyra.org)
-
-   Copyright (C) 2005   Gregory P. Smith (greg at krypto.org)
-   Licensed to PSF under a Contributor Agreement.
-
-*/
-
-/* SHA objects */
-
-#include "Python.h"
-#include "structmember.h"
-
-
-/* Endianness testing and definitions */
-#define TestEndianness(variable) {int i=1; variable=PCT_BIG_ENDIAN;\
-	if (*((char*)&i)==1) variable=PCT_LITTLE_ENDIAN;}
-
-#define PCT_LITTLE_ENDIAN 1
-#define PCT_BIG_ENDIAN 0
-
-/* Some useful types */
-
-typedef unsigned char SHA_BYTE;
-
-#if SIZEOF_INT == 4
-typedef unsigned int SHA_INT32;	/* 32-bit integer */
-#else
-/* not defined. compilation will die. */
-#endif
-
-/* The SHA block size and message digest sizes, in bytes */
-
-#define SHA_BLOCKSIZE    64
-#define SHA_DIGESTSIZE  20
-
-/* The structure for storing SHS info */
-
-typedef struct {
-    PyObject_HEAD
-    SHA_INT32 digest[5];		/* Message digest */
-    SHA_INT32 count_lo, count_hi;	/* 64-bit bit count */
-    SHA_BYTE data[SHA_BLOCKSIZE];	/* SHA data buffer */
-    int Endianness;
-    int local;				/* unprocessed amount in data */
-} SHAobject;
-
-/* When run on a little-endian CPU we need to perform byte reversal on an
-   array of longwords. */
-
-static void longReverse(SHA_INT32 *buffer, int byteCount, int Endianness)
-{
-    SHA_INT32 value;
-
-    if ( Endianness == PCT_BIG_ENDIAN )
-	return;
-
-    byteCount /= sizeof(*buffer);
-    while (byteCount--) {
-        value = *buffer;
-        value = ( ( value & 0xFF00FF00L ) >> 8  ) | \
-                ( ( value & 0x00FF00FFL ) << 8 );
-        *buffer++ = ( value << 16 ) | ( value >> 16 );
-    }
-}
-
-static void SHAcopy(SHAobject *src, SHAobject *dest)
-{
-    dest->Endianness = src->Endianness;
-    dest->local = src->local;
-    dest->count_lo = src->count_lo;
-    dest->count_hi = src->count_hi;
-    memcpy(dest->digest, src->digest, sizeof(src->digest));
-    memcpy(dest->data, src->data, sizeof(src->data));
-}
-
-
-/* ------------------------------------------------------------------------
- *
- * This code for the SHA algorithm was noted as public domain. The original
- * headers are pasted below.
- *
- * Several changes have been made to make it more compatible with the
- * Python environment and desired interface.
- *
- */
-
-/* NIST Secure Hash Algorithm */
-/* heavily modified by Uwe Hollerbach <uh at alumni.caltech edu> */
-/* from Peter C. Gutmann's implementation as found in */
-/* Applied Cryptography by Bruce Schneier */
-/* Further modifications to include the "UNRAVEL" stuff, below */
-
-/* This code is in the public domain */
-
-/* UNRAVEL should be fastest & biggest */
-/* UNROLL_LOOPS should be just as big, but slightly slower */
-/* both undefined should be smallest and slowest */
-
-#define UNRAVEL
-/* #define UNROLL_LOOPS */
-
-/* The SHA f()-functions.  The f1 and f3 functions can be optimized to
-   save one boolean operation each - thanks to Rich Schroeppel,
-   rcs at cs.arizona.edu for discovering this */
-
-/*#define f1(x,y,z)	((x & y) | (~x & z))		// Rounds  0-19 */
-#define f1(x,y,z)	(z ^ (x & (y ^ z)))		/* Rounds  0-19 */
-#define f2(x,y,z)	(x ^ y ^ z)			/* Rounds 20-39 */
-/*#define f3(x,y,z)	((x & y) | (x & z) | (y & z))	// Rounds 40-59 */
-#define f3(x,y,z)	((x & y) | (z & (x | y)))	/* Rounds 40-59 */
-#define f4(x,y,z)	(x ^ y ^ z)			/* Rounds 60-79 */
-
-/* SHA constants */
-
-#define CONST1		0x5a827999L			/* Rounds  0-19 */
-#define CONST2		0x6ed9eba1L			/* Rounds 20-39 */
-#define CONST3		0x8f1bbcdcL			/* Rounds 40-59 */
-#define CONST4		0xca62c1d6L			/* Rounds 60-79 */
-
-/* 32-bit rotate */
-
-#define R32(x,n)	((x << n) | (x >> (32 - n)))
-
-/* the generic case, for when the overall rotation is not unraveled */
-
-#define FG(n)	\
-    T = R32(A,5) + f##n(B,C,D) + E + *WP++ + CONST##n;	\
-    E = D; D = C; C = R32(B,30); B = A; A = T
-
-/* specific cases, for when the overall rotation is unraveled */
-
-#define FA(n)	\
-    T = R32(A,5) + f##n(B,C,D) + E + *WP++ + CONST##n; B = R32(B,30)
-
-#define FB(n)	\
-    E = R32(T,5) + f##n(A,B,C) + D + *WP++ + CONST##n; A = R32(A,30)
-
-#define FC(n)	\
-    D = R32(E,5) + f##n(T,A,B) + C + *WP++ + CONST##n; T = R32(T,30)
-
-#define FD(n)	\
-    C = R32(D,5) + f##n(E,T,A) + B + *WP++ + CONST##n; E = R32(E,30)
-
-#define FE(n)	\
-    B = R32(C,5) + f##n(D,E,T) + A + *WP++ + CONST##n; D = R32(D,30)
-
-#define FT(n)	\
-    A = R32(B,5) + f##n(C,D,E) + T + *WP++ + CONST##n; C = R32(C,30)
-
-/* do SHA transformation */
-
-static void
-sha_transform(SHAobject *sha_info)
-{
-    int i;
-    SHA_INT32 T, A, B, C, D, E, W[80], *WP;
-
-    memcpy(W, sha_info->data, sizeof(sha_info->data));
-    longReverse(W, (int)sizeof(sha_info->data), sha_info->Endianness);
-
-    for (i = 16; i < 80; ++i) {
-	W[i] = W[i-3] ^ W[i-8] ^ W[i-14] ^ W[i-16];
-
-	/* extra rotation fix */
-	W[i] = R32(W[i], 1);
-    }
-    A = sha_info->digest[0];
-    B = sha_info->digest[1];
-    C = sha_info->digest[2];
-    D = sha_info->digest[3];
-    E = sha_info->digest[4];
-    WP = W;
-#ifdef UNRAVEL
-    FA(1); FB(1); FC(1); FD(1); FE(1); FT(1); FA(1); FB(1); FC(1); FD(1);
-    FE(1); FT(1); FA(1); FB(1); FC(1); FD(1); FE(1); FT(1); FA(1); FB(1);
-    FC(2); FD(2); FE(2); FT(2); FA(2); FB(2); FC(2); FD(2); FE(2); FT(2);
-    FA(2); FB(2); FC(2); FD(2); FE(2); FT(2); FA(2); FB(2); FC(2); FD(2);
-    FE(3); FT(3); FA(3); FB(3); FC(3); FD(3); FE(3); FT(3); FA(3); FB(3);
-    FC(3); FD(3); FE(3); FT(3); FA(3); FB(3); FC(3); FD(3); FE(3); FT(3);
-    FA(4); FB(4); FC(4); FD(4); FE(4); FT(4); FA(4); FB(4); FC(4); FD(4);
-    FE(4); FT(4); FA(4); FB(4); FC(4); FD(4); FE(4); FT(4); FA(4); FB(4);
-    sha_info->digest[0] += E;
-    sha_info->digest[1] += T;
-    sha_info->digest[2] += A;
-    sha_info->digest[3] += B;
-    sha_info->digest[4] += C;
-#else /* !UNRAVEL */
-#ifdef UNROLL_LOOPS
-    FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1);
-    FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1); FG(1);
-    FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2);
-    FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2); FG(2);
-    FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3);
-    FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3); FG(3);
-    FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4);
-    FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4); FG(4);
-#else /* !UNROLL_LOOPS */
-    for (i =  0; i < 20; ++i) { FG(1); }
-    for (i = 20; i < 40; ++i) { FG(2); }
-    for (i = 40; i < 60; ++i) { FG(3); }
-    for (i = 60; i < 80; ++i) { FG(4); }
-#endif /* !UNROLL_LOOPS */
-    sha_info->digest[0] += A;
-    sha_info->digest[1] += B;
-    sha_info->digest[2] += C;
-    sha_info->digest[3] += D;
-    sha_info->digest[4] += E;
-#endif /* !UNRAVEL */
-}
-
-/* initialize the SHA digest */
-
-static void
-sha_init(SHAobject *sha_info)
-{
-    TestEndianness(sha_info->Endianness)
-
-    sha_info->digest[0] = 0x67452301L;
-    sha_info->digest[1] = 0xefcdab89L;
-    sha_info->digest[2] = 0x98badcfeL;
-    sha_info->digest[3] = 0x10325476L;
-    sha_info->digest[4] = 0xc3d2e1f0L;
-    sha_info->count_lo = 0L;
-    sha_info->count_hi = 0L;
-    sha_info->local = 0;
-}
-
-/* update the SHA digest */
-
-static void
-sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count)
-{
-    int i;
-    SHA_INT32 clo;
-
-    clo = sha_info->count_lo + ((SHA_INT32) count << 3);
-    if (clo < sha_info->count_lo) {
-        ++sha_info->count_hi;
-    }
-    sha_info->count_lo = clo;
-    sha_info->count_hi += (SHA_INT32) count >> 29;
-    if (sha_info->local) {
-        i = SHA_BLOCKSIZE - sha_info->local;
-        if (i > count) {
-            i = count;
-        }
-        memcpy(((SHA_BYTE *) sha_info->data) + sha_info->local, buffer, i);
-        count -= i;
-        buffer += i;
-        sha_info->local += i;
-        if (sha_info->local == SHA_BLOCKSIZE) {
-            sha_transform(sha_info);
-        }
-        else {
-            return;
-        }
-    }
-    while (count >= SHA_BLOCKSIZE) {
-        memcpy(sha_info->data, buffer, SHA_BLOCKSIZE);
-        buffer += SHA_BLOCKSIZE;
-        count -= SHA_BLOCKSIZE;
-        sha_transform(sha_info);
-    }
-    memcpy(sha_info->data, buffer, count);
-    sha_info->local = count;
-}
-
-/* finish computing the SHA digest */
-
-static void
-sha_final(unsigned char digest[20], SHAobject *sha_info)
-{
-    int count;
-    SHA_INT32 lo_bit_count, hi_bit_count;
-
-    lo_bit_count = sha_info->count_lo;
-    hi_bit_count = sha_info->count_hi;
-    count = (int) ((lo_bit_count >> 3) & 0x3f);
-    ((SHA_BYTE *) sha_info->data)[count++] = 0x80;
-    if (count > SHA_BLOCKSIZE - 8) {
-	memset(((SHA_BYTE *) sha_info->data) + count, 0,
-	       SHA_BLOCKSIZE - count);
-	sha_transform(sha_info);
-	memset((SHA_BYTE *) sha_info->data, 0, SHA_BLOCKSIZE - 8);
-    }
-    else {
-	memset(((SHA_BYTE *) sha_info->data) + count, 0,
-	       SHA_BLOCKSIZE - 8 - count);
-    }
-
-    /* GJS: note that we add the hi/lo in big-endian. sha_transform will
-       swap these values into host-order. */
-    sha_info->data[56] = (hi_bit_count >> 24) & 0xff;
-    sha_info->data[57] = (hi_bit_count >> 16) & 0xff;
-    sha_info->data[58] = (hi_bit_count >>  8) & 0xff;
-    sha_info->data[59] = (hi_bit_count >>  0) & 0xff;
-    sha_info->data[60] = (lo_bit_count >> 24) & 0xff;
-    sha_info->data[61] = (lo_bit_count >> 16) & 0xff;
-    sha_info->data[62] = (lo_bit_count >>  8) & 0xff;
-    sha_info->data[63] = (lo_bit_count >>  0) & 0xff;
-    sha_transform(sha_info);
-    digest[ 0] = (unsigned char) ((sha_info->digest[0] >> 24) & 0xff);
-    digest[ 1] = (unsigned char) ((sha_info->digest[0] >> 16) & 0xff);
-    digest[ 2] = (unsigned char) ((sha_info->digest[0] >>  8) & 0xff);
-    digest[ 3] = (unsigned char) ((sha_info->digest[0]      ) & 0xff);
-    digest[ 4] = (unsigned char) ((sha_info->digest[1] >> 24) & 0xff);
-    digest[ 5] = (unsigned char) ((sha_info->digest[1] >> 16) & 0xff);
-    digest[ 6] = (unsigned char) ((sha_info->digest[1] >>  8) & 0xff);
-    digest[ 7] = (unsigned char) ((sha_info->digest[1]      ) & 0xff);
-    digest[ 8] = (unsigned char) ((sha_info->digest[2] >> 24) & 0xff);
-    digest[ 9] = (unsigned char) ((sha_info->digest[2] >> 16) & 0xff);
-    digest[10] = (unsigned char) ((sha_info->digest[2] >>  8) & 0xff);
-    digest[11] = (unsigned char) ((sha_info->digest[2]      ) & 0xff);
-    digest[12] = (unsigned char) ((sha_info->digest[3] >> 24) & 0xff);
-    digest[13] = (unsigned char) ((sha_info->digest[3] >> 16) & 0xff);
-    digest[14] = (unsigned char) ((sha_info->digest[3] >>  8) & 0xff);
-    digest[15] = (unsigned char) ((sha_info->digest[3]      ) & 0xff);
-    digest[16] = (unsigned char) ((sha_info->digest[4] >> 24) & 0xff);
-    digest[17] = (unsigned char) ((sha_info->digest[4] >> 16) & 0xff);
-    digest[18] = (unsigned char) ((sha_info->digest[4] >>  8) & 0xff);
-    digest[19] = (unsigned char) ((sha_info->digest[4]      ) & 0xff);
-}
-
-/*
- * End of copied SHA code.
- *
- * ------------------------------------------------------------------------
- */
-
-static PyTypeObject SHAtype;
-
-
-static SHAobject *
-newSHAobject(void)
-{
-    return (SHAobject *)PyObject_New(SHAobject, &SHAtype);
-}
-
-/* Internal methods for a hashing object */
-
-static void
-SHA_dealloc(PyObject *ptr)
-{
-    PyObject_Del(ptr);
-}
-
-
-/* External methods for a hashing object */
-
-PyDoc_STRVAR(SHA_copy__doc__, "Return a copy of the hashing object.");
-
-static PyObject *
-SHA_copy(SHAobject *self, PyObject *unused)
-{
-    SHAobject *newobj;
-
-    if ( (newobj = newSHAobject())==NULL)
-        return NULL;
-
-    SHAcopy(self, newobj);
-    return (PyObject *)newobj;
-}
-
-PyDoc_STRVAR(SHA_digest__doc__,
-"Return the digest value as a string of binary data.");
-
-static PyObject *
-SHA_digest(SHAobject *self, PyObject *unused)
-{
-    unsigned char digest[SHA_DIGESTSIZE];
-    SHAobject temp;
-
-    SHAcopy(self, &temp);
-    sha_final(digest, &temp);
-    return PyString_FromStringAndSize((const char *)digest, sizeof(digest));
-}
-
-PyDoc_STRVAR(SHA_hexdigest__doc__,
-"Return the digest value as a string of hexadecimal digits.");
-
-static PyObject *
-SHA_hexdigest(SHAobject *self, PyObject *unused)
-{
-    unsigned char digest[SHA_DIGESTSIZE];
-    SHAobject temp;
-    PyObject *retval;
-    char *hex_digest;
-    int i, j;
-
-    /* Get the raw (binary) digest value */
-    SHAcopy(self, &temp);
-    sha_final(digest, &temp);
-
-    /* Create a new string */
-    retval = PyString_FromStringAndSize(NULL, sizeof(digest) * 2);
-    if (!retval)
-	    return NULL;
-    hex_digest = PyString_AsString(retval);
-    if (!hex_digest) {
-	    Py_DECREF(retval);
-	    return NULL;
-    }
-
-    /* Make hex version of the digest */
-    for(i=j=0; i<sizeof(digest); i++) {
-        char c;
-        c = (digest[i] >> 4) & 0xf;
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-        c = (digest[i] & 0xf);
-	c = (c>9) ? c+'a'-10 : c + '0';
-        hex_digest[j++] = c;
-    }
-    return retval;
-}
-
-PyDoc_STRVAR(SHA_update__doc__,
-"Update this hashing object's state with the provided string.");
-
-static PyObject *
-SHA_update(SHAobject *self, PyObject *args)
-{
-    unsigned char *cp;
-    int len;
-
-    if (!PyArg_ParseTuple(args, "s#:update", &cp, &len))
-        return NULL;
-
-    sha_update(self, cp, len);
-
-    Py_INCREF(Py_None);
-    return Py_None;
-}
-
-static PyMethodDef SHA_methods[] = {
-    {"copy",	  (PyCFunction)SHA_copy,      METH_NOARGS,  SHA_copy__doc__},
-    {"digest",	  (PyCFunction)SHA_digest,    METH_NOARGS,  SHA_digest__doc__},
-    {"hexdigest", (PyCFunction)SHA_hexdigest, METH_NOARGS,  SHA_hexdigest__doc__},
-    {"update",	  (PyCFunction)SHA_update,    METH_VARARGS, SHA_update__doc__},
-    {NULL,	  NULL}		/* sentinel */
-};
-
-static PyObject *
-SHA_get_block_size(PyObject *self, void *closure)
-{
-    return PyInt_FromLong(SHA_BLOCKSIZE);
-}
-
-static PyObject *
-SHA_get_digest_size(PyObject *self, void *closure)
-{
-    return PyInt_FromLong(SHA_DIGESTSIZE);
-}
-
-static PyObject *
-SHA_get_name(PyObject *self, void *closure)
-{
-    return PyString_FromStringAndSize("SHA1", 4);
-}
-
-static PyGetSetDef SHA_getseters[] = {
-    {"digest_size",
-     (getter)SHA_get_digest_size, NULL,
-     NULL,
-     NULL},
-    {"block_size",
-     (getter)SHA_get_block_size, NULL,
-     NULL,
-     NULL},
-    {"name",
-     (getter)SHA_get_name, NULL,
-     NULL,
-     NULL},
-    /* the old md5 and sha modules support 'digest_size' as in PEP 247.
-     * the old sha module also supported 'digestsize'.  ugh. */
-    {"digestsize",
-     (getter)SHA_get_digest_size, NULL,
-     NULL,
-     NULL},
-    {NULL}  /* Sentinel */
-};
-
-static PyTypeObject SHAtype = {
-    PyObject_HEAD_INIT(NULL)
-    0,			/*ob_size*/
-    "_sha.sha",		/*tp_name*/
-    sizeof(SHAobject),	/*tp_size*/
-    0,			/*tp_itemsize*/
-    /* methods */
-    SHA_dealloc,	/*tp_dealloc*/
-    0,			/*tp_print*/
-    0,                  /*tp_getattr*/
-    0,                  /*tp_setattr*/
-    0,                  /*tp_compare*/
-    0,                  /*tp_repr*/
-    0,                  /*tp_as_number*/
-    0,                  /*tp_as_sequence*/
-    0,                  /*tp_as_mapping*/
-    0,                  /*tp_hash*/
-    0,                  /*tp_call*/
-    0,                  /*tp_str*/
-    0,                  /*tp_getattro*/
-    0,                  /*tp_setattro*/
-    0,                  /*tp_as_buffer*/
-    Py_TPFLAGS_DEFAULT, /*tp_flags*/
-    0,                  /*tp_doc*/
-    0,                  /*tp_traverse*/
-    0,			/*tp_clear*/
-    0,			/*tp_richcompare*/
-    0,			/*tp_weaklistoffset*/
-    0,			/*tp_iter*/
-    0,			/*tp_iternext*/
-    SHA_methods,	/* tp_methods */
-    0,                  /* tp_members */
-    SHA_getseters,      /* tp_getset */
-};
-
-
-/* The single module-level function: new() */
-
-PyDoc_STRVAR(SHA_new__doc__,
-"Return a new SHA hashing object.  An optional string argument\n\
-may be provided; if present, this string will be automatically\n\
-hashed.");
-
-static PyObject *
-SHA_new(PyObject *self, PyObject *args, PyObject *kwdict)
-{
-    static char *kwlist[] = {"string", NULL};
-    SHAobject *new;
-    unsigned char *cp = NULL;
-    int len;
-
-    if (!PyArg_ParseTupleAndKeywords(args, kwdict, "|s#:new", kwlist,
-                                     &cp, &len)) {
-        return NULL;
-    }
-
-    if ((new = newSHAobject()) == NULL)
-        return NULL;
-
-    sha_init(new);
-
-    if (PyErr_Occurred()) {
-        Py_DECREF(new);
-        return NULL;
-    }
-    if (cp)
-        sha_update(new, cp, len);
-
-    return (PyObject *)new;
-}
-
-
-/* List of functions exported by this module */
-
-static struct PyMethodDef SHA_functions[] = {
-    {"new", (PyCFunction)SHA_new, METH_VARARGS|METH_KEYWORDS, SHA_new__doc__},
-    {NULL,	NULL}		 /* Sentinel */
-};
-
-
-/* Initialize this module. */
-
-#define insint(n,v) { PyModule_AddIntConstant(m,n,v); }
-
-PyMODINIT_FUNC
-init_sha(void)
-{
-    PyObject *m;
-
-    SHAtype.ob_type = &PyType_Type;
-    if (PyType_Ready(&SHAtype) < 0)
-        return;
-    m = Py_InitModule("_sha", SHA_functions);
-    if (m == NULL)
-	return;
-
-    /* Add some symbolic constants to the module */
-    insint("blocksize", 1);  /* For future use, in case some hash
-                                functions require an integral number of
-                                blocks */ 
-    insint("digestsize", 20);
-    insint("digest_size", 20);
-}
diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py
--- a/distutils2/_backport/shutil.py
+++ b/distutils2/_backport/shutil.py
@@ -9,6 +9,7 @@
 import stat
 from os.path import abspath
 import fnmatch
+import collections
 import errno
 
 try:
@@ -93,15 +94,9 @@
             if stat.S_ISFIFO(st.st_mode):
                 raise SpecialFileError("`%s` is a named pipe" % fn)
 
-    fsrc = open(src, 'rb')
-    try:
-        fdst = open(dst, 'wb')
-        try:
+    with open(src, 'rb') as fsrc:
+        with open(dst, 'wb') as fdst:
             copyfileobj(fsrc, fdst)
-        finally:
-            fdst.close()
-    finally:
-        fsrc.close()
 
 def copymode(src, dst):
     """Copy mode bits from src to dst"""
@@ -121,7 +116,7 @@
     if hasattr(os, 'chflags') and hasattr(st, 'st_flags'):
         try:
             os.chflags(dst, st.st_flags)
-        except OSError, why:
+        except OSError as why:
             if (not hasattr(errno, 'EOPNOTSUPP') or
                 why.errno != errno.EOPNOTSUPP):
                 raise
@@ -204,7 +199,7 @@
 
     try:
         os.makedirs(dst)
-    except OSError, e:
+    except OSError as e:
         if e.errno != errno.EEXIST:
             raise
 
@@ -232,13 +227,13 @@
                 copy_function(srcname, dstname)
         # catch the Error from the recursive copytree so that we can
         # continue with other files
-        except Error, err:
+        except Error as err:
             errors.extend(err.args[0])
-        except EnvironmentError, why:
+        except EnvironmentError as why:
             errors.append((srcname, dstname, str(why)))
     try:
         copystat(src, dst)
-    except OSError, why:
+    except OSError as why:
         if WindowsError is not None and isinstance(why, WindowsError):
             # Copying file access times may fail on Windows
             pass
@@ -400,7 +395,7 @@
     # flags for compression program, each element of list will be an argument
     if compress is not None and compress not in compress_ext:
         raise ValueError("bad value for 'compress', or compression format not "
-                         "supported: %s" % compress)
+                         "supported : {0}".format(compress))
 
     archive_name = base_name + '.tar' + compress_ext.get(compress, '')
     archive_dir = os.path.dirname(archive_name)
@@ -521,7 +516,7 @@
     Each element of the returned sequence is a tuple (name, description)
     """
     formats = [(name, registry[2]) for name, registry in
-               _ARCHIVE_FORMATS.iteritems()]
+               _ARCHIVE_FORMATS.items()]
     formats.sort()
     return formats
 
@@ -536,7 +531,7 @@
     """
     if extra_args is None:
         extra_args = []
-    if not callable(function):
+    if not isinstance(function, collections.Callable):
         raise TypeError('The %s object is not callable' % function)
     if not isinstance(extra_args, (tuple, list)):
         raise TypeError('extra_args needs to be a sequence')
@@ -611,7 +606,7 @@
     (name, extensions, description)
     """
     formats = [(name, info[0], info[3]) for name, info in
-               _UNPACK_FORMATS.iteritems()]
+               _UNPACK_FORMATS.items()]
     formats.sort()
     return formats
 
@@ -619,7 +614,7 @@
     """Checks what gets registered as an unpacker."""
     # first make sure no other unpacker is registered for this extension
     existing_extensions = {}
-    for name, info in _UNPACK_FORMATS.iteritems():
+    for name, info in _UNPACK_FORMATS.items():
         for ext in info[0]:
             existing_extensions[ext] = name
 
@@ -629,7 +624,7 @@
             raise RegistryError(msg % (extension,
                                        existing_extensions[extension]))
 
-    if not callable(function):
+    if not isinstance(function, collections.Callable):
         raise TypeError('The registered function must be a callable')
 
 
@@ -728,7 +723,7 @@
                                 "bzip2'ed tar-file")
 
 def _find_unpack_format(filename):
-    for name, info in _UNPACK_FORMATS.iteritems():
+    for name, info in _UNPACK_FORMATS.items():
         for extension in info[0]:
             if filename.endswith(extension):
                 return name
@@ -756,7 +751,7 @@
         try:
             format_info = _UNPACK_FORMATS[format]
         except KeyError:
-            raise ValueError("Unknown unpack format '%s'" % format)
+            raise ValueError("Unknown unpack format '{0}'".format(format))
 
         func = format_info[1]
         func(filename, extract_dir, **dict(format_info[2]))
@@ -764,7 +759,7 @@
         # we need to look at the registered unpackers supported extensions
         format = _find_unpack_format(filename)
         if format is None:
-            raise ReadError("Unknown archive format '%s'" % filename)
+            raise ReadError("Unknown archive format '{0}'".format(filename))
 
         func = _UNPACK_FORMATS[format][1]
         kwargs = dict(_UNPACK_FORMATS[format][2])
diff --git a/distutils2/_backport/sysconfig.cfg b/distutils2/_backport/sysconfig.cfg
--- a/distutils2/_backport/sysconfig.cfg
+++ b/distutils2/_backport/sysconfig.cfg
@@ -40,8 +40,8 @@
 platstdlib = {platbase}/lib/python{py_version_short}
 purelib = {base}/lib/python{py_version_short}/site-packages
 platlib = {platbase}/lib/python{py_version_short}/site-packages
-include = {base}/include/python{py_version_short}
-platinclude = {platbase}/include/python{py_version_short}
+include = {base}/include/python{py_version_short}{abiflags}
+platinclude = {platbase}/include/python{py_version_short}{abiflags}
 data = {base}
 
 [posix_home]
diff --git a/distutils2/_backport/sysconfig.py b/distutils2/_backport/sysconfig.py
--- a/distutils2/_backport/sysconfig.py
+++ b/distutils2/_backport/sysconfig.py
@@ -4,7 +4,7 @@
 import re
 import sys
 from os.path import pardir, realpath
-from ConfigParser import RawConfigParser
+from configparser import RawConfigParser
 
 __all__ = [
     'get_config_h_filename',
@@ -209,11 +209,8 @@
     done = {}
     notdone = {}
 
-    f = open(filename)
-    try:
+    with open(filename, errors="surrogateescape") as f:
         lines = f.readlines()
-    finally:
-        f.close()
 
     for line in lines:
         if line.startswith('#') or line.strip() == '':
@@ -237,7 +234,7 @@
                     done[n] = v
 
     # do variable interpolation here
-    variables = notdone.keys()
+    variables = list(notdone.keys())
 
     # Variables with a 'PY_' prefix in the makefile. These need to
     # be made available without that prefix through sysconfig.
@@ -316,7 +313,10 @@
     """Return the path of the Makefile."""
     if _PYTHON_BUILD:
         return os.path.join(_PROJECT_BASE, "Makefile")
-    config_dir_name = 'config'
+    if hasattr(sys, 'abiflags'):
+        config_dir_name = 'config-%s%s' % (_PY_VERSION_SHORT, sys.abiflags)
+    else:
+        config_dir_name = 'config'
     return os.path.join(get_path('stdlib'), config_dir_name, 'Makefile')
 
 
@@ -326,7 +326,7 @@
     makefile = get_makefile_filename()
     try:
         _parse_makefile(makefile, vars)
-    except IOError, e:
+    except IOError as e:
         msg = "invalid Python installation: unable to open %s" % makefile
         if hasattr(e, "strerror"):
             msg = msg + " (%s)" % e.strerror
@@ -334,12 +334,9 @@
     # load the installed pyconfig.h:
     config_h = get_config_h_filename()
     try:
-        f = open(config_h)
-        try:
+        with open(config_h) as f:
             parse_config_h(f, vars)
-        finally:
-            f.close()
-    except IOError, e:
+    except IOError as e:
         msg = "invalid Python installation: unable to open %s" % config_h
         if hasattr(e, "strerror"):
             msg = msg + " (%s)" % e.strerror
@@ -465,6 +462,11 @@
         _CONFIG_VARS['base'] = _PREFIX
         _CONFIG_VARS['platbase'] = _EXEC_PREFIX
         _CONFIG_VARS['projectbase'] = _PROJECT_BASE
+        try:
+            _CONFIG_VARS['abiflags'] = sys.abiflags
+        except AttributeError:
+            # sys.abiflags may not be defined on all platforms.
+            _CONFIG_VARS['abiflags'] = ''
 
         if os.name in ('nt', 'os2'):
             _init_non_posix(_CONFIG_VARS)
@@ -724,13 +726,13 @@
                 # On OSX the machine type returned by uname is always the
                 # 32-bit variant, even if the executable architecture is
                 # the 64-bit variant
-                if sys.maxint >= 2**32:
+                if sys.maxsize >= 2**32:
                     machine = 'x86_64'
 
             elif machine in ('PowerPC', 'Power_Macintosh'):
                 # Pick a sane name for the PPC architecture.
                 # See 'i386' case
-                if sys.maxint >= 2**32:
+                if sys.maxsize >= 2**32:
                     machine = 'ppc64'
                 else:
                     machine = 'ppc'
@@ -745,18 +747,18 @@
 def _print_dict(title, data):
     for index, (key, value) in enumerate(sorted(data.items())):
         if index == 0:
-            print '%s: ' % (title)
-        print '\t%s = "%s"' % (key, value)
+            print('%s: ' % (title))
+        print('\t%s = "%s"' % (key, value))
 
 
 def _main():
     """Display all information sysconfig detains."""
-    print 'Platform: "%s"' % get_platform()
-    print 'Python version: "%s"' % get_python_version()
-    print 'Current installation scheme: "%s"' % _get_default_scheme()
-    print
+    print('Platform: "%s"' % get_platform())
+    print('Python version: "%s"' % get_python_version())
+    print('Current installation scheme: "%s"' % _get_default_scheme())
+    print()
     _print_dict('Paths', get_paths())
-    print
+    print()
     _print_dict('Variables', get_config_vars())
 
 
diff --git a/distutils2/_backport/tarfile.py b/distutils2/_backport/tarfile.py
--- a/distutils2/_backport/tarfile.py
+++ b/distutils2/_backport/tarfile.py
@@ -1,9 +1,8 @@
-#!/usr/bin/env python
-# encoding: utf-8
+#!/usr/bin/env python3
 #-------------------------------------------------------------------
 # tarfile.py
 #-------------------------------------------------------------------
-# Copyright (C) 2002 Lars Gustäbel <lars at gustaebel.de>
+# Copyright (C) 2002 Lars Gustaebel <lars at gustaebel.de>
 # All rights reserved.
 #
 # Permission  is  hereby granted,  free  of charge,  to  any person
@@ -33,10 +32,10 @@
 __version__ = "$Revision$"
 
 version     = "0.9.0"
-__author__  = u"Lars Gust\u00e4bel (lars at gustaebel.de)"
+__author__  = "Lars Gust\u00e4bel (lars at gustaebel.de)"
 __date__    = "$Date: 2011-02-25 17:42:01 +0200 (Fri, 25 Feb 2011) $"
 __cvsid__   = "$Id: tarfile.py 88586 2011-02-25 15:42:01Z marc-andre.lemburg $"
-__credits__ = u"Gustavo Niemeyer, Niels Gust\u00e4bel, Richard Townsend."
+__credits__ = "Gustavo Niemeyer, Niels Gust\u00e4bel, Richard Townsend."
 
 #---------
 # Imports
@@ -68,38 +67,38 @@
 # from tarfile import *
 __all__ = ["TarFile", "TarInfo", "is_tarfile", "TarError"]
 
-from __builtin__ import open as _open # Since 'open' is TarFile.open
+from builtins import open as _open # Since 'open' is TarFile.open
 
 #---------------------------------------------------------
 # tar constants
 #---------------------------------------------------------
-NUL = "\0"                      # the null character
+NUL = b"\0"                     # the null character
 BLOCKSIZE = 512                 # length of processing blocks
 RECORDSIZE = BLOCKSIZE * 20     # length of records
-GNU_MAGIC = "ustar  \0"         # magic gnu tar string
-POSIX_MAGIC = "ustar\x0000"     # magic posix tar string
+GNU_MAGIC = b"ustar  \0"        # magic gnu tar string
+POSIX_MAGIC = b"ustar\x0000"    # magic posix tar string
 
 LENGTH_NAME = 100               # maximum length of a filename
 LENGTH_LINK = 100               # maximum length of a linkname
 LENGTH_PREFIX = 155             # maximum length of the prefix field
 
-REGTYPE = "0"                   # regular file
-AREGTYPE = "\0"                 # regular file
-LNKTYPE = "1"                   # link (inside tarfile)
-SYMTYPE = "2"                   # symbolic link
-CHRTYPE = "3"                   # character special device
-BLKTYPE = "4"                   # block special device
-DIRTYPE = "5"                   # directory
-FIFOTYPE = "6"                  # fifo special device
-CONTTYPE = "7"                  # contiguous file
+REGTYPE = b"0"                  # regular file
+AREGTYPE = b"\0"                # regular file
+LNKTYPE = b"1"                  # link (inside tarfile)
+SYMTYPE = b"2"                  # symbolic link
+CHRTYPE = b"3"                  # character special device
+BLKTYPE = b"4"                  # block special device
+DIRTYPE = b"5"                  # directory
+FIFOTYPE = b"6"                 # fifo special device
+CONTTYPE = b"7"                 # contiguous file
 
-GNUTYPE_LONGNAME = "L"          # GNU tar longname
-GNUTYPE_LONGLINK = "K"          # GNU tar longlink
-GNUTYPE_SPARSE = "S"            # GNU tar sparse file
+GNUTYPE_LONGNAME = b"L"         # GNU tar longname
+GNUTYPE_LONGLINK = b"K"         # GNU tar longlink
+GNUTYPE_SPARSE = b"S"           # GNU tar sparse file
 
-XHDTYPE = "x"                   # POSIX.1-2001 extended header
-XGLTYPE = "g"                   # POSIX.1-2001 global header
-SOLARIS_XHDTYPE = "X"           # Solaris extended header
+XHDTYPE = b"x"                  # POSIX.1-2001 extended header
+XGLTYPE = b"g"                  # POSIX.1-2001 global header
+SOLARIS_XHDTYPE = b"X"          # Solaris extended header
 
 USTAR_FORMAT = 0                # POSIX.1-1988 (ustar) format
 GNU_FORMAT = 1                  # GNU tar format
@@ -129,7 +128,7 @@
               "uid", "gid", "uname", "gname")
 
 # Fields from a pax header that are affected by hdrcharset.
-PAX_NAME_FIELDS = set(("path", "linkpath", "uname", "gname"))
+PAX_NAME_FIELDS = {"path", "linkpath", "uname", "gname"}
 
 # Fields in a pax header that are numbers, all other fields
 # are treated as strings.
@@ -145,26 +144,26 @@
 #---------------------------------------------------------
 # Bits used in the mode field, values in octal.
 #---------------------------------------------------------
-S_IFLNK = 0120000        # symbolic link
-S_IFREG = 0100000        # regular file
-S_IFBLK = 0060000        # block device
-S_IFDIR = 0040000        # directory
-S_IFCHR = 0020000        # character device
-S_IFIFO = 0010000        # fifo
+S_IFLNK = 0o120000        # symbolic link
+S_IFREG = 0o100000        # regular file
+S_IFBLK = 0o060000        # block device
+S_IFDIR = 0o040000        # directory
+S_IFCHR = 0o020000        # character device
+S_IFIFO = 0o010000        # fifo
 
-TSUID   = 04000          # set UID on execution
-TSGID   = 02000          # set GID on execution
-TSVTX   = 01000          # reserved
+TSUID   = 0o4000          # set UID on execution
+TSGID   = 0o2000          # set GID on execution
+TSVTX   = 0o1000          # reserved
 
-TUREAD  = 0400           # read by owner
-TUWRITE = 0200           # write by owner
-TUEXEC  = 0100           # execute/search by owner
-TGREAD  = 0040           # read by group
-TGWRITE = 0020           # write by group
-TGEXEC  = 0010           # execute/search by group
-TOREAD  = 0004           # read by other
-TOWRITE = 0002           # write by other
-TOEXEC  = 0001           # execute/search by other
+TUREAD  = 0o400           # read by owner
+TUWRITE = 0o200           # write by owner
+TUEXEC  = 0o100           # execute/search by owner
+TGREAD  = 0o040           # read by group
+TGWRITE = 0o020           # write by group
+TGEXEC  = 0o010           # execute/search by group
+TOREAD  = 0o004           # read by other
+TOWRITE = 0o002           # write by other
+TOEXEC  = 0o001           # execute/search by other
 
 #---------------------------------------------------------
 # initialization
@@ -187,7 +186,7 @@
 def nts(s, encoding, errors):
     """Convert a null-terminated bytes object to a string.
     """
-    p = s.find("\0")
+    p = s.find(b"\0")
     if p != -1:
         s = s[:p]
     return s.decode(encoding, errors)
@@ -197,14 +196,14 @@
     """
     # There are two possible encodings for a number field, see
     # itn() below.
-    if s[0] != chr(0200):
+    if s[0] != chr(0o200):
         try:
             n = int(nts(s, "ascii", "strict") or "0", 8)
         except ValueError:
             raise InvalidHeaderError("invalid header")
     else:
-        n = 0L
-        for i in xrange(len(s) - 1):
+        n = 0
+        for i in range(len(s) - 1):
             n <<= 8
             n += ord(s[i + 1])
     return n
@@ -215,11 +214,11 @@
     # POSIX 1003.1-1988 requires numbers to be encoded as a string of
     # octal digits followed by a null-byte, this allows values up to
     # (8**(digits-1))-1. GNU tar allows storing numbers greater than
-    # that if necessary. A leading 0200 byte indicates this particular
+    # that if necessary. A leading 0o200 byte indicates this particular
     # encoding, the following digits-1 bytes are a big-endian
     # representation. This allows values up to (256**(digits-1))-1.
     if 0 <= n < 8 ** (digits - 1):
-        s = "%0*o" % (digits - 1, n) + NUL
+        s = bytes("%0*o" % (digits - 1, n), "ascii") + NUL
     else:
         if format != GNU_FORMAT or n >= 256 ** (digits - 1):
             raise ValueError("overflow in number field")
@@ -229,11 +228,11 @@
             # this could raise OverflowError.
             n = struct.unpack("L", struct.pack("l", n))[0]
 
-        s = ""
-        for i in xrange(digits - 1):
-            s = chr(n & 0377) + s
+        s = bytearray()
+        for i in range(digits - 1):
+            s.insert(0, n & 0o377)
             n >>= 8
-        s = chr(0200) + s
+        s.insert(0, 0o200)
     return s
 
 def calc_chksums(buf):
@@ -261,7 +260,7 @@
 
     BUFSIZE = 16 * 1024
     blocks, remainder = divmod(length, BUFSIZE)
-    for b in xrange(blocks):
+    for b in range(blocks):
         buf = src.read(BUFSIZE)
         if len(buf) < BUFSIZE:
             raise IOError("end of file reached")
@@ -353,7 +352,7 @@
 #---------------------------
 # internal stream interface
 #---------------------------
-class _LowLevelFile(object):
+class _LowLevelFile:
     """Low-level file object. Supports reading and writing.
        It is used instead of a regular file object for streaming
        access.
@@ -366,7 +365,7 @@
         }[mode]
         if hasattr(os, "O_BINARY"):
             mode |= os.O_BINARY
-        self.fd = os.open(name, mode, 0666)
+        self.fd = os.open(name, mode, 0o666)
 
     def close(self):
         os.close(self.fd)
@@ -377,7 +376,7 @@
     def write(self, s):
         os.write(self.fd, s)
 
-class _Stream(object):
+class _Stream:
     """Class that serves as an adapter between TarFile and
        a stream-like object.  The stream-like object only
        needs to have a read() or write() method and is accessed
@@ -407,8 +406,8 @@
         self.comptype = comptype
         self.fileobj  = fileobj
         self.bufsize  = bufsize
-        self.buf      = ""
-        self.pos      = 0L
+        self.buf      = b""
+        self.pos      = 0
         self.closed   = False
 
         try:
@@ -418,7 +417,7 @@
                 except ImportError:
                     raise CompressionError("zlib module is not available")
                 self.zlib = zlib
-                self.crc = zlib.crc32("")
+                self.crc = zlib.crc32(b"")
                 if mode == "r":
                     self._init_read_gz()
                 else:
@@ -430,7 +429,7 @@
                 except ImportError:
                     raise CompressionError("bz2 module is not available")
                 if mode == "r":
-                    self.dbuf = ""
+                    self.dbuf = b""
                     self.cmp = bz2.BZ2Decompressor()
                 else:
                     self.cmp = bz2.BZ2Compressor()
@@ -451,8 +450,8 @@
                                             -self.zlib.MAX_WBITS,
                                             self.zlib.DEF_MEM_LEVEL,
                                             0)
-        timestamp = struct.pack("<L", long(time.time()))
-        self.__write("\037\213\010\010%s\002\377" % timestamp)
+        timestamp = struct.pack("<L", int(time.time()))
+        self.__write(b"\037\213\010\010" + timestamp + b"\002\377")
         if self.name.endswith(".gz"):
             self.name = self.name[:-3]
         # RFC1952 says we must use ISO-8859-1 for the FNAME field.
@@ -462,7 +461,7 @@
         """Write string s to the stream.
         """
         if self.comptype == "gz":
-            self.crc = self.zlib.crc32(s, self.crc) & 0xffffffffL
+            self.crc = self.zlib.crc32(s, self.crc)
         self.pos += len(s)
         if self.comptype != "tar":
             s = self.cmp.compress(s)
@@ -489,7 +488,7 @@
 
         if self.mode == "w" and self.buf:
             self.fileobj.write(self.buf)
-            self.buf = ""
+            self.buf = b""
             if self.comptype == "gz":
                 # The native zlib crc is an unsigned 32-bit integer, but
                 # the Python wrapper implicitly casts that to a signed C
@@ -497,8 +496,8 @@
                 # while the same crc on a 64-bit box may "look positive".
                 # To avoid irksome warnings from the `struct` module, force
                 # it to look positive on all boxes.
-                self.fileobj.write(struct.pack("<L", self.crc & 0xffffffffL))
-                self.fileobj.write(struct.pack("<L", self.pos & 0xffffFFFFL))
+                self.fileobj.write(struct.pack("<L", self.crc & 0xffffffff))
+                self.fileobj.write(struct.pack("<L", self.pos & 0xffffFFFF))
 
         if not self._extfileobj:
             self.fileobj.close()
@@ -509,12 +508,12 @@
         """Initialize for reading a gzip compressed fileobj.
         """
         self.cmp = self.zlib.decompressobj(-self.zlib.MAX_WBITS)
-        self.dbuf = ""
+        self.dbuf = b""
 
         # taken from gzip.GzipFile with some alterations
-        if self.__read(2) != "\037\213":
+        if self.__read(2) != b"\037\213":
             raise ReadError("not a gzip file")
-        if self.__read(1) != "\010":
+        if self.__read(1) != b"\010":
             raise CompressionError("unsupported compression method")
 
         flag = ord(self.__read(1))
@@ -547,7 +546,7 @@
         """
         if pos - self.pos >= 0:
             blocks, remainder = divmod(pos - self.pos, self.bufsize)
-            for i in xrange(blocks):
+            for i in range(blocks):
                 self.read(self.bufsize)
             self.read(remainder)
         else:
@@ -609,7 +608,7 @@
         return buf
 # class _Stream
 
-class _StreamProxy(object):
+class _StreamProxy:
     """Small proxy class that enables transparent compression
        detection for the Stream interface (mode 'r|*').
     """
@@ -623,9 +622,9 @@
         return self.buf
 
     def getcomptype(self):
-        if self.buf.startswith("\037\213\010"):
+        if self.buf.startswith(b"\037\213\010"):
             return "gz"
-        if self.buf.startswith("BZh91"):
+        if self.buf.startswith(b"BZh91"):
             return "bz2"
         return "tar"
 
@@ -633,7 +632,7 @@
         self.fileobj.close()
 # class StreamProxy
 
-class _BZ2Proxy(object):
+class _BZ2Proxy:
     """Small proxy class that enables external file object
        support for "r:bz2" and "w:bz2" modes. This is actually
        a workaround for a limitation in bz2 module's BZ2File
@@ -655,7 +654,7 @@
         if self.mode == "r":
             self.bz2obj = bz2.BZ2Decompressor()
             self.fileobj.seek(0)
-            self.buf = ""
+            self.buf = b""
         else:
             self.bz2obj = bz2.BZ2Compressor()
 
@@ -696,7 +695,7 @@
 #------------------------
 # Extraction file object
 #------------------------
-class _FileInFile(object):
+class _FileInFile:
     """A thin wrapper around an existing file object that
        provides a part of its data as an individual file
        object.
@@ -749,7 +748,7 @@
         else:
             size = min(size, self.size - self.position)
 
-        buf = ""
+        buf = b""
         while size > 0:
             while True:
                 data, start, stop, offset = self.map[self.map_index]
@@ -771,7 +770,7 @@
 #class _FileInFile
 
 
-class ExFileObject(object):
+class ExFileObject:
     """File-like object for reading an archive member.
        Is returned by TarFile.extractfile().
     """
@@ -788,7 +787,7 @@
         self.size = tarinfo.size
 
         self.position = 0
-        self.buffer = ""
+        self.buffer = b""
 
     def readable(self):
         return True
@@ -806,11 +805,11 @@
         if self.closed:
             raise ValueError("I/O operation on closed file")
 
-        buf = ""
+        buf = b""
         if self.buffer:
             if size is None:
                 buf = self.buffer
-                self.buffer = ""
+                self.buffer = b""
             else:
                 buf = self.buffer[:size]
                 self.buffer = self.buffer[size:]
@@ -834,14 +833,14 @@
         if self.closed:
             raise ValueError("I/O operation on closed file")
 
-        pos = self.buffer.find("\n") + 1
+        pos = self.buffer.find(b"\n") + 1
         if pos == 0:
             # no newline found.
             while True:
                 buf = self.fileobj.read(self.blocksize)
                 self.buffer += buf
-                if not buf or "\n" in buf:
-                    pos = self.buffer.find("\n") + 1
+                if not buf or b"\n" in buf:
+                    pos = self.buffer.find(b"\n") + 1
                     if pos == 0:
                         # no newline found.
                         pos = len(self.buffer)
@@ -873,25 +872,25 @@
 
         return self.position
 
-    def seek(self, pos, whence=0):
+    def seek(self, pos, whence=os.SEEK_SET):
         """Seek to a position in the file.
         """
         if self.closed:
             raise ValueError("I/O operation on closed file")
 
-        if whence == 0:  # os.SEEK_SET
+        if whence == os.SEEK_SET:
             self.position = min(max(pos, 0), self.size)
-        elif whence == 1:  # os.SEEK_CUR
+        elif whence == os.SEEK_CUR:
             if pos < 0:
                 self.position = max(self.position + pos, 0)
             else:
                 self.position = min(self.position + pos, self.size)
-        elif whence == 2:  # os.SEEK_END
+        elif whence == os.SEEK_END:
             self.position = max(min(self.size + pos, self.size), 0)
         else:
             raise ValueError("Invalid argument")
 
-        self.buffer = ""
+        self.buffer = b""
         self.fileobj.seek(self.position)
 
     def close(self):
@@ -912,7 +911,7 @@
 #------------------
 # Exported Classes
 #------------------
-class TarInfo(object):
+class TarInfo:
     """Informational class which holds the details about an
        archive member given by a tar header block.
        TarInfo objects are returned by TarFile.getmember(),
@@ -931,7 +930,7 @@
            of the member.
         """
         self.name = name        # member name
-        self.mode = 0644        # file permissions
+        self.mode = 0o644       # file permissions
         self.uid = 0            # user id
         self.gid = 0            # group id
         self.size = 0           # file size
@@ -972,7 +971,7 @@
         """
         info = {
             "name":     self.name,
-            "mode":     self.mode & 07777,
+            "mode":     self.mode & 0o7777,
             "uid":      self.uid,
             "gid":      self.gid,
             "size":     self.size,
@@ -991,7 +990,7 @@
 
         return info
 
-    def tobuf(self, format=DEFAULT_FORMAT, encoding=ENCODING, errors="strict"):
+    def tobuf(self, format=DEFAULT_FORMAT, encoding=ENCODING, errors="surrogateescape"):
         """Return a tar header as a string of 512 byte blocks.
         """
         info = self.get_info()
@@ -1023,7 +1022,7 @@
         """
         info["magic"] = GNU_MAGIC
 
-        buf = ""
+        buf = b""
         if len(info["linkname"]) > LENGTH_LINK:
             buf += self._create_gnu_long_header(info["linkname"], GNUTYPE_LONGLINK, encoding, errors)
 
@@ -1070,14 +1069,14 @@
 
             val = info[name]
             if not 0 <= val < 8 ** (digits - 1) or isinstance(val, float):
-                pax_headers[name] = unicode(val)
+                pax_headers[name] = str(val)
                 info[name] = 0
 
         # Create a pax extended header if necessary.
         if pax_headers:
             buf = self._create_pax_generic_header(pax_headers, XHDTYPE, encoding)
         else:
-            buf = ""
+            buf = b""
 
         return buf + self._create_header(info, USTAR_FORMAT, "ascii", "replace")
 
@@ -1109,12 +1108,12 @@
         """
         parts = [
             stn(info.get("name", ""), 100, encoding, errors),
-            itn(info.get("mode", 0) & 07777, 8, format),
+            itn(info.get("mode", 0) & 0o7777, 8, format),
             itn(info.get("uid", 0), 8, format),
             itn(info.get("gid", 0), 8, format),
             itn(info.get("size", 0), 12, format),
             itn(info.get("mtime", 0), 12, format),
-            "        ", # checksum field
+            b"        ", # checksum field
             info.get("type", REGTYPE),
             stn(info.get("linkname", ""), 100, encoding, errors),
             info.get("magic", POSIX_MAGIC),
@@ -1125,9 +1124,9 @@
             stn(info.get("prefix", ""), 155, encoding, errors)
         ]
 
-        buf = struct.pack("%ds" % BLOCKSIZE, "".join(parts))
+        buf = struct.pack("%ds" % BLOCKSIZE, b"".join(parts))
         chksum = calc_chksums(buf[-BLOCKSIZE:])[0]
-        buf = buf[:-364] + "%06o\0" % chksum + buf[-357:]
+        buf = buf[:-364] + bytes("%06o\0" % chksum, "ascii") + buf[-357:]
         return buf
 
     @staticmethod
@@ -1161,7 +1160,7 @@
     def _create_pax_generic_header(cls, pax_headers, type, encoding):
         """Return a POSIX.1-2008 extended or global header sequence
            that contains a list of keyword, value pairs. The values
-           must be unicode objects.
+           must be strings.
         """
         # Check if one of the fields contains surrogate characters and thereby
         # forces hdrcharset=BINARY, see _proc_pax() for more information.
@@ -1173,10 +1172,10 @@
                 binary = True
                 break
 
-        records = ""
+        records = b""
         if binary:
             # Put the hdrcharset field at the beginning of the header.
-            records += "21 hdrcharset=BINARY\n"
+            records += b"21 hdrcharset=BINARY\n"
 
         for keyword, value in pax_headers.items():
             keyword = keyword.encode("utf8")
@@ -1194,7 +1193,7 @@
                 if n == p:
                     break
                 p = n
-            records += bytes(str(p), "ascii") + " " + keyword + "=" + value + "\n"
+            records += bytes(str(p), "ascii") + b" " + keyword + b"=" + value + b"\n"
 
         # We use a hardcoded "././@PaxHeader" name like star does
         # instead of the one that POSIX recommends.
@@ -1355,7 +1354,7 @@
         while isextended:
             buf = tarfile.fileobj.read(BLOCKSIZE)
             pos = 0
-            for i in xrange(21):
+            for i in range(21):
                 try:
                     offset = nti(buf[pos:pos + 12])
                     numbytes = nti(buf[pos + 12:pos + 24])
@@ -1392,7 +1391,7 @@
         # these fields are UTF-8 encoded but since POSIX.1-2008 tar
         # implementations are allowed to store them as raw binary strings if
         # the translation to UTF-8 fails.
-        match = re.search(r"\d+ hdrcharset=([^\n]+)\n", buf)
+        match = re.search(br"\d+ hdrcharset=([^\n]+)\n", buf)
         if match is not None:
             pax_headers["hdrcharset"] = match.group(1).decode("utf8")
 
@@ -1409,7 +1408,7 @@
         # "%d %s=%s\n" % (length, keyword, value). length is the size
         # of the complete record including the length field itself and
         # the newline. keyword and value are both UTF-8 encoded strings.
-        regex = re.compile(r"(\d+) ([^=]+)=")
+        regex = re.compile(br"(\d+) ([^=]+)=")
         pos = 0
         while True:
             match = regex.match(buf, pos)
@@ -1478,10 +1477,10 @@
         """Process a GNU tar extended sparse header, version 0.0.
         """
         offsets = []
-        for match in re.finditer(r"\d+ GNU.sparse.offset=(\d+)\n", buf):
+        for match in re.finditer(br"\d+ GNU.sparse.offset=(\d+)\n", buf):
             offsets.append(int(match.group(1)))
         numbytes = []
-        for match in re.finditer(r"\d+ GNU.sparse.numbytes=(\d+)\n", buf):
+        for match in re.finditer(br"\d+ GNU.sparse.numbytes=(\d+)\n", buf):
             numbytes.append(int(match.group(1)))
         next.sparse = list(zip(offsets, numbytes))
 
@@ -1497,12 +1496,12 @@
         fields = None
         sparse = []
         buf = tarfile.fileobj.read(BLOCKSIZE)
-        fields, buf = buf.split("\n", 1)
+        fields, buf = buf.split(b"\n", 1)
         fields = int(fields)
         while len(sparse) < fields * 2:
-            if "\n" not in buf:
+            if b"\n" not in buf:
                 buf += tarfile.fileobj.read(BLOCKSIZE)
-            number, buf = buf.split("\n", 1)
+            number, buf = buf.split(b"\n", 1)
             sparse.append(int(number))
         next.offset_data = tarfile.fileobj.tell()
         next.sparse = list(zip(sparse[::2], sparse[1::2]))
@@ -1569,7 +1568,7 @@
         return self.type in (CHRTYPE, BLKTYPE, FIFOTYPE)
 # class TarInfo
 
-class TarFile(object):
+class TarFile:
     """The TarFile Class provides an interface to tar archives.
     """
 
@@ -1597,7 +1596,7 @@
 
     def __init__(self, name=None, mode="r", fileobj=None, format=None,
             tarinfo=None, dereference=None, ignore_zeros=None, encoding=None,
-            errors="strict", pax_headers=None, debug=None, errorlevel=None):
+            errors="surrogateescape", pax_headers=None, debug=None, errorlevel=None):
         """Open an (uncompressed) tar archive `name'. `mode' is either 'r' to
            read from an existing archive, 'a' to append data to an existing
            file or 'w' to create a new file overwriting an existing one. `mode'
@@ -1624,10 +1623,7 @@
             if hasattr(fileobj, "mode"):
                 self._mode = fileobj.mode
             self._extfileobj = True
-        if name:
-            self.name = os.path.abspath(name)
-        else:
-            self.name = None
+        self.name = os.path.abspath(name) if name else None
         self.fileobj = fileobj
 
         # Init attributes.
@@ -1678,7 +1674,7 @@
                     except EOFHeaderError:
                         self.fileobj.seek(self.offset)
                         break
-                    except HeaderError, e:
+                    except HeaderError as e:
                         raise ReadError(str(e))
 
             if self.mode in "aw":
@@ -1740,7 +1736,7 @@
                     saved_pos = fileobj.tell()
                 try:
                     return func(name, "r", fileobj, **kwargs)
-                except (ReadError, CompressionError), e:
+                except (ReadError, CompressionError) as e:
                     if fileobj is not None:
                         fileobj.seek(saved_pos)
                     continue
@@ -2010,31 +2006,27 @@
 
         for tarinfo in self:
             if verbose:
-                print filemode(tarinfo.mode),
-                print "%s/%s" % (tarinfo.uname or tarinfo.uid,
-                                 tarinfo.gname or tarinfo.gid),
+                print(filemode(tarinfo.mode), end=' ')
+                print("%s/%s" % (tarinfo.uname or tarinfo.uid,
+                                 tarinfo.gname or tarinfo.gid), end=' ')
                 if tarinfo.ischr() or tarinfo.isblk():
-                    print "%10s" % ("%d,%d" \
-                                    % (tarinfo.devmajor, tarinfo.devminor)),
+                    print("%10s" % ("%d,%d" \
+                                    % (tarinfo.devmajor, tarinfo.devminor)), end=' ')
                 else:
-                    print "%10d" % tarinfo.size,
-                print "%d-%02d-%02d %02d:%02d:%02d" \
-                      % time.localtime(tarinfo.mtime)[:6],
+                    print("%10d" % tarinfo.size, end=' ')
+                print("%d-%02d-%02d %02d:%02d:%02d" \
+                      % time.localtime(tarinfo.mtime)[:6], end=' ')
 
-            if tarinfo.isdir():
-                sep = "/"
-            else:
-                sep = ""
-            print tarinfo.name + sep,
+            print(tarinfo.name + ("/" if tarinfo.isdir() else ""), end=' ')
 
             if verbose:
                 if tarinfo.issym():
-                    print "->", tarinfo.linkname,
+                    print("->", tarinfo.linkname, end=' ')
                 if tarinfo.islnk():
-                    print "link to", tarinfo.linkname,
-            print
+                    print("link to", tarinfo.linkname, end=' ')
+            print()
 
-    def add(self, name, arcname=None, recursive=True, exclude=None, filter=None):
+    def add(self, name, arcname=None, recursive=True, exclude=None, *, filter=None):
         """Add the file `name' to the archive. `name' may be any type of file
            (directory, fifo, symbolic link, etc.). If given, `arcname'
            specifies an alternative name for the file in the archive.
@@ -2139,7 +2131,7 @@
                 # Extract directories with a safe mode.
                 directories.append(tarinfo)
                 tarinfo = copy.copy(tarinfo)
-                tarinfo.mode = 0700
+                tarinfo.mode = 0o700
             # Do not set_attrs directories, as we will do that further down
             self.extract(tarinfo, path, set_attrs=not tarinfo.isdir())
 
@@ -2154,7 +2146,7 @@
                 self.chown(tarinfo, dirpath)
                 self.utime(tarinfo, dirpath)
                 self.chmod(tarinfo, dirpath)
-            except ExtractError, e:
+            except ExtractError as e:
                 if self.errorlevel > 1:
                     raise
                 else:
@@ -2169,7 +2161,7 @@
         """
         self._check("r")
 
-        if isinstance(member, basestring):
+        if isinstance(member, str):
             tarinfo = self.getmember(member)
         else:
             tarinfo = member
@@ -2181,7 +2173,7 @@
         try:
             self._extract_member(tarinfo, os.path.join(path, tarinfo.name),
                                  set_attrs=set_attrs)
-        except EnvironmentError, e:
+        except EnvironmentError as e:
             if self.errorlevel > 0:
                 raise
             else:
@@ -2189,7 +2181,7 @@
                     self._dbg(1, "tarfile: %s" % e.strerror)
                 else:
                     self._dbg(1, "tarfile: %s %r" % (e.strerror, e.filename))
-        except ExtractError, e:
+        except ExtractError as e:
             if self.errorlevel > 1:
                 raise
             else:
@@ -2206,7 +2198,7 @@
         """
         self._check("r")
 
-        if isinstance(member, basestring):
+        if isinstance(member, str):
             tarinfo = self.getmember(member)
         else:
             tarinfo = member
@@ -2287,8 +2279,8 @@
         try:
             # Use a safe mode for the directory, the real mode is set
             # later in _extract_member().
-            os.mkdir(targetpath, 0700)
-        except EnvironmentError, e:
+            os.mkdir(targetpath, 0o700)
+        except EnvironmentError as e:
             if e.errno != errno.EEXIST:
                 raise
 
@@ -2387,7 +2379,7 @@
                 else:
                     if sys.platform != "os2emx":
                         os.chown(targetpath, u, g)
-            except EnvironmentError, e:
+            except EnvironmentError as e:
                 raise ExtractError("could not change owner")
 
     def chmod(self, tarinfo, targetpath):
@@ -2396,7 +2388,7 @@
         if hasattr(os, 'chmod'):
             try:
                 os.chmod(targetpath, tarinfo.mode)
-            except EnvironmentError, e:
+            except EnvironmentError as e:
                 raise ExtractError("could not change mode")
 
     def utime(self, tarinfo, targetpath):
@@ -2406,7 +2398,7 @@
             return
         try:
             os.utime(targetpath, (tarinfo.mtime, tarinfo.mtime))
-        except EnvironmentError, e:
+        except EnvironmentError as e:
             raise ExtractError("could not change modification time")
 
     #--------------------------------------------------------------------------
@@ -2427,12 +2419,12 @@
         while True:
             try:
                 tarinfo = self.tarinfo.fromtarfile(self)
-            except EOFHeaderError, e:
+            except EOFHeaderError as e:
                 if self.ignore_zeros:
                     self._dbg(2, "0x%X: %s" % (self.offset, e))
                     self.offset += BLOCKSIZE
                     continue
-            except InvalidHeaderError, e:
+            except InvalidHeaderError as e:
                 if self.ignore_zeros:
                     self._dbg(2, "0x%X: %s" % (self.offset, e))
                     self.offset += BLOCKSIZE
@@ -2442,10 +2434,10 @@
             except EmptyHeaderError:
                 if self.offset == 0:
                     raise ReadError("empty file")
-            except TruncatedHeaderError, e:
+            except TruncatedHeaderError as e:
                 if self.offset == 0:
                     raise ReadError(str(e))
-            except SubsequentHeaderError, e:
+            except SubsequentHeaderError as e:
                 raise ReadError(str(e))
             break
 
@@ -2532,7 +2524,7 @@
         """Write debugging output to sys.stderr.
         """
         if level <= self.debug:
-            print >> sys.stderr, msg
+            print(msg, file=sys.stderr)
 
     def __enter__(self):
         self._check()
@@ -2549,7 +2541,7 @@
             self.closed = True
 # class TarFile
 
-class TarIter(object):
+class TarIter:
     """Iterator Class.
 
        for tarinfo in TarFile(...):
@@ -2565,7 +2557,7 @@
         """Return iterator object.
         """
         return self
-    def next(self):
+    def __next__(self):
         """Return the next item using TarFile's next() method.
            When all members have been read, set TarFile as _loaded.
         """
diff --git a/distutils2/_backport/tests/test_shutil.py b/distutils2/_backport/tests/test_shutil.py
--- a/distutils2/_backport/tests/test_shutil.py
+++ b/distutils2/_backport/tests/test_shutil.py
@@ -2,11 +2,11 @@
 import sys
 import stat
 import tempfile
+from io import StringIO
 from os.path import splitdrive
-from StringIO import StringIO
+from functools import wraps
+from distutils.spawn import find_executable, spawn
 
-from distutils.spawn import find_executable, spawn
-from distutils2.compat import wraps
 from distutils2._backport import shutil, tarfile
 from distutils2._backport.shutil import (
     _make_tarball, _make_zipfile, make_archive, unpack_archive,
@@ -15,7 +15,7 @@
     Error, RegistryError)
 
 from distutils2.tests import unittest, support
-from test.test_support import TESTFN
+from test.support import TESTFN
 
 
 try:
@@ -370,7 +370,7 @@
                 os.mkfifo(pipe)
                 try:
                     shutil.copytree(TESTFN, TESTFN2)
-                except shutil.Error, e:
+                except shutil.Error as e:
                     errors = e.args[0]
                     self.assertEqual(len(errors), 1)
                     src, dst, error_msg = errors[0]
@@ -756,7 +756,7 @@
         self.dst_file = os.path.join(self.dst_dir, filename)
         f = open(self.src_file, "wb")
         try:
-            f.write("spam")
+            f.write(b"spam")
         finally:
             f.close()
 
@@ -877,7 +877,7 @@
 
     _delete = False
 
-    class Faux(object):
+    class Faux:
         _entered = False
         _exited_with = None
         _raised = False
@@ -917,7 +917,6 @@
 
         self.assertRaises(IOError, shutil.copyfile, 'srcfile', 'destfile')
 
-    @unittest.skip("can't use the with statement and support 2.4")
     def test_w_dest_open_fails(self):
 
         srcfile = self.Faux()
@@ -937,7 +936,6 @@
         self.assertEqual(srcfile._exited_with[1].args,
                          ('Cannot open "destfile"',))
 
-    @unittest.skip("can't use the with statement and support 2.4")
     def test_w_dest_close_fails(self):
 
         srcfile = self.Faux()
@@ -960,7 +958,6 @@
         self.assertEqual(srcfile._exited_with[1].args,
                          ('Cannot close',))
 
-    @unittest.skip("can't use the with statement and support 2.4")
     def test_w_source_close_fails(self):
 
         srcfile = self.Faux(True)
diff --git a/distutils2/_backport/tests/test_sysconfig.py b/distutils2/_backport/tests/test_sysconfig.py
--- a/distutils2/_backport/tests/test_sysconfig.py
+++ b/distutils2/_backport/tests/test_sysconfig.py
@@ -3,8 +3,8 @@
 import subprocess
 import shutil
 from copy import copy
-from ConfigParser import RawConfigParser
-from StringIO import StringIO
+from configparser import RawConfigParser
+from io import StringIO
 
 from distutils2._backport import sysconfig
 from distutils2._backport.sysconfig import (
@@ -15,7 +15,7 @@
 from distutils2.tests import unittest
 from distutils2.tests.support import skip_unless_symlink
 
-from test.test_support import TESTFN, unlink
+from test.support import TESTFN, unlink
 
 
 class TestSysConfig(unittest.TestCase):
@@ -141,14 +141,14 @@
         get_config_vars()['CFLAGS'] = ('-fno-strict-aliasing -DNDEBUG -g '
                                        '-fwrapv -O3 -Wall -Wstrict-prototypes')
 
-        maxint = sys.maxint
+        maxint = sys.maxsize
         try:
-            sys.maxint = 2147483647
+            sys.maxsize = 2147483647
             self.assertEqual(get_platform(), 'macosx-10.3-ppc')
-            sys.maxint = 9223372036854775807
+            sys.maxsize = 9223372036854775807
             self.assertEqual(get_platform(), 'macosx-10.3-ppc64')
         finally:
-            sys.maxint = maxint
+            sys.maxsize = maxint
 
         self._set_uname(('Darwin', 'macziade', '8.11.1',
                    ('Darwin Kernel Version 8.11.1: '
@@ -159,14 +159,14 @@
 
         get_config_vars()['CFLAGS'] = ('-fno-strict-aliasing -DNDEBUG -g '
                                        '-fwrapv -O3 -Wall -Wstrict-prototypes')
-        maxint = sys.maxint
+        maxint = sys.maxsize
         try:
-            sys.maxint = 2147483647
+            sys.maxsize = 2147483647
             self.assertEqual(get_platform(), 'macosx-10.3-i386')
-            sys.maxint = 9223372036854775807
+            sys.maxsize = 9223372036854775807
             self.assertEqual(get_platform(), 'macosx-10.3-x86_64')
         finally:
-            sys.maxint = maxint
+            sys.maxsize = maxint
 
         # macbook with fat binaries (fat, universal or fat64)
         get_config_vars()['MACOSX_DEPLOYMENT_TARGET'] = '10.4'
@@ -244,7 +244,7 @@
         def get(python):
             cmd = [python, '-c',
                    'from distutils2._backport import sysconfig; '
-                   'print sysconfig.get_platform()']
+                   'print(sysconfig.get_platform())']
             p = subprocess.Popen(cmd, stdout=subprocess.PIPE, env=os.environ)
             return p.communicate()
         real = os.path.realpath(sys.executable)
@@ -255,7 +255,6 @@
         finally:
             unlink(link)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_user_similar(self):
         # Issue #8759: make sure the posix scheme for the users
         # is similar to the global posix_prefix one
@@ -290,18 +289,15 @@
         if 'MACOSX_DEPLOYMENT_TARGET' in env:
             del env['MACOSX_DEPLOYMENT_TARGET']
 
-        devnull_fp = open('/dev/null', 'w')
-        try:
+        with open('/dev/null', 'w') as devnull_fp:
             p = subprocess.Popen([
                     sys.executable, '-c',
                     'from distutils2._backport import sysconfig; '
-                    'print sysconfig.get_platform()',
+                    'print(sysconfig.get_platform())',
                 ],
                 stdout=subprocess.PIPE,
                 stderr=devnull_fp,
                 env=env)
-        finally:
-            devnull_fp.close()
         test_platform = p.communicate()[0].strip()
         test_platform = test_platform.decode('utf-8')
         status = p.wait()
@@ -314,12 +310,11 @@
         env = os.environ.copy()
         env['MACOSX_DEPLOYMENT_TARGET'] = '10.1'
 
-        dev_null = open('/dev/null')
-        try:
+        with open('/dev/null') as dev_null:
             p = subprocess.Popen([
                     sys.executable, '-c',
                     'from distutils2._backport import sysconfig; '
-                    'print sysconfig.get_platform()',
+                    'print(sysconfig.get_platform())',
                 ],
                 stdout=subprocess.PIPE,
                 stderr=dev_null,
@@ -330,8 +325,6 @@
 
             self.assertEqual(status, 0)
             self.assertEqual(my_platform, test_platform)
-        finally:
-            dev_null.close()
 
 
 class MakefileTests(unittest.TestCase):
@@ -344,15 +337,12 @@
 
     def test_parse_makefile(self):
         self.addCleanup(unlink, TESTFN)
-        makefile = open(TESTFN, "w")
-        try:
-            print >> makefile, "var1=a$(VAR2)"
-            print >> makefile, "VAR2=b$(var3)"
-            print >> makefile, "var3=42"
-            print >> makefile, "var4=$/invalid"
-            print >> makefile, "var5=dollar$$5"
-        finally:
-            makefile.close()
+        with open(TESTFN, "w") as makefile:
+            print("var1=a$(VAR2)", file=makefile)
+            print("VAR2=b$(var3)", file=makefile)
+            print("var3=42", file=makefile)
+            print("var4=$/invalid", file=makefile)
+            print("var5=dollar$$5", file=makefile)
         vars = sysconfig._parse_makefile(TESTFN)
         self.assertEqual(vars, {
             'var1': 'ab42',
diff --git a/distutils2/command/bdist_msi.py b/distutils2/command/bdist_msi.py
--- a/distutils2/command/bdist_msi.py
+++ b/distutils2/command/bdist_msi.py
@@ -394,8 +394,7 @@
         #     entries for each version as the above code does
         if self.pre_install_script:
             scriptfn = os.path.join(self.bdist_dir, "preinstall.bat")
-            f = open(scriptfn, "w")
-            try:
+            with open(scriptfn, "w") as f:
                 # The batch file will be executed with [PYTHON], so that %1
                 # is the path to the Python interpreter; %0 will be the path
                 # of the batch file.
@@ -405,13 +404,8 @@
                 # """
                 # <actual script>
                 f.write('rem ="""\n%1 %0\nexit\n"""\n')
-                fp = open(self.pre_install_script)
-                try:
+                with open(self.pre_install_script) as fp:
                     f.write(fp.read())
-                finally:
-                    fp.close()
-            finally:
-                f.close()
             add_data(self.db, "Binary",
                      [("PreInstall", msilib.Binary(scriptfn)),
                      ])
diff --git a/distutils2/command/bdist_wininst.py b/distutils2/command/bdist_wininst.py
--- a/distutils2/command/bdist_wininst.py
+++ b/distutils2/command/bdist_wininst.py
@@ -248,40 +248,33 @@
         logger.info("creating %s", installer_name)
 
         if bitmap:
-            fp = open(bitmap, "rb")
-            try:
+            with open(bitmap, "rb") as fp:
                 bitmapdata = fp.read()
-            finally:
-                fp.close()
             bitmaplen = len(bitmapdata)
         else:
             bitmaplen = 0
 
-        file = open(installer_name, "wb")
-        try:
+        with open(installer_name, "wb") as file:
             file.write(self.get_exe_bytes())
             if bitmap:
                 file.write(bitmapdata)
 
             # Convert cfgdata from unicode to ascii, mbcs encoded
-            if isinstance(cfgdata, unicode):
+            if isinstance(cfgdata, str):
                 cfgdata = cfgdata.encode("mbcs")
 
             # Append the pre-install script
-            cfgdata = cfgdata + "\0"
+            cfgdata = cfgdata + b"\0"
             if self.pre_install_script:
                 # We need to normalize newlines, so we open in text mode and
                 # convert back to bytes. "latin-1" simply avoids any possible
                 # failures.
-                fp = codecs.open(self.pre_install_script, encoding="latin-1")
-                try:
+                with open(self.pre_install_script, encoding="latin-1") as fp:
                     script_data = fp.read().encode("latin-1")
-                finally:
-                    fp.close()
-                cfgdata = cfgdata + script_data + "\n\0"
+                cfgdata = cfgdata + script_data + b"\n\0"
             else:
                 # empty pre-install script
-                cfgdata = cfgdata + "\0"
+                cfgdata = cfgdata + b"\0"
             file.write(cfgdata)
 
             # The 'magic number' 0x1234567B is used to make sure that the
@@ -295,13 +288,8 @@
                                  bitmaplen,        # number of bytes in bitmap
                                  )
             file.write(header)
-            fp = open(arcname, "rb")
-            try:
+            with open(arcname, "rb") as fp:
                 file.write(fp.read())
-            finally:
-                fp.close()
-        finally:
-            file.close()
 
     def get_installer_filename(self, fullname):
         # Factored out to allow overriding in subclasses
@@ -356,9 +344,5 @@
             sfix = ''
 
         filename = os.path.join(directory, "wininst-%.1f%s.exe" % (bv, sfix))
-        fp = open(filename, "rb")
-        try:
-            content = fp.read()
-        finally:
-            fp.close()
-        return content
+        with open(filename, "rb") as fp:
+            return fp.read()
diff --git a/distutils2/command/build_clib.py b/distutils2/command/build_clib.py
--- a/distutils2/command/build_clib.py
+++ b/distutils2/command/build_clib.py
@@ -82,7 +82,7 @@
 
         if self.include_dirs is None:
             self.include_dirs = self.distribution.include_dirs or []
-        if isinstance(self.include_dirs, basestring):
+        if isinstance(self.include_dirs, str):
             self.include_dirs = self.include_dirs.split(os.pathsep)
 
         # XXX same as for build_ext -- what about 'self.define' and
@@ -130,7 +130,7 @@
 
             name, build_info = lib
 
-            if not isinstance(name, basestring):
+            if not isinstance(name, str):
                 raise PackagingSetupError("first element of each tuple in 'libraries' " + \
                       "must be a string (the library name)")
             if '/' in name or (os.sep != '/' and os.sep in name):
diff --git a/distutils2/command/build_ext.py b/distutils2/command/build_ext.py
--- a/distutils2/command/build_ext.py
+++ b/distutils2/command/build_ext.py
@@ -3,6 +3,7 @@
 import os
 import re
 import sys
+import site
 import logging
 
 from distutils2._backport import sysconfig
@@ -15,12 +16,6 @@
 from distutils2.compiler.extension import Extension
 from distutils2 import logger
 
-import site
-if sys.version_info[:2] >= (2, 6):
-    HAS_USER_SITE = True
-else:
-    HAS_USER_SITE = False
-
 if os.name == 'nt':
     from distutils2.compiler.msvccompiler import get_build_version
     MSVC_VERSION = int(get_build_version())
@@ -65,6 +60,8 @@
         ('inplace', 'i',
          "ignore build-lib and put compiled extensions into the source " +
          "directory alongside your pure Python modules"),
+        ('user', None,
+         "add user include, library and rpath"),
         ('include-dirs=', 'I',
          "list of directories to search for header files" + sep_by),
         ('define=', 'D',
@@ -91,12 +88,8 @@
          "path to the SWIG executable"),
         ]
 
-    boolean_options = ['inplace', 'debug', 'force']
+    boolean_options = ['inplace', 'debug', 'force', 'user']
 
-    if HAS_USER_SITE:
-        user_options.append(('user', None,
-                             "add user include, library and rpath"))
-        boolean_options.append('user')
 
     help_options = [
         ('help-compiler', None,
@@ -123,8 +116,7 @@
         self.compiler = None
         self.swig = None
         self.swig_opts = None
-        if HAS_USER_SITE:
-            self.user = None
+        self.user = None
 
     def finalize_options(self):
         self.set_undefined_options('build',
@@ -159,7 +151,7 @@
         plat_py_include = sysconfig.get_path('platinclude')
         if self.include_dirs is None:
             self.include_dirs = self.distribution.include_dirs or []
-        if isinstance(self.include_dirs, basestring):
+        if isinstance(self.include_dirs, str):
             self.include_dirs = self.include_dirs.split(os.pathsep)
 
         # Put the Python "system" include dir at the end, so that
@@ -168,7 +160,7 @@
         if plat_py_include != py_include:
             self.include_dirs.append(plat_py_include)
 
-        if isinstance(self.libraries, basestring):
+        if isinstance(self.libraries, str):
             self.libraries = [self.libraries]
 
         # Life is easier if we're not forever checking for None, so
@@ -177,12 +169,12 @@
             self.libraries = []
         if self.library_dirs is None:
             self.library_dirs = []
-        elif isinstance(self.library_dirs, basestring):
+        elif isinstance(self.library_dirs, str):
             self.library_dirs = self.library_dirs.split(os.pathsep)
 
         if self.rpath is None:
             self.rpath = []
-        elif isinstance(self.rpath, basestring):
+        elif isinstance(self.rpath, str):
             self.rpath = self.rpath.split(os.pathsep)
 
         # for extensions under windows use different directories
@@ -243,8 +235,7 @@
         # for extensions under Linux or Solaris with a shared Python library,
         # Python's library directory must be appended to library_dirs
         sysconfig.get_config_var('Py_ENABLE_SHARED')
-        if ((sys.platform.startswith('linux') or sys.platform.startswith('gnu')
-             or sys.platform.startswith('sunos'))
+        if (sys.platform.startswith(('linux', 'gnu', 'sunos'))
             and sysconfig.get_config_var('Py_ENABLE_SHARED')):
             if sys.executable.startswith(os.path.join(sys.exec_prefix, "bin")):
                 # building third party extensions
@@ -274,7 +265,7 @@
             self.swig_opts = self.swig_opts.split(' ')
 
         # Finally add the user include and library directories if requested
-        if HAS_USER_SITE and self.user:
+        if self.user:
             user_include = os.path.join(site.USER_BASE, "include")
             user_lib = os.path.join(site.USER_BASE, "lib")
             if os.path.isdir(user_include):
@@ -362,7 +353,7 @@
         for ext in self.extensions:
             try:
                 self.build_extension(ext)
-            except (CCompilerError, PackagingError, CompileError), e:
+            except (CCompilerError, PackagingError, CompileError) as e:
                 if not ext.optional:
                     raise
                 logger.warning('%s: building extension %r failed: %s',
@@ -650,7 +641,7 @@
 
         else:
             if sysconfig.get_config_var('Py_ENABLE_SHARED'):
-                template = 'python%d.%d'
+                template = 'python%d.%d' + getattr(sys, 'abiflags', '')
                 pythonlib = template % sys.version_info[:2]
                 return ext.libraries + [pythonlib]
             else:
diff --git a/distutils2/command/build_py.py b/distutils2/command/build_py.py
--- a/distutils2/command/build_py.py
+++ b/distutils2/command/build_py.py
@@ -342,7 +342,7 @@
         return outputs
 
     def build_module(self, module, module_file, package):
-        if isinstance(package, basestring):
+        if isinstance(package, str):
             package = package.split('.')
         elif not isinstance(package, (list, tuple)):
             raise TypeError(
@@ -388,7 +388,7 @@
                 self.build_module(module, module_file, package)
 
     def byte_compile(self, files):
-        if getattr(sys, 'dont_write_bytecode', False):
+        if sys.dont_write_bytecode:
             logger.warning('%s: byte-compiling is disabled, skipping.',
                            self.get_command_name())
             return
diff --git a/distutils2/command/build_scripts.py b/distutils2/command/build_scripts.py
--- a/distutils2/command/build_scripts.py
+++ b/distutils2/command/build_scripts.py
@@ -2,17 +2,18 @@
 
 import os
 import re
+from tokenize import detect_encoding
 
 from distutils2.command.cmd import Command
 from distutils2.util import convert_path, newer
 from distutils2 import logger
 from distutils2.compat import Mixin2to3
-from distutils2.compat import detect_encoding, fsencode
+from distutils2.compat import fsencode
 from distutils2._backport import sysconfig
 
 
 # check if Python is called on the first line with this expression
-first_line_re = re.compile('^#!.*python[0-9.]*([ \t].*)?$')
+first_line_re = re.compile(b'^#!.*python[0-9.]*([ \t].*)?$')
 
 class build_scripts(Command, Mixin2to3):
 
@@ -94,7 +95,7 @@
                 match = first_line_re.match(first_line)
                 if match:
                     adjust = True
-                    post_interp = match.group(1) or ''
+                    post_interp = match.group(1) or b''
 
             if adjust:
                 logger.info("copying and adjusting %s -> %s", script,
@@ -108,7 +109,7 @@
                            "python%s%s" % (sysconfig.get_config_var("VERSION"),
                                            sysconfig.get_config_var("EXE")))
                     executable = fsencode(executable)
-                    shebang = "#!" + executable + post_interp + "\n"
+                    shebang = b"#!" + executable + post_interp + b"\n"
                     # Python parser starts to read a script using UTF-8 until
                     # it gets a #coding:xxx cookie. The shebang has to be the
                     # first line of a file, the #coding:xxx cookie cannot be
@@ -130,12 +131,9 @@
                             "The shebang (%r) is not decodable "
                             "from the script encoding (%s)" % (
                                 shebang, encoding))
-                    outf = open(outfile, "wb")
-                    try:
+                    with open(outfile, "wb") as outf:
                         outf.write(shebang)
                         outf.writelines(f.readlines())
-                    finally:
-                        outf.close()
                 if f:
                     f.close()
             else:
@@ -148,8 +146,8 @@
                 if self.dry_run:
                     logger.info("changing mode of %s", file)
                 else:
-                    oldmode = os.stat(file).st_mode & 07777
-                    newmode = (oldmode | 0555) & 07777
+                    oldmode = os.stat(file).st_mode & 0o7777
+                    newmode = (oldmode | 0o555) & 0o7777
                     if newmode != oldmode:
                         logger.info("changing mode of %s from %o to %o",
                                  file, oldmode, newmode)
diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py
--- a/distutils2/command/cmd.py
+++ b/distutils2/command/cmd.py
@@ -8,7 +8,7 @@
 from distutils2._backport.shutil import copyfile, move, make_archive
 
 
-class Command(object):
+class Command:
     """Abstract base class for defining command classes, the "worker bees"
     of the Packaging.  A useful analogy for command classes is to think of
     them as subroutines with local variables called "options".  The options
@@ -216,7 +216,7 @@
         if val is None:
             setattr(self, option, default)
             return default
-        elif not isinstance(val, basestring):
+        elif not isinstance(val, str):
             raise PackagingOptionError("'%s' must be a %s (got `%s`)" %
                                        (option, what, val))
         return val
@@ -236,14 +236,14 @@
         val = getattr(self, option)
         if val is None:
             return
-        elif isinstance(val, basestring):
+        elif isinstance(val, str):
             setattr(self, option, re.split(r',\s*|\s+', val))
         else:
             if isinstance(val, list):
                 # checks if all elements are str
                 ok = True
                 for element in val:
-                    if not isinstance(element, basestring):
+                    if not isinstance(element, str):
                         ok = False
                         break
             else:
@@ -351,7 +351,7 @@
     def execute(self, func, args, msg=None, level=1):
         util.execute(func, args, msg, dry_run=self.dry_run)
 
-    def mkpath(self, name, mode=00777, dry_run=None, verbose=0):
+    def mkpath(self, name, mode=0o777, dry_run=None, verbose=0):
         if dry_run is None:
             dry_run = self.dry_run
         name = os.path.normpath(name)
@@ -421,7 +421,7 @@
             skip_msg = "skipping %s (inputs unchanged)" % outfile
 
         # Allow 'infiles' to be a single string
-        if isinstance(infiles, basestring):
+        if isinstance(infiles, str):
             infiles = (infiles,)
         elif not isinstance(infiles, (list, tuple)):
             raise TypeError(
diff --git a/distutils2/command/config.py b/distutils2/command/config.py
--- a/distutils2/command/config.py
+++ b/distutils2/command/config.py
@@ -67,17 +67,17 @@
     def finalize_options(self):
         if self.include_dirs is None:
             self.include_dirs = self.distribution.include_dirs or []
-        elif isinstance(self.include_dirs, basestring):
+        elif isinstance(self.include_dirs, str):
             self.include_dirs = self.include_dirs.split(os.pathsep)
 
         if self.libraries is None:
             self.libraries = []
-        elif isinstance(self.libraries, basestring):
+        elif isinstance(self.libraries, str):
             self.libraries = [self.libraries]
 
         if self.library_dirs is None:
             self.library_dirs = []
-        elif isinstance(self.library_dirs, basestring):
+        elif isinstance(self.library_dirs, str):
             self.library_dirs = self.library_dirs.split(os.pathsep)
 
     def run(self):
@@ -110,8 +110,7 @@
 
     def _gen_temp_sourcefile(self, body, headers, lang):
         filename = "_configtest" + LANG_EXT[lang]
-        file = open(filename, "w")
-        try:
+        with open(filename, "w") as file:
             if headers:
                 for header in headers:
                     file.write("#include <%s>\n" % header)
@@ -119,8 +118,6 @@
             file.write(body)
             if body[-1] != "\n":
                 file.write("\n")
-        finally:
-            file.close()
         return filename
 
     def _preprocess(self, body, headers, include_dirs, lang):
@@ -206,11 +203,10 @@
         self._check_compiler()
         src, out = self._preprocess(body, headers, include_dirs, lang)
 
-        if isinstance(pattern, basestring):
+        if isinstance(pattern, str):
             pattern = re.compile(pattern)
 
-        file = open(out)
-        try:
+        with open(out) as file:
             match = False
             while True:
                 line = file.readline()
@@ -219,8 +215,6 @@
                 if pattern.search(line):
                     match = True
                     break
-        finally:
-            file.close()
 
         self._clean()
         return match
@@ -351,8 +345,5 @@
         logger.info(filename)
     else:
         logger.info(head)
-    file = open(filename)
-    try:
+    with open(filename) as file:
         logger.info(file.read())
-    finally:
-        file.close()
diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py
--- a/distutils2/command/install_data.py
+++ b/distutils2/command/install_data.py
@@ -48,7 +48,7 @@
             self.mkpath(dir_dest)
             try:
                 out = self.copy_file(_file[0], dir_dest)[0]
-            except Error, e:
+            except Error as e:
                 logger.warning('%s: %s', self.get_command_name(), e)
                 out = destination
 
diff --git a/distutils2/command/install_dist.py b/distutils2/command/install_dist.py
--- a/distutils2/command/install_dist.py
+++ b/distutils2/command/install_dist.py
@@ -13,12 +13,6 @@
 from distutils2.util import convert_path, change_root, get_platform
 from distutils2.errors import PackagingOptionError
 
-import site
-if sys.version_info[:2] >= (2, 6):
-    HAS_USER_SITE = True
-else:
-    HAS_USER_SITE = False
-
 
 class install_dist(Command):
 
@@ -30,6 +24,9 @@
          "installation prefix"),
         ('exec-prefix=', None,
          "(Unix only) prefix for platform-specific files"),
+        ('user', None,
+         "install in user site-packages directory [%s]" %
+         get_path('purelib', '%s_user' % os.name)),
         ('home=', None,
          "(Unix only) home directory to install under"),
 
@@ -100,15 +97,7 @@
         ]
 
     boolean_options = ['compile', 'force', 'skip-build', 'no-distinfo',
-                       'requested', 'no-record']
-
-    if HAS_USER_SITE:
-        user_options.append(
-            ('user', None,
-             "install in user site-packages directory [%s]" %
-             get_path('purelib', '%s_user' % os.name)))
-
-        boolean_options.append('user')
+                       'requested', 'no-record', 'user']
 
     negative_opt = {'no-compile': 'compile', 'no-requested': 'requested'}
 
@@ -118,8 +107,7 @@
         self.prefix = None
         self.exec_prefix = None
         self.home = None
-        if HAS_USER_SITE:
-            self.user = False
+        self.user = False
 
         # These select only the installation base; it's up to the user to
         # specify the installation scheme (currently, that means supplying
@@ -138,9 +126,8 @@
         self.install_lib = None         # set to either purelib or platlib
         self.install_scripts = None
         self.install_data = None
-        if HAS_USER_SITE:
-            self.install_userbase = get_config_var('userbase')
-            self.install_usersite = get_path('purelib', '%s_user' % os.name)
+        self.install_userbase = get_config_var('userbase')
+        self.install_usersite = get_path('purelib', '%s_user' % os.name)
 
         self.compile = None
         self.optimize = None
@@ -221,9 +208,8 @@
             raise PackagingOptionError(
                 "must supply either home or prefix/exec-prefix -- not both")
 
-        if HAS_USER_SITE and self.user and (
-                self.prefix or self.exec_prefix or self.home or
-                self.install_base or self.install_platbase):
+        if self.user and (self.prefix or self.exec_prefix or self.home or
+                          self.install_base or self.install_platbase):
             raise PackagingOptionError(
                 "can't combine user with prefix/exec_prefix/home or "
                 "install_base/install_platbase")
@@ -276,11 +262,9 @@
             'exec_prefix': exec_prefix,
             'srcdir': srcdir,
             'projectbase': projectbase,
-            }
-
-        if HAS_USER_SITE:
-            self.config_vars['userbase'] = self.install_userbase
-            self.config_vars['usersite'] = self.install_usersite
+            'userbase': self.install_userbase,
+            'usersite': self.install_usersite,
+        }
 
         self.expand_basedirs()
 
@@ -298,7 +282,7 @@
         self.dump_dirs("post-expand_dirs()")
 
         # Create directories under USERBASE
-        if HAS_USER_SITE and self.user:
+        if self.user:
             self.create_user_dirs()
 
         # Pick the actual directory to install all modules to: either
@@ -313,10 +297,8 @@
 
         # Convert directories from Unix /-separated syntax to the local
         # convention.
-        self.convert_paths('lib', 'purelib', 'platlib',
-                           'scripts', 'data', 'headers')
-        if HAS_USER_SITE:
-            self.convert_paths('userbase', 'usersite')
+        self.convert_paths('lib', 'purelib', 'platlib', 'scripts',
+                           'data', 'headers', 'userbase', 'usersite')
 
         # Well, we're not actually fully completely finalized yet: we still
         # have to deal with 'extra_path', which is the hack for allowing
@@ -357,7 +339,7 @@
                     "installation scheme is incomplete")
             return
 
-        if HAS_USER_SITE and self.user:
+        if self.user:
             if self.install_userbase is None:
                 raise PackagingPlatformError(
                     "user base directory is not specified")
@@ -385,7 +367,7 @@
 
     def finalize_other(self):
         """Finalize options for non-posix platforms"""
-        if HAS_USER_SITE and self.user:
+        if self.user:
             if self.install_userbase is None:
                 raise PackagingPlatformError(
                     "user base directory is not specified")
@@ -466,7 +448,7 @@
             self.extra_path = self.distribution.extra_path
 
         if self.extra_path is not None:
-            if isinstance(self.extra_path, basestring):
+            if isinstance(self.extra_path, str):
                 self.extra_path = self.extra_path.split(',')
 
             if len(self.extra_path) == 1:
@@ -501,7 +483,7 @@
         home = convert_path(os.path.expanduser("~"))
         for name, path in self.config_vars.items():
             if path.startswith(home) and not os.path.isdir(path):
-                os.makedirs(path, 0700)
+                os.makedirs(path, 0o700)
 
     # -- Command execution methods -------------------------------------
 
diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py
--- a/distutils2/command/install_distinfo.py
+++ b/distutils2/command/install_distinfo.py
@@ -4,12 +4,8 @@
 
 import os
 import csv
-import codecs
+import hashlib
 from shutil import rmtree
-try:
-    import hashlib
-except ImportError:
-    from distutils2._backport import hashlib
 
 from distutils2 import logger
 from distutils2.command.cmd import Command
@@ -90,11 +86,8 @@
         installer_path = os.path.join(self.distinfo_dir, 'INSTALLER')
         logger.info('creating %s', installer_path)
         if not self.dry_run:
-            f = open(installer_path, 'w')
-            try:
+            with open(installer_path, 'w') as f:
                 f.write(self.installer)
-            finally:
-                f.close()
         self.outfiles.append(installer_path)
 
         if self.requested:
@@ -111,15 +104,12 @@
                                               'RESOURCES')
                 logger.info('creating %s', resources_path)
                 if not self.dry_run:
-                    f = open(resources_path, 'wb')
-                    try:
+                    with open(resources_path, 'wb') as f:
                         writer = csv.writer(f, delimiter=',',
                                             lineterminator='\n',
                                             quotechar='"')
                         for row in install_data.get_resources_out():
                             writer.writerow(row)
-                    finally:
-                        f.close()
 
                 self.outfiles.append(resources_path)
 
@@ -127,8 +117,7 @@
             record_path = os.path.join(self.distinfo_dir, 'RECORD')
             logger.info('creating %s', record_path)
             if not self.dry_run:
-                f = codecs.open(record_path, 'w', encoding='utf-8')
-                try:
+                with open(record_path, 'w', encoding='utf-8') as f:
                     writer = csv.writer(f, delimiter=',',
                                         lineterminator='\n',
                                         quotechar='"')
@@ -141,19 +130,14 @@
                             writer.writerow((fpath, '', ''))
                         else:
                             size = os.path.getsize(fpath)
-                            fp = open(fpath, 'rb')
-                            try:
+                            with open(fpath, 'rb') as fp:
                                 hash = hashlib.md5()
                                 hash.update(fp.read())
-                            finally:
-                                fp.close()
                             md5sum = hash.hexdigest()
                             writer.writerow((fpath, md5sum, size))
 
                     # add the RECORD file itself
                     writer.writerow((record_path, '', ''))
-                finally:
-                    f.close()
             self.outfiles.append(record_path)
 
     def get_outputs(self):
diff --git a/distutils2/command/install_lib.py b/distutils2/command/install_lib.py
--- a/distutils2/command/install_lib.py
+++ b/distutils2/command/install_lib.py
@@ -114,7 +114,7 @@
         return outfiles
 
     def byte_compile(self, files):
-        if getattr(sys, 'dont_write_bytecode', False):
+        if sys.dont_write_bytecode:
             # XXX do we want this?  because a Python runs without bytecode
             # doesn't mean that the *dists should not contain bytecode
             #--or does it?
diff --git a/distutils2/command/install_scripts.py b/distutils2/command/install_scripts.py
--- a/distutils2/command/install_scripts.py
+++ b/distutils2/command/install_scripts.py
@@ -48,7 +48,7 @@
                 if self.dry_run:
                     logger.info("changing mode of %s", file)
                 else:
-                    mode = (os.stat(file).st_mode | 0555) & 07777
+                    mode = (os.stat(file).st_mode | 0o555) & 0o7777
                     logger.info("changing mode of %s to %o", file, mode)
                     os.chmod(file, mode)
 
diff --git a/distutils2/command/register.py b/distutils2/command/register.py
--- a/distutils2/command/register.py
+++ b/distutils2/command/register.py
@@ -3,8 +3,9 @@
 # Contributed by Richard Jones
 
 import getpass
-import urllib2
-import urlparse
+import urllib.error
+import urllib.parse
+import urllib.request
 
 from distutils2 import logger
 from distutils2.util import (read_pypirc, generate_pypirc, DEFAULT_REPOSITORY,
@@ -80,7 +81,7 @@
     def classifiers(self):
         ''' Fetch the list of classifiers from the server.
         '''
-        response = urllib2.urlopen(self.repository+'?:action=list_classifiers')
+        response = urllib.request.urlopen(self.repository+'?:action=list_classifiers')
         logger.info(response.read())
 
     def verify_metadata(self):
@@ -143,22 +144,22 @@
  4. quit
 Your selection [default 1]: ''')
 
-            choice = raw_input()
+            choice = input()
             if not choice:
                 choice = '1'
             elif choice not in choices:
-                print 'Please choose one of the four options!'
+                print('Please choose one of the four options!')
 
         if choice == '1':
             # get the username and password
             while not username:
-                username = raw_input('Username: ')
+                username = input('Username: ')
             while not password:
                 password = getpass.getpass('Password: ')
 
             # set up the authentication
-            auth = urllib2.HTTPPasswordMgr()
-            host = urlparse.urlparse(self.repository)[1]
+            auth = urllib.request.HTTPPasswordMgr()
+            host = urllib.parse.urlparse(self.repository)[1]
             auth.add_password(self.realm, host, username, password)
             # send the info to the server and report the result
             code, result = self.post_to_server(self.build_post_data('submit'),
@@ -178,7 +179,7 @@
                         get_pypirc_path())
                     choice = 'X'
                     while choice.lower() not in 'yn':
-                        choice = raw_input('Save your login (y/N)?')
+                        choice = input('Save your login (y/N)?')
                         if not choice:
                             choice = 'n'
                     if choice.lower() == 'y':
@@ -189,7 +190,7 @@
             data['name'] = data['password'] = data['email'] = ''
             data['confirm'] = None
             while not data['name']:
-                data['name'] = raw_input('Username: ')
+                data['name'] = input('Username: ')
             while data['password'] != data['confirm']:
                 while not data['password']:
                     data['password'] = getpass.getpass('Password: ')
@@ -198,9 +199,9 @@
                 if data['password'] != data['confirm']:
                     data['password'] = ''
                     data['confirm'] = None
-                    print "Password and confirm don't match!"
+                    print("Password and confirm don't match!")
             while not data['email']:
-                data['email'] = raw_input('   EMail: ')
+                data['email'] = input('   EMail: ')
             code, result = self.post_to_server(data)
             if code != 200:
                 logger.info('server response (%s): %s', code, result)
@@ -211,7 +212,7 @@
             data = {':action': 'password_reset'}
             data['email'] = ''
             while not data['email']:
-                data['email'] = raw_input('Your email address: ')
+                data['email'] = input('Your email address: ')
             code, result = self.post_to_server(data)
             logger.info('server response (%s): %s', code, result)
 
@@ -236,20 +237,20 @@
             'Content-type': content_type,
             'Content-length': str(len(body))
         }
-        req = urllib2.Request(self.repository, body, headers)
+        req = urllib.request.Request(self.repository, body, headers)
 
         # handle HTTP and include the Basic Auth handler
-        opener = urllib2.build_opener(
-            urllib2.HTTPBasicAuthHandler(password_mgr=auth)
+        opener = urllib.request.build_opener(
+            urllib.request.HTTPBasicAuthHandler(password_mgr=auth)
         )
         data = ''
         try:
             result = opener.open(req)
-        except urllib2.HTTPError, e:
+        except urllib.error.HTTPError as e:
             if self.show_response:
                 data = e.fp.read()
             result = e.code, e.msg
-        except urllib2.URLError, e:
+        except urllib.error.URLError as e:
             result = 500, str(e)
         else:
             if self.show_response:
diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py
--- a/distutils2/command/sdist.py
+++ b/distutils2/command/sdist.py
@@ -3,7 +3,7 @@
 import os
 import re
 import sys
-from StringIO import StringIO
+from io import StringIO
 
 from distutils2 import logger
 from distutils2.util import resolve_name
@@ -134,7 +134,7 @@
         if self.manifest_builders is None:
             self.manifest_builders = []
         else:
-            if isinstance(self.manifest_builders, basestring):
+            if isinstance(self.manifest_builders, str):
                 self.manifest_builders = self.manifest_builders.split(',')
             builders = []
             for builder in self.manifest_builders:
@@ -143,7 +143,7 @@
                     continue
                 try:
                     builder = resolve_name(builder)
-                except ImportError, e:
+                except ImportError as e:
                     raise PackagingModuleError(e)
 
                 builders.append(builder)
@@ -337,7 +337,7 @@
         """
         return self.archive_files
 
-    def create_tree(self, base_dir, files, mode=0777, verbose=1,
+    def create_tree(self, base_dir, files, mode=0o777, verbose=1,
                     dry_run=False):
         need_dir = set()
         for file in files:
diff --git a/distutils2/command/upload.py b/distutils2/command/upload.py
--- a/distutils2/command/upload.py
+++ b/distutils2/command/upload.py
@@ -4,14 +4,11 @@
 import socket
 import logging
 import platform
-import urlparse
+import urllib.parse
 from base64 import standard_b64encode
-try:
-    from hashlib import md5
-except ImportError:
-    from distutils2._backport.hashlib import md5
-from urllib2 import HTTPError
-from urllib2 import urlopen, Request
+from hashlib import md5
+from urllib.error import HTTPError
+from urllib.request import urlopen, Request
 
 from distutils2 import logger
 from distutils2.errors import PackagingOptionError
@@ -87,7 +84,7 @@
     def upload_file(self, command, pyversion, filename):
         # Makes sure the repository URL is compliant
         scheme, netloc, url, params, query, fragments = \
-            urlparse.urlparse(self.repository)
+            urllib.parse.urlparse(self.repository)
         if params or query or fragments:
             raise AssertionError("Incompatible url %s" % self.repository)
 
@@ -104,11 +101,8 @@
 
         # Fill in the data - send all the metadata in case we need to
         # register a new release
-        f = open(filename, 'rb')
-        try:
+        with open(filename, 'rb') as f:
             content = f.read()
-        finally:
-            f.close()
 
         data = self.distribution.metadata.todict()
 
@@ -124,11 +118,8 @@
             data['comment'] = 'built for %s' % platform.platform(terse=True)
 
         if self.sign:
-            fp = open(filename + '.asc')
-            try:
+            with open(filename + '.asc') as fp:
                 sig = fp.read()
-            finally:
-                fp.close()
             data['gpg_signature'] = [
                 (os.path.basename(filename) + ".asc", sig)]
 
@@ -136,7 +127,7 @@
         # The exact encoding of the authentication string is debated.
         # Anyway PyPI only accepts ascii for both username or password.
         user_pass = (self.username + ":" + self.password).encode('ascii')
-        auth = "Basic " + standard_b64encode(user_pass)
+        auth = b"Basic " + standard_b64encode(user_pass)
 
         # Build up the MIME payload for the POST data
         files = []
@@ -160,10 +151,10 @@
             result = urlopen(request)
             status = result.code
             reason = result.msg
-        except socket.error, e:
+        except socket.error as e:
             logger.error(e)
             return
-        except HTTPError, e:
+        except HTTPError as e:
             status = e.code
             reason = e.msg
 
diff --git a/distutils2/command/upload_docs.py b/distutils2/command/upload_docs.py
--- a/distutils2/command/upload_docs.py
+++ b/distutils2/command/upload_docs.py
@@ -5,9 +5,9 @@
 import socket
 import zipfile
 import logging
-import httplib
-import urlparse
-from StringIO import StringIO
+import http.client
+import urllib.parse
+from io import BytesIO
 
 from distutils2 import logger
 from distutils2.util import (read_pypirc, DEFAULT_REPOSITORY, DEFAULT_REALM,
@@ -18,7 +18,7 @@
 
 def zip_dir(directory):
     """Compresses recursively contents of directory into a BytesIO object"""
-    destination = StringIO()
+    destination = BytesIO()
     zip_file = zipfile.ZipFile(destination, "w")
     try:
         for root, dirs, files in os.walk(directory):
@@ -91,16 +91,16 @@
 
         credentials = self.username + ':' + self.password
         # FIXME should use explicit encoding
-        auth = "Basic " + base64.encodestring(credentials.encode()).strip()
+        auth = b"Basic " + base64.encodebytes(credentials.encode()).strip()
 
         logger.info("Submitting documentation to %s", self.repository)
 
-        scheme, netloc, url, params, query, fragments = urlparse.urlparse(
+        scheme, netloc, url, params, query, fragments = urllib.parse.urlparse(
             self.repository)
         if scheme == "http":
-            conn = httplib.HTTPConnection(netloc)
+            conn = http.client.HTTPConnection(netloc)
         elif scheme == "https":
-            conn = httplib.HTTPSConnection(netloc)
+            conn = http.client.HTTPSConnection(netloc)
         else:
             raise AssertionError("unsupported scheme %r" % scheme)
 
@@ -113,7 +113,7 @@
             conn.endheaders()
             conn.send(body)
 
-        except socket.error, e:
+        except socket.error as e:
             logger.error(e)
             return
 
diff --git a/distutils2/compat.py b/distutils2/compat.py
--- a/distutils2/compat.py
+++ b/distutils2/compat.py
@@ -4,10 +4,7 @@
 Python 3.2, for internal use only.  Whole modules are in _backport.
 """
 
-import os
-import re
 import sys
-import codecs
 from distutils2 import logger
 
 
@@ -62,131 +59,12 @@
 # The rest of this file does not exist in packaging
 # functions are sorted alphabetically and are not included in __all__
 
-try:
-    any
-except NameError:
-    def any(seq):
-        for elem in seq:
-            if elem:
-                return True
-        return False
-
-
-_cookie_re = re.compile("coding[:=]\s*([-\w.]+)")
-
-
-def _get_normal_name(orig_enc):
-    """Imitates get_normal_name in tokenizer.c."""
-    # Only care about the first 12 characters.
-    enc = orig_enc[:12].lower().replace("_", "-")
-    if enc == "utf-8" or enc.startswith("utf-8-"):
-        return "utf-8"
-    if enc in ("latin-1", "iso-8859-1", "iso-latin-1") or \
-       enc.startswith(("latin-1-", "iso-8859-1-", "iso-latin-1-")):
-        return "iso-8859-1"
-    return orig_enc
-
-
-def detect_encoding(readline):
-    """
-    The detect_encoding() function is used to detect the encoding that should
-    be used to decode a Python source file.  It requires one argment, readline,
-    in the same way as the tokenize() generator.
-
-    It will call readline a maximum of twice, and return the encoding used
-    (as a string) and a list of any lines (left as bytes) it has read in.
-
-    It detects the encoding from the presence of a utf-8 bom or an encoding
-    cookie as specified in pep-0263.  If both a bom and a cookie are present,
-    but disagree, a SyntaxError will be raised.  If the encoding cookie is an
-    invalid charset, raise a SyntaxError.  Note that if a utf-8 bom is found,
-    'utf-8-sig' is returned.
-
-    If no encoding is specified, then the default of 'utf-8' will be returned.
-    """
-    bom_found = False
-    encoding = None
-    default = 'utf-8'
-
-    def read_or_stop():
-        try:
-            return readline()
-        except StopIteration:
-            return ''
-
-    def find_cookie(line):
-        try:
-            line_string = line.decode('ascii')
-        except UnicodeDecodeError:
-            return None
-
-        matches = _cookie_re.findall(line_string)
-        if not matches:
-            return None
-        encoding = _get_normal_name(matches[0])
-        try:
-            codec = codecs.lookup(encoding)
-        except LookupError:
-            # This behaviour mimics the Python interpreter
-            raise SyntaxError("unknown encoding: " + encoding)
-
-        if bom_found:
-            if codec.name != 'utf-8':
-                # This behaviour mimics the Python interpreter
-                raise SyntaxError('encoding problem: utf-8')
-            encoding += '-sig'
-        return encoding
-
-    first = read_or_stop()
-    if first.startswith(codecs.BOM_UTF8):
-        bom_found = True
-        first = first[3:]
-        default = 'utf-8-sig'
-    if not first:
-        return default, []
-
-    encoding = find_cookie(first)
-    if encoding:
-        return encoding, [first]
-
-    second = read_or_stop()
-    if not second:
-        return default, [first]
-
-    encoding = find_cookie(second)
-    if encoding:
-        return encoding, [first, second]
-
-    return default, [first, second]
-
 
 def fsencode(filename):
-    if isinstance(filename, str):
+    if isinstance(filename, bytes):
         return filename
-    elif isinstance(filename, unicode):
+    elif isinstance(filename, str):
         return filename.encode(sys.getfilesystemencoding())
     else:
         raise TypeError("expect bytes or str, not %s" %
                         type(filename).__name__)
-
-
-try:
-    from functools import wraps
-except ImportError:
-    def wraps(func=None):
-        """No-op replacement for functools.wraps"""
-        def wrapped(func):
-            return func
-        return wrapped
-
-try:
-    from platform import python_implementation
-except ImportError:
-    def python_implementation():
-        if 'PyPy' in sys.version:
-            return 'PyPy'
-        if os.name == 'java':
-            return 'Jython'
-        if sys.version.startswith('IronPython'):
-            return 'IronPython'
-        return 'CPython'
diff --git a/distutils2/compiler/__init__.py b/distutils2/compiler/__init__.py
--- a/distutils2/compiler/__init__.py
+++ b/distutils2/compiler/__init__.py
@@ -142,7 +142,7 @@
     compilers = []
 
     for name, cls in _COMPILERS.items():
-        if isinstance(cls, basestring):
+        if isinstance(cls, str):
             cls = resolve_name(cls)
             _COMPILERS[name] = cls
 
@@ -179,7 +179,7 @@
             msg = msg + " with '%s' compiler" % compiler
         raise PackagingPlatformError(msg)
 
-    if isinstance(cls, basestring):
+    if isinstance(cls, str):
         cls = resolve_name(cls)
         _COMPILERS[compiler] = cls
 
diff --git a/distutils2/compiler/bcppcompiler.py b/distutils2/compiler/bcppcompiler.py
--- a/distutils2/compiler/bcppcompiler.py
+++ b/distutils2/compiler/bcppcompiler.py
@@ -104,7 +104,7 @@
                 # This needs to be compiled to a .res file -- do it now.
                 try:
                     self.spawn(["brcc32", "-fo", obj, src])
-                except PackagingExecError, msg:
+                except PackagingExecError as msg:
                     raise CompileError(msg)
                 continue # the 'for' loop
 
@@ -128,7 +128,7 @@
                 self.spawn([self.cc] + compile_opts + pp_opts +
                            [input_opt, output_opt] +
                            extra_postargs + [src])
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise CompileError(msg)
 
         return objects
@@ -146,7 +146,7 @@
                 pass                    # XXX what goes here?
             try:
                 self.spawn([self.lib] + lib_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LibError(msg)
         else:
             logger.debug("skipping %s (up-to-date)", output_filename)
@@ -268,7 +268,7 @@
             self.mkpath(os.path.dirname(output_filename))
             try:
                 self.spawn([self.linker] + ld_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LinkError(msg)
 
         else:
@@ -351,5 +351,5 @@
                 self.mkpath(os.path.dirname(output_file))
             try:
                 self.spawn(pp_args)
-            except PackagingExecError, exc:
-                raise CompileError(exc)
+            except PackagingExecError as msg:
+                raise CompileError(msg)
diff --git a/distutils2/compiler/ccompiler.py b/distutils2/compiler/ccompiler.py
--- a/distutils2/compiler/ccompiler.py
+++ b/distutils2/compiler/ccompiler.py
@@ -12,7 +12,7 @@
 from distutils2.compiler import gen_preprocess_options
 
 
-class CCompiler(object):
+class CCompiler:
     """Abstract base class to define the interface that must be implemented
     by real compiler classes.  Also has some utility methods used by
     several compiler classes.
@@ -148,7 +148,7 @@
             self.set_executable(key, value)
 
     def set_executable(self, key, value):
-        if isinstance(value, basestring):
+        if isinstance(value, str):
             setattr(self, key, split_quoted(value))
         else:
             setattr(self, key, value)
@@ -170,8 +170,8 @@
             if not (isinstance(defn, tuple) and
                     (len(defn) == 1 or
                      (len(defn) == 2 and
-                      (isinstance(defn[1], basestring) or defn[1] is None))) and
-                    isinstance(defn[0], basestring)):
+                      (isinstance(defn[1], str) or defn[1] is None))) and
+                    isinstance(defn[0], str)):
                 raise TypeError(("invalid macro definition '%s': " % defn) + \
                       "must be tuple (string,), (string, string), or " + \
                       "(string, None)")
@@ -311,7 +311,7 @@
         """Process arguments and decide which source files to compile."""
         if outdir is None:
             outdir = self.output_dir
-        elif not isinstance(outdir, basestring):
+        elif not isinstance(outdir, str):
             raise TypeError("'output_dir' must be a string or None")
 
         if macros is None:
@@ -371,7 +371,7 @@
         """
         if output_dir is None:
             output_dir = self.output_dir
-        elif not isinstance(output_dir, basestring):
+        elif not isinstance(output_dir, str):
             raise TypeError("'output_dir' must be a string or None")
 
         if macros is None:
@@ -403,7 +403,7 @@
 
         if output_dir is None:
             output_dir = self.output_dir
-        elif not isinstance(output_dir, basestring):
+        elif not isinstance(output_dir, str):
             raise TypeError("'output_dir' must be a string or None")
 
         return objects, output_dir
@@ -727,8 +727,7 @@
         if library_dirs is None:
             library_dirs = []
         fd, fname = tempfile.mkstemp(".c", funcname, text=True)
-        f = os.fdopen(fd, "w")
-        try:
+        with os.fdopen(fd, "w") as f:
             for incl in includes:
                 f.write("""#include "%s"\n""" % incl)
             f.write("""\
@@ -736,8 +735,6 @@
     %s();
 }
 """ % funcname)
-        finally:
-            f.close()
         try:
             objects = self.compile([fname], include_dirs=include_dirs)
         except CompileError:
@@ -854,7 +851,7 @@
             return
         return move(src, dst)
 
-    def mkpath(self, name, mode=0777):
+    def mkpath(self, name, mode=0o777):
         name = os.path.normpath(name)
         if os.path.isdir(name) or name == '':
             return
diff --git a/distutils2/compiler/cygwinccompiler.py b/distutils2/compiler/cygwinccompiler.py
--- a/distutils2/compiler/cygwinccompiler.py
+++ b/distutils2/compiler/cygwinccompiler.py
@@ -156,13 +156,13 @@
             # gcc needs '.res' and '.rc' compiled to object files !!!
             try:
                 self.spawn(["windres", "-i", src, "-o", obj])
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise CompileError(msg)
         else: # for other files use the C-compiler
             try:
                 self.spawn(self.compiler_so + cc_args + [src, '-o', obj] +
                            extra_postargs)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise CompileError(msg)
 
     def link(self, target_desc, objects, output_filename, output_dir=None,
@@ -344,14 +344,11 @@
     # let's see if __GNUC__ is mentioned in python.h
     fn = sysconfig.get_config_h_filename()
     try:
-        config_h = open(fn)
-        try:
+        with open(fn) as config_h:
             if "__GNUC__" in config_h.read():
                 return CONFIG_H_OK, "'%s' mentions '__GNUC__'" % fn
             else:
                 return CONFIG_H_NOTOK, "'%s' does not mention '__GNUC__'" % fn
-        finally:
-            config_h.close()
-    except IOError, exc:
+    except IOError as exc:
         return (CONFIG_H_UNCERTAIN,
                 "couldn't read '%s': %s" % (fn, exc.strerror))
diff --git a/distutils2/compiler/extension.py b/distutils2/compiler/extension.py
--- a/distutils2/compiler/extension.py
+++ b/distutils2/compiler/extension.py
@@ -13,7 +13,7 @@
 # order to do anything.
 
 
-class Extension(object):
+class Extension:
     """Just a collection of attributes that describes an extension
     module and everything needed to build it (hopefully in a portable
     way, but there are hooks that let you be as unportable as you need).
@@ -86,14 +86,14 @@
                  extra_compile_args=None, extra_link_args=None,
                  export_symbols=None, swig_opts=None, depends=None,
                  language=None, optional=None, **kw):
-        if not isinstance(name, basestring):
+        if not isinstance(name, str):
             raise AssertionError("'name' must be a string")
 
         if not isinstance(sources, list):
             raise AssertionError("'sources' must be a list of strings")
 
         for v in sources:
-            if not isinstance(v, basestring):
+            if not isinstance(v, str):
                 raise AssertionError("'sources' must be a list of strings")
 
         self.name = name
diff --git a/distutils2/compiler/msvc9compiler.py b/distutils2/compiler/msvc9compiler.py
--- a/distutils2/compiler/msvc9compiler.py
+++ b/distutils2/compiler/msvc9compiler.py
@@ -46,7 +46,7 @@
 }
 
 
-class Reg(object):
+class Reg:
     """Helper class to read values from the registry
     """
 
@@ -108,7 +108,7 @@
         return s
     convert_mbcs = staticmethod(convert_mbcs)
 
-class MacroExpander(object):
+class MacroExpander:
 
     def __init__(self, version):
         self.macros = {}
@@ -477,7 +477,7 @@
                 try:
                     self.spawn([self.rc] + pp_opts +
                                [output_opt] + [input_opt])
-                except PackagingExecError, msg:
+                except PackagingExecError as msg:
                     raise CompileError(msg)
                 continue
             elif ext in self._mc_extensions:
@@ -504,7 +504,7 @@
                     self.spawn([self.rc] +
                                ["/fo" + obj] + [rc_file])
 
-                except PackagingExecError, msg:
+                except PackagingExecError as msg:
                     raise CompileError(msg)
                 continue
             else:
@@ -517,7 +517,7 @@
                 self.spawn([self.cc] + compile_opts + pp_opts +
                            [input_opt, output_opt] +
                            extra_postargs)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise CompileError(msg)
 
         return objects
@@ -542,7 +542,7 @@
                 pass # XXX what goes here?
             try:
                 self.spawn([self.lib] + lib_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LibError(msg)
         else:
             logger.debug("skipping %s (up-to-date)", output_filename)
@@ -620,7 +620,7 @@
             self.mkpath(os.path.dirname(output_filename))
             try:
                 self.spawn([self.linker] + ld_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LinkError(msg)
 
             # embed the manifest
@@ -637,7 +637,7 @@
             try:
                 self.spawn(['mt.exe', '-nologo', '-manifest',
                             temp_manifest, out_arg])
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LinkError(msg)
         else:
             logger.debug("skipping %s (up-to-date)", output_filename)
diff --git a/distutils2/compiler/msvccompiler.py b/distutils2/compiler/msvccompiler.py
--- a/distutils2/compiler/msvccompiler.py
+++ b/distutils2/compiler/msvccompiler.py
@@ -105,7 +105,7 @@
     return s
 
 
-class MacroExpander(object):
+class MacroExpander:
 
     def __init__(self, version):
         self.macros = {}
@@ -386,7 +386,7 @@
                 try:
                     self.spawn([self.rc] + pp_opts +
                                [output_opt] + [input_opt])
-                except PackagingExecError, msg:
+                except PackagingExecError as msg:
                     raise CompileError(msg)
                 continue
             elif ext in self._mc_extensions:
@@ -415,7 +415,7 @@
                     self.spawn([self.rc] +
                                 ["/fo" + obj] + [rc_file])
 
-                except PackagingExecError, msg:
+                except PackagingExecError as msg:
                     raise CompileError(msg)
                 continue
             else:
@@ -429,7 +429,7 @@
                 self.spawn([self.cc] + compile_opts + pp_opts +
                            [input_opt, output_opt] +
                            extra_postargs)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise CompileError(msg)
 
         return objects
@@ -448,7 +448,7 @@
                 pass                    # XXX what goes here?
             try:
                 self.spawn([self.lib] + lib_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LibError(msg)
 
         else:
@@ -515,7 +515,7 @@
             self.mkpath(os.path.dirname(output_filename))
             try:
                 self.spawn([self.linker] + ld_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LinkError(msg)
 
         else:
diff --git a/distutils2/compiler/unixccompiler.py b/distutils2/compiler/unixccompiler.py
--- a/distutils2/compiler/unixccompiler.py
+++ b/distutils2/compiler/unixccompiler.py
@@ -165,7 +165,7 @@
                 self.mkpath(os.path.dirname(output_file))
             try:
                 self.spawn(pp_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise CompileError(msg)
 
     def _compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts):
@@ -175,7 +175,7 @@
         try:
             self.spawn(compiler_so + cc_args + [src, '-o', obj] +
                        extra_postargs)
-        except PackagingExecError, msg:
+        except PackagingExecError as msg:
             raise CompileError(msg)
 
     def create_static_lib(self, objects, output_libname,
@@ -199,7 +199,7 @@
             if self.ranlib:
                 try:
                     self.spawn(self.ranlib + [output_filename])
-                except PackagingExecError, msg:
+                except PackagingExecError as msg:
                     raise LibError(msg)
         else:
             logger.debug("skipping %s (up-to-date)", output_filename)
@@ -253,7 +253,7 @@
                     linker = _darwin_compiler_fixup(linker, ld_args)
 
                 self.spawn(linker + ld_args)
-            except PackagingExecError, msg:
+            except PackagingExecError as msg:
                 raise LinkError(msg)
         else:
             logger.debug("skipping %s (up-to-date)", output_filename)
diff --git a/distutils2/config.py b/distutils2/config.py
--- a/distutils2/config.py
+++ b/distutils2/config.py
@@ -2,11 +2,10 @@
 
 import os
 import sys
-import codecs
 import logging
 
 from shlex import split
-from ConfigParser import RawConfigParser
+from configparser import RawConfigParser
 from distutils2 import logger
 from distutils2.errors import PackagingOptionError
 from distutils2.compiler.extension import Extension
@@ -74,7 +73,7 @@
     return destinations
 
 
-class Config(object):
+class Config:
     """Class used to work with configuration files"""
     def __init__(self, dist):
         self.dist = dist
@@ -155,7 +154,7 @@
                     for line in setup_hooks:
                         try:
                             hook = resolve_name(line)
-                        except ImportError, e:
+                        except ImportError as e:
                             logger.warning('cannot find setup hook: %s',
                                            e.args[0])
                         else:
@@ -190,11 +189,8 @@
                     value = []
                     for filename in filenames:
                         # will raise if file not found
-                        description_file = open(filename)
-                        try:
+                        with open(filename) as description_file:
                             value.append(description_file.read().strip())
-                        finally:
-                            description_file.close()
                         # add filename as a required file
                         if filename not in metadata.requires_files:
                             metadata.requires_files.append(filename)
@@ -214,7 +210,7 @@
             self.dist.packages = []
 
             packages = files.get('packages', [])
-            if isinstance(packages, basestring):
+            if isinstance(packages, str):
                 packages = [packages]
 
             for package in packages:
@@ -224,10 +220,10 @@
                 self.dist.packages.append(package)
 
             self.dist.py_modules = files.get('modules', [])
-            if isinstance(self.dist.py_modules, basestring):
+            if isinstance(self.dist.py_modules, str):
                 self.dist.py_modules = [self.dist.py_modules]
             self.dist.scripts = files.get('scripts', [])
-            if isinstance(self.dist.scripts, basestring):
+            if isinstance(self.dist.scripts, str):
                 self.dist.scripts = [self.dist.scripts]
 
             self.dist.package_data = {}
@@ -310,11 +306,8 @@
 
         for filename in filenames:
             logger.debug("  reading %s", filename)
-            f = codecs.open(filename, 'r', encoding='utf-8')
-            try:
+            with open(filename, 'r', encoding='utf-8') as f:
                 parser.readfp(f)
-            finally:
-                f.close()
 
             if os.path.split(filename)[-1] == 'setup.cfg':
                 self._read_setup_cfg(parser, filename)
@@ -338,7 +331,7 @@
 
                     if opt == 'sub_commands':
                         val = split_multiline(val)
-                        if isinstance(val, basestring):
+                        if isinstance(val, str):
                             val = [val]
 
                     # Hooks use a suffix system to prevent being overriden
@@ -371,19 +364,19 @@
                         setattr(self.dist, opt, strtobool(val))
                     else:
                         setattr(self.dist, opt, val)
-                except ValueError, msg:
+                except ValueError as msg:
                     raise PackagingOptionError(msg)
 
     def _load_compilers(self, compilers):
         compilers = split_multiline(compilers)
-        if isinstance(compilers, basestring):
+        if isinstance(compilers, str):
             compilers = [compilers]
         for compiler in compilers:
             set_compiler(compiler.strip())
 
     def _load_commands(self, commands):
         commands = split_multiline(commands)
-        if isinstance(commands, basestring):
+        if isinstance(commands, str):
             commands = [commands]
         for command in commands:
             set_command(command.strip())
diff --git a/distutils2/create.py b/distutils2/create.py
--- a/distutils2/create.py
+++ b/distutils2/create.py
@@ -23,25 +23,17 @@
 import imp
 import sys
 import glob
-import codecs
 import shutil
+from hashlib import md5
 from textwrap import dedent
-from ConfigParser import RawConfigParser
+from tokenize import detect_encoding
+from configparser import RawConfigParser
 
 # importing this with an underscore as it should be replaced by the
 # dict form or another structures for all purposes
 from distutils2._trove import all_classifiers as _CLASSIFIERS_LIST
-from distutils2.compat import detect_encoding
 from distutils2.version import is_valid_version
 from distutils2._backport import sysconfig
-try:
-    any
-except NameError:
-    from distutils2.compat import any
-try:
-    from hashlib import md5
-except ImportError:
-    from distutils2._backport.hashlib import md5
 
 
 _FILENAME = 'setup.cfg'
@@ -120,26 +112,20 @@
     This function load the setup file in all cases (even if it have already
     been loaded before, because we are monkey patching its setup function with
     a particular one"""
-    f = open("setup.py", "rb")
-    try:
+    with open("setup.py", "rb") as f:
         encoding, lines = detect_encoding(f.readline)
-    finally:
-        f.close()
-    f = open("setup.py")
-    try:
+    with open("setup.py", encoding=encoding) as f:
         imp.load_module("setup", f, "setup.py", (".py", "r", imp.PY_SOURCE))
-    finally:
-        f.close()
 
 
 def ask_yn(question, default=None, helptext=None):
     question += ' (y/n)'
     while True:
         answer = ask(question, default, helptext, required=True)
-        if answer and answer[0].lower() in 'yn':
+        if answer and answer[0].lower() in ('y', 'n'):
             return answer[0].lower()
 
-        print '\nERROR: You must select "Y" or "N".\n'
+        print('\nERROR: You must select "Y" or "N".\n')
 
 
 # XXX use util.ask
@@ -148,11 +134,11 @@
 
 def ask(question, default=None, helptext=None, required=True,
         lengthy=False, multiline=False):
-    prompt = u'%s: ' % (question,)
+    prompt = '%s: ' % (question,)
     if default:
-        prompt = u'%s [%s]: ' % (question, default)
+        prompt = '%s [%s]: ' % (question, default)
         if default and len(question) + len(default) > 70:
-            prompt = u'%s\n    [%s]: ' % (question, default)
+            prompt = '%s\n    [%s]: ' % (question, default)
     if lengthy or multiline:
         prompt += '\n   > '
 
@@ -167,19 +153,19 @@
 
         line = sys.stdin.readline().strip()
         if line == '?':
-            print '=' * 70
-            print helptext
-            print '=' * 70
+            print('=' * 70)
+            print(helptext)
+            print('=' * 70)
             continue
         if default and not line:
             return default
         if not line and required:
-            print '*' * 70
-            print 'This value cannot be empty.'
-            print '==========================='
+            print('*' * 70)
+            print('This value cannot be empty.')
+            print('===========================')
             if helptext:
-                print helptext
-            print '*' * 70
+                print(helptext)
+            print('*' * 70)
             continue
         return line
 
@@ -216,7 +202,7 @@
 LICENCES = _build_licences(_CLASSIFIERS_LIST)
 
 
-class MainProgram(object):
+class MainProgram:
     """Make a project setup configuration file (setup.cfg)."""
 
     def __init__(self):
@@ -286,34 +272,32 @@
     def _write_cfg(self):
         if os.path.exists(_FILENAME):
             if os.path.exists('%s.old' % _FILENAME):
-                print ('ERROR: %(name)s.old backup exists, please check that '
-                       'current %(name)s is correct and remove %(name)s.old' %
-                       {'name': _FILENAME})
+                print("ERROR: %(name)s.old backup exists, please check that "
+                      "current %(name)s is correct and remove %(name)s.old" %
+                      {'name': _FILENAME})
                 return
             shutil.move(_FILENAME, '%s.old' % _FILENAME)
 
-        fp = codecs.open(_FILENAME, 'w', encoding='utf-8')
-        try:
-            fp.write(u'[metadata]\n')
+        with open(_FILENAME, 'w', encoding='utf-8') as fp:
+            fp.write('[metadata]\n')
             # TODO use metadata module instead of hard-coding field-specific
             # behavior here
 
             # simple string entries
             for name in ('name', 'version', 'summary', 'download_url'):
-                fp.write(u'%s = %s\n' % (name, self.data.get(name, 'UNKNOWN')))
+                fp.write('%s = %s\n' % (name, self.data.get(name, 'UNKNOWN')))
 
             # optional string entries
             if 'keywords' in self.data and self.data['keywords']:
-                fp.write(u'keywords = %s\n' % ' '.join(self.data['keywords']))
+                fp.write('keywords = %s\n' % ' '.join(self.data['keywords']))
             for name in ('home_page', 'author', 'author_email',
                          'maintainer', 'maintainer_email', 'description-file'):
                 if name in self.data and self.data[name]:
-                    fp.write(u'%s = %s\n' % (name.decode('utf-8'),
-                                             self.data[name].decode('utf-8')))
+                    fp.write('%s = %s\n' % (name, self.data[name]))
             if 'description' in self.data:
                 fp.write(
-                    u'description = %s\n'
-                    % u'\n       |'.join(self.data['description'].split('\n')))
+                    'description = %s\n'
+                    % '\n       |'.join(self.data['description'].split('\n')))
 
             # multiple use string entries
             for name in ('platform', 'supported-platform', 'classifier',
@@ -321,25 +305,23 @@
                          'requires-external'):
                 if not(name in self.data and self.data[name]):
                     continue
-                fp.write(u'%s = ' % name)
-                fp.write(u''.join('    %s\n' % val
+                fp.write('%s = ' % name)
+                fp.write(''.join('    %s\n' % val
                                  for val in self.data[name]).lstrip())
-            fp.write(u'\n[files]\n')
+            fp.write('\n[files]\n')
             for name in ('packages', 'modules', 'scripts',
                          'package_data', 'extra_files'):
                 if not(name in self.data and self.data[name]):
                     continue
-                fp.write(u'%s = %s\n'
-                         % (name, u'\n    '.join(self.data[name]).strip()))
-            fp.write(u'\nresources =\n')
+                fp.write('%s = %s\n'
+                         % (name, '\n    '.join(self.data[name]).strip()))
+            fp.write('\nresources =\n')
             for src, dest in self.data['resources']:
-                fp.write(u'    %s = %s\n' % (src, dest))
-            fp.write(u'\n')
-        finally:
-            fp.close()
+                fp.write('    %s = %s\n' % (src, dest))
+            fp.write('\n')
 
-        os.chmod(_FILENAME, 0644)
-        print 'Wrote %r.' % _FILENAME
+        os.chmod(_FILENAME, 0o644)
+        print('Wrote "%s".' % _FILENAME)
 
     def convert_py_to_cfg(self):
         """Generate a setup.cfg from an existing setup.py.
@@ -368,9 +350,8 @@
                       ('description', 'summary'),
                       ('long_description', 'description'),
                       ('url', 'home_page'),
-                      ('platforms', 'platform'))
-            if sys.version >= '2.5':
-                labels += (
+                      ('platforms', 'platform'),
+                      # backport only for 2.5+
                       ('provides', 'provides-dist'),
                       ('obsoletes', 'obsoletes-dist'),
                       ('requires', 'requires-dist'))
@@ -386,7 +367,7 @@
             # 2.1 data_files -> resources
             if dist.data_files:
                 if (len(dist.data_files) < 2 or
-                    isinstance(dist.data_files[1], basestring)):
+                    isinstance(dist.data_files[1], str)):
                     dist.data_files = [('', dist.data_files)]
                 # add tokens in the destination paths
                 vars = {'distribution.name': data['name']}
@@ -421,11 +402,8 @@
                                  self.data['description']).lower().encode())
                 ref = ref.digest()
                 for readme in glob.glob('README*'):
-                    fp = codecs.open(readme, encoding='utf-8')
-                    try:
+                    with open(readme, encoding='utf-8') as fp:
                         contents = fp.read()
-                    finally:
-                        fp.close()
                     contents = re.sub('\s', '', contents.lower()).encode()
                     val = md5(contents).digest()
                     if val == ref:
@@ -637,8 +615,8 @@
                         break
 
             if len(found_list) == 0:
-                print ('ERROR: Could not find a matching license for "%s"' %
-                       license)
+                print('ERROR: Could not find a matching license for "%s"' %
+                      license)
                 continue
 
             question = 'Matching licenses:\n\n'
@@ -659,8 +637,8 @@
             try:
                 index = found_list[int(choice) - 1]
             except ValueError:
-                print ('ERROR: Invalid selection, type a number from the list '
-                       'above.')
+                print("ERROR: Invalid selection, type a number from the list "
+                      "above.")
 
             classifiers.add(_CLASSIFIERS_LIST[index])
 
@@ -683,8 +661,8 @@
                     classifiers.add(key)
                     return
                 except (IndexError, ValueError):
-                    print ('ERROR: Invalid selection, type a single digit '
-                           'number.')
+                    print("ERROR: Invalid selection, type a single digit "
+                          "number.")
 
 
 def main():
diff --git a/distutils2/database.py b/distutils2/database.py
--- a/distutils2/database.py
+++ b/distutils2/database.py
@@ -5,11 +5,8 @@
 import csv
 import sys
 import zipimport
-from StringIO import StringIO
-try:
-    from hashlib import md5
-except ImportError:
-    from distutils2._backport.hashlib import md5
+from io import StringIO
+from hashlib import md5
 
 from distutils2 import logger
 from distutils2.errors import PackagingError
@@ -122,7 +119,7 @@
             _cache_generated_egg = True
 
 
-class Distribution(object):
+class Distribution:
     """Created with the *path* of the ``.dist-info`` directory provided to the
     constructor. It reads the metadata contained in ``METADATA`` when it is
     instantiated."""
@@ -162,8 +159,7 @@
 
     def _get_records(self, local=False):
         results = []
-        record = self.get_distinfo_file('RECORD')
-        try:
+        with self.get_distinfo_file('RECORD') as record:
             record_reader = csv.reader(record, delimiter=',',
                                        lineterminator='\n')
             for row in record_reader:
@@ -173,20 +169,15 @@
                     path = path.replace('/', os.sep)
                     path = os.path.join(sys.prefix, path)
                 results.append((path, checksum, size))
-        finally:
-            record.close()
         return results
 
     def get_resource_path(self, relative_path):
-        resources_file = self.get_distinfo_file('RESOURCES')
-        try:
+        with self.get_distinfo_file('RESOURCES') as resources_file:
             resources_reader = csv.reader(resources_file, delimiter=',',
                                           lineterminator='\n')
             for relative, destination in resources_reader:
                 if relative == relative_path:
                     return destination
-        finally:
-            resources_file.close()
         raise KeyError(
             'no resource file with relative path %r is installed' %
             relative_path)
@@ -284,7 +275,7 @@
     __hash__ = object.__hash__
 
 
-class EggInfoDistribution(object):
+class EggInfoDistribution:
     """Created with the *path* of the ``.egg-info`` directory or file provided
     to the constructor. It reads the metadata contained in the file itself, or
     if the given path happens to be a directory, the metadata is read from the
@@ -318,7 +309,7 @@
         def yield_lines(strs):
             """Yield non-empty/non-comment lines of a ``basestring``
             or sequence"""
-            if isinstance(strs, basestring):
+            if isinstance(strs, str):
                 for s in strs.splitlines():
                     s = s.strip()
                     # skip blank lines/comments
@@ -337,11 +328,8 @@
                 self.metadata = Metadata(path=meta_path)
                 try:
                     req_path = os.path.join(path, 'EGG-INFO', 'requires.txt')
-                    fp = open(req_path, 'r')
-                    try:
+                    with open(req_path, 'r') as fp:
                         requires = fp.read()
-                    finally:
-                        fp.close()
                 except IOError:
                     requires = None
             else:
@@ -361,11 +349,8 @@
             if os.path.isdir(path):
                 path = os.path.join(path, 'PKG-INFO')
                 try:
-                    fp = open(os.path.join(path, 'requires.txt'), 'r')
-                    try:
+                    with open(os.path.join(path, 'requires.txt'), 'r') as fp:
                         requires = fp.read()
-                    finally:
-                        fp.close()
                 except IOError:
                     requires = None
             self.metadata = Metadata(path=path)
@@ -427,11 +412,8 @@
     def list_installed_files(self, local=False):
 
         def _md5(path):
-            f = open(path, 'rb')
-            try:
+            with open(path, 'rb') as f:
                 content = f.read()
-            finally:
-                f.close()
             return md5(content).hexdigest()
 
         def _size(path):
diff --git a/distutils2/depgraph.py b/distutils2/depgraph.py
--- a/distutils2/depgraph.py
+++ b/distutils2/depgraph.py
@@ -7,7 +7,7 @@
 
 import sys
 
-from StringIO import StringIO
+from io import StringIO
 from distutils2.errors import PackagingError
 from distutils2.version import VersionPredicate, IrrationalVersionError
 
@@ -107,26 +107,26 @@
     """
     disconnected = []
 
-    f.write(u"digraph dependencies {\n")
+    f.write("digraph dependencies {\n")
     for dist, adjs in graph.adjacency_list.items():
         if len(adjs) == 0 and not skip_disconnected:
             disconnected.append(dist)
         for other, label in adjs:
             if not label is None:
-                f.write(u'"%s" -> "%s" [label="%s"]\n' %
+                f.write('"%s" -> "%s" [label="%s"]\n' %
                                             (dist.name, other.name, label))
             else:
-                f.write(u'"%s" -> "%s"\n' % (dist.name, other.name))
+                f.write('"%s" -> "%s"\n' % (dist.name, other.name))
     if not skip_disconnected and len(disconnected) > 0:
-        f.write(u'subgraph disconnected {\n')
-        f.write(u'label = "Disconnected"\n')
-        f.write(u'bgcolor = red\n')
+        f.write('subgraph disconnected {\n')
+        f.write('label = "Disconnected"\n')
+        f.write('bgcolor = red\n')
 
         for dist in disconnected:
-            f.write(u'"%s"' % dist.name)
-            f.write(u'\n')
-        f.write(u'}\n')
-    f.write(u'}\n')
+            f.write('"%s"' % dist.name)
+            f.write('\n')
+        f.write('}\n')
+    f.write('}\n')
 
 
 def generate_graph(dists):
@@ -234,22 +234,22 @@
             graph = generate_graph(dists)
         finally:
             sys.stderr = old
-    except Exception, e:
+    except Exception as e:
         tempout.seek(0)
         tempout = tempout.read()
-        print 'Could not generate the graph'
-        print tempout
-        print e
+        print('Could not generate the graph')
+        print(tempout)
+        print(e)
         sys.exit(1)
 
     for dist, reqs in graph.missing.items():
         if len(reqs) > 0:
-            print 'Warning: Missing dependencies for %r:' % dist.name, \
-                  ', '.join(reqs)
+            print("Warning: Missing dependencies for %r:" % dist.name,
+                  ", ".join(reqs))
     # XXX replace with argparse
     if len(sys.argv) == 1:
-        print 'Dependency graph:'
-        print '   ', repr(graph).replace('\n', '\n    ')
+        print('Dependency graph:')
+        print('   ', repr(graph).replace('\n', '\n    '))
         sys.exit(0)
     elif len(sys.argv) > 1 and sys.argv[1] in ('-d', '--dot'):
         if len(sys.argv) > 2:
@@ -257,18 +257,15 @@
         else:
             filename = 'depgraph.dot'
 
-        f = open(filename, 'w')
-        try:
+        with open(filename, 'w') as f:
             graph_to_dot(graph, f, True)
-        finally:
-            f.close()
         tempout.seek(0)
         tempout = tempout.read()
-        print tempout
-        print 'Dot file written at %r' % filename
+        print(tempout)
+        print('Dot file written at %r' % filename)
         sys.exit(0)
     else:
-        print 'Supported option: -d [filename]'
+        print('Supported option: -d [filename]')
         sys.exit(1)
 
 
diff --git a/distutils2/dist.py b/distutils2/dist.py
--- a/distutils2/dist.py
+++ b/distutils2/dist.py
@@ -32,7 +32,7 @@
     return USAGE % {'script': script}
 
 
-class Distribution(object):
+class Distribution:
     """Class used to represent a project and work with it.
 
     Most of the work hiding behind 'pysetup run' is really done within a
@@ -355,7 +355,7 @@
         # it takes.
         try:
             cmd_class = get_command_class(command)
-        except PackagingModuleError, msg:
+        except PackagingModuleError as msg:
             raise PackagingArgError(msg)
 
         # XXX We want to push this in distutils2.command
@@ -460,14 +460,14 @@
                 options = self.global_options
             parser.set_option_table(options)
             parser.print_help(self.common_usage + "\nGlobal options:")
-            print
+            print()
 
         if display_options:
             parser.set_option_table(self.display_options)
             parser.print_help(
                 "Information display options (just display " +
                 "information, ignore any commands)")
-            print
+            print()
 
         for command in self.commands:
             if isinstance(command, type) and issubclass(command, Command):
@@ -480,9 +480,9 @@
             else:
                 parser.set_option_table(cls.user_options)
             parser.print_help("Options for %r command:" % cls.__name__)
-            print
+            print()
 
-        print gen_usage(self.script_name)
+        print(gen_usage(self.script_name))
 
     def handle_display_options(self, option_order):
         """If there were any non-global "display-only" options
@@ -494,8 +494,8 @@
         # we ignore "foo bar").
         if self.help_commands:
             self.print_commands()
-            print
-            print gen_usage(self.script_name)
+            print()
+            print(gen_usage(self.script_name))
             return True
 
         # If user supplied any of the "display metadata" options, then
@@ -511,12 +511,12 @@
                 opt = opt.replace('-', '_')
                 value = self.metadata[opt]
                 if opt in ('keywords', 'platform'):
-                    print ','.join(value)
+                    print(','.join(value))
                 elif opt in ('classifier', 'provides', 'requires',
                              'obsoletes'):
-                    print '\n'.join(value)
+                    print('\n'.join(value))
                 else:
-                    print value
+                    print(value)
                 any_display_options = True
 
         return any_display_options
@@ -525,14 +525,14 @@
         """Print a subset of the list of all commands -- used by
         'print_commands()'.
         """
-        print header + ":"
+        print(header + ":")
 
         for cmd in commands:
             cls = self.cmdclass.get(cmd) or get_command_class(cmd)
             description = getattr(cls, 'description',
                                   '(no description available)')
 
-            print "  %-*s  %s" % (max_length, cmd, description)
+            print("  %-*s  %s" % (max_length, cmd, description))
 
     def _get_command_groups(self):
         """Helper function to retrieve all the command class names divided
@@ -562,7 +562,7 @@
                                 "Standard commands",
                                 max_length)
         if extra_commands:
-            print
+            print()
             self.print_command_list(extra_commands,
                                     "Extra commands",
                                     max_length)
@@ -622,7 +622,7 @@
                 neg_opt = {}
 
             try:
-                is_string = isinstance(value, basestring)
+                is_string = isinstance(value, str)
                 if option in neg_opt and is_string:
                     setattr(command_obj, neg_opt[option], not strtobool(value))
                 elif option in bool_opts and is_string:
@@ -633,7 +633,7 @@
                     raise PackagingOptionError(
                         "error in %s: command %r has no such option %r" %
                         (source, command_name, option))
-            except ValueError, msg:
+            except ValueError as msg:
                 raise PackagingOptionError(msg)
 
     def get_reinitialized_command(self, command, reinit_subcommands=False):
@@ -725,10 +725,10 @@
             return
 
         for hook in hooks.values():
-            if isinstance(hook, basestring):
+            if isinstance(hook, str):
                 try:
                     hook_obj = resolve_name(hook)
-                except ImportError, e:
+                except ImportError as e:
                     raise PackagingModuleError(e)
             else:
                 hook_obj = hook
diff --git a/distutils2/fancy_getopt.py b/distutils2/fancy_getopt.py
--- a/distutils2/fancy_getopt.py
+++ b/distutils2/fancy_getopt.py
@@ -28,7 +28,7 @@
 neg_alias_re = re.compile("^(%s)=!(%s)$" % (longopt_pat, longopt_pat))
 
 
-class FancyGetopt(object):
+class FancyGetopt:
     """Wrapper around the standard 'getopt()' module that provides some
     handy extra functionality:
       * short and long options are tied together
@@ -151,13 +151,13 @@
                 raise ValueError("invalid option tuple: %r" % option)
 
             # Type- and value-check the option names
-            if not isinstance(longopt, basestring) or len(longopt) < 2:
+            if not isinstance(longopt, str) or len(longopt) < 2:
                 raise PackagingGetoptError(
                       ("invalid long option '%s': "
                        "must be a string of length >= 2") % longopt)
 
             if (not ((short is None) or
-                     (isinstance(short, basestring) and len(short) == 1))):
+                     (isinstance(short, str) and len(short) == 1))):
                 raise PackagingGetoptError(
                       ("invalid short option '%s': "
                        "must be a single character or None") % short)
@@ -237,7 +237,7 @@
 
         try:
             opts, args = getopt.getopt(args, short_opts, self.long_opts)
-        except getopt.error, msg:
+        except getopt.error as msg:
             raise PackagingArgError(msg)
 
         for opt, val in opts:
@@ -377,7 +377,7 @@
     return parser.getopt(args, object)
 
 
-class OptionDummy(object):
+class OptionDummy:
     """Dummy class just used as a place to hold command-line option
     values as instance attributes."""
 
diff --git a/distutils2/install.py b/distutils2/install.py
--- a/distutils2/install.py
+++ b/distutils2/install.py
@@ -51,7 +51,7 @@
         # try to make the paths.
         try:
             os.makedirs(os.path.dirname(new))
-        except OSError, e:
+        except OSError as e:
             if e.errno != errno.EEXIST:
                 raise
         os.rename(old, new)
@@ -88,7 +88,7 @@
         dist.run_command('install_dist')
         name = dist.metadata['Name']
         return database.get_distribution(name) is not None
-    except (IOError, os.error, PackagingError, CCompilerError), msg:
+    except (IOError, os.error, PackagingError, CCompilerError) as msg:
         raise ValueError("Failed to install, " + str(msg))
 
 
@@ -160,7 +160,7 @@
         try:
             func(source_dir)
             return True
-        except ValueError, err:
+        except ValueError as err:
             # failed to install
             logger.info(str(err))
             return False
@@ -187,7 +187,7 @@
         try:
             _install_dist(dist, path)
             installed_dists.append(dist)
-        except Exception, e:
+        except Exception as e:
             logger.info('Failed: %s', e)
 
             # reverting
@@ -396,7 +396,7 @@
     def _move_file(source, target):
         try:
             os.rename(source, target)
-        except OSError, err:
+        except OSError as err:
             return err
         return None
 
@@ -495,12 +495,9 @@
 
     # trying to write a file there
     try:
-        testfile = tempfile.NamedTemporaryFile(suffix=project,
-                                               dir=purelib_path)
-        try:
-            testfile.write('test')
-        finally:
-            testfile.close()
+        with tempfile.NamedTemporaryFile(suffix=project,
+                                         dir=purelib_path) as testfile:
+            testfile.write(b'test')
     except OSError:
         # FIXME this should check the errno, or be removed altogether (race
         # condition: the directory permissions could be changed between here
@@ -525,7 +522,7 @@
         install_from_infos(install_path,
                            info['install'], info['remove'], info['conflict'])
 
-    except InstallationConflict, e:
+    except InstallationConflict as e:
         if logger.isEnabledFor(logging.INFO):
             projects = ('%r %s' % (p.name, p.version) for p in e.args[0])
             logger.info('%r conflicts with %s', project, ','.join(projects))
diff --git a/distutils2/manifest.py b/distutils2/manifest.py
--- a/distutils2/manifest.py
+++ b/distutils2/manifest.py
@@ -22,7 +22,7 @@
 _COMMENTED_LINE = re.compile('#.*?(?=\n)|\n(?=$)', re.M | re.S)
 
 
-class Manifest(object):
+class Manifest:
     """A list of files built by on exploring the filesystem and filtered by
     applying various patterns to what we find there.
     """
@@ -67,7 +67,7 @@
 
         Updates the list accordingly.
         """
-        if isinstance(path_or_file, basestring):
+        if isinstance(path_or_file, str):
             f = open(path_or_file)
         else:
             f = path_or_file
@@ -89,7 +89,7 @@
                 continue
             try:
                 self._process_template_line(line)
-            except PackagingTemplateError, msg:
+            except PackagingTemplateError as msg:
                 logger.warning("%s, %s", path_or_file, msg)
 
     def write(self, path):
@@ -98,11 +98,8 @@
         named by 'self.manifest'.
         """
         if os.path.isfile(path):
-            fp = open(path)
-            try:
+            with open(path) as fp:
                 first_line = fp.readline()
-            finally:
-                fp.close()
 
             if first_line != '# file GENERATED by distutils2, do NOT edit\n':
                 logger.info("not writing to manually maintained "
@@ -122,12 +119,9 @@
         distribution.
         """
         logger.info("reading manifest file %r", path)
-        manifest = open(path)
-        try:
+        with open(path) as manifest:
             for line in manifest.readlines():
                 self.append(line)
-        finally:
-            manifest.close()
 
     def exclude_pattern(self, pattern, anchor=True, prefix=None,
                         is_regex=False):
@@ -356,7 +350,7 @@
     or just returned as-is (assumes it's a regex object).
     """
     if is_regex:
-        if isinstance(pattern, basestring):
+        if isinstance(pattern, str):
             return re.compile(pattern)
         else:
             return pattern
diff --git a/distutils2/markers.py b/distutils2/markers.py
--- a/distutils2/markers.py
+++ b/distutils2/markers.py
@@ -3,10 +3,9 @@
 import os
 import sys
 import platform
-from tokenize import generate_tokens, NAME, OP, STRING, ENDMARKER
-from StringIO import StringIO as BytesIO
 
-from distutils2.compat import python_implementation
+from tokenize import tokenize, NAME, OP, STRING, ENDMARKER, ENCODING
+from io import BytesIO
 
 __all__ = ['interpret']
 
@@ -33,11 +32,10 @@
          'os.name': os.name,
          'platform.version': platform.version(),
          'platform.machine': platform.machine(),
-         'platform.python_implementation': python_implementation(),
-        }
+         'platform.python_implementation': platform.python_implementation()}
 
 
-class _Operation(object):
+class _Operation:
 
     def __init__(self, execution_context=None):
         self.left = None
@@ -98,7 +96,7 @@
         return _operate(self.op, left, right)
 
 
-class _OR(object):
+class _OR:
     def __init__(self, left, right=None):
         self.left = left
         self.right = right
@@ -113,7 +111,7 @@
         return self.left() or self.right()
 
 
-class _AND(object):
+class _AND:
     def __init__(self, left, right=None):
         self.left = left
         self.right = right
@@ -133,10 +131,10 @@
     marker = marker.strip().encode()
     ops = []
     op_starting = True
-    for token in generate_tokens(BytesIO(marker).readline):
+    for token in tokenize(BytesIO(marker).readline):
         # Unpack token
         toktype, tokval, rowcol, line, logical_line = token
-        if toktype not in (NAME, OP, STRING, ENDMARKER):
+        if toktype not in (NAME, OP, STRING, ENDMARKER, ENCODING):
             raise SyntaxError('Type not supported "%s"' % tokval)
 
         if op_starting:
diff --git a/distutils2/metadata.py b/distutils2/metadata.py
--- a/distutils2/metadata.py
+++ b/distutils2/metadata.py
@@ -4,10 +4,9 @@
 """
 
 import re
-import codecs
 import logging
 
-from StringIO import StringIO
+from io import StringIO
 from email import message_from_file
 from distutils2 import logger
 from distutils2.markers import interpret
@@ -185,7 +184,7 @@
 
 _FILESAFE = re.compile('[^A-Za-z0-9.]+')
 
-class Metadata(object):
+class Metadata:
     """The metadata of a release.
 
     Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can
@@ -218,7 +217,7 @@
         self._fields['Metadata-Version'] = _best_version(self._fields)
 
     def _write_field(self, file, name, value):
-        file.write(u'%s: %s\n' % (name, value))
+        file.write('%s: %s\n' % (name, value))
 
     def __getitem__(self, name):
         return self.get(name)
@@ -311,11 +310,8 @@
 
     def read(self, filepath):
         """Read the metadata values from a file path."""
-        fp = codecs.open(filepath, 'r', encoding='utf-8')
-        try:
+        with open(filepath, 'r', encoding='utf-8') as fp:
             self.read_file(fp)
-        finally:
-            fp.close()
 
     def read_file(self, fileob):
         """Read the metadata values from a file object."""
@@ -337,11 +333,8 @@
 
     def write(self, filepath):
         """Write the metadata fields to filepath."""
-        fp = codecs.open(filepath, 'w', encoding='utf-8')
-        try:
+        with open(filepath, 'w', encoding='utf-8') as fp:
             self.write_file(fp)
-        finally:
-            fp.close()
 
     def write_file(self, fileobject):
         """Write the PKG-INFO format data to a file object."""
@@ -404,13 +397,13 @@
 
         if ((name in _ELEMENTSFIELD or name == 'Platform') and
             not isinstance(value, (list, tuple))):
-            if isinstance(value, basestring):
+            if isinstance(value, str):
                 value = [v.strip() for v in value.split(',')]
             else:
                 value = []
         elif (name in _LISTFIELDS and
               not isinstance(value, (list, tuple))):
-            if isinstance(value, basestring):
+            if isinstance(value, str):
                 value = [value]
             else:
                 value = []
@@ -472,7 +465,7 @@
             valid, value = self._platform(self._fields[name])
             if not valid:
                 return []
-            if isinstance(value, basestring):
+            if isinstance(value, str):
                 return value.split(',')
         valid, value = self._platform(self._fields[name])
         if not valid:
@@ -559,7 +552,7 @@
         return data
 
     # Mapping API
-    # TODO could add iter* variants
+    # XXX these methods should return views or sets in 3.x
 
     def keys(self):
         return list(_version2fieldlist(self['Metadata-Version']))
diff --git a/distutils2/pypi/base.py b/distutils2/pypi/base.py
--- a/distutils2/pypi/base.py
+++ b/distutils2/pypi/base.py
@@ -3,7 +3,7 @@
 from distutils2.pypi.dist import ReleasesList
 
 
-class BaseClient(object):
+class BaseClient:
     """Base class containing common methods for the index crawlers/clients"""
 
     def __init__(self, prefer_final, prefer_source):
diff --git a/distutils2/pypi/dist.py b/distutils2/pypi/dist.py
--- a/distutils2/pypi/dist.py
+++ b/distutils2/pypi/dist.py
@@ -8,9 +8,12 @@
 """
 
 import re
+import hashlib
 import tempfile
-import urllib
-import urlparse
+import urllib.request
+import urllib.parse
+import urllib.error
+
 from distutils2.errors import IrrationalVersionError
 from distutils2.version import (suggest_normalized_version, NormalizedVersion,
                                 get_version_predicate)
@@ -18,10 +21,6 @@
 from distutils2.pypi.errors import (HashDoesNotMatch, UnsupportedHashName,
                                     CantParseArchiveName)
 from distutils2._backport.shutil import unpack_archive
-try:
-    import hashlib
-except ImportError:
-    from distutils2._backport import hashlib
 
 
 __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url']
@@ -31,7 +30,7 @@
 DIST_TYPES = ['bdist', 'sdist']
 
 
-class IndexReference(object):
+class IndexReference:
     """Mixin used to store the index reference"""
     def set_index(self, index=None):
         self._index = index
@@ -297,8 +296,8 @@
         # if we do not have downloaded it yet, do it.
         if self.downloaded_location is None:
             url = self.url['url']
-            archive_name = urlparse.urlparse(url)[2].split('/')[-1]
-            filename, headers = urllib.urlretrieve(url,
+            archive_name = urllib.parse.urlparse(url)[2].split('/')[-1]
+            filename, headers = urllib.request.urlretrieve(url,
                                                    path + "/" + archive_name)
             self.downloaded_location = filename
             self._check_md5(filename)
@@ -327,12 +326,9 @@
         hashname = self.url['hashname']
         expected_hashval = self.url['hashval']
         if None not in (expected_hashval, hashname):
-            f = open(filename, 'rb')
-            try:
+            with open(filename, 'rb') as f:
                 hashval = hashlib.new(hashname)
                 hashval.update(f.read())
-            finally:
-                f.close()
 
             if hashval.hexdigest() != expected_hashval:
                 raise HashDoesNotMatch("got %s instead of %s"
@@ -491,7 +487,7 @@
         url = url.replace("#md5=%s" % md5_hash, "")
 
     # parse the archive name to find dist name and version
-    archive_name = urlparse.urlparse(url)[2].split('/')[-1]
+    archive_name = urllib.parse.urlparse(url)[2].split('/')[-1]
     extension_matched = False
     # remove the extension from the name
     for ext in EXTENSIONS:
diff --git a/distutils2/pypi/simple.py b/distutils2/pypi/simple.py
--- a/distutils2/pypi/simple.py
+++ b/distutils2/pypi/simple.py
@@ -6,17 +6,18 @@
 reference implementation available at http://pypi.python.org/simple/).
 """
 
-import httplib
+import http.client
 import re
 import socket
 import sys
-import urllib2
-import urlparse
+import urllib.request
+import urllib.parse
+import urllib.error
 import os
 
 from fnmatch import translate
+from functools import wraps
 from distutils2 import logger
-from distutils2.compat import wraps
 from distutils2.metadata import Metadata
 from distutils2.version import get_version_predicate
 from distutils2 import __version__ as distutils2_version
@@ -123,7 +124,7 @@
         self.follow_externals = follow_externals
 
         # mirroring attributes.
-        parsed = urlparse.urlparse(index_url)
+        parsed = urllib.parse.urlparse(index_url)
         self.scheme = parsed[0]
         if self.scheme == 'file':
             ender = os.path.sep
@@ -156,20 +157,17 @@
 
         Return a list of names.
         """
-        if u'*' in name:
-            name.replace(u'*', u'.*')
+        if '*' in name:
+            name.replace('*', '.*')
         else:
-            name = u"%s%s%s" % (u'*.?', name, u'*.?')
-        name = name.replace(u'*', u'[^<]*')  # avoid matching end tag
-        pattern = (u'<a[^>]*>(%s)</a>' % name).encode('utf-8')
+            name = "%s%s%s" % ('*.?', name, '*.?')
+        name = name.replace('*', '[^<]*')  # avoid matching end tag
+        pattern = ('<a[^>]*>(%s)</a>' % name).encode('utf-8')
         projectname = re.compile(pattern, re.I)
         matching_projects = []
 
-        index = self._open_url(self.index_url)
-        try:
+        with self._open_url(self.index_url) as index:
             index_content = index.read()
-        finally:
-            index.close()
 
         for match in projectname.finditer(index_content):
             project_name = match.group(1).decode('utf-8')
@@ -231,10 +229,8 @@
         """
         self._mirrors_used.add(self.index_url)
         index_url = self._mirrors.pop()
-        # XXX use urlparse for a real check of missing scheme part
-        if not (index_url.startswith("http://") or
-                index_url.startswith("https://") or
-                index_url.startswith("file://")):
+        # XXX use urllib.parse for a real check of missing scheme part
+        if not index_url.startswith(("http://", "https://", "file://")):
             index_url = "http://%s" % index_url
 
         if not index_url.endswith("/simple"):
@@ -251,10 +247,10 @@
         # if _index_url is contained in the given URL, we are browsing the
         # index, and it's always "browsable".
         # local files are always considered browable resources
-        if self.index_url in url or urlparse.urlparse(url)[0] == "file":
+        if self.index_url in url or urllib.parse.urlparse(url)[0] == "file":
             return True
         elif self.follow_externals:
-            if self._allowed_hosts(urlparse.urlparse(url)[1]):  # 1 is netloc
+            if self._allowed_hosts(urllib.parse.urlparse(url)[1]):  # 1 is netloc
                 return True
             else:
                 return False
@@ -311,8 +307,7 @@
                              the links we find (eg. run recursively this
                              method on it)
         """
-        f = self._open_url(url)
-        try:
+        with self._open_url(url) as f:
             base_url = f.url
             if url not in self._processed_urls:
                 self._processed_urls.append(url)
@@ -325,7 +320,7 @@
                             try:
                                 infos = get_infos_from_url(link, project_name,
                                             is_external=self.index_url not in url)
-                            except CantParseArchiveName, e:
+                            except CantParseArchiveName as e:
                                 logger.warning(
                                     "version has not been parsed: %s", e)
                             else:
@@ -334,8 +329,6 @@
                             if self._is_browsable(link) and follow_links:
                                 self._process_url(link, project_name,
                                     follow_links=False)
-        finally:
-            f.close()
 
     def _get_link_matcher(self, url):
         """Returns the right link matcher function of the given url
@@ -346,7 +339,7 @@
             return self._default_link_matcher
 
     def _get_full_url(self, url, base_url):
-        return urlparse.urljoin(base_url, self._htmldecode(url))
+        return urllib.parse.urljoin(base_url, self._htmldecode(url))
 
     def _simple_link_matcher(self, content, base_url):
         """Yield all links with a rel="download" or rel="homepage".
@@ -402,11 +395,11 @@
         files support.
 
         """
-        scheme, netloc, path, params, query, frag = urlparse.urlparse(url)
+        scheme, netloc, path, params, query, frag = urllib.parse.urlparse(url)
 
         # authentication stuff
         if scheme in ('http', 'https'):
-            auth, host = urllib2.splituser(netloc)
+            auth, host = urllib.parse.splituser(netloc)
         else:
             auth = None
 
@@ -418,27 +411,27 @@
         # add authorization headers if auth is provided
         if auth:
             auth = "Basic " + \
-                urlparse.unquote(auth).encode('base64').strip()
-            new_url = urlparse.urlunparse((
+                urllib.parse.unquote(auth).encode('base64').strip()
+            new_url = urllib.parse.urlunparse((
                 scheme, host, path, params, query, frag))
-            request = urllib2.Request(new_url)
+            request = urllib.request.Request(new_url)
             request.add_header("Authorization", auth)
         else:
-            request = urllib2.Request(url)
+            request = urllib.request.Request(url)
         request.add_header('User-Agent', USER_AGENT)
         try:
-            fp = urllib2.urlopen(request)
-        except (ValueError, httplib.InvalidURL), v:
+            fp = urllib.request.urlopen(request)
+        except (ValueError, http.client.InvalidURL) as v:
             msg = ' '.join([str(arg) for arg in v.args])
             raise PackagingPyPIError('%s %s' % (url, msg))
-        except urllib2.HTTPError, v:
+        except urllib.error.HTTPError as v:
             return v
-        except urllib2.URLError, v:
+        except urllib.error.URLError as v:
             raise DownloadError("Download error for %s: %s" % (url, v.reason))
-        except httplib.BadStatusLine, v:
+        except http.client.BadStatusLine as v:
             raise DownloadError('%s returned a bad status line. '
                 'The server might be down, %s' % (url, v.line))
-        except httplib.HTTPException, v:
+        except http.client.HTTPException as v:
             raise DownloadError("Download error for %s: %s" % (url, v))
         except socket.timeout:
             raise DownloadError("The server timeouted")
@@ -447,9 +440,9 @@
             # Put authentication info back into request URL if same host,
             # so that links found on the page will work
             s2, h2, path2, param2, query2, frag2 = \
-                urlparse.urlparse(fp.url)
+                urllib.parse.urlparse(fp.url)
             if s2 == scheme and h2 == host:
-                fp.url = urlparse.urlunparse(
+                fp.url = urllib.parse.urlunparse(
                     (s2, netloc, path2, param2, query2, frag2))
         return fp
 
diff --git a/distutils2/pypi/wrapper.py b/distutils2/pypi/wrapper.py
--- a/distutils2/pypi/wrapper.py
+++ b/distutils2/pypi/wrapper.py
@@ -25,13 +25,13 @@
         exception = None
         methods = [func]
         for f in wrapper._indexes.values():
-            if f != func.im_self and hasattr(f, func.f_name):
-                methods.append(getattr(f, func.f_name))
+            if f != func.__self__ and hasattr(f, func.__name__):
+                methods.append(getattr(f, func.__name__))
         for method in methods:
             try:
                 response = method(*args, **kwargs)
                 retry = False
-            except Exception, e:
+            except Exception as e:
                 exception = e
             if not retry:
                 break
@@ -42,7 +42,7 @@
     return decorator
 
 
-class ClientWrapper(object):
+class ClientWrapper:
     """Wrapper around simple and xmlrpc clients,
 
     Choose the best implementation to use depending the needs, using the given
diff --git a/distutils2/pypi/xmlrpc.py b/distutils2/pypi/xmlrpc.py
--- a/distutils2/pypi/xmlrpc.py
+++ b/distutils2/pypi/xmlrpc.py
@@ -6,7 +6,7 @@
 implementation at http://wiki.python.org/moin/PyPiXmlRpc).
 """
 
-import xmlrpclib
+import xmlrpc.client
 
 from distutils2 import logger
 from distutils2.errors import IrrationalVersionError
@@ -171,7 +171,7 @@
                 project.add_release(release=ReleaseInfo(p['name'],
                     p['version'], metadata={'summary': p['summary']},
                     index=self._index))
-            except IrrationalVersionError, e:
+            except IrrationalVersionError as e:
                 logger.warning("Irrational version error found: %s", e)
         return [self._projects[p['name'].lower()] for p in projects]
 
@@ -195,6 +195,6 @@
 
         """
         if not hasattr(self, '_server_proxy'):
-            self._server_proxy = xmlrpclib.ServerProxy(self.server_url)
+            self._server_proxy = xmlrpc.client.ServerProxy(self.server_url)
 
         return self._server_proxy
diff --git a/distutils2/run.py b/distutils2/run.py
--- a/distutils2/run.py
+++ b/distutils2/run.py
@@ -74,7 +74,7 @@
     return optdict
 
 
-class action_help(object):
+class action_help:
     """Prints a help message when the standard help flags: -h and --help
     are used on the commandline.
     """
@@ -86,7 +86,7 @@
         def wrapper(*args, **kwargs):
             f_args = args[1]
             if '--help' in f_args or '-h' in f_args:
-                print self.help_msg
+                print(self.help_msg)
                 return
             return f(*args, **kwargs)
         return wrapper
@@ -132,7 +132,7 @@
     else:
         dists = get_distributions(use_egg_info=True)
         graph = generate_graph(dists)
-        print graph.repr_node(dist)
+        print(graph.repr_node(dist))
 
 
 @action_help("""\
@@ -205,13 +205,13 @@
 
     for key in keys:
         if key in metadata:
-            print metadata._convert_name(key) + ':'
+            print(metadata._convert_name(key) + ':')
             value = metadata[key]
             if isinstance(value, list):
                 for v in value:
-                    print '   ', v
+                    print('   ', v)
             else:
-                print '   ', value.replace('\n', '\n    ')
+                print('   ', value.replace('\n', '\n    '))
 
 
 @action_help("""\
@@ -257,14 +257,14 @@
     commands = STANDARD_COMMANDS  # + extra commands
 
     if args == ['--list-commands']:
-        print 'List of available commands:'
+        print('List of available commands:')
         cmds = sorted(commands)
 
         for cmd in cmds:
             cls = dispatcher.cmdclass.get(cmd) or get_command_class(cmd)
             desc = getattr(cls, 'description',
                             '(no description available)')
-            print '  %s: %s' % (cmd, desc)
+            print('  %s: %s' % (cmd, desc))
         return
 
     while args:
@@ -310,7 +310,7 @@
 
     number = 0
     for dist in results:
-        print '%r %s (from %r)' % (dist.name, dist.version, dist.path)
+        print('%r %s (from %r)' % (dist.name, dist.version, dist.path))
         number += 1
 
     if number == 0:
@@ -371,7 +371,7 @@
 ]
 
 
-class Dispatcher(object):
+class Dispatcher:
     """Reads the command-line options
     """
     def __init__(self, args=None):
@@ -445,7 +445,7 @@
         # it takes.
         try:
             cmd_class = get_command_class(command)
-        except PackagingModuleError, msg:
+        except PackagingModuleError as msg:
             raise PackagingArgError(msg)
 
         # XXX We want to push this in distutils2.command
@@ -545,18 +545,18 @@
         # late import because of mutual dependence between these modules
         from distutils2.command.cmd import Command
 
-        print 'Usage: pysetup [options] action [action_options]'
-        print
+        print('Usage: pysetup [options] action [action_options]')
+        print()
         if global_options_:
             self.print_usage(self.parser)
-            print
+            print()
 
         if display_options_:
             parser.set_option_table(display_options)
             parser.print_help(
                 "Information display options (just display " +
                 "information, ignore any commands)")
-            print
+            print()
 
         for command in commands:
             if isinstance(command, type) and issubclass(command, Command):
@@ -570,15 +570,15 @@
                 parser.set_option_table(cls.user_options)
 
             parser.print_help("Options for %r command:" % cls.__name__)
-            print
+            print()
 
     def _show_command_help(self, command):
-        if isinstance(command, basestring):
+        if isinstance(command, str):
             command = get_command_class(command)
 
         desc = getattr(command, 'description', '(no description available)')
-        print 'Description:', desc
-        print
+        print('Description:', desc)
+        print()
 
         if (hasattr(command, 'help_options') and
             isinstance(command.help_options, list)):
@@ -588,7 +588,7 @@
             self.parser.set_option_table(command.user_options)
 
         self.parser.print_help("Options:")
-        print
+        print()
 
     def _get_command_groups(self):
         """Helper function to retrieve all the command class names divided
@@ -615,7 +615,7 @@
 
         self.print_command_list(std_commands, "Standard commands", max_length)
         if extra_commands:
-            print
+            print()
             self.print_command_list(extra_commands, "Extra commands",
                                     max_length)
 
@@ -623,14 +623,14 @@
         """Print a subset of the list of all commands -- used by
         'print_commands()'.
         """
-        print header + ":"
+        print(header + ":")
 
         for cmd in commands:
             cls = self.cmdclass.get(cmd) or get_command_class(cmd)
             description = getattr(cls, 'description',
                                   '(no description available)')
 
-            print "  %-*s  %s" % (max_length, cmd, description)
+            print("  %-*s  %s" % (max_length, cmd, description))
 
     def __call__(self):
         if self.action is None:
@@ -646,17 +646,16 @@
     old_level = logger.level
     old_handlers = list(logger.handlers)
     try:
-        try:
-            dispatcher = Dispatcher(args)
-            if dispatcher.action is None:
-                return
-            return dispatcher()
-        except KeyboardInterrupt:
-            logger.info('interrupted')
-            return 1
-        except (IOError, os.error, PackagingError, CCompilerError), exc:
-            logger.exception(exc)
-            return 1
+        dispatcher = Dispatcher(args)
+        if dispatcher.action is None:
+            return
+        return dispatcher()
+    except KeyboardInterrupt:
+        logger.info('interrupted')
+        return 1
+    except (IOError, os.error, PackagingError, CCompilerError) as exc:
+        logger.exception(exc)
+        return 1
     finally:
         logger.setLevel(old_level)
         logger.handlers[:] = old_handlers
diff --git a/distutils2/tests/__init__.py b/distutils2/tests/__init__.py
--- a/distutils2/tests/__init__.py
+++ b/distutils2/tests/__init__.py
@@ -7,14 +7,14 @@
 
 Utility code is included in distutils2.tests.support.
 
-Always import unittest from this module, it will be the right version
+Always import unittest from this module: it will be unittest from the
 standard library for packaging tests and unittest2 for distutils2 tests.
 """
 
 import os
 import sys
 import unittest2 as unittest
-from StringIO import StringIO
+from io import StringIO
 
 # XXX move helpers to support, add tests for them, remove things that
 # duplicate test.support (or keep them for the backport; needs thinking)
@@ -42,7 +42,7 @@
     """Test failed."""
 
 
-class BasicTestRunner(object):
+class BasicTestRunner:
     def run(self, test):
         result = unittest.TestResult()
         test(result)
@@ -72,13 +72,13 @@
 def run_unittest(classes, verbose_=1):
     """Run tests from unittest.TestCase-derived classes.
 
-    Originally extracted from stdlib test.test_support and modified to
+    Originally extracted from stdlib test.support and modified to
     support unittest2.
     """
     valid_types = (unittest.TestSuite, unittest.TestCase)
     suite = unittest.TestSuite()
     for cls in classes:
-        if isinstance(cls, basestring):
+        if isinstance(cls, str):
             if cls in sys.modules:
                 suite.addTest(unittest.findTestCases(sys.modules[cls]))
             else:
diff --git a/distutils2/tests/__main__.py b/distutils2/tests/__main__.py
--- a/distutils2/tests/__main__.py
+++ b/distutils2/tests/__main__.py
@@ -4,7 +4,7 @@
 
 import os
 import sys
-from test.test_support import reap_children, reap_threads, run_unittest
+from test.support import reap_children, reap_threads, run_unittest
 
 from distutils2.tests import unittest
 
diff --git a/distutils2/tests/pypi_server.py b/distutils2/tests/pypi_server.py
--- a/distutils2/tests/pypi_server.py
+++ b/distutils2/tests/pypi_server.py
@@ -30,17 +30,15 @@
 """
 
 import os
-import Queue
+import queue
 import select
 import threading
-import SocketServer
-from BaseHTTPServer import HTTPServer
-from SimpleHTTPServer import SimpleHTTPRequestHandler
-from SimpleXMLRPCServer import SimpleXMLRPCServer
+import socketserver
+from functools import wraps
+from http.server import HTTPServer, SimpleHTTPRequestHandler
+from xmlrpc.server import SimpleXMLRPCServer
 
 from distutils2.tests import unittest
-from distutils2.compat import wraps
-
 
 
 PYPI_DEFAULT_STATIC_PATH = os.path.join(
@@ -116,7 +114,7 @@
             self.server = HTTPServer(('127.0.0.1', 0), PyPIRequestHandler)
             self.server.RequestHandlerClass.pypi_server = self
 
-            self.request_queue = Queue.Queue()
+            self.request_queue = queue.Queue()
             self._requests = []
             self.default_response_status = 404
             self.default_response_headers = [('Content-type', 'text/plain')]
@@ -153,7 +151,7 @@
     def stop(self):
         """self shutdown is not supported for python < 2.6"""
         self._run = False
-        if self.isAlive():
+        if self.is_alive():
             self.join()
         self.server.server_close()
 
@@ -170,7 +168,7 @@
         while True:
             try:
                 self._requests.append(self.request_queue.get_nowait())
-            except Queue.Empty:
+            except queue.Empty:
                 break
         return self._requests
 
@@ -222,18 +220,12 @@
                         relative_path += "index.html"
 
                     if relative_path.endswith('.tar.gz'):
-                        file = open(fs_path + relative_path, 'rb')
-                        try:
+                        with open(fs_path + relative_path, 'rb') as file:
                             data = file.read()
-                        finally:
-                            file.close()
                         headers = [('Content-type', 'application/x-gtar')]
                     else:
-                        file = open(fs_path + relative_path)
-                        try:
+                        with open(fs_path + relative_path) as file:
                             data = file.read().encode()
-                        finally:
-                            file.close()
                         headers = [('Content-type', 'text/html')]
 
                     headers.append(('Content-Length', len(data)))
@@ -269,7 +261,7 @@
             self.send_header(header, value)
         self.end_headers()
 
-        if isinstance(data, unicode):
+        if isinstance(data, str):
             data = data.encode('utf-8')
 
         self.wfile.write(data)
@@ -278,12 +270,12 @@
 class PyPIXMLRPCServer(SimpleXMLRPCServer):
     def server_bind(self):
         """Override server_bind to store the server name."""
-        SocketServer.TCPServer.server_bind(self)
+        socketserver.TCPServer.server_bind(self)
         host, port = self.socket.getsockname()[:2]
         self.server_port = port
 
 
-class MockDist(object):
+class MockDist:
     """Fake distribution, used in the Mock PyPI Server"""
 
     def __init__(self, name, version="1.0", hidden=False, url="http://url/",
@@ -398,7 +390,7 @@
         }
 
 
-class XMLRPCMockIndex(object):
+class XMLRPCMockIndex:
     """Mock XMLRPC server"""
 
     def __init__(self, dists=[]):
diff --git a/distutils2/tests/pypi_test_server.py b/distutils2/tests/pypi_test_server.py
--- a/distutils2/tests/pypi_test_server.py
+++ b/distutils2/tests/pypi_test_server.py
@@ -40,7 +40,7 @@
         self.addCleanup(self.pypi.stop)
 
 
-class PyPIServer(object):
+class PyPIServer:
     """Shim to access testpypi.python.org, for testing a real server."""
 
     def __init__(self, test_static_path=None,
diff --git a/distutils2/tests/support.py b/distutils2/tests/support.py
--- a/distutils2/tests/support.py
+++ b/distutils2/tests/support.py
@@ -83,7 +83,7 @@
         self.buffer.append(record)
 
 
-class LoggingCatcher(object):
+class LoggingCatcher:
     """TestCase-compatible mixin to receive logging calls.
 
     Upon setUp, instances of this classes get a BufferingHandler that's
@@ -139,7 +139,7 @@
         return messages
 
 
-class TempdirManager(object):
+class TempdirManager:
     """TestCase-compatible mixin to create temporary directories and files.
 
     Directories and files created in a test_* method will be removed after it
@@ -182,11 +182,8 @@
         """
         if isinstance(path, (list, tuple)):
             path = os.path.join(*path)
-        f = codecs.open(path, 'w', encoding=encoding)
-        try:
+        with open(path, 'w', encoding=encoding) as f:
             f.write(content)
-        finally:
-            f.close()
 
     def create_dist(self, **kw):
         """Create a stub distribution object and files.
@@ -224,7 +221,7 @@
         self.assertFalse(os.path.isfile(path), "%r exists" % path)
 
 
-class EnvironRestorer(object):
+class EnvironRestorer:
     """TestCase-compatible mixin to restore or delete environment variables.
 
     The variables to restore (or delete if they were not originally present)
@@ -251,7 +248,7 @@
         super(EnvironRestorer, self).tearDown()
 
 
-class DummyCommand(object):
+class DummyCommand:
     """Class to store options for retrieval via set_undefined_options().
 
     Useful for mocking one dependency command in the tests for another
@@ -339,6 +336,11 @@
         cmd = build_ext(dist)
         support.fixup_build_ext(cmd)
         cmd.ensure_finalized()
+
+    In addition, this function also fixes cmd.distribution.include_dirs if
+    the running Python is an uninstalled Python 3.3.  (This fix is not done in
+    packaging, which does not need it, nor in distutils2 for Python 2, which
+    has no in-development version that can't be expected to be installed.)
     """
     if os.name == 'nt':
         cmd.debug = sys.executable.endswith('_d.exe')
@@ -354,12 +356,17 @@
             name, equals, value = runshared.partition('=')
             cmd.library_dirs = value.split(os.pathsep)
 
+    # Allow tests to run with an uninstalled Python 3.3
+    if sys.version_info[:2] == (3, 3) and sysconfig.is_python_build():
+        pysrcdir = sysconfig.get_config_var('projectbase')
+        cmd.distribution.include_dirs.append(os.path.join(pysrcdir, 'Include'))
+
 
 try:
-    from test.test_support import skip_unless_symlink
+    from test.support import skip_unless_symlink
 except ImportError:
     skip_unless_symlink = unittest.skip(
-        'requires test.test_support.skip_unless_symlink')
+        'requires test.support.skip_unless_symlink')
 
 
 requires_zlib = unittest.skipUnless(zlib, 'requires zlib')
@@ -368,7 +375,7 @@
 def unlink(filename):
     try:
         os.unlink(filename)
-    except OSError, error:
+    except OSError as error:
         # The filename need not exist.
         if error.errno not in (errno.ENOENT, errno.ENOTDIR):
             raise
@@ -381,7 +388,7 @@
     This will typically be run on the result of the communicate() method
     of a subprocess.Popen object.
     """
-    stderr = re.sub(r"\[\d+ refs\]\r?\n?$", "", stderr).strip()
+    stderr = re.sub(br"\[\d+ refs\]\r?\n?$", b"", stderr).strip()
     return stderr
 
 
diff --git a/distutils2/tests/test_command_build_clib.py b/distutils2/tests/test_command_build_clib.py
--- a/distutils2/tests/test_command_build_clib.py
+++ b/distutils2/tests/test_command_build_clib.py
@@ -68,7 +68,7 @@
         pkg_dir, dist = self.create_dist()
         cmd = build_clib(dist)
 
-        class FakeCompiler(object):
+        class FakeCompiler:
             def compile(*args, **kw):
                 pass
             create_static_lib = compile
diff --git a/distutils2/tests/test_command_build_ext.py b/distutils2/tests/test_command_build_ext.py
--- a/distutils2/tests/test_command_build_ext.py
+++ b/distutils2/tests/test_command_build_ext.py
@@ -3,7 +3,7 @@
 import site
 import shutil
 import textwrap
-from StringIO import StringIO
+from io import StringIO
 from distutils2.dist import Distribution
 from distutils2.errors import (UnknownFileError, CompileError,
                                PackagingPlatformError)
@@ -21,18 +21,13 @@
     def setUp(self):
         super(BuildExtTestCase, self).setUp()
         self.tmp_dir = self.mkdtemp()
-        if sys.version > "2.6":
-            self.old_user_base = site.USER_BASE
-            site.USER_BASE = self.mkdtemp()
+        self.old_user_base = site.USER_BASE
+        site.USER_BASE = self.mkdtemp()
 
     def tearDown(self):
-        if sys.version > "2.6":
-            site.USER_BASE = self.old_user_base
-
+        site.USER_BASE = self.old_user_base
         super(BuildExtTestCase, self).tearDown()
 
-    @unittest.skipIf(sys.version_info[:2] < (2, 6),
-                     "can't compile xxmodule successfully")
     def test_build_ext(self):
         support.copy_xxmodule_c(self.tmp_dir)
         xx_c = os.path.join(self.tmp_dir, 'xxmodule.c')
@@ -94,7 +89,6 @@
         # make sure we get some library dirs under solaris
         self.assertGreater(len(cmd.library_dirs), 0)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_user_site(self):
         dist = Distribution({'name': 'xx'})
         cmd = build_ext(dist)
@@ -362,8 +356,7 @@
 
         deptarget_c = os.path.join(self.tmp_dir, 'deptargetmodule.c')
 
-        fp = open(deptarget_c, 'w')
-        try:
+        with open(deptarget_c, 'w') as fp:
             fp.write(textwrap.dedent('''\
                 #include <AvailabilityMacros.h>
 
@@ -375,8 +368,6 @@
                 #endif
 
             ''' % operator))
-        finally:
-            fp.close()
 
         # get the deployment target that the interpreter was built with
         target = sysconfig.get_config_var('MACOSX_DEPLOYMENT_TARGET')
diff --git a/distutils2/tests/test_command_build_py.py b/distutils2/tests/test_command_build_py.py
--- a/distutils2/tests/test_command_build_py.py
+++ b/distutils2/tests/test_command_build_py.py
@@ -61,7 +61,7 @@
         self.assertIn("__init__.py", files)
         self.assertIn("README.txt", files)
         # XXX even with -O, distutils writes pyc, not pyo; bug?
-        if getattr(sys , 'dont_write_bytecode', False):
+        if sys.dont_write_bytecode:
             self.assertNotIn("__init__.pyc", files)
         else:
             self.assertIn("__init__.pyc", files)
@@ -99,8 +99,6 @@
             os.chdir(cwd)
             sys.stdout = old_stdout
 
-    @unittest.skipUnless(hasattr(sys, 'dont_write_bytecode'),
-                         'sys.dont_write_bytecode not supported')
     def test_dont_write_bytecode(self):
         # makes sure byte_compile is not used
         pkg_dir, dist = self.create_dist()
diff --git a/distutils2/tests/test_command_build_scripts.py b/distutils2/tests/test_command_build_scripts.py
--- a/distutils2/tests/test_command_build_scripts.py
+++ b/distutils2/tests/test_command_build_scripts.py
@@ -71,11 +71,8 @@
         return expected
 
     def write_script(self, dir, name, text):
-        f = open(os.path.join(dir, name), "w")
-        try:
+        with open(os.path.join(dir, name), "w") as f:
             f.write(text)
-        finally:
-            f.close()
 
     def test_version_int(self):
         source = self.mkdtemp()
diff --git a/distutils2/tests/test_command_config.py b/distutils2/tests/test_command_config.py
--- a/distutils2/tests/test_command_config.py
+++ b/distutils2/tests/test_command_config.py
@@ -13,11 +13,8 @@
 
     def test_dump_file(self):
         this_file = __file__.rstrip('co')
-        f = open(this_file)
-        try:
+        with open(this_file) as f:
             numlines = len(f.readlines())
-        finally:
-            f.close()
 
         dump_file(this_file, 'I am the header')
 
diff --git a/distutils2/tests/test_command_install_dist.py b/distutils2/tests/test_command_install_dist.py
--- a/distutils2/tests/test_command_install_dist.py
+++ b/distutils2/tests/test_command_install_dist.py
@@ -72,7 +72,6 @@
         check_path(cmd.install_scripts, os.path.join(destination, "bin"))
         check_path(cmd.install_data, destination)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_user_site(self):
         # test install with --user
         # preparing the environment for the test
@@ -173,19 +172,18 @@
         cmd.home = 'home'
         self.assertRaises(PackagingOptionError, cmd.finalize_options)
 
-        if sys.version >= '2.6':
-            # can't combine user with with prefix/exec_prefix/home or
-            # install_(plat)base
-            cmd.prefix = None
-            cmd.user = 'user'
-            self.assertRaises(PackagingOptionError, cmd.finalize_options)
+        # can't combine user with with prefix/exec_prefix/home or
+        # install_(plat)base
+        cmd.prefix = None
+        cmd.user = 'user'
+        self.assertRaises(PackagingOptionError, cmd.finalize_options)
 
     def test_old_record(self):
         # test pre-PEP 376 --record option (outside dist-info dir)
         install_dir = self.mkdtemp()
         project_dir, dist = self.create_dist(scripts=['hello'])
         os.chdir(project_dir)
-        self.write_file('hello', "print 'o hai'")
+        self.write_file('hello', "print('o hai')")
 
         cmd = install_dist(dist)
         dist.command_obj['install_dist'] = cmd
@@ -194,11 +192,8 @@
         cmd.ensure_finalized()
         cmd.run()
 
-        f = open(cmd.record)
-        try:
+        with open(cmd.record) as f:
             content = f.read()
-        finally:
-            f.close()
 
         found = [os.path.basename(line) for line in content.splitlines()]
         expected = ['hello', 'METADATA', 'INSTALLER', 'REQUESTED', 'RECORD']
@@ -207,8 +202,6 @@
         # XXX test that fancy_getopt is okay with options named
         # record and no-record but unrelated
 
-    @unittest.skipIf(sys.version_info[:2] < (2, 6),
-                     "can't compile xxmodule successfully")
     def test_old_record_extensions(self):
         # test pre-PEP 376 --record option with ext modules
         install_dir = self.mkdtemp()
@@ -229,11 +222,8 @@
         cmd.ensure_finalized()
         cmd.run()
 
-        f = open(cmd.record)
-        try:
+        with open(cmd.record) as f:
             content = f.read()
-        finally:
-            f.close()
 
         found = [os.path.basename(line) for line in content.splitlines()]
         expected = [_make_ext_name('xx'),
diff --git a/distutils2/tests/test_command_install_distinfo.py b/distutils2/tests/test_command_install_distinfo.py
--- a/distutils2/tests/test_command_install_distinfo.py
+++ b/distutils2/tests/test_command_install_distinfo.py
@@ -2,7 +2,7 @@
 
 import os
 import csv
-import codecs
+import hashlib
 
 from distutils2.command.install_distinfo import install_distinfo
 from distutils2.command.cmd import Command
@@ -10,10 +10,6 @@
 from distutils2.metadata import Metadata
 from distutils2.tests import unittest, support
 from distutils2._backport import sysconfig
-try:
-    import hashlib
-except:
-    from distutils2._backport import hashlib
 
 
 class DummyInstallCmd(Command):
@@ -59,18 +55,10 @@
         dist_info = os.path.join(install_dir, 'foo-1.0.dist-info')
         self.checkLists(os.listdir(dist_info),
                         ['METADATA', 'RECORD', 'REQUESTED', 'INSTALLER'])
-        fp = open(os.path.join(dist_info, 'INSTALLER'))
-        try:
+        with open(os.path.join(dist_info, 'INSTALLER')) as fp:
             self.assertEqual(fp.read(), 'distutils')
-        finally:
-            fp.close()
-
-        fp = open(os.path.join(dist_info, 'REQUESTED'))
-        try:
+        with open(os.path.join(dist_info, 'REQUESTED')) as fp:
             self.assertEqual(fp.read(), '')
-        finally:
-            fp.close()
-
         meta_path = os.path.join(dist_info, 'METADATA')
         self.assertTrue(Metadata(path=meta_path).check())
 
@@ -91,11 +79,8 @@
         cmd.run()
 
         dist_info = os.path.join(install_dir, 'foo-1.0.dist-info')
-        fp = open(os.path.join(dist_info, 'INSTALLER'))
-        try:
+        with open(os.path.join(dist_info, 'INSTALLER')) as fp:
             self.assertEqual(fp.read(), 'bacon-python')
-        finally:
-            fp.close()
 
     def test_requested(self):
         pkg_dir, dist = self.create_dist(name='foo',
@@ -171,21 +156,15 @@
         # platform-dependent (line endings)
         metadata = os.path.join(modules_dest, 'Spamlib-0.1.dist-info',
                                 'METADATA')
-        fp = open(metadata, 'rb')
-        try:
+        with open(metadata, 'rb') as fp:
             content = fp.read()
-        finally:
-            fp.close()
 
         metadata_size = str(len(content))
         metadata_md5 = hashlib.md5(content).hexdigest()
 
         record = os.path.join(modules_dest, 'Spamlib-0.1.dist-info', 'RECORD')
-        fp = codecs.open(record, encoding='utf-8')
-        try:
+        with open(record, encoding='utf-8') as fp:
             content = fp.read()
-        finally:
-            fp.close()
 
         found = []
         for line in content.splitlines():
@@ -240,29 +219,23 @@
 
         expected = []
         for f in install.get_outputs():
-            if (f.endswith('.pyc') or f.endswith('.pyo') or f == os.path.join(
+            if (f.endswith(('.pyc', '.pyo')) or f == os.path.join(
                 install_dir, 'foo-1.0.dist-info', 'RECORD')):
                 expected.append([f, '', ''])
             else:
                 size = os.path.getsize(f)
                 md5 = hashlib.md5()
-                fp = open(f, 'rb')
-                try:
+                with open(f, 'rb') as fp:
                     md5.update(fp.read())
-                finally:
-                    fp.close()
                 hash = md5.hexdigest()
                 expected.append([f, hash, str(size)])
 
         parsed = []
-        f = open(os.path.join(dist_info, 'RECORD'), 'r')
-        try:
+        with open(os.path.join(dist_info, 'RECORD'), 'r') as f:
             reader = csv.reader(f, delimiter=',',
                                    lineterminator=os.linesep,
                                    quotechar='"')
             parsed = list(reader)
-        finally:
-            f.close()
 
         self.maxDiff = None
         self.checkLists(parsed, expected)
diff --git a/distutils2/tests/test_command_install_lib.py b/distutils2/tests/test_command_install_lib.py
--- a/distutils2/tests/test_command_install_lib.py
+++ b/distutils2/tests/test_command_install_lib.py
@@ -33,8 +33,7 @@
         cmd.finalize_options()
         self.assertEqual(cmd.optimize, 2)
 
-    @unittest.skipIf(getattr(sys, 'dont_write_bytecode', False),
-                     'byte-compile disabled')
+    @unittest.skipIf(sys.dont_write_bytecode, 'byte-compile disabled')
     def test_byte_compile(self):
         pkg_dir, dist = self.create_dist()
         cmd = install_lib(dist)
@@ -83,8 +82,6 @@
         # get_input should return 2 elements
         self.assertEqual(len(cmd.get_inputs()), 2)
 
-    @unittest.skipUnless(hasattr(sys, 'dont_write_bytecode'),
-                         'sys.dont_write_bytecode not supported')
     def test_dont_write_bytecode(self):
         # makes sure byte_compile is not used
         pkg_dir, dist = self.create_dist()
diff --git a/distutils2/tests/test_command_install_scripts.py b/distutils2/tests/test_command_install_scripts.py
--- a/distutils2/tests/test_command_install_scripts.py
+++ b/distutils2/tests/test_command_install_scripts.py
@@ -38,11 +38,8 @@
 
         def write_script(name, text):
             expected.append(name)
-            f = open(os.path.join(source, name), "w")
-            try:
+            with open(os.path.join(source, name), "w") as f:
                 f.write(text)
-            finally:
-                f.close()
 
         write_script("script1.py", ("#! /usr/bin/env python2.3\n"
                                     "# bogus script w/ Python sh-bang\n"
diff --git a/distutils2/tests/test_command_register.py b/distutils2/tests/test_command_register.py
--- a/distutils2/tests/test_command_register.py
+++ b/distutils2/tests/test_command_register.py
@@ -1,8 +1,10 @@
-# encoding: utf-8
 """Tests for distutils2.command.register."""
 import os
 import getpass
-import urllib2
+import urllib.request
+import urllib.error
+import urllib.parse
+
 try:
     import docutils
     DOCUTILS_SUPPORT = True
@@ -36,7 +38,7 @@
 """
 
 
-class Inputs(object):
+class Inputs:
     """Fakes user inputs."""
     def __init__(self, *answers):
         self.answers = answers
@@ -49,7 +51,7 @@
             self.index += 1
 
 
-class FakeOpener(object):
+class FakeOpener:
     """Fakes a PyPI server"""
     def __init__(self):
         self.reqs = []
@@ -85,12 +87,12 @@
             return 'password'
 
         getpass.getpass = _getpass
-        self.old_opener = urllib2.build_opener
-        self.conn = urllib2.build_opener = FakeOpener()
+        self.old_opener = urllib.request.build_opener
+        self.conn = urllib.request.build_opener = FakeOpener()
 
     def tearDown(self):
         getpass.getpass = self._old_getpass
-        urllib2.build_opener = self.old_opener
+        urllib.request.build_opener = self.old_opener
         if hasattr(register_module, 'input'):
             del register_module.input
         super(RegisterTestCase, self).tearDown()
@@ -121,7 +123,7 @@
         # Password : 'password'
         # Save your login (y/N)? : 'y'
         inputs = Inputs('1', 'tarek', 'y')
-        register_module.raw_input = inputs
+        register_module.input = inputs
         cmd.ensure_finalized()
         cmd.run()
 
@@ -129,11 +131,8 @@
         self.assertTrue(os.path.exists(self.rc))
 
         # with the content similar to WANTED_PYPIRC
-        fp = open(self.rc)
-        try:
+        with open(self.rc) as fp:
             content = fp.read()
-        finally:
-            fp.close()
         self.assertEqual(content, WANTED_PYPIRC)
 
         # now let's make sure the .pypirc file generated
@@ -153,7 +152,7 @@
         req1 = dict(self.conn.reqs[0].headers)
         req2 = dict(self.conn.reqs[1].headers)
         self.assertEqual(req2['Content-length'], req1['Content-length'])
-        self.assertIn('xxx', self.conn.reqs[1].data)
+        self.assertIn(b'xxx', self.conn.reqs[1].data)
 
     def test_password_not_in_file(self):
 
@@ -171,7 +170,7 @@
         # this test runs choice 2
         cmd = self._get_cmd()
         inputs = Inputs('2', 'tarek', 'tarek at ziade.org')
-        register_module.raw_input = inputs
+        register_module.input = inputs
         # let's run the command
         # FIXME does this send a real request? use a mock server
         cmd.ensure_finalized()
@@ -182,13 +181,13 @@
         req = self.conn.reqs[0]
         headers = dict(req.headers)
         self.assertEqual(headers['Content-length'], '628')
-        self.assertIn('tarek', req.data)
+        self.assertIn(b'tarek', req.data)
 
     def test_password_reset(self):
         # this test runs choice 3
         cmd = self._get_cmd()
         inputs = Inputs('3', 'tarek at ziade.org')
-        register_module.raw_input = inputs
+        register_module.input = inputs
         cmd.ensure_finalized()
         cmd.run()
 
@@ -197,7 +196,7 @@
         req = self.conn.reqs[0]
         headers = dict(req.headers)
         self.assertEqual(headers['Content-length'], '298')
-        self.assertIn('tarek', req.data)
+        self.assertIn(b'tarek', req.data)
 
     @unittest.skipUnless(DOCUTILS_SUPPORT, 'needs docutils')
     def test_strict(self):
@@ -211,7 +210,7 @@
         cmd.ensure_finalized()
         cmd.strict = True
         inputs = Inputs('1', 'tarek', 'y')
-        register_module.raw_input = inputs
+        register_module.input = inputs
         self.assertRaises(PackagingSetupError, cmd.run)
 
         # metadata is OK but long_description is broken
@@ -232,7 +231,7 @@
         cmd.ensure_finalized()
         cmd.strict = True
         inputs = Inputs('1', 'tarek', 'y')
-        register_module.raw_input = inputs
+        register_module.input = inputs
         cmd.ensure_finalized()
         cmd.run()
 
@@ -240,7 +239,7 @@
         cmd = self._get_cmd()
         cmd.ensure_finalized()
         inputs = Inputs('1', 'tarek', 'y')
-        register_module.raw_input = inputs
+        register_module.input = inputs
         cmd.ensure_finalized()
         cmd.run()
 
diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py
--- a/distutils2/tests/test_command_sdist.py
+++ b/distutils2/tests/test_command_sdist.py
@@ -214,11 +214,8 @@
         self.assertEqual(len(content), 10)
 
         # Checking the MANIFEST
-        fp = open(join(self.tmp_dir, 'MANIFEST'))
-        try:
+        with open(join(self.tmp_dir, 'MANIFEST')) as fp:
             manifest = fp.read()
-        finally:
-            fp.close()
         self.assertEqual(manifest, MANIFEST % {'sep': os.sep})
 
     @requires_zlib
@@ -334,12 +331,9 @@
 
         # Should produce four lines. Those lines are one comment, one default
         # (README) and two package files.
-        f = open(cmd.manifest)
-        try:
+        with open(cmd.manifest) as f:
             manifest = [line.strip() for line in f.read().split('\n')
                         if line.strip() != '']
-        finally:
-            f.close()
         self.assertEqual(len(manifest), 3)
 
         # Adding a file
@@ -352,12 +346,9 @@
 
         cmd.run()
 
-        f = open(cmd.manifest)
-        try:
+        with open(cmd.manifest) as f:
             manifest2 = [line.strip() for line in f.read().split('\n')
                          if line.strip() != '']
-        finally:
-            f.close()
 
         # Do we have the new file in MANIFEST?
         self.assertEqual(len(manifest2), 4)
@@ -370,12 +361,9 @@
         cmd.ensure_finalized()
         cmd.run()
 
-        f = open(cmd.manifest)
-        try:
+        with open(cmd.manifest) as f:
             manifest = [line.strip() for line in f.read().split('\n')
                         if line.strip() != '']
-        finally:
-            f.close()
 
         self.assertEqual(manifest[0],
                          '# file GENERATED by distutils2, do NOT edit')
@@ -388,12 +376,9 @@
         self.write_file((self.tmp_dir, cmd.manifest), 'README.manual')
         cmd.run()
 
-        f = open(cmd.manifest)
-        try:
+        with open(cmd.manifest) as f:
             manifest = [line.strip() for line in f.read().split('\n')
                         if line.strip() != '']
-        finally:
-            f.close()
 
         self.assertEqual(manifest, ['README.manual'])
 
@@ -404,11 +389,8 @@
         cmd.ensure_finalized()
         self.write_file((self.tmp_dir, 'yeah'), 'xxx')
         cmd.run()
-        f = open(cmd.manifest)
-        try:
+        with open(cmd.manifest) as f:
             content = f.read()
-        finally:
-            f.close()
 
         self.assertIn('yeah', content)
 
diff --git a/distutils2/tests/test_command_test.py b/distutils2/tests/test_command_test.py
--- a/distutils2/tests/test_command_test.py
+++ b/distutils2/tests/test_command_test.py
@@ -113,7 +113,7 @@
         record = []
         a_module.recorder = lambda *args: record.append("suite")
 
-        class MockTextTestRunner(object):
+        class MockTextTestRunner:
             def __init__(*_, **__):
                 pass
 
@@ -177,7 +177,7 @@
         self.assertEqual(["runner called"], record)
 
     def prepare_mock_ut2(self):
-        class MockUTClass(object):
+        class MockUTClass:
             def __init__(*_, **__):
                 pass
 
@@ -187,7 +187,7 @@
             def run(self, _):
                 pass
 
-        class MockUTModule(object):
+        class MockUTModule:
             TestLoader = MockUTClass
             TextTestRunner = MockUTClass
 
diff --git a/distutils2/tests/test_command_upload.py b/distutils2/tests/test_command_upload.py
--- a/distutils2/tests/test_command_upload.py
+++ b/distutils2/tests/test_command_upload.py
@@ -1,4 +1,3 @@
-# encoding: utf-8
 """Tests for distutils2.command.upload."""
 import os
 
@@ -44,6 +43,7 @@
 """
 
 
+ at unittest.skipIf(threading is None, 'needs threading')
 class UploadTestCase(support.TempdirManager, support.EnvironRestorer,
                      support.LoggingCatcher, PyPIServerTestCase):
 
@@ -112,8 +112,8 @@
         # what did we send?
         handler, request_data = self.pypi.requests[-1]
         headers = handler.headers
-        self.assertIn('dédé', request_data)
-        self.assertIn('xxx', request_data)
+        self.assertIn('dédé'.encode('utf-8'), request_data)
+        self.assertIn(b'xxx', request_data)
 
         self.assertEqual(int(headers['content-length']), len(request_data))
         self.assertLess(int(headers['content-length']), 2500)
@@ -148,12 +148,8 @@
             "----------------GHSKFJDLGDS7543FJKLFHRE75642756743254"
             .encode())[1:4]
 
-        self.assertIn('name=":action"', action)
-        self.assertIn('doc_upload', action)
-
-
-UploadTestCase = unittest.skipIf(threading is None, 'needs threading')(
-        UploadTestCase)
+        self.assertIn(b'name=":action"', action)
+        self.assertIn(b'doc_upload', action)
 
 
 def test_suite():
diff --git a/distutils2/tests/test_command_upload_docs.py b/distutils2/tests/test_command_upload_docs.py
--- a/distutils2/tests/test_command_upload_docs.py
+++ b/distutils2/tests/test_command_upload_docs.py
@@ -32,6 +32,7 @@
 """
 
 
+ at unittest.skipIf(threading is None, "Needs threading")
 class UploadDocsTestCase(support.TempdirManager,
                          support.EnvironRestorer,
                          support.LoggingCatcher,
@@ -94,39 +95,39 @@
 
         self.assertEqual(len(self.pypi.requests), 1)
         handler, request_data = self.pypi.requests[-1]
-        self.assertIn("content", request_data)
+        self.assertIn(b"content", request_data)
         self.assertIn("Basic", handler.headers['authorization'])
         self.assertTrue(handler.headers['content-type']
             .startswith('multipart/form-data;'))
 
         action, name, version, content = request_data.split(
-            '----------------GHSKFJDLGDS7543FJKLFHRE75642756743254')[1:5]
+            b'----------------GHSKFJDLGDS7543FJKLFHRE75642756743254')[1:5]
 
         # check that we picked the right chunks
-        self.assertIn('name=":action"', action)
-        self.assertIn('name="name"', name)
-        self.assertIn('name="version"', version)
-        self.assertIn('name="content"', content)
+        self.assertIn(b'name=":action"', action)
+        self.assertIn(b'name="name"', name)
+        self.assertIn(b'name="version"', version)
+        self.assertIn(b'name="content"', content)
 
         # check their contents
-        self.assertIn('doc_upload', action)
-        self.assertIn('distr-name', name)
-        self.assertIn('docs/index.html', content)
-        self.assertIn('Ce mortel ennui', content)
+        self.assertIn(b'doc_upload', action)
+        self.assertIn(b'distr-name', name)
+        self.assertIn(b'docs/index.html', content)
+        self.assertIn(b'Ce mortel ennui', content)
 
     @unittest.skipIf(_ssl is None, 'Needs SSL support')
     def test_https_connection(self):
         self.https_called = False
         self.addCleanup(
-            setattr, upload_docs_mod.httplib, 'HTTPSConnection',
-            upload_docs_mod.httplib.HTTPSConnection)
+            setattr, upload_docs_mod.http.client, 'HTTPSConnection',
+            upload_docs_mod.http.client.HTTPSConnection)
 
         def https_conn_wrapper(*args):
             self.https_called = True
             # the testing server is http
-            return upload_docs_mod.httplib.HTTPConnection(*args)
+            return upload_docs_mod.http.client.HTTPConnection(*args)
 
-        upload_docs_mod.httplib.HTTPSConnection = https_conn_wrapper
+        upload_docs_mod.http.client.HTTPSConnection = https_conn_wrapper
 
         self.prepare_command()
         self.cmd.run()
@@ -174,9 +175,6 @@
         self.assertTrue(record, "should report the response")
         self.assertIn(self.pypi.default_response_data, record)
 
-UploadDocsTestCase = unittest.skipIf(threading is None, "Needs threading")(
-    UploadDocsTestCase)
-
 
 def test_suite():
     return unittest.makeSuite(UploadDocsTestCase)
diff --git a/distutils2/tests/test_compiler.py b/distutils2/tests/test_compiler.py
--- a/distutils2/tests/test_compiler.py
+++ b/distutils2/tests/test_compiler.py
@@ -6,7 +6,7 @@
 from distutils2.tests import unittest, support
 
 
-class FakeCompiler(object):
+class FakeCompiler:
 
     name = 'fake'
     description = 'Fake'
@@ -36,7 +36,7 @@
         os.environ['ARFLAGS'] = '-arflags'
 
         # make sure AR gets caught
-        class compiler(object):
+        class compiler:
             name = 'unix'
 
             def set_executables(self, **kw):
diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py
--- a/distutils2/tests/test_config.py
+++ b/distutils2/tests/test_config.py
@@ -1,9 +1,8 @@
-# encoding: utf-8
 """Tests for distutils2.config."""
 import os
 import sys
 import logging
-from StringIO import StringIO
+from io import StringIO
 
 from distutils2 import command
 from distutils2.dist import Distribution
@@ -15,7 +14,7 @@
 from distutils2.tests.support import requires_zlib
 
 
-SETUP_CFG = u"""
+SETUP_CFG = """
 [metadata]
 name = RestingParrot
 version = 0.6.4
@@ -161,7 +160,7 @@
 """
 
 
-class DCompiler(object):
+class DCompiler:
     name = 'd'
     description = 'D Compiler'
 
@@ -181,7 +180,7 @@
     config['files']['modules'] += '\n third'
 
 
-class FooBarBazTest(object):
+class FooBarBazTest:
 
     def __init__(self, dist):
         self.distribution = dist
@@ -474,11 +473,8 @@
         cmd.finalize_options()
         cmd.get_file_list()
         cmd.make_distribution()
-        fp = open('MANIFEST')
-        try:
+        with open('MANIFEST') as fp:
             self.assertIn('README\nREADME2\n', fp.read())
-        finally:
-            fp.close()
 
     def test_sub_commands(self):
         self.write_setup()
diff --git a/distutils2/tests/test_create.py b/distutils2/tests/test_create.py
--- a/distutils2/tests/test_create.py
+++ b/distutils2/tests/test_create.py
@@ -1,9 +1,7 @@
-# encoding: utf-8
 """Tests for distutils2.create."""
 import os
 import sys
-import codecs
-from StringIO import StringIO
+from io import StringIO
 from textwrap import dedent
 from distutils2.create import MainProgram, ask_yn, ask, main
 from distutils2._backport import sysconfig
@@ -98,7 +96,7 @@
 
     def test_convert_setup_py_to_cfg(self):
         self.write_file((self.wdir, 'setup.py'),
-                        dedent(u"""
+                        dedent("""
         # coding: utf-8
         from distutils.core import setup
 
@@ -133,18 +131,15 @@
               scripts=['my_script', 'bin/run'],
               )
         """), encoding='utf-8')
-        sys.stdin.write(u'y\n')
+        sys.stdin.write('y\n')
         sys.stdin.seek(0)
         main()
 
         path = os.path.join(self.wdir, 'setup.cfg')
-        fp = codecs.open(path, encoding='utf-8')
-        try:
+        with open(path, encoding='utf-8') as fp:
             contents = fp.read()
-        finally:
-            fp.close()
 
-        self.assertEqual(contents, dedent(u"""\
+        self.assertEqual(contents, dedent("""\
             [metadata]
             name = pyxfoil
             version = 0.2
@@ -184,14 +179,11 @@
 
     def test_convert_setup_py_to_cfg_with_description_in_readme(self):
         self.write_file((self.wdir, 'setup.py'),
-                        dedent(u"""
+                        dedent("""
         # coding: utf-8
         from distutils.core import setup
-        fp = open('README.txt')
-        try:
+        with open('README.txt') as fp:
             long_description = fp.read()
-        finally:
-            fp.close()
 
         setup(name='pyxfoil',
               version='0.2',
@@ -221,13 +213,10 @@
         main()
 
         path = os.path.join(self.wdir, 'setup.cfg')
-        fp = codecs.open(path, encoding='utf-8')
-        try:
+        with open(path, encoding='utf-8') as fp:
             contents = fp.read()
-        finally:
-            fp.close()
 
-        self.assertEqual(contents, dedent(u"""\
+        self.assertEqual(contents, dedent("""\
             [metadata]
             name = pyxfoil
             version = 0.2
diff --git a/distutils2/tests/test_database.py b/distutils2/tests/test_database.py
--- a/distutils2/tests/test_database.py
+++ b/distutils2/tests/test_database.py
@@ -1,12 +1,10 @@
 import os
+import io
 import csv
 import sys
 import shutil
 import tempfile
-try:
-    from hashlib import md5
-except ImportError:
-    from distutils2._backport.hashlib import md5
+from hashlib import md5
 from textwrap import dedent
 
 from distutils2.tests.test_util import GlobTestCaseBase
@@ -28,11 +26,8 @@
 
 
 def get_hexdigest(filename):
-    fp = open(filename, 'rb')
-    try:
-        checksum = md5(fp.read())
-    finally:
-        fp.close()
+    with open(filename, 'rb') as file:
+        checksum = md5(file.read())
     return checksum.hexdigest()
 
 
@@ -43,7 +38,7 @@
     return path, digest, size
 
 
-class FakeDistsMixin(object):
+class FakeDistsMixin:
 
     def setUp(self):
         super(FakeDistsMixin, self).setUp()
@@ -65,11 +60,11 @@
         # shutil gives no control over the mode of directories :(
         # see http://bugs.python.org/issue1666318
         for root, dirs, files in os.walk(self.fake_dists_path):
-            os.chmod(root, 0755)
+            os.chmod(root, 0o755)
             for f in files:
-                os.chmod(os.path.join(root, f), 0644)
+                os.chmod(os.path.join(root, f), 0o644)
             for d in dirs:
-                os.chmod(os.path.join(root, d), 0755)
+                os.chmod(os.path.join(root, d), 0o755)
 
 
 class CommonDistributionTests(FakeDistsMixin):
@@ -138,10 +133,9 @@
         for distinfo_dir in self.dirs:
 
             record_file = os.path.join(distinfo_dir, 'RECORD')
-            fp = open(record_file, 'w')
-            try:
+            with open(record_file, 'w') as file:
                 record_writer = csv.writer(
-                    fp, delimiter=',', quoting=csv.QUOTE_NONE,
+                    file, delimiter=',', quoting=csv.QUOTE_NONE,
                     lineterminator='\n')
 
                 dist_location = distinfo_dir.replace('.dist-info', '')
@@ -152,12 +146,9 @@
                 for file in ('INSTALLER', 'METADATA', 'REQUESTED'):
                     record_writer.writerow(record_pieces((distinfo_dir, file)))
                 record_writer.writerow([record_file])
-            finally:
-                fp.close()
 
-            fp = open(record_file)
-            try:
-                record_reader = csv.reader(fp, lineterminator='\n')
+            with open(record_file) as file:
+                record_reader = csv.reader(file, lineterminator='\n')
                 record_data = {}
                 for row in record_reader:
                     if row == []:
@@ -165,8 +156,6 @@
                     path, md5_, size = (row[:] +
                                         [None for i in range(len(row), 3)])
                     record_data[path] = md5_, size
-            finally:
-                fp.close()
             self.records[distinfo_dir] = record_data
 
     def test_instantiation(self):
@@ -210,14 +199,11 @@
         ]
 
         for distfile in distinfo_files:
-            value = dist.get_distinfo_file(distfile)
-            try:
-                self.assertIsInstance(value, file)
+            with dist.get_distinfo_file(distfile) as value:
+                self.assertIsInstance(value, io.TextIOWrapper)
                 # Is it the correct file?
                 self.assertEqual(value.name,
                                  os.path.join(distinfo_dir, distfile))
-            finally:
-                value.close()
 
         # Test an absolute path that is part of another distributions dist-info
         other_distinfo_file = os.path.join(
@@ -640,8 +626,7 @@
         metadata_path = os.path.join(dist_info, 'METADATA')
         resources_path = os.path.join(dist_info, 'RESOURCES')
 
-        fp = open(metadata_path, 'w')
-        try:
+        with open(metadata_path, 'w') as fp:
             fp.write(dedent("""\
                 Metadata-Version: 1.2
                 Name: test
@@ -649,25 +634,18 @@
                 Summary: test
                 Author: me
                 """))
-        finally:
-            fp.close()
+
         test_path = 'test.cfg'
 
         fd, test_resource_path = tempfile.mkstemp()
         os.close(fd)
         self.addCleanup(os.remove, test_resource_path)
 
-        fp = open(test_resource_path, 'w')
-        try:
+        with open(test_resource_path, 'w') as fp:
             fp.write('Config')
-        finally:
-            fp.close()
 
-        fp = open(resources_path, 'w')
-        try:
+        with open(resources_path, 'w') as fp:
             fp.write('%s,%s' % (test_path, test_resource_path))
-        finally:
-            fp.close()
 
         # Add fake site-packages to sys.path to retrieve fake dist
         self.addCleanup(sys.path.remove, temp_site_packages)
@@ -682,11 +660,8 @@
                          test_resource_path)
         self.assertRaises(KeyError, get_file_path, dist_name, 'i-dont-exist')
 
-        fp = get_file(dist_name, test_path)
-        try:
+        with get_file(dist_name, test_path) as fp:
             self.assertEqual(fp.read(), 'Config')
-        finally:
-            fp.close()
         self.assertRaises(KeyError, get_file, dist_name, 'i-dont-exist')
 
 
diff --git a/distutils2/tests/test_depgraph.py b/distutils2/tests/test_depgraph.py
--- a/distutils2/tests/test_depgraph.py
+++ b/distutils2/tests/test_depgraph.py
@@ -2,7 +2,7 @@
 import os
 import re
 import sys
-from StringIO import StringIO
+from io import StringIO
 
 from distutils2 import depgraph
 from distutils2.database import get_distribution, enable_cache, disable_cache
diff --git a/distutils2/tests/test_dist.py b/distutils2/tests/test_dist.py
--- a/distutils2/tests/test_dist.py
+++ b/distutils2/tests/test_dist.py
@@ -1,7 +1,6 @@
 """Tests for distutils2.dist."""
 import os
 import sys
-import codecs
 import logging
 import textwrap
 
@@ -52,12 +51,9 @@
     def test_debug_mode(self):
         tmpdir = self.mkdtemp()
         setupcfg = os.path.join(tmpdir, 'setup.cfg')
-        f = open(setupcfg, "w")
-        try:
+        with open(setupcfg, "w") as f:
             f.write("[global]\n")
             f.write("command_packages = foo.bar, splat")
-        finally:
-            f.close()
 
         files = [setupcfg]
         sys.argv.append("build")
@@ -128,11 +124,8 @@
 
         temp_dir = self.mkdtemp()
         user_filename = os.path.join(temp_dir, user_filename)
-        f = open(user_filename, 'w')
-        try:
+        with open(user_filename, 'w') as f:
             f.write('.')
-        finally:
-            f.close()
 
         dist = Distribution()
 
@@ -148,11 +141,8 @@
         else:
             user_filename = os.path.join(temp_home, "pydistutils.cfg")
 
-        f = open(user_filename, 'w')
-        try:
+        with open(user_filename, 'w') as f:
             f.write('[distutils2]\n')
-        finally:
-            f.close()
 
         def _expander(path):
             return temp_home
diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py
--- a/distutils2/tests/test_install.py
+++ b/distutils2/tests/test_install.py
@@ -18,7 +18,7 @@
     use_xmlrpc_server = fake_dec
 
 
-class InstalledDist(object):
+class InstalledDist:
     """Distribution object, represent distributions currently installed on the
     system"""
     def __init__(self, name, version, deps):
@@ -33,7 +33,7 @@
         return '<InstalledDist %r>' % self.metadata['Name']
 
 
-class ToInstallDist(object):
+class ToInstallDist:
     """Distribution that will be installed"""
 
     def __init__(self, files=False):
@@ -63,7 +63,7 @@
         return self.list_installed_files()
 
 
-class MagicMock(object):
+class MagicMock:
     def __init__(self, return_value=None, raise_exception=False):
         self.called = False
         self._times_called = 0
diff --git a/distutils2/tests/test_manifest.py b/distutils2/tests/test_manifest.py
--- a/distutils2/tests/test_manifest.py
+++ b/distutils2/tests/test_manifest.py
@@ -1,7 +1,7 @@
 """Tests for distutils2.manifest."""
 import os
 import logging
-from StringIO import StringIO
+from io import StringIO
 from distutils2.manifest import Manifest
 
 from distutils2.tests import unittest, support
@@ -37,11 +37,8 @@
     def test_manifest_reader(self):
         tmpdir = self.mkdtemp()
         MANIFEST = os.path.join(tmpdir, 'MANIFEST.in')
-        f = open(MANIFEST, 'w')
-        try:
+        with open(MANIFEST, 'w') as f:
             f.write(_MANIFEST)
-        finally:
-            f.close()
 
         manifest = Manifest()
         manifest.read_template(MANIFEST)
@@ -54,11 +51,8 @@
             self.assertIn('no files found matching', warning)
 
         # manifest also accepts file-like objects
-        f = open(MANIFEST)
-        try:
+        with open(MANIFEST) as f:
             manifest.read_template(f)
-        finally:
-            f.close()
 
         # the manifest should have been read and 3 warnings issued
         # (we didn't provide the files)
diff --git a/distutils2/tests/test_markers.py b/distutils2/tests/test_markers.py
--- a/distutils2/tests/test_markers.py
+++ b/distutils2/tests/test_markers.py
@@ -2,7 +2,6 @@
 import os
 import sys
 import platform
-from distutils2.compat import python_implementation
 from distutils2.markers import interpret
 
 from distutils2.tests import unittest
@@ -18,7 +17,7 @@
         os_name = os.name
         platform_version = platform.version()
         platform_machine = platform.machine()
-        platform_python_implementation = python_implementation()
+        platform_python_implementation = platform.python_implementation()
 
         self.assertTrue(interpret("sys.platform == '%s'" % sys_platform))
         self.assertTrue(interpret(
diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py
--- a/distutils2/tests/test_metadata.py
+++ b/distutils2/tests/test_metadata.py
@@ -1,11 +1,9 @@
-# encoding: utf-8
 """Tests for distutils2.metadata."""
 import os
 import sys
-import codecs
 import logging
 from textwrap import dedent
-from StringIO import StringIO
+from io import StringIO
 
 from distutils2.errors import (MetadataConflictError, MetadataMissingError,
                                MetadataUnrecognizedVersionError)
@@ -37,12 +35,8 @@
 
     def test_instantiation(self):
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
-        f = codecs.open(PKG_INFO, 'r', encoding='utf-8')
-        try:
+        with open(PKG_INFO, 'r', encoding='utf-8') as f:
             contents = f.read()
-        finally:
-            f.close()
-
         fp = StringIO(contents)
 
         m = Metadata()
@@ -70,11 +64,8 @@
     def test_metadata_markers(self):
         # see if we can be platform-aware
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
-        f = codecs.open(PKG_INFO, 'r', encoding='utf-8')
-        try:
+        with open(PKG_INFO, 'r', encoding='utf-8') as f:
             content = f.read() % sys.platform
-        finally:
-            f.close()
         metadata = Metadata(platform_dependent=True)
 
         metadata.read_file(StringIO(content))
@@ -92,11 +83,8 @@
 
     def test_mapping_api(self):
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
-        f = codecs.open(PKG_INFO, 'r', encoding='utf-8')
-        try:
+        with open(PKG_INFO, 'r', encoding='utf-8') as f:
             content = f.read() % sys.platform
-        finally:
-            f.close()
         metadata = Metadata(fileobj=StringIO(content))
         self.assertIn('Version', metadata.keys())
         self.assertIn('0.5', metadata.values())
@@ -111,6 +99,8 @@
         metadata.update({'version': '1--2'})
         self.assertEqual(len(self.get_logs()), 1)
 
+        # XXX caveat: the keys method and friends are not 3.x-style views
+        # should be changed or documented
         self.assertEqual(list(metadata), metadata.keys())
 
     def test_read_metadata(self):
@@ -143,21 +133,18 @@
         tmp_dir = self.mkdtemp()
         my_file = os.path.join(tmp_dir, 'f')
 
-        metadata = Metadata(mapping={'author': u'Mister Café',
-                                     'name': u'my.project',
-                                     'author': u'Café Junior',
-                                     'summary': u'Café torréfié',
-                                     'description': u'Héhéhé',
-                                     'keywords': [u'café', u'coffee']})
+        metadata = Metadata(mapping={'author': 'Mister Café',
+                                     'name': 'my.project',
+                                     'author': 'Café Junior',
+                                     'summary': 'Café torréfié',
+                                     'description': 'Héhéhé',
+                                     'keywords': ['café', 'coffee']})
         metadata.write(my_file)
 
         # the file should use UTF-8
         metadata2 = Metadata()
-        fp = codecs.open(my_file, encoding='utf-8')
-        try:
+        with open(my_file, encoding='utf-8') as fp:
             metadata2.read_file(fp)
-        finally:
-            fp.close()
 
         # XXX when keywords are not defined, metadata will have
         # 'Keywords': [] but metadata2 will have 'Keywords': ['']
@@ -165,19 +152,16 @@
         self.assertEqual(metadata.items(), metadata2.items())
 
         # ASCII also works, it's a subset of UTF-8
-        metadata = Metadata(mapping={'author': u'Mister Cafe',
-                                     'name': u'my.project',
-                                     'author': u'Cafe Junior',
-                                     'summary': u'Cafe torrefie',
-                                     'description': u'Hehehe'})
+        metadata = Metadata(mapping={'author': 'Mister Cafe',
+                                     'name': 'my.project',
+                                     'author': 'Cafe Junior',
+                                     'summary': 'Cafe torrefie',
+                                     'description': 'Hehehe'})
         metadata.write(my_file)
 
         metadata2 = Metadata()
-        fp = codecs.open(my_file, encoding='utf-8')
-        try:
+        with open(my_file, encoding='utf-8') as fp:
             metadata2.read_file(fp)
-        finally:
-            fp.close()
 
     def test_metadata_read_write(self):
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
@@ -324,21 +308,15 @@
 
     def test_description(self):
         PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO')
-        f = codecs.open(PKG_INFO, 'r', encoding='utf-8')
-        try:
+        with open(PKG_INFO, 'r', encoding='utf-8') as f:
             content = f.read() % sys.platform
-        finally:
-            f.close()
         metadata = Metadata()
         metadata.read_file(StringIO(content))
 
         # see if we can read the description now
         DESC = os.path.join(os.path.dirname(__file__), 'LONG_DESC.txt')
-        f = open(DESC)
-        try:
+        with open(DESC) as f:
             wanted = f.read()
-        finally:
-            f.close()
         self.assertEqual(wanted, metadata['Description'])
 
         # save the file somewhere and make sure we can read it back
diff --git a/distutils2/tests/test_mixin2to3.py b/distutils2/tests/test_mixin2to3.py
--- a/distutils2/tests/test_mixin2to3.py
+++ b/distutils2/tests/test_mixin2to3.py
@@ -9,30 +9,22 @@
                         support.LoggingCatcher,
                         unittest.TestCase):
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_convert_code_only(self):
         # used to check if code gets converted properly.
         code = "print 'test'"
 
-        fp = self.mktempfile()
-        try:
+        with self.mktempfile() as fp:
             fp.write(code)
-        finally:
-            fp.close()
 
         mixin2to3 = Mixin2to3()
         mixin2to3._run_2to3([fp.name])
         expected = "print('test')"
 
-        fp = open(fp.name)
-        try:
+        with open(fp.name) as fp:
             converted = fp.read()
-        finally:
-            fp.close()
 
         self.assertEqual(expected, converted)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_doctests_only(self):
         # used to check if doctests gets converted properly.
         doctest = textwrap.dedent('''\
@@ -44,11 +36,8 @@
             It works.
             """''')
 
-        fp = self.mktempfile()
-        try:
+        with self.mktempfile() as fp:
             fp.write(doctest)
-        finally:
-            fp.close()
 
         mixin2to3 = Mixin2to3()
         mixin2to3._run_2to3([fp.name])
@@ -61,24 +50,17 @@
             It works.
             """\n''')
 
-        fp = open(fp.name)
-        try:
+        with open(fp.name) as fp:
             converted = fp.read()
-        finally:
-            fp.close()
 
         self.assertEqual(expected, converted)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_additional_fixers(self):
         # used to check if use_2to3_fixers works
         code = 'type(x) is not T'
 
-        fp = self.mktempfile()
-        try:
+        with self.mktempfile() as fp:
             fp.write(code)
-        finally:
-            fp.close()
 
         mixin2to3 = Mixin2to3()
         mixin2to3._run_2to3(files=[fp.name], doctests=[fp.name],
@@ -86,11 +68,8 @@
 
         expected = 'not isinstance(x, T)'
 
-        fp = open(fp.name)
-        try:
+        with open(fp.name) as fp:
             converted = fp.read()
-        finally:
-            fp.close()
 
         self.assertEqual(expected, converted)
 
diff --git a/distutils2/tests/test_msvc9compiler.py b/distutils2/tests/test_msvc9compiler.py
--- a/distutils2/tests/test_msvc9compiler.py
+++ b/distutils2/tests/test_msvc9compiler.py
@@ -118,22 +118,16 @@
         from distutils2.compiler.msvc9compiler import MSVCCompiler
         tempdir = self.mkdtemp()
         manifest = os.path.join(tempdir, 'manifest')
-        f = open(manifest, 'w')
-        try:
+        with open(manifest, 'w') as f:
             f.write(_MANIFEST)
-        finally:
-            f.close()
 
         compiler = MSVCCompiler()
         compiler._remove_visual_c_ref(manifest)
 
         # see what we got
-        f = open(manifest)
-        try:
+        with open(manifest) as f:
             # removing trailing spaces
             content = '\n'.join(line.rstrip() for line in f.readlines())
-        finally:
-            f.close()
 
         # makes sure the manifest was properly cleaned
         self.assertEqual(content, _CLEANED_MANIFEST)
diff --git a/distutils2/tests/test_pypi_server.py b/distutils2/tests/test_pypi_server.py
--- a/distutils2/tests/test_pypi_server.py
+++ b/distutils2/tests/test_pypi_server.py
@@ -1,5 +1,7 @@
 """Tests for distutils2.command.bdist."""
-import urllib2
+import urllib.request
+import urllib.parse
+import urllib.error
 
 try:
     import threading
@@ -13,6 +15,7 @@
 from distutils2.tests import unittest
 
 
+ at unittest.skipIf(threading is None, "Needs threading")
 class PyPIServerTest(unittest.TestCase):
 
     def test_records_requests(self):
@@ -24,12 +27,13 @@
             server.start()
             self.assertEqual(len(server.requests), 0)
 
-            data = 'Rock Around The Bunker'
+            data = b'Rock Around The Bunker'
 
             headers = {"X-test-header": "Mister Iceberg"}
 
-            request = urllib2.Request(server.full_address, data, headers)
-            urllib2.urlopen(request)
+            request = urllib.request.Request(
+                server.full_address, data, headers)
+            urllib.request.urlopen(request)
             self.assertEqual(len(server.requests), 1)
             handler, request_data = server.requests[-1]
             self.assertIn(data, request_data)
@@ -48,14 +52,11 @@
             server is the same than the one made by a simple file read.
             """
             url = server.full_address + url_path
-            request = urllib2.Request(url)
-            response = urllib2.urlopen(request)
-            file = open(PYPI_DEFAULT_STATIC_PATH + "/test_pypi_server"
-                      + url_path)
-            try:
+            request = urllib.request.Request(url)
+            response = urllib.request.urlopen(request)
+            with open(PYPI_DEFAULT_STATIC_PATH + "/test_pypi_server"
+                      + url_path) as file:
                 return response.read().decode() == file.read()
-            finally:
-                file.close()
 
         server = PyPIServer(static_uri_paths=["simple", "external"],
             static_filesystem_paths=["test_pypi_server"])
@@ -63,10 +64,10 @@
         try:
             # the file does not exists on the disc, so it might not be served
             url = server.full_address + "/simple/unexisting_page"
-            request = urllib2.Request(url)
+            request = urllib.request.Request(url)
             try:
-                urllib2.urlopen(request)
-            except urllib2.HTTPError, e:
+                urllib.request.urlopen(request)
+            except urllib.error.HTTPError as e:
                 self.assertEqual(e.code, 404)
 
             # now try serving a content that do exists
@@ -80,10 +81,6 @@
             server.stop()
 
 
-PyPIServerTest = unittest.skipIf(threading is None, "Needs threading")(
-        PyPIServerTest)
-
-
 def test_suite():
     return unittest.makeSuite(PyPIServerTest)
 
diff --git a/distutils2/tests/test_pypi_simple.py b/distutils2/tests/test_pypi_simple.py
--- a/distutils2/tests/test_pypi_simple.py
+++ b/distutils2/tests/test_pypi_simple.py
@@ -2,8 +2,10 @@
 import re
 import os
 import sys
-import httplib
-import urllib2
+import http.client
+import urllib.error
+import urllib.parse
+import urllib.request
 
 from distutils2.pypi.simple import Crawler
 
@@ -12,7 +14,7 @@
                                       fake_dec)
 
 try:
-    import thread as _thread
+    import _thread
     from distutils2.tests.pypi_server import (use_pypi_server, PyPIServer,
                                               PYPI_DEFAULT_STATIC_PATH)
 except ImportError:
@@ -43,11 +45,11 @@
         url = 'http://127.0.0.1:0/nonesuch/test_simple'
         try:
             v = crawler._open_url(url)
-        except Exception, v:
+        except Exception as v:
             self.assertIn(url, str(v))
         else:
             v.close()
-            self.assertIsInstance(v, urllib2.HTTPError)
+            self.assertIsInstance(v, urllib.error.HTTPError)
 
         # issue 16
         # easy_install inquant.contentmirror.plone breaks because of a typo
@@ -57,35 +59,34 @@
                'inquant.contentmirror.plone/trunk')
         try:
             v = crawler._open_url(url)
-        except Exception, v:
+        except Exception as v:
             self.assertIn(url, str(v))
         else:
             v.close()
-            self.assertIsInstance(v, urllib2.HTTPError)
+            self.assertIsInstance(v, urllib.error.HTTPError)
 
         def _urlopen(*args):
-            raise httplib.BadStatusLine('line')
+            raise http.client.BadStatusLine('line')
 
-        old_urlopen = urllib2.urlopen
-        urllib2.urlopen = _urlopen
+        old_urlopen = urllib.request.urlopen
+        urllib.request.urlopen = _urlopen
         url = 'http://example.org'
         try:
-            try:
-                v = crawler._open_url(url)
-            except Exception, v:
-                self.assertIn('line', str(v))
-            else:
-                v.close()
-                # TODO use self.assertRaises
-                raise AssertionError('Should have raise here!')
+            v = crawler._open_url(url)
+        except Exception as v:
+            self.assertIn('line', str(v))
+        else:
+            v.close()
+            # TODO use self.assertRaises
+            raise AssertionError('Should have raise here!')
         finally:
-            urllib2.urlopen = old_urlopen
+            urllib.request.urlopen = old_urlopen
 
         # issue 20
         url = 'http://http://svn.pythonpaste.org/Paste/wphp/trunk'
         try:
             crawler._open_url(url)
-        except Exception, v:
+        except Exception as v:
             self.assertIn('nonnumeric port', str(v))
 
         # issue #160
@@ -274,22 +275,22 @@
         # Test that the simple link matcher yield the good links.
         generator = crawler._simple_link_matcher(content, crawler.index_url)
         self.assertEqual(('%stest/foobar-1.tar.gz#md5=abcdef' %
-                          crawler.index_url, True), generator.next())
-        self.assertEqual(('http://dl-link1', True), generator.next())
+                          crawler.index_url, True), next(generator))
+        self.assertEqual(('http://dl-link1', True), next(generator))
         self.assertEqual(('%stest' % crawler.index_url, False),
-                         generator.next())
-        self.assertRaises(StopIteration, generator.next)
+                         next(generator))
+        self.assertRaises(StopIteration, generator.__next__)
 
         # Follow the external links is possible (eg. homepages)
         crawler.follow_externals = True
         generator = crawler._simple_link_matcher(content, crawler.index_url)
         self.assertEqual(('%stest/foobar-1.tar.gz#md5=abcdef' %
-                          crawler.index_url, True), generator.next())
-        self.assertEqual(('http://dl-link1', True), generator.next())
-        self.assertEqual(('http://dl-link2', False), generator.next())
+                          crawler.index_url, True), next(generator))
+        self.assertEqual(('http://dl-link1', True), next(generator))
+        self.assertEqual(('http://dl-link2', False), next(generator))
         self.assertEqual(('%stest' % crawler.index_url, False),
-                         generator.next())
-        self.assertRaises(StopIteration, generator.next)
+                         next(generator))
+        self.assertRaises(StopIteration, generator.__next__)
 
     def test_browse_local_files(self):
         # Test that we can browse local files"""
diff --git a/distutils2/tests/test_pypi_xmlrpc.py b/distutils2/tests/test_pypi_xmlrpc.py
--- a/distutils2/tests/test_pypi_xmlrpc.py
+++ b/distutils2/tests/test_pypi_xmlrpc.py
@@ -13,6 +13,7 @@
     use_xmlrpc_server = fake_dec
 
 
+ at unittest.skipIf(threading is None, "Needs threading")
 class TestXMLRPCClient(unittest.TestCase):
     def _get_client(self, server, *args, **kwargs):
         return Client(server.full_address, *args, **kwargs)
@@ -91,10 +92,6 @@
         self.assertEqual(['FooFoo'], release.metadata['obsoletes_dist'])
 
 
-TestXMLRPCClient = unittest.skipIf(threading is None, "Needs threading")(
-        TestXMLRPCClient)
-
-
 def test_suite():
     suite = unittest.TestSuite()
     suite.addTest(unittest.makeSuite(TestXMLRPCClient))
diff --git a/distutils2/tests/test_run.py b/distutils2/tests/test_run.py
--- a/distutils2/tests/test_run.py
+++ b/distutils2/tests/test_run.py
@@ -2,7 +2,7 @@
 
 import os
 import sys
-from StringIO import StringIO
+from io import StringIO
 
 from distutils2 import install
 from distutils2.tests import unittest, support
@@ -71,12 +71,11 @@
         else:
             pythonpath = d2parent
 
-        status, out, err = assert_python_ok(
-            '-c', 'from distutils2.run import main; main()', '--help',
-            PYTHONPATH=pythonpath)
+        status, out, err = assert_python_ok('-m', 'distutils2.run', '--help',
+                                            PYTHONPATH=pythonpath)
         self.assertEqual(status, 0)
-        self.assertGreater(out, '')
-        self.assertEqual(err, '')
+        self.assertGreater(out, b'')
+        self.assertEqual(err, b'')
 
 
 def test_suite():
diff --git a/distutils2/tests/test_uninstall.py b/distutils2/tests/test_uninstall.py
--- a/distutils2/tests/test_uninstall.py
+++ b/distutils2/tests/test_uninstall.py
@@ -1,7 +1,7 @@
 """Tests for the uninstall command."""
 import os
 import sys
-from StringIO import StringIO
+from io import StringIO
 import stat
 import distutils2.util
 
diff --git a/distutils2/tests/test_util.py b/distutils2/tests/test_util.py
--- a/distutils2/tests/test_util.py
+++ b/distutils2/tests/test_util.py
@@ -5,7 +5,7 @@
 import logging
 import tempfile
 import subprocess
-from StringIO import StringIO
+from io import StringIO
 
 from distutils2.tests import support, unittest
 from distutils2.tests.test_config import SETUP_CFG
@@ -55,24 +55,24 @@
 """
 
 EXPECTED_MULTIPART_OUTPUT = [
-    '---x',
-    'Content-Disposition: form-data; name="username"',
-    '',
-    'wok',
-    '---x',
-    'Content-Disposition: form-data; name="password"',
-    '',
-    'secret',
-    '---x',
-    'Content-Disposition: form-data; name="picture"; filename="wok.png"',
-    '',
-    'PNG89',
-    '---x--',
-    '',
+    b'---x',
+    b'Content-Disposition: form-data; name="username"',
+    b'',
+    b'wok',
+    b'---x',
+    b'Content-Disposition: form-data; name="password"',
+    b'',
+    b'secret',
+    b'---x',
+    b'Content-Disposition: form-data; name="picture"; filename="wok.png"',
+    b'',
+    b'PNG89',
+    b'---x--',
+    b'',
 ]
 
 
-class FakePopen(object):
+class FakePopen:
     test_class = None
 
     def __init__(self, args, bufsize=0, executable=None,
@@ -82,7 +82,7 @@
                  startupinfo=None, creationflags=0,
                  restore_signals=True, start_new_session=False,
                  pass_fds=()):
-        if isinstance(args, basestring):
+        if isinstance(args, str):
             args = args.split()
         self.cmd = args[0]
         exes = self.test_class._exes
@@ -319,8 +319,6 @@
         res = get_compiler_versions()
         self.assertEqual(res[2], None)
 
-    @unittest.skipUnless(hasattr(sys, 'dont_write_bytecode'),
-                         'sys.dont_write_bytecode not supported')
     def test_dont_write_bytecode(self):
         # makes sure byte_compile raise a PackagingError
         # if sys.dont_write_bytecode is True
@@ -377,7 +375,7 @@
                                         'pkg1.pkg3.pkg6']))
 
     def test_resolve_name(self):
-        self.assertIs(str, resolve_name('__builtin__.str'))
+        self.assertIs(str, resolve_name('builtins.str'))
         self.assertEqual(
             UtilTestCase.__name__,
             resolve_name("distutils2.tests.test_util.UtilTestCase").__name__)
@@ -407,7 +405,6 @@
         finally:
             sys.path.remove(tmp_dir)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_run_2to3_on_code(self):
         content = "print 'test'"
         converted_content = "print('test')"
@@ -421,7 +418,6 @@
         file_handle.close()
         self.assertEqual(new_content, converted_content)
 
-    @unittest.skipIf(sys.version < '2.6', 'requires Python 2.6 or higher')
     def test_run_2to3_on_doctests(self):
         # to check if text files containing doctests only get converted.
         content = ">>> print 'test'\ntest\n"
@@ -448,24 +444,24 @@
         if os.name == 'posix':
             exe = os.path.join(tmpdir, 'foo.sh')
             self.write_file(exe, '#!/bin/sh\nexit 1')
-            os.chmod(exe, 0777)
+            os.chmod(exe, 0o777)
         else:
             exe = os.path.join(tmpdir, 'foo.bat')
             self.write_file(exe, 'exit 1')
 
-        os.chmod(exe, 0777)
+        os.chmod(exe, 0o777)
         self.assertRaises(PackagingExecError, spawn, [exe])
 
         # now something that works
         if os.name == 'posix':
             exe = os.path.join(tmpdir, 'foo.sh')
             self.write_file(exe, '#!/bin/sh\nexit 0')
-            os.chmod(exe, 0777)
+            os.chmod(exe, 0o777)
         else:
             exe = os.path.join(tmpdir, 'foo.bat')
             self.write_file(exe, 'exit 0')
 
-        os.chmod(exe, 0777)
+        os.chmod(exe, 0o777)
         spawn([exe])  # should work without any error
 
     def test_server_registration(self):
@@ -497,11 +493,8 @@
         self.assertFalse(os.path.exists(rc))
         generate_pypirc('tarek', 'xxx')
         self.assertTrue(os.path.exists(rc))
-        f = open(rc)
-        try:
+        with open(rc) as f:
             content = f.read()
-        finally:
-            f.close()
         self.assertEqual(content, WANTED)
 
     def test_cfg_to_args(self):
@@ -538,10 +531,10 @@
 
     def test_encode_multipart(self):
         fields = [('username', 'wok'), ('password', 'secret')]
-        files = [('picture', 'wok.png', 'PNG89')]
-        content_type, body = encode_multipart(fields, files, '-x')
-        self.assertEqual('multipart/form-data; boundary=-x', content_type)
-        self.assertEqual(EXPECTED_MULTIPART_OUTPUT, body.split('\r\n'))
+        files = [('picture', 'wok.png', b'PNG89')]
+        content_type, body = encode_multipart(fields, files, b'-x')
+        self.assertEqual(b'multipart/form-data; boundary=-x', content_type)
+        self.assertEqual(EXPECTED_MULTIPART_OUTPUT, body.split(b'\r\n'))
 
 
 class GlobTestCaseBase(support.TempdirManager,
@@ -787,21 +780,16 @@
             dir_paths.append(path)
         for f in files:
             path = os.path.join(tempdir, f)
-            _f = open(path, 'w')
-            try:
+            with open(path, 'w') as _f:
                 _f.write(f)
-            finally:
-                _f.close()
             file_paths.append(path)
 
-        record_file = open(record_file_path, 'w')
-        try:
+        with open(record_file_path, 'w') as record_file:
             for fpath in file_paths:
                 record_file.write(fpath + '\n')
             for dpath in dir_paths:
                 record_file.write(dpath + '\n')
-        finally:
-            record_file.close()
+
         return (tempdir, record_file_path)
 
 
diff --git a/distutils2/tests/xxmodule.c b/distutils2/tests/xxmodule.c
--- a/distutils2/tests/xxmodule.c
+++ b/distutils2/tests/xxmodule.c
@@ -10,7 +10,7 @@
    your own types of attributes instead.  Maybe you want to name your
    local variables other than 'self'.  If your object type is needed in
    other files, you'll have to create a file "foobarobject.h"; see
-   intobject.h for an example. */
+   floatobject.h for an example. */
 
 /* Xxo objects */
 
@@ -19,23 +19,23 @@
 static PyObject *ErrorObject;
 
 typedef struct {
-	PyObject_HEAD
-	PyObject	*x_attr;	/* Attributes dictionary */
+    PyObject_HEAD
+    PyObject            *x_attr;        /* Attributes dictionary */
 } XxoObject;
 
 static PyTypeObject Xxo_Type;
 
-#define XxoObject_Check(v)	(Py_TYPE(v) == &Xxo_Type)
+#define XxoObject_Check(v)      (Py_TYPE(v) == &Xxo_Type)
 
 static XxoObject *
 newXxoObject(PyObject *arg)
 {
-	XxoObject *self;
-	self = PyObject_New(XxoObject, &Xxo_Type);
-	if (self == NULL)
-		return NULL;
-	self->x_attr = NULL;
-	return self;
+    XxoObject *self;
+    self = PyObject_New(XxoObject, &Xxo_Type);
+    if (self == NULL)
+        return NULL;
+    self->x_attr = NULL;
+    return self;
 }
 
 /* Xxo methods */
@@ -43,101 +43,101 @@
 static void
 Xxo_dealloc(XxoObject *self)
 {
-	Py_XDECREF(self->x_attr);
-	PyObject_Del(self);
+    Py_XDECREF(self->x_attr);
+    PyObject_Del(self);
 }
 
 static PyObject *
 Xxo_demo(XxoObject *self, PyObject *args)
 {
-	if (!PyArg_ParseTuple(args, ":demo"))
-		return NULL;
-	Py_INCREF(Py_None);
-	return Py_None;
+    if (!PyArg_ParseTuple(args, ":demo"))
+        return NULL;
+    Py_INCREF(Py_None);
+    return Py_None;
 }
 
 static PyMethodDef Xxo_methods[] = {
-	{"demo",	(PyCFunction)Xxo_demo,	METH_VARARGS,
-		PyDoc_STR("demo() -> None")},
-	{NULL,		NULL}		/* sentinel */
+    {"demo",            (PyCFunction)Xxo_demo,  METH_VARARGS,
+        PyDoc_STR("demo() -> None")},
+    {NULL,              NULL}           /* sentinel */
 };
 
 static PyObject *
-Xxo_getattr(XxoObject *self, char *name)
+Xxo_getattro(XxoObject *self, PyObject *name)
 {
-	if (self->x_attr != NULL) {
-		PyObject *v = PyDict_GetItemString(self->x_attr, name);
-		if (v != NULL) {
-			Py_INCREF(v);
-			return v;
-		}
-	}
-	return Py_FindMethod(Xxo_methods, (PyObject *)self, name);
+    if (self->x_attr != NULL) {
+        PyObject *v = PyDict_GetItem(self->x_attr, name);
+        if (v != NULL) {
+            Py_INCREF(v);
+            return v;
+        }
+    }
+    return PyObject_GenericGetAttr((PyObject *)self, name);
 }
 
 static int
 Xxo_setattr(XxoObject *self, char *name, PyObject *v)
 {
-	if (self->x_attr == NULL) {
-		self->x_attr = PyDict_New();
-		if (self->x_attr == NULL)
-			return -1;
-	}
-	if (v == NULL) {
-		int rv = PyDict_DelItemString(self->x_attr, name);
-		if (rv < 0)
-			PyErr_SetString(PyExc_AttributeError,
-			        "delete non-existing Xxo attribute");
-		return rv;
-	}
-	else
-		return PyDict_SetItemString(self->x_attr, name, v);
+    if (self->x_attr == NULL) {
+        self->x_attr = PyDict_New();
+        if (self->x_attr == NULL)
+            return -1;
+    }
+    if (v == NULL) {
+        int rv = PyDict_DelItemString(self->x_attr, name);
+        if (rv < 0)
+            PyErr_SetString(PyExc_AttributeError,
+                "delete non-existing Xxo attribute");
+        return rv;
+    }
+    else
+        return PyDict_SetItemString(self->x_attr, name, v);
 }
 
 static PyTypeObject Xxo_Type = {
-	/* The ob_type field must be initialized in the module init function
-	 * to be portable to Windows without using C++. */
-	PyVarObject_HEAD_INIT(NULL, 0)
-	"xxmodule.Xxo",		/*tp_name*/
-	sizeof(XxoObject),	/*tp_basicsize*/
-	0,			/*tp_itemsize*/
-	/* methods */
-	(destructor)Xxo_dealloc, /*tp_dealloc*/
-	0,			/*tp_print*/
-	(getattrfunc)Xxo_getattr, /*tp_getattr*/
-	(setattrfunc)Xxo_setattr, /*tp_setattr*/
-	0,			/*tp_compare*/
-	0,			/*tp_repr*/
-	0,			/*tp_as_number*/
-	0,			/*tp_as_sequence*/
-	0,			/*tp_as_mapping*/
-	0,			/*tp_hash*/
-        0,                      /*tp_call*/
-        0,                      /*tp_str*/
-        0,                      /*tp_getattro*/
-        0,                      /*tp_setattro*/
-        0,                      /*tp_as_buffer*/
-        Py_TPFLAGS_DEFAULT,     /*tp_flags*/
-        0,                      /*tp_doc*/
-        0,                      /*tp_traverse*/
-        0,                      /*tp_clear*/
-        0,                      /*tp_richcompare*/
-        0,                      /*tp_weaklistoffset*/
-        0,                      /*tp_iter*/
-        0,                      /*tp_iternext*/
-        0,                      /*tp_methods*/
-        0,                      /*tp_members*/
-        0,                      /*tp_getset*/
-        0,                      /*tp_base*/
-        0,                      /*tp_dict*/
-        0,                      /*tp_descr_get*/
-        0,                      /*tp_descr_set*/
-        0,                      /*tp_dictoffset*/
-        0,                      /*tp_init*/
-        0,                      /*tp_alloc*/
-        0,                      /*tp_new*/
-        0,                      /*tp_free*/
-        0,                      /*tp_is_gc*/
+    /* The ob_type field must be initialized in the module init function
+     * to be portable to Windows without using C++. */
+    PyVarObject_HEAD_INIT(NULL, 0)
+    "xxmodule.Xxo",             /*tp_name*/
+    sizeof(XxoObject),          /*tp_basicsize*/
+    0,                          /*tp_itemsize*/
+    /* methods */
+    (destructor)Xxo_dealloc, /*tp_dealloc*/
+    0,                          /*tp_print*/
+    (getattrfunc)0,         /*tp_getattr*/
+    (setattrfunc)Xxo_setattr, /*tp_setattr*/
+    0,                          /*tp_reserved*/
+    0,                          /*tp_repr*/
+    0,                          /*tp_as_number*/
+    0,                          /*tp_as_sequence*/
+    0,                          /*tp_as_mapping*/
+    0,                          /*tp_hash*/
+    0,                      /*tp_call*/
+    0,                      /*tp_str*/
+    (getattrofunc)Xxo_getattro, /*tp_getattro*/
+    0,                      /*tp_setattro*/
+    0,                      /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT,     /*tp_flags*/
+    0,                      /*tp_doc*/
+    0,                      /*tp_traverse*/
+    0,                      /*tp_clear*/
+    0,                      /*tp_richcompare*/
+    0,                      /*tp_weaklistoffset*/
+    0,                      /*tp_iter*/
+    0,                      /*tp_iternext*/
+    Xxo_methods,            /*tp_methods*/
+    0,                      /*tp_members*/
+    0,                      /*tp_getset*/
+    0,                      /*tp_base*/
+    0,                      /*tp_dict*/
+    0,                      /*tp_descr_get*/
+    0,                      /*tp_descr_set*/
+    0,                      /*tp_dictoffset*/
+    0,                      /*tp_init*/
+    0,                      /*tp_alloc*/
+    0,                      /*tp_new*/
+    0,                      /*tp_free*/
+    0,                      /*tp_is_gc*/
 };
 /* --------------------------------------------------------------------- */
 
@@ -151,12 +151,12 @@
 static PyObject *
 xx_foo(PyObject *self, PyObject *args)
 {
-	long i, j;
-	long res;
-	if (!PyArg_ParseTuple(args, "ll:foo", &i, &j))
-		return NULL;
-	res = i+j; /* XXX Do something here */
-	return PyInt_FromLong(res);
+    long i, j;
+    long res;
+    if (!PyArg_ParseTuple(args, "ll:foo", &i, &j))
+        return NULL;
+    res = i+j; /* XXX Do something here */
+    return PyLong_FromLong(res);
 }
 
 
@@ -165,14 +165,14 @@
 static PyObject *
 xx_new(PyObject *self, PyObject *args)
 {
-	XxoObject *rv;
+    XxoObject *rv;
 
-	if (!PyArg_ParseTuple(args, ":new"))
-		return NULL;
-	rv = newXxoObject(args);
-	if (rv == NULL)
-		return NULL;
-	return (PyObject *)rv;
+    if (!PyArg_ParseTuple(args, ":new"))
+        return NULL;
+    rv = newXxoObject(args);
+    if (rv == NULL)
+        return NULL;
+    return (PyObject *)rv;
 }
 
 /* Example with subtle bug from extensions manual ("Thin Ice"). */
@@ -180,20 +180,20 @@
 static PyObject *
 xx_bug(PyObject *self, PyObject *args)
 {
-	PyObject *list, *item;
+    PyObject *list, *item;
 
-	if (!PyArg_ParseTuple(args, "O:bug", &list))
-		return NULL;
+    if (!PyArg_ParseTuple(args, "O:bug", &list))
+        return NULL;
 
-	item = PyList_GetItem(list, 0);
-	/* Py_INCREF(item); */
-	PyList_SetItem(list, 1, PyInt_FromLong(0L));
-	PyObject_Print(item, stdout, 0);
-	printf("\n");
-	/* Py_DECREF(item); */
+    item = PyList_GetItem(list, 0);
+    /* Py_INCREF(item); */
+    PyList_SetItem(list, 1, PyLong_FromLong(0L));
+    PyObject_Print(item, stdout, 0);
+    printf("\n");
+    /* Py_DECREF(item); */
 
-	Py_INCREF(Py_None);
-	return Py_None;
+    Py_INCREF(Py_None);
+    return Py_None;
 }
 
 /* Test bad format character */
@@ -201,61 +201,61 @@
 static PyObject *
 xx_roj(PyObject *self, PyObject *args)
 {
-	PyObject *a;
-	long b;
-	if (!PyArg_ParseTuple(args, "O#:roj", &a, &b))
-		return NULL;
-	Py_INCREF(Py_None);
-	return Py_None;
+    PyObject *a;
+    long b;
+    if (!PyArg_ParseTuple(args, "O#:roj", &a, &b))
+        return NULL;
+    Py_INCREF(Py_None);
+    return Py_None;
 }
 
 
 /* ---------- */
 
 static PyTypeObject Str_Type = {
-	/* The ob_type field must be initialized in the module init function
-	 * to be portable to Windows without using C++. */
-	PyVarObject_HEAD_INIT(NULL, 0)
-	"xxmodule.Str",		/*tp_name*/
-	0,			/*tp_basicsize*/
-	0,			/*tp_itemsize*/
-	/* methods */
-	0,			/*tp_dealloc*/
-	0,			/*tp_print*/
-	0,			/*tp_getattr*/
-	0,			/*tp_setattr*/
-	0,			/*tp_compare*/
-	0,			/*tp_repr*/
-	0,			/*tp_as_number*/
-	0,			/*tp_as_sequence*/
-	0,			/*tp_as_mapping*/
-	0,			/*tp_hash*/
-	0,			/*tp_call*/
-	0,			/*tp_str*/
-	0,			/*tp_getattro*/
-	0,			/*tp_setattro*/
-	0,			/*tp_as_buffer*/
-	Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
-	0,			/*tp_doc*/
-	0,			/*tp_traverse*/
-	0,			/*tp_clear*/
-	0,			/*tp_richcompare*/
-	0,			/*tp_weaklistoffset*/
-	0,			/*tp_iter*/
-	0,			/*tp_iternext*/
-	0,			/*tp_methods*/
-	0,			/*tp_members*/
-	0,			/*tp_getset*/
-	0, /* see initxx */	/*tp_base*/
-	0,			/*tp_dict*/
-	0,			/*tp_descr_get*/
-	0,			/*tp_descr_set*/
-	0,			/*tp_dictoffset*/
-	0,			/*tp_init*/
-	0,			/*tp_alloc*/
-	0,			/*tp_new*/
-	0,			/*tp_free*/
-	0,			/*tp_is_gc*/
+    /* The ob_type field must be initialized in the module init function
+     * to be portable to Windows without using C++. */
+    PyVarObject_HEAD_INIT(NULL, 0)
+    "xxmodule.Str",             /*tp_name*/
+    0,                          /*tp_basicsize*/
+    0,                          /*tp_itemsize*/
+    /* methods */
+    0,                          /*tp_dealloc*/
+    0,                          /*tp_print*/
+    0,                          /*tp_getattr*/
+    0,                          /*tp_setattr*/
+    0,                          /*tp_reserved*/
+    0,                          /*tp_repr*/
+    0,                          /*tp_as_number*/
+    0,                          /*tp_as_sequence*/
+    0,                          /*tp_as_mapping*/
+    0,                          /*tp_hash*/
+    0,                          /*tp_call*/
+    0,                          /*tp_str*/
+    0,                          /*tp_getattro*/
+    0,                          /*tp_setattro*/
+    0,                          /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
+    0,                          /*tp_doc*/
+    0,                          /*tp_traverse*/
+    0,                          /*tp_clear*/
+    0,                          /*tp_richcompare*/
+    0,                          /*tp_weaklistoffset*/
+    0,                          /*tp_iter*/
+    0,                          /*tp_iternext*/
+    0,                          /*tp_methods*/
+    0,                          /*tp_members*/
+    0,                          /*tp_getset*/
+    0, /* see PyInit_xx */      /*tp_base*/
+    0,                          /*tp_dict*/
+    0,                          /*tp_descr_get*/
+    0,                          /*tp_descr_set*/
+    0,                          /*tp_dictoffset*/
+    0,                          /*tp_init*/
+    0,                          /*tp_alloc*/
+    0,                          /*tp_new*/
+    0,                          /*tp_free*/
+    0,                          /*tp_is_gc*/
 };
 
 /* ---------- */
@@ -263,54 +263,54 @@
 static PyObject *
 null_richcompare(PyObject *self, PyObject *other, int op)
 {
-	Py_INCREF(Py_NotImplemented);
-	return Py_NotImplemented;
+    Py_INCREF(Py_NotImplemented);
+    return Py_NotImplemented;
 }
 
 static PyTypeObject Null_Type = {
-	/* The ob_type field must be initialized in the module init function
-	 * to be portable to Windows without using C++. */
-	PyVarObject_HEAD_INIT(NULL, 0)
-	"xxmodule.Null",	/*tp_name*/
-	0,			/*tp_basicsize*/
-	0,			/*tp_itemsize*/
-	/* methods */
-	0,			/*tp_dealloc*/
-	0,			/*tp_print*/
-	0,			/*tp_getattr*/
-	0,			/*tp_setattr*/
-	0,			/*tp_compare*/
-	0,			/*tp_repr*/
-	0,			/*tp_as_number*/
-	0,			/*tp_as_sequence*/
-	0,			/*tp_as_mapping*/
-	0,			/*tp_hash*/
-	0,			/*tp_call*/
-	0,			/*tp_str*/
-	0,			/*tp_getattro*/
-	0,			/*tp_setattro*/
-	0,			/*tp_as_buffer*/
-	Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
-	0,			/*tp_doc*/
-	0,			/*tp_traverse*/
-	0,			/*tp_clear*/
-	null_richcompare,	/*tp_richcompare*/
-	0,			/*tp_weaklistoffset*/
-	0,			/*tp_iter*/
-	0,			/*tp_iternext*/
-	0,			/*tp_methods*/
-	0,			/*tp_members*/
-	0,			/*tp_getset*/
-	0, /* see initxx */	/*tp_base*/
-	0,			/*tp_dict*/
-	0,			/*tp_descr_get*/
-	0,			/*tp_descr_set*/
-	0,			/*tp_dictoffset*/
-	0,			/*tp_init*/
-	0,			/*tp_alloc*/
-	0, /* see initxx */	/*tp_new*/
-	0,			/*tp_free*/
-	0,			/*tp_is_gc*/
+    /* The ob_type field must be initialized in the module init function
+     * to be portable to Windows without using C++. */
+    PyVarObject_HEAD_INIT(NULL, 0)
+    "xxmodule.Null",            /*tp_name*/
+    0,                          /*tp_basicsize*/
+    0,                          /*tp_itemsize*/
+    /* methods */
+    0,                          /*tp_dealloc*/
+    0,                          /*tp_print*/
+    0,                          /*tp_getattr*/
+    0,                          /*tp_setattr*/
+    0,                          /*tp_reserved*/
+    0,                          /*tp_repr*/
+    0,                          /*tp_as_number*/
+    0,                          /*tp_as_sequence*/
+    0,                          /*tp_as_mapping*/
+    0,                          /*tp_hash*/
+    0,                          /*tp_call*/
+    0,                          /*tp_str*/
+    0,                          /*tp_getattro*/
+    0,                          /*tp_setattro*/
+    0,                          /*tp_as_buffer*/
+    Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
+    0,                          /*tp_doc*/
+    0,                          /*tp_traverse*/
+    0,                          /*tp_clear*/
+    null_richcompare,           /*tp_richcompare*/
+    0,                          /*tp_weaklistoffset*/
+    0,                          /*tp_iter*/
+    0,                          /*tp_iternext*/
+    0,                          /*tp_methods*/
+    0,                          /*tp_members*/
+    0,                          /*tp_getset*/
+    0, /* see PyInit_xx */      /*tp_base*/
+    0,                          /*tp_dict*/
+    0,                          /*tp_descr_get*/
+    0,                          /*tp_descr_set*/
+    0,                          /*tp_dictoffset*/
+    0,                          /*tp_init*/
+    0,                          /*tp_alloc*/
+    0, /* see PyInit_xx */      /*tp_new*/
+    0,                          /*tp_free*/
+    0,                          /*tp_is_gc*/
 };
 
 
@@ -320,60 +320,77 @@
 /* List of functions defined in the module */
 
 static PyMethodDef xx_methods[] = {
-	{"roj",		xx_roj,		METH_VARARGS,
-		PyDoc_STR("roj(a,b) -> None")},
-	{"foo",		xx_foo,		METH_VARARGS,
-	 	xx_foo_doc},
-	{"new",		xx_new,		METH_VARARGS,
-		PyDoc_STR("new() -> new Xx object")},
-	{"bug",		xx_bug,		METH_VARARGS,
-		PyDoc_STR("bug(o) -> None")},
-	{NULL,		NULL}		/* sentinel */
+    {"roj",             xx_roj,         METH_VARARGS,
+        PyDoc_STR("roj(a,b) -> None")},
+    {"foo",             xx_foo,         METH_VARARGS,
+        xx_foo_doc},
+    {"new",             xx_new,         METH_VARARGS,
+        PyDoc_STR("new() -> new Xx object")},
+    {"bug",             xx_bug,         METH_VARARGS,
+        PyDoc_STR("bug(o) -> None")},
+    {NULL,              NULL}           /* sentinel */
 };
 
 PyDoc_STRVAR(module_doc,
 "This is a template module just for instruction.");
 
-/* Initialization function for the module (*must* be called initxx) */
+/* Initialization function for the module (*must* be called PyInit_xx) */
+
+
+static struct PyModuleDef xxmodule = {
+    PyModuleDef_HEAD_INIT,
+    "xx",
+    module_doc,
+    -1,
+    xx_methods,
+    NULL,
+    NULL,
+    NULL,
+    NULL
+};
 
 PyMODINIT_FUNC
-initxx(void)
+PyInit_xx(void)
 {
-	PyObject *m;
+    PyObject *m = NULL;
 
-	/* Due to cross platform compiler issues the slots must be filled
-	 * here. It's required for portability to Windows without requiring
-	 * C++. */
-	Null_Type.tp_base = &PyBaseObject_Type;
-	Null_Type.tp_new = PyType_GenericNew;
-	Str_Type.tp_base = &PyUnicode_Type;
+    /* Due to cross platform compiler issues the slots must be filled
+     * here. It's required for portability to Windows without requiring
+     * C++. */
+    Null_Type.tp_base = &PyBaseObject_Type;
+    Null_Type.tp_new = PyType_GenericNew;
+    Str_Type.tp_base = &PyUnicode_Type;
 
-	/* Finalize the type object including setting type of the new type
-	 * object; doing it here is required for portability, too. */
-	if (PyType_Ready(&Xxo_Type) < 0)
-		return;
+    /* Finalize the type object including setting type of the new type
+     * object; doing it here is required for portability, too. */
+    if (PyType_Ready(&Xxo_Type) < 0)
+        goto fail;
 
-	/* Create the module and add the functions */
-	m = Py_InitModule3("xx", xx_methods, module_doc);
-	if (m == NULL)
-		return;
+    /* Create the module and add the functions */
+    m = PyModule_Create(&xxmodule);
+    if (m == NULL)
+        goto fail;
 
-	/* Add some symbolic constants to the module */
-	if (ErrorObject == NULL) {
-		ErrorObject = PyErr_NewException("xx.error", NULL, NULL);
-		if (ErrorObject == NULL)
-			return;
-	}
-	Py_INCREF(ErrorObject);
-	PyModule_AddObject(m, "error", ErrorObject);
+    /* Add some symbolic constants to the module */
+    if (ErrorObject == NULL) {
+        ErrorObject = PyErr_NewException("xx.error", NULL, NULL);
+        if (ErrorObject == NULL)
+            goto fail;
+    }
+    Py_INCREF(ErrorObject);
+    PyModule_AddObject(m, "error", ErrorObject);
 
-	/* Add Str */
-	if (PyType_Ready(&Str_Type) < 0)
-		return;
-	PyModule_AddObject(m, "Str", (PyObject *)&Str_Type);
+    /* Add Str */
+    if (PyType_Ready(&Str_Type) < 0)
+        goto fail;
+    PyModule_AddObject(m, "Str", (PyObject *)&Str_Type);
 
-	/* Add Null */
-	if (PyType_Ready(&Null_Type) < 0)
-		return;
-	PyModule_AddObject(m, "Null", (PyObject *)&Null_Type);
+    /* Add Null */
+    if (PyType_Ready(&Null_Type) < 0)
+        goto fail;
+    PyModule_AddObject(m, "Null", (PyObject *)&Null_Type);
+    return m;
+ fail:
+    Py_XDECREF(m);
+    return NULL;
 }
diff --git a/distutils2/util.py b/distutils2/util.py
--- a/distutils2/util.py
+++ b/distutils2/util.py
@@ -5,22 +5,15 @@
 import csv
 import sys
 import errno
-import codecs
 import shutil
 import string
+import hashlib
 import posixpath
 import subprocess
+from glob import iglob as std_iglob
 from fnmatch import fnmatchcase
 from inspect import getsource
-from ConfigParser import RawConfigParser
-try:
-    from glob import iglob as std_iglob
-except ImportError:
-    from glob import glob as std_iglob
-try:
-    import hashlib
-except ImportError:
-    from distutils2._backport import hashlib
+from configparser import RawConfigParser
 
 from distutils2 import logger
 from distutils2.errors import (PackagingPlatformError, PackagingFileError,
@@ -333,7 +326,7 @@
     """
     # nothing is done if sys.dont_write_bytecode is True
     # FIXME this should not raise an error
-    if getattr(sys, 'dont_write_bytecode', False):
+    if sys.dont_write_bytecode:
         raise PackagingByteCompileError('byte-compiling is disabled.')
 
     # First, if the caller didn't force us into direct or indirect mode,
@@ -362,9 +355,9 @@
             if script_fd is not None:
                 script = os.fdopen(script_fd, "w", encoding='utf-8')
             else:
-                script = codecs.open(script_name, "w", encoding='utf-8')
+                script = open(script_name, "w", encoding='utf-8')
 
-            try:
+            with script:
                 script.write("""\
 from distutils2.util import byte_compile
 files = [
@@ -391,8 +384,6 @@
              verbose=%r, dry_run=False,
              direct=True)
 """ % (optimize, force, prefix, base_dir, verbose))
-            finally:
-                script.close()
 
         cmd = [sys.executable, script_name]
         if optimize == 1:
@@ -549,12 +540,10 @@
     idiom in all cases, only with Command.execute, which runs depending on
     the dry_run argument and also logs its arguments).
     """
-    f = open(filename, "w")
-    try:
+    with open(filename, "w") as f:
         for line in contents:
             f.write(line + "\n")
-    finally:
-        f.close()
+
 
 def _is_package(path):
     return os.path.isdir(path) and os.path.isfile(
@@ -657,7 +646,7 @@
     for part in parts[1:]:
         try:
             ret = getattr(ret, part)
-        except AttributeError, exc:
+        except AttributeError as exc:
             raise ImportError(exc)
 
     return ret
@@ -773,13 +762,10 @@
 def generate_pypirc(username, password):
     """Create a default .pypirc file."""
     rc = get_pypirc_path()
-    f = open(rc, 'w')
+    with open(rc, 'w') as f:
+        f.write(DEFAULT_PYPIRC % (username, password))
     try:
-        f.write(DEFAULT_PYPIRC % (username, password))
-    finally:
-        f.close()
-    try:
-        os.chmod(rc, 0600)
+        os.chmod(rc, 0o600)
     except OSError:
         # should do something better here
         pass
@@ -863,7 +849,7 @@
     r.refactor(files, write=True, doctests_only=doctests_only)
 
 
-class Mixin2to3(object):
+class Mixin2to3:
     """ Wrapper class for commands that run 2to3.
     To configure 2to3, setup scripts may either change
     the class variables, or inherit from this class
@@ -986,11 +972,8 @@
     if not os.path.exists(path):
         raise PackagingFileError("file '%s' does not exist" %
                                  os.path.abspath(path))
-    f = codecs.open(path, encoding='utf-8')
-    try:
+    with open(path, encoding='utf-8') as f:
         config.readfp(f)
-    finally:
-        f.close()
 
     kwargs = {}
     for arg in D1_D2_SETUP_ARGS:
@@ -1012,11 +995,8 @@
                     filenames = split_multiline(filenames)
                     in_cfg_value = []
                     for filename in filenames:
-                        fp = open(filename)
-                        try:
+                        with open(filename) as fp:
                             in_cfg_value.append(fp.read())
-                        finally:
-                            fp.close()
                     in_cfg_value = '\n\n'.join(in_cfg_value)
             else:
                 continue
@@ -1053,11 +1033,8 @@
     if os.path.exists("setup.py"):
         raise PackagingFileError("a setup.py file already exists")
 
-    fp = codecs.open("setup.py", "w", encoding='utf-8')
-    try:
+    with open("setup.py", "w", encoding='utf-8') as fp:
         fp.write(_SETUP_TMPL % {'func': getsource(cfg_to_args)})
-    finally:
-        fp.close()
 
 
 # Taken from the pip project
@@ -1065,19 +1042,18 @@
 def ask(message, options):
     """Prompt the user with *message*; *options* contains allowed responses."""
     while True:
-        response = raw_input(message)
+        response = input(message)
         response = response.strip().lower()
         if response not in options:
-            print 'invalid response:', repr(response)
-            print 'choose one of', ', '.join(repr(o) for o in options)
+            print('invalid response:', repr(response))
+            print('choose one of', ', '.join(repr(o) for o in options))
         else:
             return response
 
 
 def _parse_record_file(record_file):
     distinfo, extra_metadata, installed = ({}, [], [])
-    rfile = open(record_file, 'r')
-    try:
+    with open(record_file, 'r') as rfile:
         for path in rfile:
             path = path.strip()
             if path.endswith('egg-info') and os.path.isfile(path):
@@ -1098,8 +1074,6 @@
                 continue
             else:
                 installed.append(path)
-    finally:
-        rfile.close()
 
     distinfo['egginfo'] = egginfo
     distinfo['metadata'] = metadata
@@ -1115,8 +1089,7 @@
 
 
 def _write_record_file(record_path, installed_files):
-    f = codecs.open(record_path, 'w', encoding='utf-8')
-    try:
+    with open(record_path, 'w', encoding='utf-8') as f:
         writer = csv.writer(f, delimiter=',', lineterminator=os.linesep,
                             quotechar='"')
 
@@ -1126,19 +1099,14 @@
                 writer.writerow((fpath, '', ''))
             else:
                 hash = hashlib.md5()
-                fp = open(fpath, 'rb')
-                try:
+                with open(fpath, 'rb') as fp:
                     hash.update(fp.read())
-                finally:
-                    fp.close()
                 md5sum = hash.hexdigest()
                 size = os.path.getsize(fpath)
                 writer.writerow((fpath, md5sum, size))
 
         # add the RECORD file itself
         writer.writerow((record_path, '', ''))
-    finally:
-        f.close()
     return record_path
 
 
@@ -1173,11 +1141,8 @@
 
     installer_path = distinfo['installer_path']
     logger.info('creating %s', installer_path)
-    f = open(installer_path, 'w')
-    try:
+    with open(installer_path, 'w') as f:
         f.write(installer)
-    finally:
-        f.close()
 
     if requested:
         requested_path = distinfo['requested_path']
@@ -1218,27 +1183,20 @@
 
 
 def _has_text(setup_py, installer):
-    installer_pattern = re.compile('import %s|from %s' %
-                                   (installer, installer))
-    setup = codecs.open(setup_py, 'r', encoding='utf-8')
-    try:
+    installer_pattern = re.compile('import {0}|from {0}'.format(installer))
+    with open(setup_py, 'r', encoding='utf-8') as setup:
         for line in setup:
             if re.search(installer_pattern, line):
                 logger.debug("Found %s text in setup.py.", installer)
                 return True
-    finally:
-        setup.close()
     logger.debug("No %s text found in setup.py.", installer)
     return False
 
 
 def _has_required_metadata(setup_cfg):
     config = RawConfigParser()
-    f = codecs.open(setup_cfg, 'r', encoding='utf8')
-    try:
+    with open(setup_cfg, 'r', encoding='utf8') as f:
         config.readfp(f)
-    finally:
-        f.close()
     return (config.has_section('metadata') and
             'name' in config.options('metadata') and
             'version' in config.options('metadata'))
@@ -1341,7 +1299,7 @@
               "cannot copy tree '%s': not a directory" % src)
     try:
         names = os.listdir(src)
-    except os.error, e:
+    except os.error as e:
         errstr = e[1]
         if dry_run:
             names = []
@@ -1387,9 +1345,9 @@
 # I don't use os.makedirs because a) it's new to Python 1.5.2, and
 # b) it blows up if the directory already exists (I want to silently
 # succeed in that case).
-def _mkpath(name, mode=0777, verbose=True, dry_run=False):
+def _mkpath(name, mode=0o777, verbose=True, dry_run=False):
     # Detect a common bug -- name is None
-    if not isinstance(name, basestring):
+    if not isinstance(name, str):
         raise PackagingInternalError(
               "mkpath: 'name' must be a string (got %r)" % (name,))
 
@@ -1428,7 +1386,7 @@
         if not dry_run:
             try:
                 os.mkdir(head, mode)
-            except OSError, exc:
+            except OSError as exc:
                 if not (exc.errno == errno.EEXIST and os.path.isdir(head)):
                     raise PackagingFileError(
                           "could not create '%s': %s" % (head, exc.args[-1]))
@@ -1445,15 +1403,15 @@
     form fields, *files* is a sequence of (name: str, filename: str, value:
     bytes) elements for data to be uploaded as files.
 
-    Returns (content_type: bytes, body: bytes) ready for httplib.HTTP.
+    Returns (content_type: bytes, body: bytes) ready for http.client.HTTP.
     """
     # Taken from
     # http://code.activestate.com/recipes/146306-http-client-to-post-using-multipartform-data/
 
     if boundary is None:
-        boundary = '--------------GHSKFJDLGDS7543FJKLFHRE75642756743254'
-    elif not isinstance(boundary, str):
-        raise TypeError('boundary must be str, not %r' % type(boundary))
+        boundary = b'--------------GHSKFJDLGDS7543FJKLFHRE75642756743254'
+    elif not isinstance(boundary, bytes):
+        raise TypeError('boundary must be bytes, not %r' % type(boundary))
 
     l = []
     for key, values in fields:
@@ -1463,21 +1421,23 @@
 
         for value in values:
             l.extend((
-                '--' + boundary,
-                # XXX should encode to match packaging but it causes bugs
-                ('Content-Disposition: form-data; name="%s"' % key), '', value))
+                b'--' + boundary,
+                ('Content-Disposition: form-data; name="%s"' %
+                 key).encode('utf-8'),
+                b'',
+                value.encode('utf-8')))
 
     for key, filename, value in files:
         l.extend((
-            '--' + boundary,
+            b'--' + boundary,
             ('Content-Disposition: form-data; name="%s"; filename="%s"' %
-             (key, filename)),
-            '',
+             (key, filename)).encode('utf-8'),
+            b'',
             value))
 
-    l.append('--' + boundary + '--')
-    l.append('')
+    l.append(b'--' + boundary + b'--')
+    l.append(b'')
 
-    body = '\r\n'.join(l)
-    content_type = 'multipart/form-data; boundary=' + boundary
+    body = b'\r\n'.join(l)
+    content_type = b'multipart/form-data; boundary=' + boundary
     return content_type, body
diff --git a/distutils2/version.py b/distutils2/version.py
--- a/distutils2/version.py
+++ b/distutils2/version.py
@@ -38,7 +38,7 @@
     $''', re.VERBOSE)
 
 
-class NormalizedVersion(object):
+class NormalizedVersion:
     """A rational version.
 
     Good:
@@ -342,7 +342,7 @@
     return comp, NormalizedVersion(version)
 
 
-class VersionPredicate(object):
+class VersionPredicate:
     """Defines a predicate: ProjectName (>ver1,ver2, ..)"""
 
     _operators = {"<": lambda x, y: x < y,
@@ -384,7 +384,7 @@
 
     def match(self, version):
         """Check if the provided version matches the predicates."""
-        if isinstance(version, basestring):
+        if isinstance(version, str):
             version = NormalizedVersion(version)
         for operator, predicate in self.predicates:
             if not self._operators[operator](version, predicate):
@@ -444,6 +444,6 @@
     """Return a VersionPredicate object, from a string or an already
     existing object.
     """
-    if isinstance(requirements, basestring):
+    if isinstance(requirements, str):
         requirements = VersionPredicate(requirements)
     return requirements
diff --git a/runtests.py b/runtests.py
--- a/runtests.py
+++ b/runtests.py
@@ -130,10 +130,4 @@
 
 
 if __name__ == "__main__":
-    if sys.version < '2.5':
-        try:
-            from distutils2._backport import hashlib
-        except ImportError:
-            import subprocess
-            subprocess.call([sys.executable, 'setup.py', 'build_ext'])
     sys.exit(test_main())
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -1,13 +1,6 @@
-#!/usr/bin/env python
-# -*- encoding: utf-8 -*-
-import os
-import re
-import sys
-import codecs
-from distutils import sysconfig
-from distutils.core import setup, Extension
-from distutils.ccompiler import new_compiler
-from ConfigParser import RawConfigParser
+#!/usr/bin/env python3
+from distutils.core import setup
+from configparser import RawConfigParser
 
 
 def split_multiline(value):
@@ -48,11 +41,8 @@
         }
     config = RawConfigParser()
     config.optionxform = lambda x: x.lower().replace('_', '-')
-    fp = codecs.open(path, encoding='utf-8')
-    try:
+    with open(path, encoding='utf-8') as fp:
         config.readfp(fp)
-    finally:
-        fp.close()
     kwargs = {}
     for section in opts_to_args:
         for optname, argname, xform in opts_to_args[section]:
@@ -67,11 +57,8 @@
         filenames = config.get('metadata', 'description-file')
         for filename in split_multiline(filenames):
             descriptions = []
-            fp = open(filename)
-            try:
+            with open(filename) as fp:
                 descriptions.append(fp.read())
-            finally:
-                fp.close()
         kwargs['long_description'] = '\n\n'.join(descriptions)
     # Handle `package_data`
     if 'package_data' in kwargs:
@@ -83,126 +70,6 @@
         kwargs['package_data'] = package_data
     return kwargs
 
-# (from Python's setup.py, in PyBuildExt.detect_modules())
-def prepare_hashlib_extensions():
-    """Decide which C extensions to build and create the appropriate
-    Extension objects to build them.  Return a list of Extensions.
-    """
-    ssl_libs = None
-    ssl_inc_dir = None
-    ssl_lib_dirs = []
-    ssl_inc_dirs = []
-    if os.name == 'posix':
-        # (from Python's setup.py, in PyBuildExt.detect_modules())
-        # lib_dirs and inc_dirs are used to search for files;
-        # if a file is found in one of those directories, it can
-        # be assumed that no additional -I,-L directives are needed.
-        lib_dirs = []
-        inc_dirs = []
-        if os.path.normpath(sys.prefix) != '/usr':
-            lib_dirs.append(sysconfig.get_config_var('LIBDIR'))
-            inc_dirs.append(sysconfig.get_config_var('INCLUDEDIR'))
-        # Ensure that /usr/local is always used
-        lib_dirs.append('/usr/local/lib')
-        inc_dirs.append('/usr/local/include')
-        # Add the compiler defaults; this compiler object is only used
-        # to locate the OpenSSL files.
-        compiler = new_compiler()
-        lib_dirs.extend(compiler.library_dirs)
-        inc_dirs.extend(compiler.include_dirs)
-        # Now the platform defaults
-        lib_dirs.extend(['/lib64', '/usr/lib64', '/lib', '/usr/lib'])
-        inc_dirs.extend(['/usr/include'])
-        # Find the SSL library directory
-        ssl_libs = ['ssl', 'crypto']
-        ssl_lib = compiler.find_library_file(lib_dirs, 'ssl')
-        if ssl_lib is None:
-            ssl_lib_dirs = ['/usr/local/ssl/lib', '/usr/contrib/ssl/lib']
-            ssl_lib = compiler.find_library_file(ssl_lib_dirs, 'ssl')
-            if ssl_lib is not None:
-                ssl_lib_dirs.append(os.path.dirname(ssl_lib))
-            else:
-                ssl_libs = None
-        # Locate the SSL headers
-        for ssl_inc_dir in inc_dirs + ['/usr/local/ssl/include',
-                                       '/usr/contrib/ssl/include']:
-            ssl_h = os.path.join(ssl_inc_dir, 'openssl', 'ssl.h')
-            if os.path.exists(ssl_h):
-                if ssl_inc_dir not in inc_dirs:
-                    ssl_inc_dirs.append(ssl_inc_dir)
-                break
-    elif os.name == 'nt':
-        # (from Python's PCbuild/build_ssl.py, in find_best_ssl_dir())
-        # Look for SSL 1 level up from here.  That is, the same place the
-        # other externals for Python core live.
-        # note: do not abspath src_dir; the build will fail if any
-        # higher up directory name has spaces in it.
-        src_dir = '..'
-        try:
-            fnames = os.listdir(src_dir)
-        except OSError:
-            fnames = []
-        ssl_dir = None
-        best_parts = []
-        for fname in fnames:
-            fqn = os.path.join(src_dir, fname)
-            if os.path.isdir(fqn) and fname.startswith("openssl-"):
-                # We have a candidate, determine the best
-                parts = re.split("[.-]", fname)[1:]
-                # Ignore all "beta" or any other qualifiers;
-                # eg - openssl-0.9.7-beta1
-                if len(parts) < 4 and parts > best_parts:
-                    best_parts = parts
-                    ssl_dir = fqn
-        if ssl_dir is not None:
-            ssl_libs = ['gdi32', 'user32', 'advapi32',
-                        os.path.join(ssl_dir, 'out32', 'libeay32')]
-            ssl_inc_dir = os.path.join(ssl_dir, 'inc32')
-            ssl_inc_dirs.append(ssl_inc_dir)
-
-    # Find out which version of OpenSSL we have
-    openssl_ver = 0
-    openssl_ver_re = re.compile(
-        '^\s*#\s*define\s+OPENSSL_VERSION_NUMBER\s+(0x[0-9a-fA-F]+)' )
-    if ssl_inc_dir is not None:
-        opensslv_h = os.path.join(ssl_inc_dir, 'openssl', 'opensslv.h')
-        try:
-            incfile = open(opensslv_h, 'r')
-            for line in incfile:
-                m = openssl_ver_re.match(line)
-                if m:
-                    openssl_ver = int(m.group(1), 16)
-        except IOError:
-            e = str(sys.last_value)
-            print("IOError while reading %s: %s" % (opensslv_h, e))
-
-    # Now we can determine which extension modules need to be built.
-    exts = []
-    if ssl_libs is not None and openssl_ver >= 0x907000:
-        # The _hashlib module wraps optimized implementations
-        # of hash functions from the OpenSSL library.
-        exts.append(Extension('distutils2._backport._hashlib',
-                              ['distutils2/_backport/_hashopenssl.c'],
-                              include_dirs=ssl_inc_dirs,
-                              library_dirs=ssl_lib_dirs,
-                              libraries=ssl_libs))
-    else:
-        # no openssl at all, use our own md5 and sha1
-        exts.append(Extension('distutils2._backport._sha',
-                              ['distutils2/_backport/shamodule.c']))
-        exts.append(Extension('distutils2._backport._md5',
-                              sources=['distutils2/_backport/md5module.c',
-                                       'distutils2/_backport/md5.c'],
-                              depends=['distutils2/_backport/md5.h']) )
-    if openssl_ver < 0x908000:
-        # OpenSSL doesn't do these until 0.9.8 so we'll bring our own
-        exts.append(Extension('distutils2._backport._sha256',
-                              ['distutils2/_backport/sha256module.c']))
-        exts.append(Extension('distutils2._backport._sha512',
-                              ['distutils2/_backport/sha512module.c']))
-    return exts
 
 setup_kwargs = cfg_to_args('setup.cfg')
-if sys.version < '2.5':
-    setup_kwargs['ext_modules'] = prepare_hashlib_extensions()
 setup(**setup_kwargs)
diff --git a/tests.sh b/tests.sh
--- a/tests.sh
+++ b/tests.sh
@@ -1,40 +1,29 @@
 #!/bin/sh
-echo -n "Running tests with Python 2.4... "
-python2.4 setup.py build_ext -f -q 2> /dev/null > /dev/null
-python2.4 -Wd runtests.py -q
+echo -n "Running tests with Python 3.1... "
+python3.1 -Wd runtests.py -q
 if [ $? -ne 0 ];then
     echo Failed, re-running
-    python2.4 -Wd runtests.py
+    python3.1 -Wd runtests.py
     exit 1
 else
     echo Success
 fi
 
-echo -n "Running tests with Python 2.5... "
-python2.5 -Wd runtests.py -q
+echo -n "Running tests with Python 3.2... "
+python3.2 -Wd runtests.py -q
 if [ $? -ne 0 ];then
     echo Failed, re-running
-    python2.5 -Wd runtests.py
+    python3.2 -Wd runtests.py
     exit 1
 else
     echo Success
 fi
 
-echo -n "Running tests with Python 2.6... "
-python2.6 -Wd runtests.py -q
+echo -n "Running tests with Python 3.3... "
+python3.3 -Wd runtests.py -q
 if [ $? -ne 0 ];then
     echo Failed, re-running
-    python2.6 -Wd runtests.py
-    exit 1
-else
-    echo Success
-fi
-
-echo -n "Running tests with Python 2.7... "
-python2.7 -Wd -bb -3 runtests.py -q
-if [ $? -ne 0 ];then
-    echo Failed, re-running
-    python2.7 -Wd -bb -3 runtests.py
+    python3.3 -Wd runtests.py
     exit 1
 else
     echo Success

-- 
Repository URL: http://hg.python.org/distutils2


More information about the Python-checkins mailing list