[Python-checkins] distutils2: Uniformize client APIs for requesting informations to the indexes.
tarek.ziade
python-checkins at python.org
Sun Aug 8 11:50:46 CEST 2010
tarek.ziade pushed 386c37a009e1 to distutils2:
http://hg.python.org/distutils2/rev/386c37a009e1
changeset: 457:386c37a009e1
user: Alexis Metaireau <ametaireau at gmail.com>
date: Sat Jul 24 02:15:46 2010 +0200
summary: Uniformize client APIs for requesting informations to the indexes.
files: docs/source/projects-index.dist.rst, docs/source/projects-index.simple.rst, docs/source/projects-index.xmlrpc.rst, src/distutils2/index/base.py, src/distutils2/index/dist.py, src/distutils2/index/errors.py, src/distutils2/index/simple.py, src/distutils2/index/xmlrpc.py, src/distutils2/tests/test_index_simple.py, src/distutils2/tests/test_index_xmlrpc.py
diff --git a/docs/source/projects-index.dist.rst b/docs/source/projects-index.dist.rst
--- a/docs/source/projects-index.dist.rst
+++ b/docs/source/projects-index.dist.rst
@@ -5,14 +5,13 @@
Informations coming from indexes are represented by the classes present in the
`dist` module.
-.. note:: Keep in mind that each project (eg. FooBar) can have several
- releases (eg. 1.1, 1.2, 1.3), and each of these releases can be
- provided in multiple distributions (eg. a source distribution,
- a binary one, etc).
-
APIs
====
+Keep in mind that each project (eg. FooBar) can have several releases
+(eg. 1.1, 1.2, 1.3), and each of these releases can be provided in multiple
+distributions (eg. a source distribution, a binary one, etc).
+
ReleaseInfo
------------
diff --git a/docs/source/projects-index.simple.rst b/docs/source/projects-index.simple.rst
--- a/docs/source/projects-index.simple.rst
+++ b/docs/source/projects-index.simple.rst
@@ -27,39 +27,39 @@
Usage Exemples
---------------
-To help you understand how using the `SimpleIndexCrawler` class, here are some basic
+To help you understand how using the `Crawler` class, here are some basic
usages.
Request the simple index to get a specific distribution
++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Supposing you want to scan an index to get a list of distributions for
-the "foobar" project. You can use the "find" method for that.
-The find method will browse the project page, and return :class:`ReleaseInfo`
+the "foobar" project. You can use the "get_releases" method for that.
+The get_releases method will browse the project page, and return :class:`ReleaseInfo`
objects for each found link that rely on downloads. ::
>>> from distutils2.index.simple import Crawler
>>> crawler = Crawler()
- >>> crawler.find("FooBar")
+ >>> crawler.get_releases("FooBar")
[<ReleaseInfo "Foobar 1.1">, <ReleaseInfo "Foobar 1.2">]
Note that you also can request the client about specific versions, using version
specifiers (described in `PEP 345
<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_)::
- >>> client.find("FooBar < 1.2")
+ >>> client.get_releases("FooBar < 1.2")
[<ReleaseInfo "FooBar 1.1">, ]
-`find` returns a list of :class:`ReleaseInfo`, but you also can get the best
-distribution that fullfil your requirements, using "get"::
+`get_releases` returns a list of :class:`ReleaseInfo`, but you also can get the best
+distribution that fullfil your requirements, using "get_release"::
- >>> client.get("FooBar < 1.2")
+ >>> client.get_release("FooBar < 1.2")
<ReleaseInfo "FooBar 1.1">
Download distributions
+++++++++++++++++++++++
-As it can get the urls of distributions provided by PyPI, the `SimpleIndexCrawler`
+As it can get the urls of distributions provided by PyPI, the `Crawler`
client also can download the distributions and put it for you in a temporary
destination::
@@ -74,7 +74,7 @@
While downloading, the md5 of the archive will be checked, if not matches, it
will try another time, then if fails again, raise `MD5HashDoesNotMatchError`.
-Internally, that's not the SimpleIndexCrawler which download the distributions, but the
+Internally, that's not the Crawler which download the distributions, but the
`DistributionInfo` class. Please refer to this documentation for more details.
Following PyPI external links
@@ -87,35 +87,45 @@
It's possible to tell the PyPIClient to follow external links by setting the
`follow_externals` attribute, on instanciation or after::
- >>> client = SimpleIndexCrawler(follow_externals=True)
+ >>> client = Crawler(follow_externals=True)
or ::
- >>> client = SimpleIndexCrawler()
+ >>> client = Crawler()
>>> client.follow_externals = True
Working with external indexes, and mirrors
+++++++++++++++++++++++++++++++++++++++++++
-The default `SimpleIndexCrawler` behavior is to rely on the Python Package index stored
+The default `Crawler` behavior is to rely on the Python Package index stored
on PyPI (http://pypi.python.org/simple).
As you can need to work with a local index, or private indexes, you can specify
it using the index_url parameter::
- >>> client = SimpleIndexCrawler(index_url="file://filesystem/path/")
+ >>> client = Crawler(index_url="file://filesystem/path/")
or ::
- >>> client = SimpleIndexCrawler(index_url="http://some.specific.url/")
+ >>> client = Crawler(index_url="http://some.specific.url/")
You also can specify mirrors to fallback on in case the first index_url you
provided doesnt respond, or not correctly. The default behavior for
-`SimpleIndexCrawler` is to use the list provided by Python.org DNS records, as
+`Crawler` is to use the list provided by Python.org DNS records, as
described in the :pep:`381` about mirroring infrastructure.
If you don't want to rely on these, you could specify the list of mirrors you
want to try by specifying the `mirrors` attribute. It's a simple iterable::
>>> mirrors = ["http://first.mirror","http://second.mirror"]
- >>> client = SimpleIndexCrawler(mirrors=mirrors)
+ >>> client = Crawler(mirrors=mirrors)
+
+Searching in the simple index
++++++++++++++++++++++++++++++
+
+It's possible to search for projects with specific names in the package index.
+Assuming you want to find all projects containing the "Grail" keyword::
+
+ >>> client.search(name="grail")
+ ["holy grail", "unholy grail", "grail"]
+
diff --git a/docs/source/projects-index.xmlrpc.rst b/docs/source/projects-index.xmlrpc.rst
--- a/docs/source/projects-index.xmlrpc.rst
+++ b/docs/source/projects-index.xmlrpc.rst
@@ -49,19 +49,19 @@
most used way for users (eg. "give me the last version of the FooBar project").
This can be accomplished using the following syntax::
- >>> client = XMLRPCClient()
- >>> client.get("Foobar (<= 1.3))
+ >>> client = xmlrpc.Client()
+ >>> client.get_release("Foobar (<= 1.3))
<FooBar 1.2.1>
- >>> client.find("FooBar (<= 1.3)")
+ >>> client.get_releases("FooBar (<= 1.3)")
[FooBar 1.1, FooBar 1.1.1, FooBar 1.2, FooBar 1.2.1]
And we also can find for specific fields::
- >>> client.find_by(field=value)
+ >>> client.search_projects(field=value)
You could specify the operator to use, default is "or"::
- >>> client.find_by(field=value, operator="and")
+ >>> client.search_projects(field=value, operator="and")
The specific fields you can search are:
@@ -85,7 +85,7 @@
XML-RPC is a prefered way to retrieve metadata informations from indexes.
It's really simple to do so::
- >>> client = XMLRPCClient()
+ >>> client = xmlrpc.Client()
>>> client.get_metadata("FooBar", "1.1")
<ReleaseInfo FooBar 1.1>
@@ -94,7 +94,7 @@
metadata::
>>> foobar11 = ReleaseInfo("FooBar", "1.1")
- >>> client = XMLRPCClient()
+ >>> client = xmlrpc.Client()
>>> returned_release = client.get_metadata(release=foobar11)
>>> returned_release
<ReleaseInfo FooBar 1.1>
@@ -105,7 +105,7 @@
To retrieve all the releases for a project, you can build them using
`get_releases`::
- >>> client = XMLRPCClient()
+ >>> client = xmlrpc.Client()
>>> client.get_releases("FooBar")
[<ReleaseInfo FooBar 0.9>, <ReleaseInfo FooBar 1.0>, <ReleaseInfo 1.1>]
@@ -119,7 +119,7 @@
It's possible to retrive informations about distributions, e.g "what are the
existing distributions for this release ? How to retrieve them ?"::
- >>> client = XMLRPCClient()
+ >>> client = xmlrpc.Client()
>>> release = client.get_distributions("FooBar", "1.1")
>>> release.dists
{'sdist': <FooBar 1.1 sdist>, 'bdist': <FooBar 1.1 bdist>}
@@ -140,7 +140,7 @@
:class:`distutils2.index.xmlrpc.Client` directly (they will be made, but they're
invisible to the you)::
- >>> client = XMLRPCClient()
+ >>> client = xmlrpc.Client()
>>> releases = client.get_releases("FooBar")
>>> releases.get_release("1.1").metadata
<Metadata for FooBar 1.1>
diff --git a/src/distutils2/index/base.py b/src/distutils2/index/base.py
--- a/src/distutils2/index/base.py
+++ b/src/distutils2/index/base.py
@@ -1,75 +1,13 @@
from distutils2.version import VersionPredicate
-from distutils2.index.errors import DistributionNotFound
+from distutils2.index.dist import ReleasesList
-class IndexClient(object):
- """Base class containing common index client methods"""
+class BaseClient(object):
+ """Base class containing common methods for the index crawlers/clients"""
- def _search_for_releases(self, requirements):
- """To be redefined in child classes"""
- return NotImplemented
-
- def find(self, requirements, prefer_final=None):
- """Browse the PyPI to find distributions that fullfil the given
- requirements.
-
- :param requirements: A project name and it's distribution, using
- version specifiers, as described in PEP345.
- :type requirements: You can pass either a version.VersionPredicate
- or a string.
- :param prefer_final: if the version is not mentioned in requirements,
- and the last version is not a "final" one
- (alpha, beta, etc.), pick up the last final
- version.
- """
- requirements = self._get_version_predicate(requirements)
- prefer_final = self._get_prefer_final(prefer_final)
-
- # internally, rely on the "_search_for_release" method
- dists = self._search_for_releases(requirements)
- if dists:
- dists = dists.filter(requirements)
- dists.sort_releases(prefer_final=prefer_final)
- return dists
-
- def get(self, requirements, prefer_final=None):
- """Return only one release that fulfill the given requirements.
-
- :param requirements: A project name and it's distribution, using
- version specifiers, as described in PEP345.
- :type requirements: You can pass either a version.VersionPredicate
- or a string.
- :param prefer_final: if the version is not mentioned in requirements,
- and the last version is not a "final" one
- (alpha, beta, etc.), pick up the last final
- version.
- """
- predicate = self._get_version_predicate(requirements)
-
- # internally, rely on the "_get_release" method
- dist = self._get_release(predicate, prefer_final=prefer_final)
- if not dist:
- raise DistributionNotFound(requirements)
- return dist
-
- def download(self, requirements, temp_path=None, prefer_final=None,
- prefer_source=True):
- """Download the distribution, using the requirements.
-
- If more than one distribution match the requirements, use the last
- version.
- Download the distribution, and put it in the temp_path. If no temp_path
- is given, creates and return one.
-
- Returns the complete absolute path to the downloaded archive.
-
- :param requirements: The same as the find attribute of `find`.
-
- You can specify prefer_final argument here. If not, the default
- one will be used.
- """
- return self.get(requirements, prefer_final)\
- .download(prefer_source=prefer_source, path=temp_path)
+ def __init__(self, prefer_final, prefer_source):
+ self._prefer_final = prefer_final
+ self._prefer_source = prefer_source
def _get_version_predicate(self, requirements):
"""Return a VersionPredicate object, from a string or an already
@@ -80,9 +18,37 @@
return requirements
def _get_prefer_final(self, prefer_final=None):
- """Return the prefer_final bit parameter or the specified one if
- exists."""
+ """Return the prefer_final internal parameter or the specified one if
+ provided"""
if prefer_final:
return prefer_final
else:
return self._prefer_final
+
+ def _get_prefer_source(self, prefer_source=None):
+ """Return the prefer_source internal parameter or the specified one if
+ provided"""
+ if prefer_source:
+ return prefer_source
+ else:
+ return self._prefer_source
+
+ def _get_project(self, project_name):
+ """Return an project instance, create it if necessary"""
+ return self._projects.setdefault(project_name,
+ ReleasesList(project_name))
+
+ def download_distribution(self, requirements, temp_path=None,
+ prefer_source=None, prefer_final=None):
+ """Download a distribution from the last release according to the
+ requirements.
+
+ If temp_path is provided, download to this path, otherwise, create a
+ temporary location for the download and return it.
+ """
+ prefer_final = self._get_prefer_final(prefer_final)
+ prefer_source = self._get_prefer_source(prefer_source)
+ release = self.get_release(requirements, prefer_final)
+ if release:
+ dist = release.get_distribution(prefer_source=prefer_source)
+ return dist.download(temp_path)
diff --git a/src/distutils2/index/dist.py b/src/distutils2/index/dist.py
--- a/src/distutils2/index/dist.py
+++ b/src/distutils2/index/dist.py
@@ -53,17 +53,17 @@
self.metadata = DistributionMetadata(mapping=metadata)
self.dists = {}
self.hidden = hidden
-
+
if 'dist_type' in kwargs:
dist_type = kwargs.pop('dist_type')
self.add_distribution(dist_type, **kwargs)
-
+
def set_version(self, version):
try:
self._version = NormalizedVersion(version)
except IrrationalVersionError:
suggestion = suggest_normalized_version(version)
- if suggestion:
+ if suggestion:
self.version = suggestion
else:
raise IrrationalVersionError(version)
@@ -94,7 +94,7 @@
self.dists[dist_type].add_url(**params)
else:
self.dists[dist_type] = DistInfo(self, dist_type, **params)
-
+
def get_distribution(self, dist_type=None, prefer_source=True):
"""Return a distribution.
@@ -144,7 +144,7 @@
% (self.name, other.name))
def __repr__(self):
- return "<%s %s>" %(self.name, self.version)
+ return "<%s %s>" % (self.name, self.version)
def __eq__(self, other):
self._check_is_comparable(other)
@@ -307,7 +307,7 @@
def add_releases(self, releases):
"""Add releases in the release list.
-
+
:param: releases is a list of ReleaseInfo objects.
"""
for r in releases:
@@ -367,7 +367,7 @@
super(ReleasesList, self).sort(
key=lambda i: [getattr(i, arg) for arg in sort_by],
reverse=reverse, *args, **kwargs)
-
+
def get_release(self, version):
"""Return a release from it's version.
"""
@@ -384,7 +384,7 @@
string = 'Project "%s"' % self.name
if self.get_versions():
string += ' versions: %s' % ', '.join(self.get_versions())
- return '<%s>' % string
+ return '<%s>' % string
def get_infos_from_url(url, probable_dist_name=None, is_external=True):
@@ -454,7 +454,7 @@
version = archive_name.lstrip(name)
else:
name, version = eager_split(archive_name)
-
+
version = suggest_normalized_version(version)
if version is not None and name != "":
return (name.lower(), version)
diff --git a/src/distutils2/index/errors.py b/src/distutils2/index/errors.py
--- a/src/distutils2/index/errors.py
+++ b/src/distutils2/index/errors.py
@@ -5,23 +5,27 @@
from distutils2.errors import DistutilsError
-class IndexError(DistutilsError):
+class IndexesError(DistutilsError):
"""The base class for errors of the index python package."""
-class ProjectNotFound(IndexError):
+class ProjectNotFound(IndexesError):
"""Project has not been found"""
-class DistributionNotFound(IndexError):
- """No distribution match the given requirements."""
+class DistributionNotFound(IndexesError):
+ """The release has not been found"""
-class CantParseArchiveName(IndexError):
+class ReleaseNotFound(IndexesError):
+ """The release has not been found"""
+
+
+class CantParseArchiveName(IndexesError):
"""An archive name can't be parsed to find distribution name and version"""
-class DownloadError(IndexError):
+class DownloadError(IndexesError):
"""An error has occurs while downloading"""
@@ -29,13 +33,13 @@
"""Compared hashes does not match"""
-class UnsupportedHashName(IndexError):
+class UnsupportedHashName(IndexesError):
"""A unsupported hashname has been used"""
-class UnableToDownload(IndexError):
+class UnableToDownload(IndexesError):
"""All mirrors have been tried, without success"""
-class InvalidSearchField(IndexError):
+class InvalidSearchField(IndexesError):
"""An invalid search field has been used"""
diff --git a/src/distutils2/index/simple.py b/src/distutils2/index/simple.py
--- a/src/distutils2/index/simple.py
+++ b/src/distutils2/index/simple.py
@@ -13,11 +13,12 @@
import urlparse
import logging
-from distutils2.index.base import IndexClient
+from distutils2.index.base import BaseClient
from distutils2.index.dist import (ReleasesList, EXTENSIONS,
get_infos_from_url, MD5_HASH)
-from distutils2.index.errors import (IndexError, DownloadError,
- UnableToDownload, CantParseArchiveName)
+from distutils2.index.errors import (IndexesError, DownloadError,
+ UnableToDownload, CantParseArchiveName,
+ ReleaseNotFound)
from distutils2.index.mirrors import get_mirrors
from distutils2 import __version__ as __distutils2_version__
@@ -55,6 +56,7 @@
return _socket_timeout
return _socket_timeout
+
def with_mirror_support():
"""Decorator that makes the mirroring support easier"""
def wrapper(func):
@@ -67,21 +69,27 @@
try:
self._switch_to_next_mirror()
except KeyError:
- raise UnableToDownload("Tried all mirrors")
+ raise UnableToDownload("Tried all mirrors")
else:
self._mirrors_tries += 1
- self._releases.clear()
+ self._projects.clear()
return wrapped(self, *args, **kwargs)
return wrapped
return wrapper
-class Crawler(IndexClient):
+
+class Crawler(BaseClient):
"""Provides useful tools to request the Python Package Index simple API.
You can specify both mirrors and mirrors_url, but mirrors_url will only be
used if mirrors is set to None.
:param index_url: the url of the simple index to search on.
+ :param prefer_final: if the version is not mentioned, and the last
+ version is not a "final" one (alpha, beta, etc.),
+ pick up the last final version.
+ :param prefer_source: if the distribution type is not mentioned, pick up
+ the source one if available.
:param follow_externals: tell if following external links is needed or
not. Default is False.
:param hosts: a list of hosts allowed to be processed while using
@@ -89,9 +97,6 @@
hosts.
:param follow_externals: tell if following external links is needed or
not. Default is False.
- :param prefer_final: if the version is not mentioned, and the last
- version is not a "final" one (alpha, beta, etc.),
- pick up the last final version.
:param mirrors_url: the url to look on for DNS records giving mirror
adresses.
:param mirrors: a list of mirrors (see PEP 381).
@@ -100,12 +105,13 @@
on mirrors before switching.
"""
- def __init__(self, index_url=DEFAULT_INDEX_URL, hosts=DEFAULT_HOSTS,
- follow_externals=False, prefer_final=False,
- mirrors_url=None, mirrors=None,
+ def __init__(self, index_url=DEFAULT_INDEX_URL, prefer_final=False,
+ prefer_source=True, hosts=DEFAULT_HOSTS,
+ follow_externals=False, mirrors_url=None, mirrors=None,
timeout=SOCKET_TIMEOUT, mirrors_max_tries=0):
+ super(Crawler, self).__init__(prefer_final, prefer_source)
self.follow_externals = follow_externals
-
+
# mirroring attributes.
if not index_url.endswith("/"):
index_url += "/"
@@ -114,12 +120,10 @@
mirrors = get_mirrors(mirrors_url)
self._mirrors = set(mirrors)
self._mirrors_used = set()
- self.index_url = index_url
+ self.index_url = index_url
self._mirrors_max_tries = mirrors_max_tries
self._mirrors_tries = 0
-
self._timeout = timeout
- self._prefer_final = prefer_final
# create a regexp to match all given hosts
self._allowed_hosts = re.compile('|'.join(map(translate, hosts))).match
@@ -128,34 +132,44 @@
# scanning them multple time (eg. if there is multiple pages pointing
# on one)
self._processed_urls = []
- self._releases = {}
-
+ self._projects = {}
+
@with_mirror_support()
- def search(self, name=None, **kwargs):
- """Search the index for projects containing the given name"""
+ def search_projects(self, name=None, **kwargs):
+ """Search the index for projects containing the given name.
+
+ Return a list of names.
+ """
index = self._open_url(self.index_url)
-
projectname = re.compile("""<a[^>]*>(.?[^<]*%s.?[^<]*)</a>""" % name,
flags=re.I)
matching_projects = []
for match in projectname.finditer(index.read()):
- matching_projects.append(match.group(1))
-
+ matching_projects.append(match.group(1))
return matching_projects
- def _search_for_releases(self, requirements):
- """Search for distributions and return a ReleaseList object containing
- the results
+ def get_releases(self, requirements, prefer_final=None):
+ """Search for releases and return a ReleaseList object containing
+ the results.
"""
- # process the index page for the project name, searching for
- # distributions.
- self._process_index_page(requirements.name)
- return self._releases.setdefault(requirements.name,
- ReleasesList(requirements.name))
-
- def _get_release(self, requirements, prefer_final):
+ predicate = self._get_version_predicate(requirements)
+ prefer_final = self._get_prefer_final(prefer_final)
+ self._process_index_page(predicate.name)
+ releases = self._projects.setdefault(predicate.name,
+ ReleasesList(predicate.name))
+ if releases:
+ releases = releases.filter(predicate)
+ releases.sort_releases(prefer_final=prefer_final)
+ return releases
+
+ def get_release(self, requirements, prefer_final=None):
"""Return only one release that fulfill the given requirements"""
- return self.find(requirements, prefer_final).get_last(requirements)
+ predicate = self._get_version_predicate(requirements)
+ release = self.get_releases(predicate, prefer_final)\
+ .get_last(predicate)
+ if not release:
+ raise ReleaseNotFound("No release matches the given criterias")
+ return release
def _switch_to_next_mirror(self):
"""Switch to the next mirror (eg. point self.index_url to the next
@@ -216,18 +230,18 @@
name = release.name
else:
name = release_info['name']
- if not name in self._releases:
- self._releases[name] = ReleasesList(name)
+ if not name in self._projects:
+ self._projects[name] = ReleasesList(name)
if release:
- self._releases[name].add_release(release=release)
+ self._projects[name].add_release(release=release)
else:
name = release_info.pop('name')
version = release_info.pop('version')
dist_type = release_info.pop('dist_type')
- self._releases[name].add_release(version, dist_type,
+ self._projects[name].add_release(version, dist_type,
**release_info)
- return self._releases[name]
+ return self._projects[name]
def _process_url(self, url, project_name=None, follow_links=True):
"""Process an url and search for distributions packages.
@@ -256,7 +270,7 @@
infos = get_infos_from_url(link, project_name,
is_external=not self.index_url in url)
except CantParseArchiveName, e:
- logging.warning("version has not been parsed: %s"
+ logging.warning("version has not been parsed: %s"
% e)
else:
self._register_release(release_info=infos)
@@ -324,7 +338,7 @@
"""
scheme, netloc, path, params, query, frag = urlparse.urlparse(url)
-
+
# authentication stuff
if scheme in ('http', 'https'):
auth, host = urllib2.splituser(netloc)
@@ -335,7 +349,7 @@
if scheme == 'file':
if url.endswith('/'):
url += "index.html"
-
+
# add authorization headers if auth is provided
if auth:
auth = "Basic " + \
@@ -351,7 +365,7 @@
fp = urllib2.urlopen(request)
except (ValueError, httplib.InvalidURL), v:
msg = ' '.join([str(arg) for arg in v.args])
- raise IndexError('%s %s' % (url, msg))
+ raise IndexesError('%s %s' % (url, msg))
except urllib2.HTTPError, v:
return v
except urllib2.URLError, v:
@@ -372,7 +386,6 @@
if s2 == scheme and h2 == host:
fp.url = urlparse.urlunparse(
(s2, netloc, path2, param2, query2, frag2))
-
return fp
def _decode_entity(self, match):
diff --git a/src/distutils2/index/xmlrpc.py b/src/distutils2/index/xmlrpc.py
--- a/src/distutils2/index/xmlrpc.py
+++ b/src/distutils2/index/xmlrpc.py
@@ -2,9 +2,9 @@
import xmlrpclib
from distutils2.errors import IrrationalVersionError
-from distutils2.index.base import IndexClient
+from distutils2.index.base import BaseClient
from distutils2.index.errors import ProjectNotFound, InvalidSearchField
-from distutils2.index.dist import ReleaseInfo, ReleasesList
+from distutils2.index.dist import ReleaseInfo
PYPI_XML_RPC_URL = 'http://python.org/pypi'
@@ -14,9 +14,9 @@
'description', 'keywords', 'platform', 'download_url']
-class Client(IndexClient):
+class Client(BaseClient):
"""Client to query indexes using XML-RPC method calls.
-
+
If no server_url is specified, use the default PyPI XML-RPC URL,
defined in the PYPI_XML_RPC_URL constant::
@@ -29,43 +29,26 @@
'http://someurl/'
"""
- def __init__(self, server_url=PYPI_XML_RPC_URL, prefer_final=False):
+ def __init__(self, server_url=PYPI_XML_RPC_URL, prefer_final=False,
+ prefer_source=True):
+ super(Client, self).__init__(prefer_final, prefer_source)
self.server_url = server_url
self._projects = {}
- self._prefer_final = prefer_final
- def _search_for_releases(self, requirements):
- return self.get_releases(requirements.name)
-
- def _get_release(self, requirements, prefer_final=False):
- releases = self.get_releases(requirements.name)
- release = releases.get_last(requirements, prefer_final)
+ def get_release(self, requirements, prefer_final=False):
+ """Return a release with all complete metadata and distribution
+ related informations.
+ """
+ prefer_final = self._get_prefer_final(prefer_final)
+ predicate = self._get_version_predicate(requirements)
+ releases = self.get_releases(predicate.name)
+ release = releases.get_last(predicate, prefer_final)
self.get_metadata(release.name, "%s" % release.version)
self.get_distributions(release.name, "%s" % release.version)
return release
- @property
- def proxy(self):
- """Property used to return the XMLRPC server proxy.
-
- If no server proxy is defined yet, creates a new one::
-
- >>> client = XmlRpcClient()
- >>> client.proxy()
- <ServerProxy for python.org/pypi>
-
- """
- if not hasattr(self, '_server_proxy'):
- self._server_proxy = xmlrpclib.ServerProxy(self.server_url)
-
- return self._server_proxy
-
- def _get_project(self, project_name):
- """Return an project instance, create it if necessary"""
- return self._projects.setdefault(project_name,
- ReleasesList(project_name))
-
- def get_releases(self, project_name, show_hidden=True, force_update=False):
+ def get_releases(self, requirements, prefer_final=None, show_hidden=True,
+ force_update=False):
"""Return the list of existing releases for a specific project.
Cache the results from one call to another.
@@ -88,6 +71,9 @@
def get_versions(project_name, show_hidden):
return self.proxy.package_releases(project_name, show_hidden)
+ predicate = self._get_version_predicate(requirements)
+ prefer_final = self._get_prefer_final(prefer_final)
+ project_name = predicate.name
if not force_update and (project_name in self._projects):
project = self._projects[project_name]
if not project.contains_hidden and show_hidden:
@@ -100,15 +86,17 @@
for version in hidden_versions:
project.add_release(release=ReleaseInfo(project_name,
version))
- return project
else:
versions = get_versions(project_name, show_hidden)
if not versions:
raise ProjectNotFound(project_name)
project = self._get_project(project_name)
- project.add_releases([ReleaseInfo(project_name, version)
+ project.add_releases([ReleaseInfo(project_name, version)
for version in versions])
- return project
+ project = project.filter(predicate)
+ project.sort_releases(prefer_final)
+ return project
+
def get_distributions(self, project_name, version):
"""Grab informations about distributions from XML-RPC.
@@ -143,9 +131,9 @@
release.set_metadata(metadata)
return release
- def search(self, name=None, operator="or", **kwargs):
+ def search_projects(self, name=None, operator="or", **kwargs):
"""Find using the keys provided in kwargs.
-
+
You can set operator to "and" or "or".
"""
for key in kwargs:
@@ -157,9 +145,25 @@
for p in projects:
project = self._get_project(p['name'])
try:
- project.add_release(release=ReleaseInfo(p['name'],
- p['version'], metadata={'summary':p['summary']}))
+ project.add_release(release=ReleaseInfo(p['name'],
+ p['version'], metadata={'summary': p['summary']}))
except IrrationalVersionError, e:
logging.warn("Irrational version error found: %s" % e)
-
+
return [self._projects[p['name']] for p in projects]
+
+ @property
+ def proxy(self):
+ """Property used to return the XMLRPC server proxy.
+
+ If no server proxy is defined yet, creates a new one::
+
+ >>> client = XmlRpcClient()
+ >>> client.proxy()
+ <ServerProxy for python.org/pypi>
+
+ """
+ if not hasattr(self, '_server_proxy'):
+ self._server_proxy = xmlrpclib.ServerProxy(self.server_url)
+
+ return self._server_proxy
diff --git a/src/distutils2/tests/test_index_simple.py b/src/distutils2/tests/test_index_simple.py
--- a/src/distutils2/tests/test_index_simple.py
+++ b/src/distutils2/tests/test_index_simple.py
@@ -85,14 +85,14 @@
# Browse the index, asking for a specified release version
# The PyPI index contains links for version 1.0, 1.1, 2.0 and 2.0.1
crawler = self._get_simple_crawler(server)
- last_release = crawler.get("foobar")
+ last_release = crawler.get_release("foobar")
# we have scanned the index page
self.assertIn(server.full_address + "/simple/foobar/",
crawler._processed_urls)
# we have found 4 releases in this page
- self.assertEqual(len(crawler._releases["foobar"]), 4)
+ self.assertEqual(len(crawler._projects["foobar"]), 4)
# and returned the most recent one
self.assertEqual("%s" % last_release.version, '2.0.1')
@@ -140,7 +140,7 @@
# Try to request the package index, wich contains links to "externals"
# resources. They have to be scanned too.
crawler = self._get_simple_crawler(server, follow_externals=True)
- crawler.get("foobar")
+ crawler.get_release("foobar")
self.assertIn(server.full_address + "/external/external.html",
crawler._processed_urls)
@@ -150,7 +150,7 @@
# Test that telling the simple pyPI client to not retrieve external
# works
crawler = self._get_simple_crawler(server, follow_externals=False)
- crawler.get("foobar")
+ crawler.get_release("foobar")
self.assertNotIn(server.full_address + "/external/external.html",
crawler._processed_urls)
@@ -175,7 +175,7 @@
# scan a test index
crawler = Crawler(index_url, follow_externals=True)
- releases = crawler.find("foobar")
+ releases = crawler.get_releases("foobar")
server.stop()
# we have only one link, because links are compared without md5
@@ -196,7 +196,7 @@
# process the pages
crawler = self._get_simple_crawler(server, follow_externals=True)
- crawler.find("foobar")
+ crawler.get_releases("foobar")
# now it should have processed only pages with links rel="download"
# and rel="homepage"
self.assertIn("%s/simple/foobar/" % server.full_address,
@@ -225,7 +225,7 @@
mirrors=[mirror.full_address, ])
# this should not raise a timeout
- self.assertEqual(4, len(crawler.find("foo")))
+ self.assertEqual(4, len(crawler.get_releases("foo")))
finally:
mirror.stop()
@@ -274,7 +274,7 @@
index_path = os.sep.join(["file://" + PYPI_DEFAULT_STATIC_PATH,
"test_found_links", "simple"])
crawler = Crawler(index_path)
- dists = crawler.find("foobar")
+ dists = crawler.get_releases("foobar")
self.assertEqual(4, len(dists))
def test_get_link_matcher(self):
@@ -301,11 +301,11 @@
self.assertIn('http://example.org/some/download', found_links)
@use_pypi_server("project_list")
- def test_search(self, server):
+ def test_search_projects(self, server):
# we can search the index for some projects, on their names
# the case used no matters here
crawler = self._get_simple_crawler(server)
- projects = crawler.search("Foobar")
+ projects = crawler.search_projects("Foobar")
self.assertListEqual(['FooBar-bar', 'Foobar-baz', 'Baz-FooBar'],
projects)
diff --git a/src/distutils2/tests/test_index_xmlrpc.py b/src/distutils2/tests/test_index_xmlrpc.py
--- a/src/distutils2/tests/test_index_xmlrpc.py
+++ b/src/distutils2/tests/test_index_xmlrpc.py
@@ -11,22 +11,22 @@
return Client(server.full_address, *args, **kwargs)
@use_xmlrpc_server()
- def test_search(self, server):
- # test that the find method return a list of ReleasesList
+ def test_search_projects(self, server):
client = self._get_client(server)
server.xmlrpc.set_search_result(['FooBar', 'Foo', 'FooFoo'])
- results = [r.name for r in client.search(name='Foo')]
+ results = [r.name for r in client.search_projects(name='Foo')]
self.assertEqual(3, len(results))
self.assertIn('FooBar', results)
self.assertIn('Foo', results)
self.assertIn('FooFoo', results)
- def test_search_bad_fields(self):
+ def test_search_projects_bad_fields(self):
client = Client()
- self.assertRaises(InvalidSearchField, client.search, invalid="test")
+ self.assertRaises(InvalidSearchField, client.search_projects,
+ invalid="test")
@use_xmlrpc_server()
- def test_find(self, server):
+ def test_get_releases(self, server):
client = self._get_client(server)
server.xmlrpc.set_distributions([
{'name': 'FooBar', 'version': '1.1'},
@@ -37,7 +37,7 @@
# use a lambda here to avoid an useless mock call
server.xmlrpc.list_releases = lambda *a, **k: ['1.1', '1.2', '1.3']
- releases = client.find('FooBar (<=1.2)')
+ releases = client.get_releases('FooBar (<=1.2)')
# dont call release_data and release_url; just return name and version.
self.assertEqual(2, len(releases))
versions = releases.get_versions()
@@ -45,30 +45,7 @@
self.assertIn('1.2', versions)
self.assertNotIn('1.3', versions)
- self.assertRaises(ProjectNotFound, client.find,'Foo')
-
- @use_xmlrpc_server()
- def test_get_releases(self, server):
- client = self._get_client(server)
- server.xmlrpc.set_distributions([
- {'name':'FooBar', 'version': '0.8', 'hidden': True},
- {'name':'FooBar', 'version': '0.9', 'hidden': True},
- {'name':'FooBar', 'version': '1.1'},
- {'name':'FooBar', 'version': '1.2'},
- ])
- releases = client.get_releases('FooBar', False)
- versions = releases.get_versions()
- self.assertEqual(2, len(versions))
- self.assertIn('1.1', versions)
- self.assertIn('1.2', versions)
-
- releases2 = client.get_releases('FooBar', True)
- versions = releases2.get_versions()
- self.assertEqual(4, len(versions))
- self.assertIn('0.8', versions)
- self.assertIn('0.9', versions)
- self.assertIn('1.1', versions)
- self.assertIn('1.2', versions)
+ self.assertRaises(ProjectNotFound, client.get_releases,'Foo')
@use_xmlrpc_server()
def test_get_distributions(self, server):
--
Repository URL: http://hg.python.org/distutils2
More information about the Python-checkins
mailing list