[Catalog-sig] RubyGems Threat Model and Requirements

Nick Coghlan ncoghlan at gmail.com
Wed Feb 13 15:21:10 CET 2013


On Wed, Feb 13, 2013 at 7:58 PM, Giovanni Bajo <rasky at develer.com> wrote:
> Il giorno 13/feb/2013, alle ore 04:31, Nick Coghlan <ncoghlan at gmail.com> ha scritto:
>> TUF's target delegation is thus in direct competition to the "trusted
>> keys" file in your design. TUF specifically aims to take care of the
>> "online key needed" problem, by confining the online key role to the
>> generation of the timestamp file, with offline keys used to sign the
>> regenerated metadata when a target delegation changes.
>
> Does this mean that adding a package to PyPI, adding a maintainer to a package, removing a maintainer from a package, etc. all require an offline operation in this model?

If I'm reading the spec correctly, you can decide what level of
consequence you want if an attacker compromises the master server.
What TUF appears to allow (but does not require), is for the access
levels to be split as follows:

Distribution server: can update timestamp file, but not upload new
releases or add/remove project keys. Only needs the timestamp role key

Upload server: can add/remove files within target subtrees that have
been delegated appropriately to user keys. Needs the release role key
in order to publish the resulting metadata updates.

Management server: can add/remove target delegations (e.g. to create
new projects, or add new maintainers). Needs the target role key in
order to update the target delegations and the release role key in
order to publish them.

Offline: the root key is the one that the clients actually trust, and
it's the one that is used to identify the target, release and
timestamp keys on the distribution server. If *this* gets compromised,
you're pretty much screwed until the clients are updated to blacklist
it. The only time you should need this key is if you need to change
the server keys (e.g. following a server compromise)

Since PyPI currently uses a single server for management, upload and
distribution, it probably doesn't make sense to use all four roles (at
least initially). An offline root key, and a single master key used to
sign the online metadata is probably the best we can do within the
current PyPI architecture (and perhaps ever, given it's nature as an
almost completely open server).

By contrast, if you were only publishing releases from a private build
service, then the *only* internet facing machine would be your
distribution server, and thus the only directly exposed key would be
the timestamp key. You could then lock down your upload and management
functions as much as you wished.

(As near as I can tell, the advice in section 6.1 of the TUF spec to
keep all keys other than the timestamp key offline is just plain wrong
for a system with PyPI's current architecture, where the same public
web service that distributes the software also needs to let people
create new projects and update their keys. If I'm wrong, maybe Justin
will be able to explain how at some point...).

> I wouldn't oppose it. In fact, I was just scared that such a model would not be accepted as it would break too much flexibility  (within my design, the equivalent would be having the trust file signed by a master PyPI GPG key, whose fingerprint is hardcoded in pip, and it is kept offline; or it might even be online, but on a separate signing server, that eg. logs everything it does by email to a public mailing list, like "Adding this fingerprint to django").

Yep, I think you've nailed the equivalence there. The reasons I still
favour the TUF model at this stage is that it lets us start with a
simple single-online-key, and target delegation direct to projects
model (i.e. something functionally equivalent to your proposal, with
TUF's "releases.txt" and the target delegation files together
fulfilling the role of your trust file), while still leaving the door
open to various future improvements like:

1. separating out PyPI's management/upload server from the
distribution server (with the latter only having access to a timestamp
key)
2. permitting delegation to umbrella projects/organisations rather
than directly to individual projects
3. migrate to using the "custom" fields in the TUF file formats for
metadata publication
4. use TUF's threshold mechanism to allow projects to opt in to a
higher security configuration where multiple signatures are needed for
a release to be trusted

We may never do any of those things, but they're all good options to
have open to us (in particular, metadata publication through TUF is
something you can scale with a CDN, and inherently comes with markers
to indicate what has changed since you last looked at the index. You
can't easily do that with XML-RPC calls or the current PyPI simple
project listing).

For now, though, we would probably start off with
release/target/timestamp roles sharing a key, all threshold values set
to 1, and just doing simple project based target delegation to user
keys. Given the existing GPG infrastructure, I'm also inclined to
stick with GPG based keys and work with the TUF folks to define that
format in their spec. We may also need to leave the protection against
replay attacks off by default, do to the problem with incorrect clocks
noted at the end of the TUF spec.

> Does TUF have also a way to let end users not trust PyPI, that is what is obtained by manually hand-editing/tuning the trust file in my design?

Yes, that's controlled by which root keys the client is configured to
trust. Swap the PyPI root key in the client for your own, and then
publish a custom set of metadata and away you go.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia


More information about the Catalog-SIG mailing list