obviscating python code for distribution

geremy condra debatem1 at gmail.com
Thu May 19 13:23:47 EDT 2011


On Wed, May 18, 2011 at 10:21 PM, Hans Georg Schaathun <hg at schaathun.net> wrote:
> On Wed, 18 May 2011 14:34:46 -0700, geremy condra
>  <debatem1 at gmail.com> wrote:
> :  Systems can be designed that are absolutely secure under reasonable
> :  assumptions. The fact that it has assumptions does not make your
> :  statement true.
> : (...)
> :  I can't tell if you're trying to play word games with the distinction
> :  between "system" and "module" or if you're just saying that you aren't
> :  sure what FIPS actually certifies. Could you please clarify?
>
> The distinction between system and module is rather significant.
> If you only consider modules, you have bounded your problem and
> drastically limited the complexity.

Ah, the 'word games' option. I'm not going to spend a lot of time
arguing this one: HSMs are clearly the domain of systems research, are
referred to in both technical and nontechnical documents as 'keystone
systems', and the FIPS standard under which they are certified
specifically calls them systems more times than I care to count. They
are, to the people who make and use them, systems, and your attempt at
redefinition won't change that.

> :  Are you talking about the Mayfair classical cipher here?
>
> I am talking about the system used in public transport cards like
> Oyster and Octopus.  I am not sure how classical it is, or whether
> mayfair/mayfare referred to the system or just a cipher.  Any way,
> it was broken, and it took years.

Ah, MIFARE. That's a different story, and no, I don't believe they
would have been broken sooner if the specs were released. The
importance (and difficulty) of securing devices like smartcards wasn't
really recognized until much later, and certainly people with a foot
in both worlds were very rare for a long time. Also remember that DES
(with its 56-bit keys) was recertified just a few months before MIFARE
(with its 48-bit keys) was first released- it was a different world.

> :  The entire field of formal modeling and verification has grown around
> :  solving this problem. My new favorite in the field is "formal models
> :  and techniques for analyzing security protocols", but there are other
> :  works discussing OS kernel verification (which has gotten a lot of
> :  attention lately) and tons of academic literature. Google (scholar) is
> :  the place to go.
>
> Sure, but now you are considering modules, rather than systems again.
> It is when these reliable components are put together to form systems
> that people fail (empirically).

Let me get this straight: your argument is that operating *systems*
aren't systems?

> :  If you can't say with confidence that something meets minimum security
> :  standards, the answer is not to try to say it meets high security
> :  standards.
>
> So what?  The levels of assurance have nothing to do with standards.
> The levels of assurance refer to the /confidence/ you can have that
> the standards are met.

The increasing levels of assurance don't just signify that you've
checked for problems- it certifies that you don't have them, at least
insofar as that level of testing is able to find. Insisting that this
doesn't, or shouldn't, translate into tighter security doesn't make
much sense.

Geremy Condra



More information about the Python-list mailing list