what are the most popular building and packaging tools for python ??
kosh
kosh at aesaeion.com
Tue Oct 26 03:33:26 EDT 2004
On Monday 25 October 2004 5:44 pm, Bengt Richter wrote:
> It's an interesting problem. Personally, I like open source, but I think
> secret stuff should be possible, and I think it will be one of these
> days...
>
I have to admit I hope it is not ever really possible.
> By analogy, I don't think it's a stretch to imagine a CPU core with a
> "secure kitchen" in the form of an inaccessible instruction cache memory
> where decrypted (by the core hardware) code/recipes may be stored for
> execution, and with private temporary data cooking areas for use before
> results come out the window.
I certainly hope that does not happen and will continue to advise customers
not to touch things that even try to do stuff like that. I have seen too many
people burned badly when some proprietary app they had which required some
special key server just stopped working. The company would go out of
business, decide that they did not want people to use the old version any
more etc etc. Either way the ammount of lockin with a system like that is
staggering. Even if the software did not cost any money the price is far too
high for what you lose. However the software tends to be very expensive which
makes it an even worse investment. The worst ones are those you can't really
get the data back out of.
> IOW, a python extension could nicely encapsulate access to securely
> encrypted code, which it would get into the "secure kitchen" by using
> harware core-core communication channels designed to support all this.
>
I would hope something like this never makes it into python just so that
people could not use it. I would want the most insane hoops to exist or
people to even try this and I would want the os to give huge warnings
whenever any piece of software used that feature.
> Using this kind of system, a customer would give you his CPU's public key
> and serial number, and you would send him the encrypted code as part of
> your app package. No other CPU would be able to run this code, since no
> other CPU would have the matching private key to decrypt it. Yes, someone
> could send a bogus non-CPU public key that he has the private key for, so
> there would have to be an authentication of CPU public keys in some kind of
> registry, presumably generated by CPU manufacturer, mapping chip serial
> number to its *public* key, available for checking at some authenticable
> site.
So what you want to do is run abitrary code on someone else's machine that
they have no way to access and that if for some reason you go out of
business they are screwed. Yeah a system like that is great except when you
get a new computer and try to install the software on it and find it doesn't
work since the key no longer matches and the company that made the software
no longer exists. This is just a horrible idea that should not go into any
systems.
>
> Maybe I am missing something, but this seems feasible to me.
> One might feel distrustful of supplying a "kitchen" in one's house
> for someone else to cook secretly in (poison? etc?), but having it
> encapsulated in a core only accessed via argument passing and result
> retrieval through actual hardware channels makes the secret code something
> like a pure function. There can't be any side effects outside of the
> kitchen, except insofar as the open side may be misled by data results.
>
There is no way I would trust a system like that and long term any company
that does will end up paying a very heavy price. Those that use those methods
will end up going out of business anyways since a competitor even a twice the
price will often be able to show themselves to be a far more attractive
vendor. Being able to tell your customers that if you vanish the software as
is will work until they switch to something else, that they can get their
data out, that they can run it on a newer computer etc. All this lock in, drm
etc stuff is just going to burn a lot of people and it is a bad idea. It is
trying to create an equivalent to physical controls in a world which is not
physical and it just won't work.
> Instead, I would hope it would enhance the possibilites for making
> dependable agreements, which should be good for everyone, so long as access
> to the the secure kitchen core hardware functionality is open, and
> optional, and does not have back doors.
>
These agreements belong in the legal system not in the technology. That is why
you have contracts and you have penalties for breaching a contract.
More information about the Python-list
mailing list