what are the most popular building and packaging tools for python ??

Bengt Richter bokr at oz.net
Mon Oct 25 19:44:50 EDT 2004


On Mon, 25 Oct 2004 23:13:35 +0200, aleaxit at yahoo.com (Alex Martelli) wrote:
>Neil Benn <neil.benn at arcor.de> wrote:
>   ...
>> start looking at reverse compilation.  This is something it it possible
>> to do in most 'bytecode' languages - other bytecode implementations 
>> (java, .NET) use 'obfuscators' that will make your code unreadable if
>> someone tries to decompile it.  To this end, I've not seen a python 
>> obfuscation tool
>
>Security by obscurity isn't.  If you can obfuscate, I can deobfuscate,
>if it's worth my while.  If you stick in license checking, I can patch
>it out.  It's not about one programmer being better than another: the
>attacker/cracker has the inevitable advantage.  If you ship all the code
>(even in object obfuscated form) you're toast.  I know: I've done that
>as part of my job for ten years of my life -- copy protection and the
>like WAS one part of my job as senior software consultant.  Thousands of
>hours wasted off my live.  Quoth the raven, nevermore.
>
>If your code contains really valuable trade secrets, my well-considered,
>experience-driven, professional suggestion, is to carefully remove just
>enough of the secret parts from the code you distribute, and make them
>available only as web-services or the equivalent from a host you
>control.  Whatever implementation language you use, the only code that
>will never be cracked is code that does NOT leave your control.  (well,
>that AND most code that's not really all that valuable, of course;-).
>
It's an interesting problem. Personally, I like open source, but I think
secret stuff should be possible, and I think it will be one of these days...

To anthropomorphize, code is a recipe for CPU behaviour analogous to the
way a cooking recipe is a recipe for cook behavior. If I send you a pgp-encrypted
(with your public key) recipe for a special dish, and you decrypt it in a secure
kitchen and do the cooking, and only serve results from the kitchen through a special
window for the waiters to take it to the end user, it would seem that the recipe
was secure, except for reverse engineering the product itself, and/or inferring what
may be inferred from externally observable parameters, such as overall timing, etc.

By analogy, I don't think it's a stretch to imagine a CPU core with a "secure kitchen"
in the form of an inaccessible instruction cache memory where decrypted (by the core
hardware) code/recipes may be stored for execution, and with private temporary data cooking
areas for use before results come out the window.

If a multi-core chip had a core dedicated as a "secure kitchen," I think it could appear
software-wise via a proxy dll that had the harware level protocol for passing in encrypted
code and then using it like a closed pure function, passing arguments and receiving results.

IOW, a python extension could nicely encapsulate access to securely encrypted code,
which it would get into the "secure kitchen" by using harware core-core communication
channels designed to support all this.

Using this kind of system, a customer would give you his CPU's public key and serial number,
and you would send him the encrypted code as part of your app package. No other CPU would
be able to run this code, since no other CPU would have the matching private key
to decrypt it. Yes, someone could send a bogus non-CPU public key that he has the private
key for, so there would have to be an authentication of CPU public keys in some kind of
registry, presumably generated by CPU manufacturer, mapping chip serial number to its
*public* key, available for checking at some authenticable site.

Maybe I am missing something, but this seems feasible to me.
One might feel distrustful of supplying a "kitchen" in one's house
for someone else to cook secretly in (poison? etc?), but having it encapsulated
in a core only accessed via argument passing and result retrieval through
actual hardware channels makes the secret code something like a pure function.
There can't be any side effects outside of the kitchen, except insofar as the
open side may be misled by data results.

I just hope this obvious overall concept has not been locked into
some patents for use in restricting freedom to create and share code
according to whatever arrangements people want to agree on to do that.

Instead, I would hope it would enhance the possibilites for making dependable
agreements, which should be good for everyone, so long as access to the
the secure kitchen core hardware functionality is open, and optional,
and does not have back doors.

(BTW, cores that specialize in audio or video streams are kind of obvious too).

>A web service can require any form of authentication and validation from
>its 'client' code, so you can implement any business model you like.  I
>heartily recommend (depending on the situation) subscription-based or
>per-use fees, over the old and crufty 'sell the bits' model that never
>really worked right (IMHO).  Be sure to pick _important_ parts of your
>code as those that are only available as webservices, not to make the
>webservices just a kind of license check, or else the webservice access
>WILL be hacked out just like any other license check (assuming your code
>IS valuable, of course).
It seems this is where we are, but give it another few years, and other
possibilites should become available. Or I'm dreaming ;-)

>
>You can distribute, depending on your exact circumstances, an "already
>somewhat useful" pure-client program, as long as the full functionality
>that customers will pay for is only accessible via the webservices.  You
>can even opensource the part you distribute, that may garner you useful
>feedback, more customers, etc.
>
>Of course, there _are_ still, today, applications which can't assume the
>net is available and must still offer full functionality no matter what.
>They're fewer and fewer, thanks be, as connectivity spreads -- games
>accrue multiplayer online-play features that players are eager for,
>financial programs require access to updated exchange rates or stock
>levels, and so on.  If you do need to sell applications which have full
>functionality without net access, you may as well resign yourself: you
>will never be truly safe, alas.
>
Yes, but if it takes sophisticated chip-destroying methods to retrieve a key,
that wouldn't happen unless it was a blockbuster application with a market worth
*a lot*. And the pirate would either have to distribute nonencrypted or set up
his own competing offer to encrypt his clear copy for specific CPUs, which
would seem pretty vulnerable legally. I.e., anyone trying to make money with
the pirated stuff would be an easy target for the lawyers, IWT. And how many
would destroy chips and have the equipment to extract a hidden key, and then
give away the decrypted result?

Regards,
Bengt Richter



More information about the Python-list mailing list