Sandboxing eval() (was: Calculator)

musbur at posteo.org musbur at posteo.org
Tue Jan 21 00:57:49 EST 2020


On Mon, 20 Jan 2020 06:43:41 +1100
Chris Angelico <rosuav at gmail.com> wrote:

> On Mon, Jan 20, 2020 at 4:43 AM <musbur at posteo.org> wrote:
> > It works, but is it safe?  
> 
> As such? No.

That's what many people have said, and I believe them. But just from a
point of technical understanding: If I start with empty global and
local dicts, and an empty __builtins__, and I screen the input string
so it can't contain the string "import", is it still possible to have
"targeted" malicious attacks? Of course by gobbling up memory any
script can try and crash the Python interpteter or the whole machine
wreaking all sorts of havoc, but by "targeted" I mean accessing the
file system or the operating system in a deterministic way.

My own Intranet application needs to guard against accidents, not
intentionally malicious attacks.


> However, there are some elegant hybrid options, where you
> can make use of the Python parser to do some of your work, and then
> look at the abstract syntax tree.

Sounds interesting. All I need is a few lines of arithmetic
and variable assignments. Blocking ':' from the input should add some
safety, too.

> Research the "ast" module for some ideas on what you can do.

Will do.


More information about the Python-list mailing list