[OT] fortran lib which provide python like data type

Steven D'Aprano steve+comp.lang.python at pearwood.info
Sat Jan 31 06:04:00 EST 2015


Michael Torrie wrote:

> On 01/30/2015 04:50 PM, Steven D'Aprano wrote:
>> Oh great. So if the average application creates a hundred thousand
>> pointers of the course of a session, you'll only have a thousand or so
>> seg faults and leaks.
>> 
>> Well, that certainly explains this:
>> 
>> https://access.redhat.com/articles/1332213
> 
> I fail to see the connection. 

The connection is that you made a comment about eliminating "99%" of
segfaults, as if that was something to be proud of. It's not. Like 99%
uptime (which means you are down for over three and a half days per year),
it's not very impressive. All it takes is *one* mishandled pointer, and you
can have a seg fault, leak, buffer overflow, security exploit, or other
serious bug. Which leads to vulnerabilities like Ghost.

Yesterday, as I wrote that message, my web browser crashed *eight times* in
a matter of half an hour. There are thousands of serious security
vulnerabilities due to mishandled pointers. Anyone who thinks that Linux
is "secure" is deluding themselves. It's only secure in comparison to
nightmares like Windows. The fact is, the security of computer systems is
best described as "with care and attention to detail, we can make it merely
awful".

And the manual use of pointers in low-level languages like C is a huge
factor in that.


> GLibc is a low-level library written in C, 
> not C++.  By its nature requires a lot of pointer use, and is prone to
> having errors.  But not that many, seeing as *all* Linux software 
> depends on it and uses at least part of it *all* the time.  Pretty
> remarkable if you ask me.  Wonder how they do it.  Perhaps they try to
> follow "basic rules."

Exactly. And as Ghost demonstrates, that is *not good enough*.


>> Manual low-level pointer manipulation is an anti-pattern. What you glibly
>> describe as programmers following "basic rules" has proven to be beyond
>> the ability of the programming community as a whole.
> 
> I don't see how you would write system code without this "anti-pattern"
> as you describe.  Python is a great language for everything else, but I
> certainly wouldn't call it a system language.

I didn't actually mention Python.

> Couldn't write a kernel 
> in it without providing it with some sort of unsafe memory access
> (pointers!).  Or even write a Python interpreter (Yes there's PyPy, but
> with a jit it's still working with pointers).

Systems languages now are, in a sense, in the same position that programming
was back in the days before the invention of Fortran. The programmer,
writing in a low-level assembly, was responsible for pushing items onto the
stack, jumping to a sub-routine, popping the arguments off, then jumping
back to the return address. Before Fortran, there were a number of
proto-languages that aimed to make that safer, but Fortran was the first
language where the compiler itself could completely handle all the
book-keeping needed to use functions easily and safely.

The problem is not *pointers*, but "manual low-level pointer manipulation",
as I said. We programmers are expected to work out how much memory we need
before allocating a pointer, remember to check whether it is nil or not
before dereferencing it, don't forget to release the memory when you're
done, oh you just wrote past the end of the buffer and now the Russian mob
owns your computer... 

Where are the systems languages that will do to pointer access what Fortran
did to the stack?

They do exist: Rust claims to be a blazing fast systems language that is
memory-safe and eliminates dangling pointers and buffer overflows at
compile time. Assuming this is true, that means that the Rust compiler
could generate code that is just as fast and efficient as C but without all
the unsafe memory accesses of C.

http://doc.rust-lang.org/nightly/intro.html#ownership

I've never used Rust. I don't know whether it is ready for kernel
development, or whether it lives up to its claims. Rust itself is not the
point: there are other approaches to memory-safety, some of them are
suitable for application development and some are suitable for systems
languages.

C is now four decades old. It took a single decade to go from hand-writing
machine code in binary to Fortran, and here we are sixty years later still
having buffer overflows. The fact that C is still one of the top three
*application development languages* is a shameful indictment on the
conservatism, resistance to change, intellectual laziness and hubris of the
programming community as a whole.

I won't go so far as to say that C must die. But it must become a niche
language, used for the tiny (and growing ever more tiny as time goes on)
segment of code that modern, memory-safe languages *provably* cannot
handle.


> What I call glibly "basic rules" are in fact shown to more or less work
> out, as Glibc proves.  Pointer use does lead to potential
> vulnerabilities.  And they must be corrected as they are found.  Still
> not sure what your point is.

No. They must be prevented from existing in the first place.

We know that the NSA and other hostile government organisations are engaged
in a concerted effort to weaponise software bugs. Criminal organisations
mass deploy malware at random against hundreds of millions of machines at a
time. Other groups write "advanced persistent threats", effectively custom
malware targeted specifically at a single person or organisation. All these
things have something in common: they rely on bugs in the targeted
software.

Eliminate insecure and buggy software, and you eliminate these threats.

(Buffer overflows are not the only source of exploitable bugs. Today,
they're not even the number one such source. But they are number two.)


> Is there a reason to use C or C++ for many of us?  Nope.  I'm not
> arguing that we should find them of use.  It's easy for us to sit on
> Python and look with contempt at C or C++, but they really do have their
> place (C more than C++ IMO).  This is so far off the original topic that
> it probably is construed that I am arguing for C++ vs Python or
> something.  But I am not.  I'm quite content with Python.  There are a
> host of languages I find interesting including D, Google Go, Vala,
> FreeBASIC, Mozilla Rust, etc.  But Python fits my needs so well, I can't
> be bothered to invest much time in these other languages.

I cannot disagree with that.




-- 
Steven




More information about the Python-list mailing list