What is different with Python ?

Andreas Kostyrka andreas at kostyrka.org
Tue Jun 14 04:02:02 EDT 2005


On Tue, Jun 14, 2005 at 12:02:29AM +0000, Andrea Griffini wrote:
> However I do not think that going this low (that's is still
> IMO just a bit below assembler and still quite higher than
> HW design) is very common for programmers.
Well, at least one University (Technical University Vienna) does it
this way. Or did it at least when I passed the courses ;)

> 
> >Or how does one explain that a "stupid and slow" algorithm can be in
> >effect faster than a "clever and fast" algorithm, without explaining
> >how a cache works. And what kinds of caches there are. (I've seen
> >documented cases where a stupid search was faster because all hot data
> >fit into the L1 cache of the CPU, while more clever algorithms where
> >slower).
> 
> Caching is indeed very important, and sometimes the difference
> is huge. I think anyway that it's probably something confined
> in a few cases (processing big quantity of data with simple
> algorithms, e.g. pixel processing).
> It's also a field where if you care about the details the
> specific architecture plays an important role, and anything
> you learned about say the Pentium III could be completely
> pointless on the Pentium 4.

Nope. While it's certainly true that it's different from architecture
to architecture, you need to learn the different types of caches,
etc. so that one can quickly crasp the architecture currently in use.

> Except by general locality rules I would say that everything
> else should be checked only if necessary and on a case-by-case
> approach. I'm way a too timid investor to throw in neurons
> on such a volatile knowledge.

It's not volatile knowledge. Knowledge what CPU uses what cache
organisation is volatile and wasted knowledge. Knowledge what kinds
of caches are common is quite useful ;)

Easy Question:
You've got 2 programs that are running in parallel.
Without basic knowledge about caches, the naive answer would be that
the programs will probably run double time. The reality is different.


> >Or you get perfect abstract designs, that are horrible when
> >implemented.
> 
> Current trend is that you don't even need to do a
> clear design. Just draw some bubbles and arrows on
> a white board with a marker, throw in some buzzword
> and presto! you basically completed the new killing app.

Well, somehow I'm happy that it is this way. That ensures enough work
for me, because with this approach somebody will call in the fire
department when it doesn't work anymore ;)
Or the startup goes belly up. 

> Real design and implementation are minutiae for bozos.
> 
> Even the mighty python is incredibly less productive
> than powerpoint ;-)
> 
> >Yes. But for example to understand the memory behaviour of Python
> >understanding C + malloc + OS APIs involved is helpful.
> 
> This is a key issue. If you've the basis firmly placed
> most of what follows will be obvious. If someone tells
> you that inserting an element at the beginning of an array
> is O(n) in the number of elements then you think "uh... ok,
> sounds reasonable", if they say that it's amortized O(1)
> instead then you say "wow..." and after some thinking
> "ok, i think i understand how it could be done" and in
> both cases you'll remember it. It's a clear *concrete* fact
> that I think just cannot be forgot.

Believe it can ;)
But that's the idea why certain courses forced us to implement at
least the most important data structures by ourselves.

Because looking at it in a book is so much less intensive than doing
it ;)

Andreas




More information about the Python-list mailing list