The REALLY bad thing about Python lists ..

Michael Hudson mwh21 at cam.ac.uk
Sun May 14 13:38:39 EDT 2000


rturpin at my-deja.com writes:

> In article <391ED8C9.BC2BAE8D at san.rr.com>,
>   Courageous <jkraska1 at san.rr.com> wrote:
> > Are you talking about some kind of strategy where
> > you allocate a maxsize and then put the used memory
> > in the middle?
> 
> Yep. You wouldn't have to do this on first allocation,
> but could do so only on first prepend that runs over
> the start of the buffer, on the assumption if it
> happens once, it might happen again. Then you get
> efficiency in both directions. The object pointer
> still points to the zero-th element, so indexing is
> no different. Below the zero-th element, you have
> pointers to both ends of the buffer. Prepending
> would be a BIT more expensive than appending, since
> you have to move these pointers. But you don't have
> to move the entire vector each time you prepend,
> which seems to be what is now done. Creating vectors
> of length n by prepending elements one at a time is
> now an n^2 operation!

So create it by appending n elements and then reverse it... this is
still n^2, I guess, but for the numbers you can deal with before the
universe dies a heat death, this will be vastly quicker than anything
else you can do in Python.

(let ((result ()))
  (dotimes (i n)
    (rplacd (last result) (cons i nil))))

is O(n^2), yet people still manage to program in Common Lisp...

does-anyone-else-think-people-are-looking-for-problems-where-there
 -are-none?-ly y'rs
M.

-- 
  About the use of language: it is impossible to sharpen a 
  pencil with a blunt axe. It is equally vain to try to do 
  it with ten blunt axes instead.
     -- E.W.Dijkstra, 18th June 1975. Perl did not exist at the time.



More information about the Python-list mailing list