Looking for info on Python's memory allocation
Steven D'Aprano
steve at REMOVETHIScyber.com.au
Tue Oct 11 10:20:09 EDT 2005
On Tue, 11 Oct 2005 11:22:39 +0200, Lasse Vågsæther Karlsen wrote:
> This begs a different question along the same lines.
Er, no it doesn't. "Begs the question" does _not_ mean "asks the question"
or "suggests the question". It means "assumes the truth of that which
needs to be proven".
http://en.wikipedia.org/wiki/Begging_the_question
http://www.worldwidewords.org/qa/qa-beg1.htm
(Both of these sources are far more forgiving of the modern mis-usage than
I am. Obviously.)
> If I have a generator or other iterable producing a vast number of
> items, and use it like this:
>
> s = [k for k in iterable]
>
> if I know beforehand how many items iterable would possibly yield, would
> a construct like this be faster and "use" less memory?
>
> s = [0] * len(iterable)
> for i in xrange(len(iterable)):
> s[i] = iterable.next()
Faster? Maybe. Only testing can tell -- but I doubt it. But as for less
memory, look at the two situations.
In the first, you create a list of N objects.
In the second, you end up with the same list of N objects, plus an xrange
object, which may be bigger or smaller than an ordinary list of N
integers, depending on how large N is.
So it won't use *less* memory -- at best, it will use just slightly more.
Is there a way from within Python to find out how much memory a single
object uses, and how much memory Python is using in total?
--
Steven.
More information about the Python-list
mailing list