O(n^2) is bad - can it be fixed?
Tim Peters
tim.one at home.com
Sat May 26 18:17:11 EDT 2001
[Tim]
> Bruce explained it in detail a few msgs back:
>
> http://groups.google.com/groups?hl=en&lr=&safe=off&
> ic=1&th=8fb1dd7dabf75024,41&seekm=
> 3B0DED72.41317BCE%40accessone.com#p
[Isaac]
> I do read messages of others. What I don't understand is exactly
> what Bruce didn't: why in the hell that "old heap" is not freed. For
> this to happen, there must be some "small chunks of memory" allocated
> for Python every other 4M space ...
Na, here's a straight C program that does nothing but reallocs:
#include <stdio.h>
#include <stdlib.h>
#include <malloc.h>
void
main()
{
size_t CHUNK = 100;
size_t size = CHUNK;
char* p = (char*)malloc(CHUNK);
for (;;) {
size += CHUNK;
p = (char*)realloc(p, size);
if (!p) {
fprintf(stderr, "out of memory, at %u\n", size);
exit(1);
}
}
}
Compiled in release mode under MSVC 6, and run on a Win98SE box with 256MB
RAM, it died with
out of memory, at 5906500
Doubling the size each time works much better -- the first time. But when I
put that in a loop, the OS froze solid at about the sixth iteration, and a
reboot was needed. That may be coincidence, but I don't care enough to try
it again.
If you can't let this go, I suggest asking Microsoft very nicely to let you
look at the Windows source code <wink>.
In the meantime, the change I checked in lets Python get away with growing a
list to about 140MB on Win98SE before realloc() craps out.
More information about the Python-list
mailing list