Tuple size and memory allocation for embedded Python
Steve Holden
steve at holdenweb.com
Fri Jan 21 17:20:54 EST 2005
Jinming Xu wrote:
> Hi Folks,
>
> Python seems unstable, when allocating big memory. For example, the
> following C++ code creates a tuple of tuples:
>
> PyObject* arCoord = PyTuple_New(n);
> double d = 1.5;
> for(int i=0; i<n; i++)
> {
> PyObject* coord = PyTuple_New(2);
> PyTuple_SetItem(coord,0, PyFloat_FromDouble(d));//x
> PyTuple_SetItem(coord,1, PyFloat_FromDouble(d));//y
> PyTuple_SetItem(arCoord,i, coord);
> }
>
> When the n is small, say 100, the code works fine. when n is big, say
> 10,000, Python has trouble allocating memory, saying:
>
> "Exception exceptions.IndexError: 'tuple index out of range' in 'garbage
> collection' ignored
> Fatal Python error: unexpected exception during garbage collection
> Aborted"
>
> Could anyone please give me some insight or a fix for this?
>
> Thanks in advance for your answer.
>
I'm going to guess that the problem is related to incorrect reference
counts. I don't see any IncRefs in there. It seems probable that the
program will work until you make n high enough to trigger a garbage
collection sweep, then memory your program still regards as allocated is
garbage collected by Python and reused. Ugly :-P
Python is pretty stable, so it's usually best to suspect our own code
unless you're heavily into using the C API (which I'm not, so feel free
to ignore me).
regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
More information about the Python-list
mailing list