64 bit memory usage

Rob Randall rob.randall2 at gmail.com
Fri Dec 10 14:03:19 EST 2010


I manged to get my python app past 3GB on a smaller 64 bit machine.
On a test to check memory usage with gc disabled only an extra 6MB was used.

The figures were 1693MB to 1687MB.

This is great.

Thanks again for the help.


On 10 December 2010 13:54, Rob Randall <rob.randall2 at gmail.com> wrote:

> You guys are right. If I disable the gc it will use all the virtual RAM in
> my test.
>
> The application I have been running these tests for is a port of a program
> written in a LISP-based tool running on Unix.
> It does a mass of stress calculations.
>
> The port has been written using a python-based toolkit I am responsible
> for. This toolkit offers much of the same functionlity as the LISP tool.
> It is based around the use of demand-driven/declarative programming.
>
> When the porting project started no one realised just how much memory the
> heaviest of the test cases used.
> It uses 40+ GB on an HP Unix machine.
>
> It is easy to see now that the port should have been written differently,
> but it is essentially complete now.
>
> This has lead me to see if a hardware solution can be found using 64 bit
> windows machnes.
>
> I will try running one the tests next to see what impact disabling the gc
> will have.
>
> Thanks,
> Rob.
>
>
>
> On 9 December 2010 22:44, John Nagle <nagle at animats.com> wrote:
>
>> On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote:
>>
>>> On Wed, 8 Dec 2010 14:44:30 +0000, Rob Randall<rob.randall2 at gmail.com>
>>> declaimed the following in gmane.comp.python.general:
>>>
>>>  I am trying to understand how much memory is available to a 64 bit
>>>> python
>>>> process running under Windows XP 64 bit.
>>>>
>>>> When I run tests just creating a series of large dictionaries containing
>>>> string keys and float values I do not seem to be able to grow the
>>>> process
>>>> beyond the amount of RAM present.
>>>>
>>>
>>   If you get to the point where you need multi-gigabyte Python
>> dictionaries, you may be using the wrong tool for the job.
>> If it's simply that you need to manage a large amount of data,
>> that's what databases are for.
>>
>>   If this is some super high performance application that needs to keep a
>> big database in memory for performance reasons, CPython
>> is probably too slow.  For that, something like Google's BigTable
>> may be more appropriate, and will scale to terabytes if necessary.
>>
>>                                John Nagle
>>
>> --
>> http://mail.python.org/mailman/listinfo/python-list
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-list/attachments/20101210/911ebb6d/attachment-0001.html>


More information about the Python-list mailing list