Program uses twice as much memory in Python 3.6 than in Python 3.5

Jan Gosmann jan at hyper-world.de
Thu Mar 30 14:14:37 EDT 2017


That's great news. I'm busy with other things right now, but will look 
into your findings in more detail later.


On 03/30/2017 02:09 PM, INADA Naoki wrote:
> Filed an issue: https://bugs.python.org/issue29949
>
> Thanks for your report, Jan.
>
> On Fri, Mar 31, 2017 at 3:04 AM, INADA Naoki <songofacandy at gmail.com> wrote:
>> Maybe, this commit make this regression.
>>
>> https://github.com/python/cpython/commit/4897300276d870f99459c82b937f0ac22450f0b6
>>
>> Old:
>> minused = (so->used + other->used)*2  (L619)
>>
>> New:
>> minused = so->used + other->used  (L620)
>> minused = (minused > 50000) ? minused * 2 : minused * 4;  (L293)
>>
>> So size of small set is doubled.
>>
>> $ /usr/bin/python3
>> Python 3.5.2+ (default, Sep 22 2016, 12:18:14)
>> [GCC 6.2.0 20160927] on linux
>> Type "help", "copyright", "credits" or "license" for more information.
>>>>> import sys
>>>>> s = set(range(10))
>>>>> sys.getsizeof(frozenset(s))
>> 736
>> $ python3
>> Python 3.6.0 (default, Dec 30 2016, 20:49:54)
>> [GCC 6.2.0 20161005] on linux
>> Type "help", "copyright", "credits" or "license" for more information.
>>>>> import  sys
>>>>> s = set(range(10))
>>>>> sys.getsizeof(frozenset(s))
>> 1248
>>
>>
>> On Fri, Mar 31, 2017 at 2:34 AM, INADA Naoki <songofacandy at gmail.com> wrote:
>>> I reproduced the issue.
>>> This is very usual, memory usage issue.  Slashing is just a result of
>>> large memory usage.
>>>
>>> After 1st pass of optimization, RAM usage is 20GB+ on Python 3.5 and
>>> 30GB on Python 3.6.
>>> And Python 3.6 starts slashing in 2nd optimization pass.
>>>
>>> I enabled tracemalloc while 1st pass.  Results is end of this mail.
>>> It seems frozenset() cause regression, but I'm not sure yet.
>>> I don't know what is contents of frozenset yet.  (I know almost
>>> nothing about this application).
>>>
>>> Jan, do you know about what this is?
>>> Could you make script which just runs `transitive_closure(edges)` with
>>> edges similar to
>>> `log_reduction.py spaun`?
>>>
>>> I'll dig into it later, maybe next week.
>>>
>>> ---
>>> Python 3.6.1
>>> 1191896 memory blocks: 22086104.2 KiB
>>>    File "/home/inada-n/work/gosmann-frontiers2017/gosmann_frontiers2017/optimized/optimizer.py",
>>> line 85
>>>      reachables[vertex] = frozenset(reachables[vertex])
>>>    File "/home/inada-n/work/gosmann-frontiers2017/gosmann_frontiers2017/optimized/optimizer.py",
>>> line 410
>>>      self.dependents = transitive_closure(self.dg.forward)
>>> 602986 memory blocks: 51819.1 KiB
>>>    File "<string>", line 14
>>>    File "/home/inada-n/work/gosmann-frontiers2017/gosmann_frontiers2017/optimized/optimizer.py",
>>> line 634
>>>      first_view=None, v_offset=0, v_size=0, v_base=None)
>>>
>>> Python 3.5.3
>>> 1166804 memory blocks: 11116407.0 KiB
>>>    File "/home/inada-n/work/gosmann-frontiers2017/gosmann_frontiers2017/optimized/optimizer.py",
>>> line 85
>>>      reachables[vertex] = frozenset(reachables[vertex])
>>>    File "/home/inada-n/work/gosmann-frontiers2017/gosmann_frontiers2017/optimized/optimizer.py",
>>> line 410
>>>      self.dependents = transitive_closure(self.dg.forward)
>>> 602989 memory blocks: 51819.3 KiB
>>>    File "<string>", line 14
>>>    File "/home/inada-n/work/gosmann-frontiers2017/gosmann_frontiers2017/optimized/optimizer.py",
>>> line 634
>>>      first_view=None, v_offset=0, v_size=0, v_base=None)




More information about the Python-list mailing list