Python speed-up

Phil Frost indigo at bitglue.com
Wed Sep 22 10:38:19 EDT 2004


String contatination in Python might be slower than you think it is,
because it requires building a new string, which involves a memory
allocation and copy. Suggested reading:
<http://www.skymind.com/~ocrow/python_string/>

For the seccond part, try replacing

  encoded_text = encoded_text[8:]

with

  del encoded_text[:8]

Or, use an index variable and don't mutate the list at all.

On Wed, Sep 22, 2004 at 04:06:04PM +0200, Guyon Mor?e wrote:
> Hi all,
> 
> I am working on a Huffman encoding exercise, but it is kinda slow. This is
> not a big problem, I do this to educate myself :)
> 
> So I started profiling the code and the slowdown was actually taking place
> at places where I didn't expect it.
> 
> after I have created a lookup-table-dictionary with encodings like
> {'d':'0110', 'e':'01' etc} to encode the original text like this:
> 
> for c in original_text:
>     encoded_text += table[c]
> 
> I can appreciate the length of the text is big, but this isn't a problem at
> character frequency counting for eaxample. Why is this slow?
> 
> 
> the second place the slowdown occurs is when I ty to chop the encoded string
> of 0's and 1's in pieces of eigth like this:
> 
> chr_list = [] # resulting list
> while 1:
>     chr_list.append(encoded_text[:8]) # take 8 bits from string and put them
> in the list
>     encoded_text = encoded_text[8:] # truncate the string
>     if len(encoded_text) < 8: # end of string reached
>         chr_list.append(encoded_text)
>         break
> 
> 
> I hope someone can tell me why these are slow.
> 
> 
> regards,
> 
> Guyon
> 
> 
> -- 
> http://mail.python.org/mailman/listinfo/python-list



More information about the Python-list mailing list