Pack optimization

BDL bdlabitt at gmail.com
Thu Oct 15 14:57:14 EDT 2009


I have a large amount of binary data that needs to be pushed across
the network.  It appears from profiling that the dominant time is
being taken up by packing the data.  (50%)  Here is a CME that shows
the problem.

from numpy import random
from struct import pack
import time

lenstim = 10084200
sigma = 0.1
stim = random.normal(0., sigma, lenstim) # 10084200 gaussian random
numbers (doubles)

fmt = '!h'+str(stim.size)+'d'  # makes fmt = '!h10084200d'
cmd = 4

startTime = time.time()
packdat = pack( fmt, cmd, *stim )
elapsed = time.time() - startTime
print "Time to pack the command and data %.6f seconds " % elapsed

Is there a faster method to do this?  Is it possible to use array?
Any suggestions would be appreciated.



More information about the Python-list mailing list