[SciPy-User] using multiple processors for particle filtering

Robin robince at gmail.com
Tue May 25 17:50:54 EDT 2010


On Tue, May 25, 2010 at 10:19 PM, Robin <robince at gmail.com> wrote:
> Sorry - just thought it probably doesn't make sense to use map in this
> case since your processing function isn't returning anything... you
> can check Pool.apply_async (which returns control and lets stuff
> continue in the background) and Pool.apply_sync (which is probably
> what you want).

I'm being a bit silly I think - this won't work properly of course
because the particle instances will be changed in the subprocesses and
not propagated back.. but hopefully the pointer to using Pool if
useful. If you can split the task into an independent function then
it's really handy...

Maybe something like

self.updated_particles = p.map(update_particle, self.particles)
where update_particle takes a single particle instance, does the
random number generation and returns the updated particle.

Cheers

Robin



More information about the SciPy-User mailing list