[IPython-dev] Performance sanity check: 7.21s to scatter 5000X1000 float array to 7 engines

Fernando Perez fperez.net at gmail.com
Sat Jan 12 01:36:30 EST 2008


On Jan 11, 2008 12:49 PM, Anand Patil <anand.prabhakar.patil at gmail.com> wrote:
> Hi Fernando,
>
>
> > The basic idea is to start your engine group as an mpi world, then
> > have them connect to the controller, and from the controller, tell
> > engine 0 to do the array creation and scatter.  So the engines are an
> > MPI world, but the controller and client don't need to be.  Is that
> > clear enough?  If not, I'll provide step by step (with code)
> > instructions later...
> >
>
> It makes sense, but I don't seem to be getting the details. I start the
> engines with
>
> mpirun -n 7 ipengine --mpi=mpi4py
>
> then do the following from IPython:
>
> In [2]: rc.executeAll('from mpi4py import MPI')
> Out[2]:
> <Results List>
> [0] In [1]: from mpi4py import MPI
> [1] In [1]: from mpi4py import MPI
>  [2] In [1]: from mpi4py import MPI
> [3] In [1]: from mpi4py import MPI
> [4] In [1]: from mpi4py import MPI
> [5] In [1]: from mpi4py import MPI
> [6] In [1]: from mpi4py import MPI
>
>
> In [3]: rc.execute(1, 'print MPI.COMM_WORLD.size')
> Out[3]:
> <Results List>
> [1] In [2]: print MPI.COMM_WORLD.size
> [1] Out[2]: 1
>
> so it looks like the IPEngines don't know they're in the same world...
>
> Anyway, I look forward to seeing the mini-tutorial later on, but please
> don't rush on my account.

What platform are you on?  I've just wasted a few hours chasing the
same behavior you're seeing, only to find out that openmpi on ubuntu
gutsy is completely #$@^* broken!!!

https://bugs.launchpad.net/ubuntu/+source/openmpi/+bug/152273

<rant> As much of an ubuntu fan as I've been over the years, lately
I've really had a few encounters like this that are annoying the hell
out of me.  We all have bugs, but a distro with fully
doesn't-work-at-all packages, and bugs that make the kernel *fully
unusable in text mode* (with certain configurations that have worked
for only, oh, 10 years) and that linger for months unfixed despite
dozens of confirmed reports... AGGGGHHHH.
</rant>

There, I feel much better.

So, solution: if you're having MPI problems, try first a simple test:

tlon[~]> cat mpitest.py
import mpi4py.MPI as mpi
print mpi.COMM_WORLD.size

Run this as

tlon[~]> mpirun -n 4 python mpitest.py
[tlon:18863] mca: base: component_find: unable to open osc pt2pt: file
not found (ignored)
[tlon:18864] mca: base: component_find: unable to open osc pt2pt: file
not found (ignored)
[tlon:18866] mca: base: component_find: unable to open osc pt2pt: file
not found (ignored)
[tlon:18869] mca: base: component_find: unable to open osc pt2pt: file
not found (ignored)
4
4
4
4

(ignore the top 'pt2pt' warnings, it's OpenMPI line noise).

I was also getting '1' before with mpi4py installed via easy_install.
I went back and:

0. Nuked all my MPI-related installs from Ubuntu (MPICH, LAM,
OpenMPI).  Triple-checked that mpirun, mpicc and friends were NOT
available.

1. Built OpenMPI from scratch from

http://www.open-mpi.org/software/ompi/v1.2/

into /usr/local.

2. Re-installed MPI4Py again, to make sure it got built against the
above, clean OpenMPI.

Now it's working fine:

In [23]: rc.getIDs()
Out[23]: [0, 1, 2, 3]

In [24]: autopx
Auto Parallel Enabled
Type %autopx to disable

In [25]: import mpi4py.MPI as mpi
<Results List>
[0] In [2]: import mpi4py.MPI as mpi
[1] In [2]: import mpi4py.MPI as mpi
[2] In [2]: import mpi4py.MPI as mpi
[3] In [2]: import mpi4py.MPI as mpi


In [26]: print mpi.COMM_WORLD.size
<Results List>
[0] In [3]: print mpi.COMM_WORLD.size
[0] Out[3]: 4

[1] In [3]: print mpi.COMM_WORLD.size
[1] Out[3]: 4

[2] In [3]: print mpi.COMM_WORLD.size
[2] Out[3]: 4

[3] In [3]: print mpi.COMM_WORLD.size
[3] Out[3]: 4



In [27]: print mpi.COMM_WORLD.rank
<Results List>
[0] In [4]: print mpi.COMM_WORLD.rank
[0] Out[4]: 0

[1] In [4]: print mpi.COMM_WORLD.rank
[1] Out[4]: 2

[2] In [4]: print mpi.COMM_WORLD.rank
[2] Out[4]: 3

[3] In [4]: print mpi.COMM_WORLD.rank
[3] Out[4]: 1



Let me know if that helps...

Cheers,

f



More information about the IPython-dev mailing list