[Moin-user] Lupy indexer errors with "too many open files"

Rettig, Lane Lane.Rettig at deshaw.com
Thu Feb 1 09:17:27 EST 2007


> > I tried reducing the mergeFactor in 
> support/lupy/index/indexwriter.py
> > from 20 to 10, as recommended elsewhere, but this didn't 
> help.  Nor did
> > upping the limit on the max no. of file descriptors for the 
> process from
> > 256 to 1024.
> 
> Maybe try some other values. But please understand that we 
> won't invest 
> much time into debugging this as the lupy code doesn't exist 
> any more in 
> 1.6 (but much better xapian code).

FWIW, this turned out to be an issue with Solaris, which cannot do
buffered fopen() (which is what Python uses) on file descriptors > 256,
regardless of what the limit is set to.  The Lupy indexer failed on
indexing somewhere between 2000 and 3000 pages.

> Concerning production use, you could maybe just start with 
> latest 1.5.x 
> release (and the non-lupy slow search, which usually works quite well 
> for small wikis) and build up your content (will take some time, I 
> guess) and later switch to 1.6 when it is released and the 
> slow search 
> is getting a real problem for you.

Thanks for the information, and for the advice.

Lane Rettig




More information about the Moin-user mailing list