automatic "lotjes trekken"

Tim Peters tim_one at email.msn.com
Tue Nov 30 01:59:48 EST 1999


[Carel Fellinger]
> ...
> glad my kids aren't lurking on this newsgroup yet; otherwise they
> would have argued for a redraw.

I think they should anyway:  isn't half the fun of these silly rituals
getting together and physically drawing lots?  Most things that get
computerized shouldn't <wink>.

> Anyhow, now I'm forced to redo the programming. Thanks for sharing
> all the insights!

My pleasure -- regurgitating was exactly what I was in the mood for, with a
belly full of far too much turkey.

Here's one last approach to counting the number of solutions:  view the
chessboard as an NxN matrix, with a 1 everywhere a square is empty, and a 0
everywhere there's cheese.  The number of solutions is then the so-called
"permanent" of this 0-1 matrix.  This is exactly like the determinant,
except all the signs are positive (it's the sum of the N! terms of the form
A[0, p[0]] * A[1, p[1]] * ... * A[N-1, p[N-1]] where p[:] ranges over all
permutations of range(N); the determinant has the same definition, except
half the terms in the sum are negated -- and that makes a world of
difference!).

Knuth Vol 2 is a good reference for "efficient" methods of evaluating the
permanent (although he doesn't make any connection to permutations with
forbidden positions, it's obvious to the most casual observer <wink>).  At
least as of the 3rd edition of vol. 2, the fastest known methods for finding
the permanent of an arbitrary 0-1 matrix take time exponential in N.  The
only winnable battle here is in reducing the constant factor.

I'm afraid that's just the nature of the beast:  combinatorics is full of
problems that are easy to state but inherently difficult and/or expense to
solve.

> ...
> the solution as i posted bailed out after 1000 tries, not exactly
> forever:)

Yes, except that wasn't actually a solution <0.5 wink>.

>> The expected number of permutations you have to generate at
>> random before you see each of the N! possibilites at least once
>> is about N! * ln(N!) (ln == log to the base e). [and if you cut
>> off the search earlier than that, you're more likely than not
>> to have missed at least one permutation completely]

> But I have to rethink this one. I'm not trying to generate all
> solutions, just one. So in what way would this make the algoritme
> unfair? Each time I call this program is an event on its own,
> unrelated to the previous. So the 'missing' solution might well
> be a different one in each run.

The paragraph to which you respond here was explaining why cutting the
search off "early" is likely cutting it off even earlier than you might
think.  That doesn't make it unfair, it makes it unlikely to terminate (your
original complaint was that your program took longer than you hoped for --
this was trying to explain why).

For a given instance of the problem, it doesn't matter whether you run the
algorithm 10 times bailing out at a thousand misses each time, or run it
once bailing out at 10,000 misses.  The expected number of individual
permutations looked at is the same either way.

Whether it's fair or not depends on what you *do* with the permutations
(i.e., on details of the code after a permutation-- or part of one --has
been generated).

> ...
> A variation of this would be to number the solutions, draw a number and
> generate that solution as I proposed in a previous post.

Yes, I didn't understand that post <wink>.  Provided the total # of
solutions is small enough, and you have an effective way to put them in
1-to-1 correspondence with a contiguous range of integers, of course that
should work.

"small enough" == within the limitations of the random-number generator; I
wouldn't trust random.random for more than 16 bits resolution, and if you
don't have more than 2**16 solutions anyway you may as well generate them in
full (since that code is already written!).

Note that it's as futile to ask random.random to generate a string of 61
random *bits* (or to make 61 random yes/no choices) as it is to ask it to
generate a random permutation of 20 elements:  both require generating 61
random bits of information (log base 2 of 20! ~= 61), and that's beyond the
generator's abilities.  We have to be careful about what that means, though:
generate a pile of 61-bit bitstrings, and they will "look random" to most
statistical tests.  However, no matter how long you continue generating
them, most ("almost all", in fact) 61-bit strings will never appear
(random.random's internal state contains 45 bits, so must repeat after no
more than 2**45 calls (but actually repeats earlier than that)).

> I will use your info and some rainy days to figure out whether it is
> feasable at all.  Would circumvent the next deepflaw [too many solutions]
> too.

Alas, for exponentially expensive problems, you end up pouring a ton of
energy into extending a simple solution that works up to (say) N=10 to a
breathtakingly complex solution that works up to (say) N=20 <0.1 wink>.
There's really little point to it unless you simply enjoy the challenge --
in which case, that's point enough!

the-wallet-rarely-knows-what's-good-for-the-soul-ly y'rs  - tim






More information about the Python-list mailing list