random

Alex Martelli aleaxit at yahoo.com
Mon Jun 4 04:28:17 EDT 2001


"Darren New" <dnew at san.rr.com> wrote in message
news:3B1AE07F.CEDA4A3A at san.rr.com...
    ...[snip: assertions on the deep properties of physical
        reality that I am not qualified to argue]...
> > That's like saying that
> > "perfect information" about an algorithm tells me only it's a good
> > one, and gives me no information about the state it starts in.
>
> No, it's not. I'm not sure why you assert this. An algorithm is a
> mathematical concept, and hence fully defined.

Even in a finitistic definition of algorithm, there may still be
parameters needed to describe it completely.


> > > > I still haven't heard your definition of
> > > > 'perfect' randomness, for example.
> > >
> > > Generally, this is something along the lines of:
> > > Given all available information about the mechanism used to generate
the
> >
> > "available" seems to play a rather crucial role here, then.
>
> No. "available" means "possible to be obtained in any way." What you

So that the randomness of an existing system can suddenly
change if somebody dies, for example?  Because, e.g., part
of the information about the system was only "available" by
asking him (he had memorized it) -- with that information,
in this thought-experiment, things were predictable -- now
the designer dies, the information is not available any more,
and it's not predictable anymore thus becomes 'random'?

I would rather not use a definition of "perfect" that requires
accounting for such factors -- considering human beings as
parts of the system, knowing if they ever wrote somewhere
down that information (it might still be 'obtained in some
way' if I could root through their papers and maybe if I was
able to decipher some private code they used -- or maybe
not) or communicated it to someone else (maybe to their
priest during confession?)... seems a rather shifty base for
a mathematical definition.

Why don't we follow De Finetti's lead?  P(x) does not 'exist'
per se, it's just a shorthand for P(x|y) where y is a 'mindset',
a sum total of information about the word, that is being
implicitly assumed.  All probabilities are 'subjective' in the
specific sense of depending on that y -- there is no 'perfect
reference y' (God's all-knowing mind) that needs to be taken
as an assumption -- as long as we fix y (ideally, explicitly)
we can do all of our probability computations equally well.

Now if you believe there is an ideal y such that all P(x|y)
are 0 or 1 for that y, or if you believe it does not exist --
whether God plays dice or not:-) -- becomes irrelevant
enough so we can do maths without first having to solve
theological questions as a prerequisite.  Isn't randomness
(or "perfect" randomness) akin to probability in this?  Or
IS it just a theological issue, so that any hope of tackling
it apart from theology is doomed?


> > That would only leave the arbitrarily long sequence of generated
> > bits and no other info?  Fine, then:
>
> > > I don't think Chaitin's Omega meets this definition, since it is by
> > > definition a single number than anyone can calculate equally well.

It's one number PER each Universal Turing Machine, I believe.  I
do not know if from the first N bits of Omega you get enough
information to reconstruct all parameters of the UTM in question
so as to be able to compute the N+1-th bit, but I believe that
part of the point is that you *don't*.

> > But from Omega's first N bits, you get no hint at all that lets
> > you predict its N+1-th bit "with > 50% accuracy".
>
> How did you get the first N bits? If you can calculate them, so can I.

Say I have an idealized black box that was made for me by a
demented but genial Transylvanian dwarf, who, unfortunately,
has since deceased.  I do know with certainty (he was a VERY
trustworthy demented but genial Transylvanian dwarf) that
what the box does each time I press the button is compute
the next bit of Omega for some UTM -- alas, I don't know WHAT
Universal Turing Machine encoding it uses, and the box with
the "algorithm" (or effective procedure, whatever) is rigged so
that any attempt to examine its innards will destroy it (I think
my deceased friend didn't believe humanity was ready to learn
how he made the infinite-precision registers that are such a
crucial part of the machine's operation, and, you know, he may
have been right after all).

So, "all available information" is: this computes Omega for SOME
UTM -- sadly, we don't know WHAT UTM that is.  There, you have
all _available_ information.  Now (assuming you can't come and
steal my treasured little black box -- it's well-guarded, I assure
you... you don't want to brave the killer rabbit, for example, and
that's just the start of it!), how are you going to calculate?  I'll
send you the first N bits, for any N, and you just need to give
me the N+1-th one.

Note that 'all available info' does include the algorithm, net of
some numeric parameters.  If I (oops, I mean my dwarvish
friend) had used a "weaker" algorithm, say a linear-congruential
generator, you would be well placed indeed in a totally similar
situation (where you know 'the algorithm' but not the numeric
parameter, e.g. the 'seed' as you put it).  Analyzing the N
bits for a high-enough N might suffice, etc.


> > So, if the
> > definition does not include the first condition, from the bits
> > only you're not going to get any farther than from N tosses
> > of an idealized coin.
>
> Sure. *I* can write an algorithm to calculate the bits of Omega. Then I

For what UTM encoding, pray?

> run my algorithm, see that the first N bits of its output matches the
> first N bits of your sequence, and predict with very high probability
> what the next bit is, right?

I don't believe so -- I don't think those N bits let you reconstruct
the UTM encoding being used.  I hope I'll be more positive about
it once I _have_ received and studied 'Exploring Randomness'.

Until then, I can only hypothesize -- IF some amount N of bits
was sufficient to let you reconstruct the UTM parameters, and
therefore calculate the N+1-th bit, THEN I would agree that the
Omega's randomness was bounded by N (<=N).  IF no amount
N of bits has this property, would you agree in turn as to the
'perfection' of the randomness (at least once the dwarf has
died, or has irretrievably forgotten how the hep he DID build
the darned algorithmic black box in the first place)?


> If you calculate the first 100,000 bits of
> Omega and give it to me, and you don't tell me that it's the first
> 100,000 bits of Omega, and I think to try calculating the first 100,000
> bits of Omega and they match, then I guess the 100,001'st bit of Omega,
> then I have a >50% chance of getting the next bit right.

Again the UTM parameters loom large.  At least until some ISO
subcommittee comes up with a standardized Universal Turing
Machine (or maybe the IEEE could do it, as for floating point?-)...

> If, on the other hand, you run a random number generator based on
> quantum radioactive decay, or you flip a coin 100,000 times, and it just
> *happens* to match the first 100,000 bits of Omega, and I notice that,
> and I guess the 100,001'st bit of Omega, then I have still only a 50%
> probability of being right.

I think the situation, looking at the bits only, would be the same
in both cases.  Now I'm starting to wonder if a UTM encoding (or
an infinite set of possible such encodings, actually) could be found
to "account" for ANY such given first-100,000-bits.  Maybe not --
perhaps I cannot encode an UTM so that random self-delimiting
programs for it have _any_ prespecified probability of halting
(or at least to N bits' approximation).


> This is why some people say that any given finite string cannot be
> considered random or non-random by itself, without knowing how it's
> generated.

The 'infinity' aspects with which 'perfect randomness' seems to be
imbued might surely account for that unwillingness to bestow the
now-hallowed monicker 'random' onto any mere finite structure, I
guess.  I still think it's very useful to study the randomness of such
finite sets, and apparently so do Chaitin and all other scholars in
that field.  However it was generated, that bitstring now has its
own intrinsic properties.  A man in the street might well call it
'pretty well-ordered' or 'rather random' -- and why should the word
'random' be pre-empted, so it has no meaning at all in this context,
when a useful technical sense, that happens to match pretty well
the way the man in the street would use the word, can be applied
instead?


Alex







More information about the Python-list mailing list