Restricting the alphabet of a string
Paul McGuire
ptmcg at austin.rr.com
Mon Apr 30 10:40:35 EDT 2007
On Apr 30, 9:00 am, 7stud <bbxx789_0... at yahoo.com> wrote:
> On Apr 30, 5:53 am, "Nathan Harmston" <ratchetg... at googlemail.com>
> wrote:
>
>
>
>
>
> > Hi,
>
> > I ve being thinking about playing around with bit strings but use in
> > some encoding problems I m considering and was trying to decide how to
> > implement a bit string class. Is there a library out there for doing
> > basic things with bit strings already is my first question?
>
> > I know that I can extend string to bit string, but is there anyway I
> > can force the alphabet to be restricted to 1's and 0's (or even 1, 0
> > and -1, as an extension to from trinary strings).
>
> > class Binary_String(String):
> > pass
>
> > Many Thanks
>
> > Nathan
>
> You could do something like this:
>
> class Binary_String(str):
> def __init__(self, val):
> i = 0
> for char in val:
> if char not in "-101":
> raise ValueError("illegal character at index " +
> str(i) + "\nOnly -1,0,1 allowed.")
>
> i+=1
> self.val = val
>
> b = Binary_String("1a0101")
>
> --output:--
> Traceback (most recent call last):
> File "test1.py", line 13, in ?
> b = Binary_String("1a100")
> File "test1.py", line 6, in __init__
> raise ValueError("illegal character at index " + str(i)
> ValueError: illegal character at index 1
> Only -1,0,1 allowed.- Hide quoted text -
>
> - Show quoted text -
Your character test only needs to check if the character is in "-10",
not "-101". In fact, this is a little misleading, since it appears to
check for "-1" but is really only checking for "-"s and "1"s (and "0"s
too, of course).
This code accepts "1-0101" also, but what is -0?
-- Paul
-- Paul
More information about the Python-list
mailing list