Restricting the alphabet of a string
7stud
bbxx789_05ss at yahoo.com
Mon Apr 30 10:00:29 EDT 2007
On Apr 30, 5:53 am, "Nathan Harmston" <ratchetg... at googlemail.com>
wrote:
> Hi,
>
> I ve being thinking about playing around with bit strings but use in
> some encoding problems I m considering and was trying to decide how to
> implement a bit string class. Is there a library out there for doing
> basic things with bit strings already is my first question?
>
> I know that I can extend string to bit string, but is there anyway I
> can force the alphabet to be restricted to 1's and 0's (or even 1, 0
> and -1, as an extension to from trinary strings).
>
> class Binary_String(String):
> pass
>
> Many Thanks
>
> Nathan
You could do something like this:
class Binary_String(str):
def __init__(self, val):
i = 0
for char in val:
if char not in "-101":
raise ValueError("illegal character at index " +
str(i) + "\nOnly -1,0,1 allowed.")
i+=1
self.val = val
b = Binary_String("1a0101")
--output:--
Traceback (most recent call last):
File "test1.py", line 13, in ?
b = Binary_String("1a100")
File "test1.py", line 6, in __init__
raise ValueError("illegal character at index " + str(i)
ValueError: illegal character at index 1
Only -1,0,1 allowed.
More information about the Python-list
mailing list