[Tutor] Floating Point Craziness

Ryan Strunk ryan.strunk at gmail.com
Sun Jun 12 15:43:10 CEST 2011


Hi everyone,
I'm designing a timeline. When the user presses the right arrow, 0.1 is
added to the current position. The user can add events to the timeline, and
can later scroll back across those events to see what they are. But
something I absolutely don't understand is happening:
I used the program to place events at 1.1, 2.1, and 3.1. Here is the end of
the debug output for arrowing to 3.1 and placing the event:
position = 2.7
position = 2.8
position = 2.9
position = 3.0
position = 3.1
event placed.
Everything appears straight forward. But then when I check the dictionary's
contents:
dictionary = {3.1000000000000014: value, 2.1000000000000005: value,
1.0999999999999999: value}
Why is this happening? The output is telling me 3.1, but the value isn't
being placed there. I've tried rounding the 0.1 interval increase to one
decimal point, but the dictionary values are still being improperly placed.
Thanks for any help you can provide.
Best,
Ryan



More information about the Tutor mailing list