What is precision of a number representation?

Antoon Pardon antoon.pardon at rece.vub.ac.be
Tue Jul 12 05:50:27 EDT 2016


Op 12-07-16 om 06:19 schreef Steven D'Aprano:
> On Tue, 12 Jul 2016 07:51 am, Chris Angelico wrote:
>
>> say, 2,147
>> millimeters, with a precision of four significant digits
>
> How do you represent 1 mm to a precision of four significant digits, in such
> a way that it is distinguished from 1 mm to one significant digit, and 1 mm
> to a precision of four decimal places?
>
> 0001
> 1
> 1.0000

Your question doesn't has an answer because 1 mm doesn't have a precision of
four significant digits.

A precision is an indication of a fault tolerance. You don't indicate less
fault tolerace by writing it as 0001.

Please explain how 0001 represants a difference in precision than just 1.

Writing 1.0000 instead of 1 can be understood as the actual number being
between 0.99995 and 1.00005 instead of the actual number being between
0.95 and 1.05.

Now between which two numbers is 0001 supposed to be?

-- 
Antoon Pardon





More information about the Python-list mailing list