are int, float, long, double, side-effects of computer engineering?

Westley Martínez anikom15 at gmail.com
Tue Mar 6 19:53:43 EST 2012


On Tue, Mar 06, 2012 at 04:29:10PM -0500, Calvin Kim wrote:
> On 03/06/2012 01:34 AM, Xah Lee wrote:
> >while what you said is true, but the problem is that 99.99% of
> >programers do NOT know this. They do not know Mathematica. They've
> >never seen a language with such feature. The concept is alien. This is
> >what i'd like to point out and spread awareness.
> >
> I can see your point. But that's not simply true. In my case and
> many others, such issue was addressed during first week of
> introductory programming classes. I was naively thought "computer =
> precision" and I was stunned to find out the inaccuracy of computer
> calculations.
> 
> But as you experienced, I also stumble upon some people (specially
> Java only programmers) who were not aware of it.
> 
> >also, argument about raw speed and fine control vs automatic
> >management, rots with time. Happened with auto memory management,
> >managed code, compilers, auto type conversion, auto extension of
> >array, auto type system, dynamic/scripting languages, etc.
> Maybe it's because I'm not in scientific community, that I learned
> to live with such side-effects. Because 99.99% of computer users and
> programmers can afford to, and willing to lose such small inaccuracy
> billion times in exchange for some performance increase and
> convenience. Although NASA may not accept my application for their
> projects for Mars mission after this posting.

Also remember that double precision is not the maximum.  There exist
standards for triple and quadruple precision formats, as well as other
extended formats.



More information about the Python-list mailing list