OT: This Swift thing

Steven D'Aprano steve+comp.lang.python at pearwood.info
Wed Jun 11 22:08:45 EDT 2014


On Wed, 11 Jun 2014 08:28:43 -0700, Rustom Mody wrote:

> Steven D'Aprano wrote:
>
>> Not the point. There's a minimum amount of energy required to flip a
>> bit. Everything beyond that is, in a sense, just wasted. You mentioned
>> this yourself in your previous post. It's a *really* tiny amount of
>> energy: about 17 meV at room temperature. That's 17 milli
>> electron-volt, or 2.7×10^-21 joules. In comparison, Intel CMOS
>> transistors have a gate charging energy of about 62500 eV (1×10^-14 J),
>> around 3.7 million times greater.
>>  
>> Broadly speaking, if the fundamental thermodynamic minimum amount of
>> energy needed to flip a bit takes the equivalent of a single grain of
>> white rice, then our current computing technology uses the equivalent
>> of 175 Big Macs.
> 
> Well thats in the same realm as saying that by E=mc² a one gram stone
> can yield 21 billion calories energy.
[...]
> ie. from a a pragmatic/engineering pov we know as much how to use
> Einstein's energy-mass-equivalence to generate energy as we know how to
> use Landauer's principle to optimally flip bits.

You know, I think that the people of Hiroshima and Nagasaki and Chernobyl 
and Fukushima (to mention only a few places) might disagree.

We know *much more* about generating energy from E = mc^2 than we know 
about optimally flipping bits: our nuclear reactions convert something of 
the order of 0.1% of their fuel to energy, that is, to get a certain 
yield, we "merely" have to supply about a thousand times more fuel than 
we theoretically needed. That's about a thousand times better than the 
efficiency of current bit-flipping technology.

We build great big clanking mechanical devices out of lumps of steel that 
reach 25% - 50% of the theoretical maximum efficiency:

https://en.wikipedia.org/wiki/Thermal_efficiency

while our computational technology is something of the order of 0.00001% 
efficient. I'm just pointing out that our computational technology uses 
over a million times more energy than the theoretical minimum, and 
therefore there is a lot of room for efficiency gains without sacrificing 
computer power. I never imagined that such viewpoint would turn out to be 
so controversial.




-- 
Steven D'Aprano
http://import-that.dreamwidth.org/



More information about the Python-list mailing list