H. Bremermann proved in 1962 that “No data processing system, whether artificial or living, can process more than 2 x 10^{47} bits per second per gram of its mass,” which means that a hypothetical supercomputer the size of earth (= about 6 x 10^{27} grams) grinding away for as long as the earth has existed (= about 10^{10} years) can have processed at most 2.56 x 20^{92} bits, which number is known as Bremermann’s Limit.

Calculations involving numbers larger than 2.56 x 20^{92} are called transcomputational problems, meaning they’re not even theoretically doable; and there are plenty of such problems in statistical physics, complexity theory, fractals, etc.

— *Everything and More* by David Foster Wallace

### Like this:

Like Loading...

*Related*

This entry was posted on October 2, 2007 at 7:57 am and is filed under 1962, Bremermann's Limit, complexity theory, data processing system, fractals, H. Bremermann, statistical physics, transcomputational problems, ultimate limits of computing. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

## Leave a Reply