Saturday, October 07, 2006

Thermal noise driven computing

"Thermal noise driven computing," Laszlo B. Kish. APL 89, 144104 (2006)

In this paper, Kish describes utilizing Johnson noise, i.e. extremely low DC voltage, to drive gates to perform computing operations. Such a computer has the low state Usl=0 and the high state Ush<=sigma, the mean or effective Johnson noise voltage. With noise as the input signal, the information content is low, but the energy requirement is also low. Kish proceeds to calculate the energy requirement for bit operations, J/bit. Starting with Shannon's channel coding theorem (which I need to look into) Kish shows that the minimum mean energy cost per bit operation P/C = pi/2 kT ln2 (per bit). This is 1.1 kT/ bit, which is small compared to the amounts microprocessors today give off >25000 kT/ bit. The inverse of this quantity is the energy efficiency of the data processing, which is large ~10^20 bit/ J.
The advantages of a thermal noise computer is low energy requirement and improved energy efficiency. But to realize a true computer requires means for error correction; often achieved by redundancy of the number of bits. So to build up such a thermal noise computer would possible require large number of computing elements.
Kish concludes by asking several questions, some which I find more interesting and thus, reproduce here:
  • Do we need error correction at all (except input/output oerations) when we want to simulate the way the brain works?)
  • Is there any way to realize a non-Turing machine with stochastic elements without excessive hardware/ software based redundancy?
  • How should redundancy and error correction be efficiently used to run the thermal noise driven computer as a Turing machine?
  • How much energy would such error correction cost?

No comments: