Computers have been digital for half a century. Why would anyone want to resurrect the clunkers of yesteryear? Charles Platt at Wired looks into this growing trend when he heard:
Bringing back analog computers in much more advanced forms than their historic ancestors will change the world of computing drastically and forever.
Engineers began using the word analog in the 1940s (shortened from analogue; they like compression) to refer to computers that simulated real-world conditions. But mechanical devices had been doing much the same thing for centuries.
In the 1940s, electronic components such as vacuum tubes and resistors were (used), because a fluctuating current flowing through them could be analogous to the behavior of fluids, gases, and other phenomena in the physical world. A varying voltage could … the orientation of a Gemini space capsule in a 1963 flight simulator.
But by then, analog had become a dying art. Instead of using a voltage to represent the velocity of a missile and electrical resistance to represent the air resistance slowing it down, a digital computer could convert variables to binary code—streams of 1s and 0s that were suitable for processing.
“A lot of Silicon Valley companies have secret projects doing analog chips,” according to Lyle Bickley, a founding member of the Computer History Museum in Mountain View, California, “Because they take so little power.”
Companies are doing analog research to customize AI hardware, particularly for energy efficiency.
Read Platt’s search through analog circuit fabrication in the fascinating article here.