It was 1983, and Acorn Computers was on top of the world. They have the wildly successful BBC Microcomputer in the UK. But the world of personal computers was changing. The market for cheap 8-bit micros that parents would buy to help kids with their homework was becoming saturated. And new machines from across the pond, like the IBM PC and the upcoming Apple Macintosh, promised significantly more power and ease of use. Acorn needed a way to compete, but it didn’t have much money for research and development.
Sophie Wilson, one of the designers of the BBC Micro, had anticipated this problem. She had added a slot called the “Tube” that could connect to a more powerful central processing unit. A slotted CPU could take over the computer, leaving its original 6502 chip free for other tasks.
But what processor should she choose? Wilson and co-designer Steve Furber considered various 16-bit options, such as Intel’s 80286, National Semiconductor’s 32016, and Motorola’s 68000. But none were completely satisfactory.
Wilson explained, “We could see what all these processors did and what they didn’t do. So the first thing they didn’t do was they didn’t make good use of the memory system. The second thing they didn’t do was that they weren’t fast; they weren’t easy to use.”
Then they visited the Western Design Center in Mesa, Arizona. This company was making the beloved 6502 and designing a 16-bit successor, the 65C618. Wilson and Furber found little more than a “bungalow in a suburb” with a few engineers and some students making diagrams using old Apple II computers and bits of sticky tape.
Despite the challenges of making their own CPU, upper management at Acorn supported their efforts. Acorn co-founder Hermann Hauser, who had a Ph.D. in Physics, gave the team copies of IBM research papers describing a new and more powerful type of CPU. It was called RISC, which stood for “reduced instruction set computing.”
To further future-proof the new Acorn CPU, the team decided to skip 16 bits and go straight to a 32-bit design. This actually made the chip simpler internally because you didn’t have to break up large numbers as often, and you could access all memory addresses directly. (In fact, the first chip only exposed 26 pins of its 32 address lines, since 2 to the power of 26, or 64MB, was a ridiculous amount of memory for the time.)
See the remarkable history of the first chips and how they have launched a computing revolution on ars technica.