Search

How an obscure British PC maker invented ARM and changed the world - Ars Technica

indonesiabei.blogspot.com
How an obscure British PC maker invented ARM and changed the world
Jason Torchinsky

Let's be honest: 2020 sucks. So much of this year has been a relentless slog of bad news and miserable events that it's been hard to keep up. Yet most of us have kept up, and the way most of us do so is with the small handheld computers we carry with us at all times. At least in America, we still call these by the hilariously reductive name "phones."

We can all use a feel-good underdog story right now, and luckily our doomscrolling 2020 selves don't have to look very far. That's because those same phones, and so much of our digital existence, run on the same thing: the ARM family of CPUs. And with Apple's release of a whole new line of Macs based on their new M1 CPU—an ARM-based processor—and with those machines getting fantastic reviews, it's a good time to remind everyone of the strange and unlikely source these world-controlling chips came from.

If you were writing reality as a screenplay, and, for some baffling reason, you had to specify what the most common central processing unit used in most phones, game consoles, ATMs, and other innumerable devices was, you'd likely pick one from one of the major manufacturers, like Intel. That state of affairs would make sense and fit in with the world as people understand it; the market dominance of some industry stalwart would raise no eyebrows or any other bits of hair on anyone.

But what if, instead, you decided to make those CPUs all hail from a barely-known company from a country usually not the first to come to mind as a global leader in high-tech innovations (well, not since, say, the 1800s)? And what if that CPU owed its existence, at least indirectly, to an educational TV show? Chances are the producers would tell you to dial this script back a bit; come on, take this seriously, already.

And yet, somehow, that's how reality actually is.

Time well spent in 2020.

In the beginning, there was TV

The ARM processor, the bit of silicon that controls over 130 billion devices all over the world and without which modernity would effectively come to a crashing halt, has a really strange origin story. Its journey is peppered with bits of seemingly bad luck that ended up providing crucial opportunities, unexpected technical benefits that would prove absolutely pivotal, and a start in some devices that would be considered abject failures.

But everything truly did sort of get set in motion by a TV show—a 1982 BBC program called The Computer Programme. This was an attempt by the BBC to educate Britons about just what the hell all these new fancy machines that looked like crappy typewriters connected to your telly were all about.

The show was part of a larger Computer Literacy Project started by the British government and the BBC as a response to fears that the UK was deeply and alarmingly unprepared for the new revolution in personal computing that was happening in America. Unlike most TV shows, the BBC wanted to feature a computer on the show that would be used to explain fundamental computing concepts and teach a bit of BASIC programming. The concepts included graphics and sound, the ability to connect to teletext networks, speech synthesis, and even some rudimentary AI. As a result, the computer needed for the show would have to be pretty good—in fact, the producers' demands were initially so high that nothing on the market really satisfied the BBC's aspirations.

So, the BBC put out a call to the UK's young computer industry, which was then dominated by Sinclair, a company that made its fortune in calculators and tiny televisions. Ultimately, it was a much smaller upstart company that ended up getting the lucrative contract: Acorn Computers.

An Acorn blooms

Acorn was a Cambridge-based firm that started in 1979 after developing computer systems originally designed to run fruit machines—we call them slot machines—then turning them into small hobbyist computer systems based on 6502 processors. That was the same CPU family used in the Apple II, Atari 2600, and Commodore 64 computers, among many others. This CPU's design will become important later, so, you know, don't forget about it.

Acorn had developed a home computer called the Atom, and when the BBC opportunity arose, they started plans for the Atom's successor to be developed into what would become the BBC Micro.

The BBC's demanding list of features ensured the resulting machine would be quite powerful for the era, though not quite as powerful as Acorn's original Atom-successor design. That Atom successor would have featured two CPUs, a tried-and-true 6502 and an as-yet undecided 16-bit CPU.

Acorn later dropped that CPU but kept an interface system, called the Tube, that would allow for additional CPUs to be connected to the machine. (This too will become more important later.)

The engineering of the BBC Micro really pushed Acorn's limits, as it was a pretty state-of-the-art machine for the era. This resulted in some fascinatingly half-ass but workable engineering decisions, like having to replicate the placement of an engineer's finger on the motherboard with a resistor pack in order to get the machine to work.

Nobody ever really figured out why the machine only worked when a finger was placed on a certain point on the motherboard, but once they were able to emulate the finger touch with resistors, they were just satisfied it worked, and moved on.

Here, listen to one of the key engineers tell you himself:

The relevant section starts at 9:40.

The BBC Micro proved to be a big success for Acorn, becoming the dominant educational computer in the UK in the 1980s.

As everyone with any urge to read this far likely knows, the 1980s were a very important time in the history of computing. IBM's PC was released in 1981, setting the standard for personal computing for decades to come. The Apple Lisa in 1983 presaged the Mac and the whole revolution of the windows-icons-mouse graphical user interface that would dominate computing to come.

Acorn saw these developments happening and realized they would need something more powerful than the aging but reliable 6502 to power their future machines if they wanted to compete. Acorn had been experimenting with a lot of 16-bit CPUs: the 65816, the 16-bit variant of the 6502, the Motorola 68000 that powered the Apple Macintosh, and the comparatively rare National Semiconductor 32016.

None of these were really doing the job, though, and Acorn reached out to Intel to see about implementing the Intel 80286 CPUs into their new architecture.

Intel ignored them completely.

RISC-y business

Spoiler alert: this will prove a very bad decision for Intel.

Acorn next made the fateful decision to design their own CPU. Inspired by the lean operation of Western Design Center (the company that was developing new 6502 versions) and various research about a new sort of processor design concept called Reduced Instruction Set Computing (RISC), Acorn decided to move ahead. Engineers Steve Furber and Sophie Wilson proved to be key players on the project.

Now, RISC processors are called what they are in comparison to Complex Instruction Set Computing (CISC processors). It's probably worth trying to give a very, very perhaps embarrassingly simplified explanation of what this actually means.

CPUs have a group of operations they can undertake—their instruction sets. CISC CPUs have large, complex sets of instructions, allowing them to perform complicated tasks over multiple "clock cycles" of the CPU. This means the complexity is actually built into the hardware of the chip itself, which means the software code can be less complex. So, code for CISC machines reduces the number of instructions, but the number of cycles the CPU needs to execute the instructions goes up.

RISC, as you probably already guessed, is the opposite: fewer instructions, less hardware on the chip itself, and every instruction can be executed in a single clock cycle. As a result, code has to be longer and seemingly less efficient, which means more memory, but the chip itself is simpler and can execute the simple instructions faster.

Acorn was well-suited to design a RISC CPU since the chip they were most familiar with, the 6502, is often said to be a sort of proto-RISC design. There's all kinds of opinions about this on the Internet (I know, right?) and I'm not looking to get into a painful, tedious, unsexy argument with anyone about it, but for the sake of what I'm explaining right now, trust me in that the 6502 at least has some very RISC-like traits.

The new Acorn chip was so RISC-y in fact that Sophie Wilson, when she was designing the instruction set for Acorn's new processor, seems to have clearly been directly inspired by a number of 6502 design concepts.

Thanks to the Internet, the brochure for the <a href="http://chrisacorns.computinghistory.org.uk/Computers/Archimedes.html">Archimedes High Performance Computer System</a> is available online in its entirety.
Enlarge / Thanks to the Internet, the brochure for the Archimedes High Performance Computer System is available online in its entirety.

Using the BBC Micro's Tube interface as a testbed, the new RISC-based CPU developed by Acorn was called the Acorn RISC Machine, or ARM. Acorn's chip manufacturing supplier VLSI began to produce ARM CPUs, first for Acorn's internal research and development. Not long after, a production version, the ARM2, was ready.

In 1987, the first production RISC-based personal computer was introduced, the Acorn Archimedes, powered by the ARM2 CPU. The ARM proved to have better performance than Intel's 286, despite having 245,000 less transistors than Intel's big chip.

The Archimedes, with its Arthur OS in ROM, proved to be a flexible, fast, and powerful machine. It had good graphics for the era, a graphical user interface, and some cool and fast low-polygon demos and games that really showed off the machine's speed, thanks to its lean and hungry CPU.

At the time, this first ARM-based machine claimed to be the fastest personal computer of the era, with a performance rating of several times that of Intel's 80286.

Less is more

The ARM's lack of transistors was a tell about the relative simplicity of the ARM itself, and as a result the chip used much less power and ran far less hot than nearly anything else around for its computing power.

The low power/low heat traits of the ARM wasn't part of the initial design brief, since Acorn was designing a CPU for a desktop machine, but it would likely prove to be the most fortunate and beneficial unplanned byproduct in the history of computing.

This low power consumption and low heat production made the ARM a natural for mobile devices, which is why Apple came sniffing around in the late 1980s looking for a CPU powerful enough to (often comically) translate handwriting into text and run a GUI all while being powered by AA batteries and not turning the handheld device it was to run into a hand-burning block of pain. The handheld device they wanted to power was the infamous Newton, and only the fast and lean ARM core could power it.

Apple and Acorn's chip partner VLSI partnered with Acorn to spin off the ARM division into its own new company, called Advanced RISC Machines, allowing the ARM name to stick. Under this alliance, with Apple's considerable resources added, ARM would develop the ARM6 core, with the ARM610 CPU being the first production chip based on that core, and, in a 20 Mhz version, would go on to power the Apple Newton in 1993.

While, sure, the Newton was kind of a spectacular failure, in hindsight it was really something much, much more: a handheld, battery-powered touch-screen device powered by an ARM CPU. Today this same description could be used to describe the literally billions of smartphones in use constantly all over the world, and it was first field-tested with the device that most people remember from that Simpsons episode where it transformed the handwritten phrase "Beat up Martin" into "Eat up Martha."

The ARM610 would go on to power a new generation of Acorn Archimedes computers and a strange Newton-based laptop called the eMate. In 2001, an ARM7-core CPU would power Apple's iPod and Nintendo's Game Boy Advance. In 2004, a pair of ARMs would drive the twin screens of the Nintendo DS.

Then, in 2007, Apple would release the first iPhone with its ARM11-core CPU. From that moment on, everything went ARM-bonkers.

ARM CPUs became the default choice for smartphones, whether they were from Apple or anyone else. ARM CPUs powered every thinking machine that wasn't a strict Intel-based desktop or laptop or server. Now, with ARM Chromebook and Apple's new MacOS ARM-based desktops and laptops, the ARM looks like it will finally be returning to where it all began—in a desktop computer.

So many years later, the ARM origin story remains worth telling because it's so wonderfully improbable; it's such a strange, unplanned sequence of events from unlikely sources. Even though it is absolutely dominant in the world now, ARM's humble beginnings make it feel like less of an unfeeling behemoth of industry than, say, the Intel/AMD near-biopoly feels.

It's nice to just take a moment and reflect: because the British felt they were being left behind by the computer revolution, they decided to make TV shows about computers. To do that, they needed a computer, so an underdog British company came up with a good one. And when that little company needed to build a faster CPU, because Intel couldn't be bothered to answer their calls, they made their own. This in-house CPU just so happened to not use much power or make much heat, which got the attention of Apple, who used it to power what most people consider to be its biggest failure. From there, of course, the company went on to take over the fucking world.
If I made that up, you'd say I was trying too hard to be quirky or that I'd seen too many Wes Anderson movies. But that's reality.

However... if reality is, in fact, a simulation, I bet it's powered by ARMs, too.

Let's block ads! (Why?)



"world" - Google News
December 20, 2020 at 09:00PM
https://ift.tt/37AUfmE

How an obscure British PC maker invented ARM and changed the world - Ars Technica
"world" - Google News
https://ift.tt/3d80zBJ
https://ift.tt/2WkdbyX

Bagikan Berita Ini

0 Response to "How an obscure British PC maker invented ARM and changed the world - Ars Technica"

Post a Comment

Powered by Blogger.