CW@50: How UK home brew gave computing to the masses
From batch processing to the graphical user interface, from desktop PCs to smartphones, from luggables to wearables, computing has come a long way since Computer Weekly was first published in 1966. We look at some of the personal computers that paved the way
The computer age has shifted from expensive, centralised, user-unfriendly systems, to desktops, laptops and handheld devices that people now find desirable, thanks to tech breakthroughs and falling prices.
According to The National Museum of Computing (TNMOC), the very first use of the phrase “personal computing” was in January 1976 in an article titled Misgivings arise over VSPC – referring to IBM’s time-sharing software, Virtual Storage Personal Computing.
The US magazine Personal Computing ran an advert in its May 1977 issue for a computer workshop in London, which claimed to be Europe’s first computer shop.
TNMOC also notes that a report on the National Computer Conference at the London Hilton in June 1977 mentioned seminar papers on future trends in personal computing.
Our friends are not electric
Computing remained largely user-unfriendly during the 1960s and 1970s. The 1960s was the era of batch processing. The business would submit work to the data processing (DP) department, which would then load a program into the mainframe to run as a batch process. At some point in the future, the output of the batch processing would be spewed out as a report on a line printer.
The 1970s saw the emergence of minicomputers – the likes of HP, Digital, Wang and Data General. Users interacted with these machines directly, so there was no need to get DP involved. Rather than relying on the DP department, the user communicated with the minicomputer application via a computer terminal, such as the DEC VT100.
DEC VT100 (Image courtesy of Jason Scott on Flickr, reproduced under Creative Commons on Wikimedia)
“minicomputers started the move to a real-time environment, as they weren’t batch,” says Martin Banks, who was technical editor at Computer Weekly in 1977.
The minicomputer, like the mainframe, was a centralised computer system, shared across the business. Computing was too expensive to be considered personal. This was the established way IT was run – but things were about to change. The year was 1977; it was to be a year of anti-establishment, and the Sex Pistols were being Pretty Vacant.
Star Wars and the microprocessor revolution, which began with Intel’s launch of the 4004 microprocessor, ignited the imagination of children, teenagers and a group of young Brits.
Handhelds in my pocket
While Compaq’s first computer was a “luggable”, technology eventually evolved to the point where companies such as Toshiba could sell fully portable laptop PCs.
Meanwhile, another revolution was taking place in the hands of computer enthusiasts. Jason Fitzpatrick, chairman of the Centre for Computing History, says: “The Psion 3c Organiser was a very important device in the days of the personal digital assistant. It was user-friendly, ran loads of software and had an open architecture.”
Interestingly, the Forth programming language was used for programming the Psion device, and there was a cross-platform developer kit to write software on PCs that could then be run on the Organiser.
Like the shift from home computers to the PC, Psion paved the way for rival devices like the Apple Newton, and Windows CE devices such as the Dell Axiom and Compaq iPaq. Unlike the Psion, most of these devices failed to capture the public’s interest, until Apple reinvented itself as a consumer electronics brand with the launch of the iPod in 2001.
“Intel’s first UK press briefing was for the 1103 memory chip,” recalls Banks, who launched the UK’s first microcomputer column, which was published in Computer Weekly. This was a 1KB (kilobyte) memory chip.
According to Banks, while minicomputers of the time, such as the PDP-11, offered 4KB, they used core memory, which was large, complex and more expensive than Intel’s 1103 DRAM.
Soon after the 4004, Intel released the 8008, then, in 1974, the 8080. Intel was soon followed by other manufacturers, most notably Zilog with the z80 microprocessor and Motorola with the 6800 and later the 6502.
The Centre for Computing History’s Fitzpatrick says magazines such as Wireless World and Electronics Today published articles about these microprocessors. The microprocessor, along with relatively cheap memory, became the basic ingredient of the personal computing era. The relatively low cost of these components enabled electronics enthusiasts to build their own computers.
The US public was introduced to home computers, with machines including the Apple II, Commodore Pet and Tandy TRS-80, but they were all ferociously expensive.
By the late 1970s, the Brits were about to get their first taste of home computing, with generation-defining machines that were made possible thanks to affordable microprocessors such as the 8080 and the z80.
The Nascom 1 kit was one of the first machines to start the home computer revolution in the UK. This was a z80 microprocessor-based machine.
TNMOC trustee Kevin Murrell’s interest in home computers was fired up by the availability of the Sinclair MK-14, a £60 kit, which was relatively affordable. But, he recalls: “I gave up waiting for one, and opted for the Nascom 1, which could be soldered together at home.”
Sinclair Research launched its first proper home computer, the ZX80, in 1980. Murrell says the ZX80 kit was a small revolution, and sold for just shy of £100. It included Basic, making it somewhat easier to load programs than manually having to type in machine code, which was how earlier machines were programmed. “I earned a small living as a student by soldering these kits for people,” he says.
The ZX80 was quickly followed up in 1981 by the Sinclair ZX81, which was sold as a complete computer.
Sinclair ZX80 (Image courtesy of Daniel Ryde, Skövde on Wikipedia, reproduced under Creative Commons)
This had just 1KB of memory and a membrane keyboard. Many people would argue that it was not particularly useful, but what it did do was lead to the launch of several off-the-shelf magazines that listed programs for readers to type in.
“Some mags had a cassette on the front to load in the game,” says Murrell, for whom this machine’s importance was that it piqued people’s interest in computing and got them into programming.
“If you wanted to do clever stuff or quick stuff, you could code in Assembler. You either came from the z80 root or 6502 chipset, and that allegiance stayed with many of us for years.”
Lots of teenagers had a black-and-white TV in their bedroom to plug in their ZX81 computer. But when Sinclair launched the ZX Spectrum in 1982, it introduced a colour computer to the living room. In those days, colour TVs were quite expensive, so people tended to have just one, which was the main living room TV.
The Spectrum sparked an explosion in affordable colour home computers. Everyone recognises the Spectrum, but there were also the Atari Amiga and Commodore Vic 20, which came over from the US. And the Welsh got involved with the Dragon 32.
One of the interesting facts that defined the British home computing industry, according to Murrell, was that were many more designs in the UK than in the US.
Among the most advanced was the Sinclair QL of 1984. “Sinclair was certainly manufacturing machines with clever ideas,” says Murrell. “The QL had a recognisable GUI [graphical user interface] desktop, micro-drives the size of a small matchbox with tapes for storage, and it had some very good office software, but sadly proved to be less than reliable.”
Let’s code, put on your red shoes and code the blues
In the early 1980s there was a big buzz around home computing, but many people didn’t really know what they were buying, says Fitzgerald.
So there was a drive in education to get kids programming. The BBC Micro was developed by Acorn Computers for the BBC, which was embarking on a UK education programme as part of the BBC Computer Literacy Project. The BBC’s goal was to have at least one of these machines in every school in the UK.
Murrell says: “The BBC Micros did incredibly well at schools and, as we show at The National Museum of Computing, it is still a perfect machine for introducing youngsters to coding. You can’t browse the web or use Facebook, but it had a sophisticated version of the Basic programming language, and you could write structured code.”
PC kills the home computer stars
The advent of the PC, and later the Macintosh, spelt the end of these early home computer stars. “Everyone wanted to share programs and data, and having a standard, even if it wasn’t as advanced as some British systems, won in the end,” says Murrell.
But in 1984, the IBM PC and the Apple Mac were simply too expensive for the public and most small businesses. “A PC used to cost £2,500. Nobody could afford one,” says Fitzpatrick.
IBM PC monochrome phosphor monitor and IBM PC keyboard (Image courtesy of Ruben de Rijcke on Wikimedia, reproduced under Creative Commons)
Arguably, had it not been for Compaq, PCs could have remained expensive IBM devices. Speaking at the HP Innovators seminar in 2014, Compaq founder Rod Canion said: “IBM did what it had never done before – it rushed into the PC market. It did not believe it really was a market. It kicked off a one-year programme and rushed to market, thinking it would sell a few tens of thousands.”
The IBM PC became the market leader within 18 months, and IBM sold billions. A startup called Compaq wanted to make its own computer. But its founders realised the challenge it faced was that it would be very difficult to convince software companies to write applications, given that IBM’s machine was the de-facto standard.
Canion told the HP seminar of its light bulb moment: “If we can’t get the software companies to adapt their software to our computer, why don’t we adapt our computer to run software that already exists.”
Read more articles in the CW@50 series
- Computer Weekly takes a look back at how the design and role of the datacentre has changed over the past five decades.
- Standardisation helps industries to grow. Here in the UK, a number of highly influential standards emerged that have become building blocks for the modern computer industry.
- Computer Weekly examines how the 1940s, 1950s and 1960s became an age of great innovation for the British computer industry.
If Compaq could make its computer run the same software as IBM, it would have enough applications to make it a success. It took the company a year to reverse engineer the basic input/output software or Bios, in effect the DNA of the PC, in a so-called “clean room” environment.
The result was the Compaq Portable, an IBM PC 100% compatible machine, which was able to run MS-DOS, the leading operating system at the time, and all DOS-based software.
Meanwhile, in the UK, Amstrad was starting to make waves with its line of computers.
Amstrad’s first machine was the CPC-464. “It was a nice machine, and shows how the home computer was becoming a consumer product,” says Fitzpatrick.
Amstrad CPC-464 (Image courtesy of Bill Bertram on Wikimedia, reproduced under Creative Commons)
Amstrad’s chief, Alan Sugar, had made a tidy profit selling hi-fis, and applied what had worked there to the nascent home computer market. Amstrad could make its gear look better and therefore more appealing.
Its first commercially successful business machine was the Amstrad PCW. “For the cost of an electric typewriter, Amstrad had a word processor and printer,” says Fitzpatrick. But this machine was not DOS compatible. It ran the CPM operating system, which meant it was not a PC.
In 1986, Amstrad launched the PC1512, an IBM-compatible machine with 512KB of memory and two floppy drives. The machine sold well in small businesses and the home, paving the way for manufacturers such as Dell to sell low-cost PCs.
“Amstrad had a really good track record for producing consumer products and making a good profit,” says Fitzpatrick. “Sugar could see the PC was a sought-after device, but too expensive.”
Legend has it that Bill Gates refused to license MS-DOS, so he gave it to Sugar for free, according to Fitzpatrick. “Sugar made the PC affordable,” he says.
Leading packages included WordStar and WordPerfect word processors, the dBASE database and the Lotus 1-2-3 spreadsheet.
Start me up
As with home computers, in the early days of the PC revolution, users tended to do quite a lot of programming themselves.
“A big part of why people bought the PC was for stock control, built using dBASE – to build a simple data entry application, which made stock a whole lot easier to manage,” says Fitzpatrick.
Programming languages such as Turbo C from Borland crossed over from home computing to the business market, enabling professional programmers to create applications for MS-DOS and, later, Windows.
Arguably, the PC really came of age in the 1990s, with client server computing and a powerful 32-bit processor in the form of the Intel 80386 chip.
Microsoft made Gates the richest man in the world and, in 1995, the company even got the Rolling Stones to launch its new operating system, Windows 95, which introduced the “Start” menu.
Compaq became the market leader in the PC server market, with 80386-powered PC servers that could run proper multi-tasking operating systems such as Unix, and Linux.
Eventually, Microsoft released Windows NT, and Windows-based server software such as the SQL Server relational database and Exchange Server to run on these PC servers. With its focus on Windows GUI administration tools, Microsoft lowered the bar needed to run PCs as servers, which paved the way to the modern x86 datacentre.