In the early 1980s computers invaded British homes for the first time, a wave of cheap and futuristic devices that allowed millions of people to discover for themselves what a computer was. These fantastic machines, like the Sinclair ZX Spectrum, Acorn Electron and Commodore 64, promised to make computing user-friendly for the first time. They were expected to reveal the wonders of information technology to the masses, and bring about a revolution in homes, schools, and workplaces. But to what extent did the electronic dreams these machines were sold on actually come true? What impact did home computers have on our lives in the 1980s?
The biggest change that 1980s home computing brought about was probably in how people thought about computers. Historically, they were data processing machinery for scientists and big organisations; full of expensive electronics. They were rare – most people would never even have seen a computer in real life, and even a large organisation might only have a single computer. And they were gigantic because their circuitry was built of thousands of bulky electronic components like valves and transistors. The very notion of having one of these giant ‘electronic brains’ in your home was the stuff of science fiction – until, that is, developments in miniaturised microelectronics in the 1970s.
By steadily improving technology, scientists squeezed ever more computer power onto ever smaller electronic components, reaching the point where all the circuitry needed to make a computer could be held on just a few tiny microchips.
In the mid-1970s electronics hobbyists started cobbling together microchips to create the first personal computers. These hand-made contraptions were good for little more than allowing enthusiasts to tinker around solving logic puzzles, but they were a start. By the end of the decade a range of computer kits was available for those with the soldering skills to assemble them. For the less technically skilled, a few American companies were even manufacturing ready-made computers, like the £700 Apple II and £500 Commodore PET. These machines, known as ‘appliance computers’ because they could simply be plugged in and switched on, were more user-friendly, but more expensive too, compared to kit computers that were priced at under £100. For the moment, personal computing was restricted to being a geeky hobby for technically minded enthusiasts, too complicated or too expensive for everyone to enjoy.
That all began to change in 1980: adverts began to appear for the Sinclair ZX80, the first computer available ready-made for under £100. The brainchild of the inventive Clive Sinclair, head of a small electronics company in Cambridge, the ZX80 looked a little like an overgrown calculator. It was a tiny machine, about 20cm square, with a miniature keyboard, and it simply plugged into a television. It offered modest computing power: no sound, a black-and-white display, and its one kilobyte of memory was just a millionth of that of a present day iPhone.
Sinclair ZX80 microcomputer, 1980. (Photo by SSPL/Getty Images)
Yet it was surprisingly versatile, affordable and easy for beginners to use. Advertised with the friendly slogan “inside a day you’ll be talking to it like an old friend,” the ZX80 proved to be a sensation and sold in huge numbers. It was the first of a series of home ‘microcomputers’ to offer user-friendly computing at a budget price, and a bewildering array of competing designs soon appeared on the market. Home computing boomed; by 1983 Britain boasted the highest level of computer ownership in the world.
Letting the computer into your life
So why in the 1980s did so many people suddenly decide to buy a computer? The answer was initially educational: the spread of microchip-powered technologies in the 1970s brought prophecies of an information technology revolution that would overturn traditional life and work. Robots might replace industrial workers; word processors might take over from secretaries; even professional jobs in teaching or medicine might be replaced by expert computer software. There would be new opportunities in this ‘brave new world’, but those who did not learn about computers faced being left behind.
Home computers were presented as a friendly introduction to a technology that was going to change the world. “We live in the age of computers,” cautioned a 1982 advert for the Commodore VIC-20 computer. “Coming to terms with them is part of coming to terms with the 20th century.” Parents were encouraged to buy computers for their children to help give them a good start in life. The BBC launched a huge computer literacy project to educate the nation, including television programmes, books, and even its own branded BBC Microcomputer, built by Acorn in Cambridge.
Commodore VIC-20 home computer, September 1983. (Photo by SSPL/Getty Images)
It seems surprising today, but learning to programme for yourself was seen as vital for developing an understanding of how computers worked, and empowering people to control them. Across Britain people stayed up late into the night typing in programmes and hunting for the inevitable bugs in their code, as they gradually explored for themselves what computers could do.
By modern standards, 1980s home computers were laughably primitive: machines with rubber keyboards, blocky graphics, beepy sound, and less processing power than the cheapest mobile phone of today. Yet in the analogue 1980s home they were positively space age. So what did people do with these futuristic marvels other than learn to program?
Computer designers were somewhat vague about the end use of their creations, but emphasised their incredible versatility around the home: Sinclair’s ZX80 was advertised with the sweeping claim that it could do “literally anything from playing chess to running a power station”. Computer enthusiasts used their machines to catalogue record collections, balance household budgets, store recipes, and carry out other domestic chores, but few persevered for long. In reality, it was probably not until 1985, when Alan Sugar introduced the first Amstrad word processor – a green-screened computer complete with a printer – that many people found something genuinely useful to do with a computer, and they swapped typewriters for the convenience of word processing. Nevertheless, the home computer introduced computing into our domestic lives and paved the way for other electronic gadgets that have since come to fill our homes.
Alan Sugar of Amstrad Plc, 1989, Brentwood, Essex. (© Alexander Caminada/Alamy Stock Photo)
As with almost any new technology, politicians were anxious to associate themselves with home computing. To the new Tory government the home computer boom was a great British success story that showed the government was leading the nation forward after the discontent, strikes and economic stagnation of the 1970s. The opening of new computer factories pointed to a brighter future for industry, such as in the South Wales steel town of Port Talbot, producer of a proudly Welsh computer named the Dragon 32. Computer company bosses were hailed as entrepreneurial heroes; Clive Sinclair, owner of one of the most important home computer companies, was given a knighthood for services to industry, and Margaret Thatcher even presented the prime minister of high-tech Japan with a Sinclair ZX Spectrum computer as a symbol of British inventiveness.
To the Tories, the home computer boom was symbolic of the new Britain they were trying to create. However, there was more to this than just good publicity. The government invested heavily in promoting new technology and in sponsoring computer education schemes. Many were aimed at children, such as “Micros in Schools” that aspired to put a single computer into every British secondary school. The hope was to train the children of today for the jobs of tomorrow, preparing them for work in a society where computers would be everywhere. At the same time, by encouraging businesses to use computers the Tories hoped to encourage efficiency and entrepreneurship. As traditional British manufacturing declined over the 1980s the seeds were sown for a new sort of economy based on enterprising high-tech firms, financial industries, media, consultancy and other services, more than big factories with unionised work forces.
Of course, the computer boom was not all serious – video gaming became incredibly popular. Not only did the millions of machines being sold to help children with their homework create a huge potential market for games, but they fostered the creative skills of programmers to write them too.
The games industry began with “bedroom programmers”, many of them teenagers, writing games on their home computers and selling them on cassette tape through mail order adverts placed in magazines. The games were simple at first, often clones of arcade games like Pacman and Space Invaders. However, the creative freedom offered by home computers allowed games to grow into new forms. From dodging killer penguins and mutant telephones in the surreal caverns of Manic Miner (1982), to 3D space combat and commodity trading across the immense galaxies of Elite (1984), games writers stretched simple home computers to do things that amazed the machines’ designers. A new creative industry was born as bedroom programmers started software companies. Many of that generation are still working in games companies today, and even some of today’s major franchises can trace their roots back to the 1980s home computing era.
Children at Christ’s College, Liverpool, using a computer, 1982. (Photo by SSPL/Getty Images)
The unexpected growth of videogaming had a profound impact on home computing. It undermined the educational pretensions that had driven the initial boom and made the machines appear as mere toys. Parents became frustrated that their children were more interested in computer gaming than learning, and adults began to outgrow simple home computers and yearn for something more sophisticated. The growing disillusionment caused the market for cheap home computers to crash in 1985. In the aftermath adults abandoned home computers and turned to more serious, ‘business like’ computers designed for office applications, like the IBM standard PC, whose descendants still dominate our desktops today.
The few home computers to survive the crash lived on for a few more years as cheap gaming machines for children, but their glory days were long past as they were replaced by ever-more capable personal computers and other devices.
Yet we cannot easily write off home computers as a failure, or a technological dead end. They helped to establish what information technology could do in our homes, certainly as entertainment platform and word processor. They educated a generation of computer programmers, who went on to create much of the software we use today. More than anything else they introduced the computer-illiterate masses to information technology, preparing people for all the PCs, smart phones, tablets and other gizmos that have since followed them into our homes and workplaces. All this began with the home computer. Surely, then, the greatest legacy of the home computer boom is the high-tech world in which we live today.
Tom Lean is the author of Electronic Dreams: How 1980s Britain Learned to Love the Computer (Bloomsbury, 2016). To find out more, click here.