History Extra logo
The official website for BBC History Magazine and BBC History Revealed

From Sutton Hoo to Rosa Parks: 50 giant leaps in history

Some were hailed as world changing in an instant. Some only years later. But each of these moments – whether for better or worse – has helped shaped the world we know today, writes Nige Tassell

Published: December 27, 2019 at 10:05 am
Try 6 issues for only £9.99 when you subscribe to BBC History Magazine or BBC History Revealed


In 1821, a young, self-taught scientist from London called Michael Faraday made a breakthrough that modern civilisation would become dependent upon. Building on the discovery of electromagnetism by Danish physicist Hans Christian Ørsted, Faraday built two instruments that produced what he dubbed ‘electromagnetic rotation’. He had, in effect, invented the electric motor, an achievement that had eluded his mentor, the celebrated inventor and chemist Sir Humphry Davy. Faraday’s achievement cannot be overstated, as confirmed by this tribute by the physicist Ernest Rutherford: “When we consider the magnitude and extent of his discoveries and their influence on the progress of science and industry, there is no honour too great to pay to the memory of Faraday, one of the greatest scientific discoverers of all time.”



When, in October 1492, Italian explorer Christopher Columbus, sailing under the flag of the Spanish crown, landed on an island in the Bahamas, the future of the Americas and the Caribbean would never be the same. The find was accidental: Columbus wasn’t intending on discovering the New World, he was trying to find a western trade route to the East Indies. Indeed, believing he’d reached his target destination, he named the indigenous population ‘Indians’. Columbus moved on to Cuba and Hispaniola, establishing a settlement on the latter (presentday Haiti), putting in process what would become the mass colonisation of the New World.

But while his arrival was a great leap in European exploration, it was to have a devastating impact on the indigenous populations he – and later settlers – encountered. Violence, slavery and disease are among the many sources of controversy associated with the 15th-century explorer.

Astronaut Buzz Aldrin on the Moon
Astronaut Buzz Aldrin on the Moon, with Armstrong and the flag reflecting in his visor. Their footprints, visible in the foreground, will remain for millions of years. But what other giant leaps have shaped our history? (Image by NASA)


Constructed around 4,700 years ago, the Step Pyramid of Djoser in Saqqara is not only the first of the Egyptian pyramids to be built, but world’s oldest intact largescale stone monument. It was designed as a tomb for the Third Dynasty Pharaoh Djoser, and was completed in his lifetime. Previous large structures in Ancient Egypt consisted of mud bricks; the time and care taken to stack and sculpt the stone’ suggests that Djoser had substantial finance and resources – as well as a huge workforce – to underpin the project. It became the prototype for the 80 or so pyramids subsequently built across the kingdom.



“Nothing will stop us. The road to thestars is steep and dangerous. But we’re not afraid.” Yuri Alekseyevich Gagarin wasn’t afraid on that April morning in 1961 when his Vostok spacecraft was launched into the (largely) unknown. More than three years after Laika the dog had been sent out of the Earth’s atmosphere, Gagarin completed a single orbit of our planet before, after 108 minutes, returning to Earth and touching down in Russia via parachute.

During re-entry, Gagarin whistled the tune The Motherland Hears, The Motherland Knows, a song that contains the lines “the motherland hears, the motherland knows, where her son flies up in the sky.” For the time being, the Soviet Union was ahead in the Space Race.



In 1980, smallpox became the first major disease to be eradicated. And it was all down to one 18th-century doctor from rural Gloucestershire. In 1796, using a local boy as his guinea pig, he tested an old piece of folklore: that if you'd caught cowpox, you couldn't then be infected with smallpox. Rubbing cowpox pocks on the boy's arm, Jenner witnessed that, while he did come down with the lesser disease, his patient became immune to the much more dangerous smallpox. Taking its name from vacca, the Latin word for 'cow', vaccination transformed global death rates. It's believed that the work of no other single person in the world has saved as many lives as that of Jenner.



From the first days of its construction in 1961 until its demolition at the turn of the 1990s, the Berlin Wall was a dominating feature of a city divided. It kept the West German-administered West Berlin separate from East Berlin, governed by the German Democratic Republic (aka East Germany). Governments on either side saw the wall differently: in the East, its official title was the Anti-Fascist Protection Rampart. WillyBrandt, the West Berlin mayor, labelled it the “wall of shame”.

Although not impregnable (an estimated 5,000 East Berliners successfully managed to cross into the West), around 20 times that number tried and failed. Thanks to the East Germans’ shoot-to-kill policy, around 200 escapees lost their lives in search of freedom. The Cold War began to thaw after Mikhail Gorbachev took power in the Soviet Union in 1985, and a new world order started to emerge. Global politics was coming out of the deep freeze. With it came a heightened discontent that the wall continued to divide Berlin.

In 1987, US President Ronald Reagan made a speech at the city’s Brandenburg Gate that contained a direct appeal to the Soviet Premier, a man whose policy of glasnost aimed to open up his country to outside influence. “General Secretary Gorbachev,” said Reagan, “if you seek peace, if you seek prosperity for the Soviet Union and Eastern Europe, if you seek liberalisation, come here to this gate. Mr Gorbachev, open this gate. Mr Gorbachev, tear down this wall!”

Six days earlier, David Bowie had held a concert nnear to the wall in West Berlin. The music had been heard on the other side of the divide, prompting anti-wall rioting. The following summer, in an apparent attempt to assuage and placate its younger citizens, the East German government allowed Bruce Springsteen to play a show in East Berlin. Speaking in German, Springsteen announced that he held “the hope that one day all the barriers will be torn down”.

The pressure wasn’t just cultural. Across the Eastern Bloc, the Iron Curtain was fraying. Along with Gorbachev’s reforms in the USSR, the 1989 Polish elections had ousted its communist regime, while the Hungarian government started pulling down fences along its border with Austria. This prompted many East Germans to leave for the West via Hungary and, later, via Czechoslovakia.

To stem the tide, on 9 November 1989, East Germany made an abrupt announcement: that the gates of the wall’s border crossings – hitherto only accessible to foreigners – would be flung open for all to pass through that very evening.

That night was one of the most joyous in German history, with West Berliners climbing on top of the wall to mingle with those from the eastern side of the city. The guards had put down their guns and the most visible division between Eastern and Western Europe was now rendered meaningless. While formal demolition didn’t commence until the following year, citizens on both sides of the divide hacked away at the structure, both for souvenirs and for deeply symbolic reasons. The process reached its denouement in 1990 when Germany was formally reunified after 45 years.



It was one of the iconic speeches of the 20th century, one that saw how – as the writer Jon Meacham has noted – “with a single phrase, Martin Luther King Jr joined Jefferson and Lincoln in the ranks of men who have shaped modern America”. Delivered before an estimated crowd of 250,000 at the March On Washington For Jobs And Freedom in August 1963, the speech defined an era in US history. It was a poetically worded, brilliantly delivered demand for long-overdue freedom and equality. King never got to see his dream come true: he was assassinated in Memphis, Tennessee, on 4 April 1968. A week later, the Civil Rights Act 1968, which had been making slow progress in Congress, was rushed through the legislature and immediately signed into law by President Lyndon Johnson.



During the 1990s, a new face replaced that of Queen Elizabeth II on the New Zealand ten-dollar note. It was that of another Englishwoman, albeit one who spent almost all her adult life in the southern hemisphere. This woman was Kate Sheppard, a figure largely unknown in the country of her birth, but whose actions and influence were felt right across the world.

Sheppard was the leading suffragist in New Zealand, a woman whose reasoned public speaking and writings – in publications such as Ten Reasons Why the Women of New Zealand Should Vote – successfully swung opinion towards universal suffrage. After a series of mass petitions had been collected by Sheppard and her fellow campaigners, on 19 September 1893, New Zealand governor Lord Glasgow signed the new Electoral Act into law. With neither the UK nor the US extending the vote to women until the other side of World War I, New Zealand blazed the trail, becoming the first self-governing nation to allow women to vote in parliamentary elections.



After two centuries of isolation, Japan warmedto the idea of opening its borders in 1853. With the US and China already enjoying extensive trade, US Commodore Matthew Perry sailed to Japan with an four-strong fleet and, endowed with “full and discretionary powers” by his Secretary of State, employed intimidatory tactics to get Japanese agreement. Such gunboat diplomacy worked. Perry returned the following year, whereupon Japan signed the Treaty of Kanagawa, which opened its ports to US ships.



By the 17th century, there was a general acceptancen that Earth wasn’t flat, but a sphere. But the Aristotelian idea that our planet was the centre of the universe, around which all other planets revolved, still held sway. Then Italian Galileo Galilei came along, brandishing his homemade telescope. Through it, he observed that Jupiter is orbited by four moons, just as Earth is orbited by our solitary Moon. The conclusion he drew, which encountered great scepticism, was that the planets revolved around the Sun.



Marie Curie’s contribution to science is huge. In 1903, she became the first woman to be awarded a Nobel Prize; eight years later, she won her second. Marie (inset)b shared the first with her husband Pierre, with whom she undertook pioneering work in radioactivity, and Antoine Henri Becquerel. In 1898, the Curies discovered two new elements – polonium and radium, both of which are more radioactive than uranium. Marie’s correct assumption was that radioactive rays could treat, reduce and even eradicate tumours, and her name remains synonymous with cancer treatment today.



“Our country has arrived at a decision. Among all the parties that contested the elections, the overwhelming majority of South Africans have mandated the African National Congress to lead our country into the future. The South Africa we have struggled for, in which all our people – be they African, Coloured, Indian or White – regard themselves as citizens of one nation is at hand.”

On 10 May 1994 – four years, two months and 29 days after taking slow, deliberate stepsb to freedom on his release from jail, where he had been held for nearly three decades – Nelson Rolihlahla Mandela was sworn in as the first black president in South Africa’s history, after a life dedicated to fighting the Apartheid system.

Initially arrested, charged, tried and jailed in 1962 for inciting workers’ strikes and leaving the country without permission, Mandela was charged the following year with sabotage and conspiracy to violently overthrow the government. At his trial in Rivonia, Mandela delivered an extraordinary, three-hour speech from the dock. It closed with a chilling confirmation of his commitment to the cause of black majority rule. “I have cherished the ideal of a democratic and free society in which all people will live together in harmony,” he told the packed courtroom. “It is an ideal for which I am prepared to die.”

The starkness of his words resonated around the world, with even the United Nations calling for Mandela and his fellow accused to be released. Instead, they were found guilty and incarcerated; Mandela would spend 18 of his 27 prison years on the infamous Robben Island, in a damp cell with a straw mat for a bed. All the while, the global clamour for his release continued.

The Special AKA released the anthem ‘Free Nelson Mandela’, while the occasion of his 70th birthday in 1988 was marked by a concert at Wembley that drew an estimated global audience of 600 million.

Mandela had been offered his release in 1985 in return for denouncing violence as a political tool; he refused to leave jail while the African National Congress (ANC) political party remained banned. When FW de Klerk became president four years later, an unconditional release became a very real prospect.

On his release in 1990, Mandela began negotiations for a multiracial general election. !e electorate would eventually return him as president, with the ANC taking 62 per cent of the vote.

This was a transformative era in what had been one of the world’s most conflicted countries – a land where black citizens had been denied a voice at the polling booth for generations. The last paragraph of Mandela’s “I am prepared to die” speech is now written on the wall of South Africa’s Constitutional Court building in Johannesburg.



If the accounts of Wu Zetian’s life are to be believed, hers is a story of raw, naked political ambition that respected few moral boundaries. To what extent these chronicles are historically accurate, though, has to be measured through the prism of male observers considering the behaviour and impact of the only female emperor in Chinese history. To reach those lofty heights surely required a well defined ruthless streak on her part, but she was often portrayed as the devil incarnate.

What isn’t disputed is the upward passage that Wu’s life took, how she scaled social strata to become the most powerful individual across the empire. Well-born and educated, she joined the Imperial household as a low-ranking concubine with domestic chores, far removed from real power. One day she managed to catch Emperor Taizong’s eye – apparently while changing his bedsheets. Upon his death in AD 649, she was sent to a Buddhist nunnery.

Wu was having none of that. She escaped and attempted to regain her position in the Imperial court. One particular unsavoury story suggests that she murdered her own baby and blamed it on the new Empress, the wife of Taizong’s successor (his ninth son, Gaozong). When the Empress was exiled, Wu took her place, as Gaozong’s wife. Another story has her ordering the limbs of her rivals to be cut off, before they were left to drown in vats of wine.

When Gaozong su ered a severe stroke in AD 660, Wu became the court’s administrator – a highly powerful position. Gaozong died in AD 683, whereupon the couple’s son Zhongzong became emperor, although he reigned for just two months before Wu demoted him and sent him into exile. Zhongzong was succeeded by his brother Ruizong – he lasted six years before suffering the same fate as his sibling.

Wu then assumed power for herself, ruling as Empress Regnant for the next 15 years until AD 705. For someone so apparently ruthless, Wu’s impact on Chinese society was significant. China greatly expanded into Central Asia during her rule, while she also declared Buddhism to be the state religion, replacing Daoism.

A strong supporter of meritocratic success over hereditary privilege, Wu built up the Chinese education system to help facilitate this (the fact that she could read and write was one of the attributes that initially attracted Taizong to her), and oversaw a substantial growth in China’s agricultural output. Despite these achievements, it’s her supposedly cold-blooded ways that history remembers.



In 1628, in the pages of the snappily titled Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus, English physician William Harvey explained the purpose of the one-way valves found in the cardiovascular system: that they are evidence that the human heart propels blood around the body in a circulatory fashion. A handful of other medical scientists had made the observation before Harvey, but it was the depth and detail of his description of the process – gained from extensive dissections of animals – that ensured the kudos came his way, albeit with a delay of around 20 years.



“Would men but generously snap our chains,and be content with rational fellowship instead of slavish obedience, they would find us more observant daughters, more affectionate sisters, more faithful wives, more reasonable mothers – in a word, better citizens.”

The most famous work of proto-feminist and author Mary Wollstonecraft, A Vindication of the Rights of Woman, was a radical manifesto for its time, published in 1792, five years before her tragically early death at the age of 38. The importance of both Wollstonecraft and her writings was damaged by posthumous accounts of her premarital affairs and her illegitimate first daughter, but she was later hailed as a guiding spirit for the suffragist movement at the turn of the 20th century.



We might regard it as a mechanical clock, but its inventor chose to name it a ‘Water-Driven Spherical Birds-Eye-View Map Of The Heavens’.

An eighth-century Chinese-Buddhist monk called I-Hsing was that inventor, a keen mathematician and astronomer who aimed to combine the two disciplines with his creation. As its name suggests, this water-powered clock was designed to trace celestial activity and proved to be a relatively accurate timepiece, accurate to within 15 minutes a day. However, over time, the water began to corrode its metal components and, on subzero days, would freeze.

More than 200 years later, another Chinese astronomer, Zhang Sixun, rebuilt the device using mercury instead.



There’s some irony that the country that was once most active in the slave trade was also the one that led the campaign to outlaw slavery and servitude. In the last decade of the 18th century, 80 per cent of Britain’s foreign income came from the triangular route that the slave trade had established – British goods going to Africa to buy slaves; slaves being transported to the West Indies; cotton, sugar and tobacco coming back to Blighty. But this was also a time when abolitionist sentiment was starting to percolate.

The impetus came from the anti-slavery committees of the Quakers, who presented a petition to Parliament in the early 1780s. A few years later, MP William Wilberforce was asked to make representations for the cause from his seat in the House of Commons. Researching further into the subject, he declared that he “thought himself unequal to the task allotted to him, but yet would not positively decline it”. In fact, his name would be forever synonymous with abolitionism.

In 1807, the Slave Trade Act was passed by Parliament, prohibiting the buying and selling of slaves on British soil, but not slavery itself. That came 26 years later, in 1833, when the Abolition of Slavery Act became law. Certain caveats ensured the legislation wasn’t as absolute as it might have been. Not only were certain parts of the British Empire exempt, but only slaves aged six and under were officially freed. The remainder were classified as ‘apprentices’, with their emancipation staggered and delayed (although this clause was removed five years later). Slave owners were also paid generous amounts for the loss of their 'property'. Twenty million pounds was set aside to recompense them, a figure equating to 40 per cent of Britain’s annual income at the time.

However flawed, the passing of the Act effectively freed around 800,000 slaves across the empire. It also marked an acceleration of worldwide anti-slavery feeling, though US President Abraham Lincoln wouldn’t make his Emancipation Proclamation for another 30 years.



“When I woke up just after dawn on 28 September 1928,” Alexander Fleming later admitted, “I certainly didn’t plan to revolutionise all medicine by discovering the world’s first antibiotic, or bacteria killer. But I suppose that was exactly what I did.” The Scottish microbiologist was only thinking about his impending holiday when he absentmindedly left an amount of Staphylococcus bacteria on a tray in his lab. On his return, he noticed that a patch of mould had stopped the bacteria’s spread. He realised that a substance in the mould, which Fleming called penicillin, had antibiotic qualities that could stem the spread of chronic infections. The number of lives subsequently saved are countless.



When the Swiss biochemist Friedrich Miescher embarked on a pursuit to isolate the protein found in white blood cells, he instead encountered a substance with properties very unlike those of the protein he was researching.

He had effected the first purification of what he named ‘nuclein’ – what we now know as deoxyribonucleic acid, or DNA. Miescher believed his discovery to be an important one, although he remained unsure as to the exact function of nuclein. Initially, the scientific community didn’t take too much notice of it either, and it wasn’t until the last decade of the 19th century that nuclein’s hereditary properties began to be understood.

The German biochemist Albrecht Kossel successfully isolated and named the five compounds that provide molecular structure to DNA and in 1910 was awarded the Nobel Prize for Physiology or Medicine for his pioneering work in cell biology. Other scientists picked up the DNA baton. By the 1940s, the Canadian-American physician Oswald Avery and his colleagues identified that DNA was “the transforming principle” in genetics.

Their work inspired others to research further and deeper. In 1950, Erwin Chargaff made the discovery that DNA was species-specific, and two years later Rosalind Franklin advanced our understanding further still. Her high-resolution photographs of DNA were extraordinary, and pushed Franklin towards a belief that DNA took a helical structure. However, she was beaten to confirming the double helix structure of DNA by James Watson and Francis Crick in 1953. Along with their colleague Maurice Wilkins, they won the Nobel Prize for Physiology or Medicine. Despite her photographs being key to their breakthrough, Franklin wasn’t saluted with a share of the honour. Nor was the name of Friedrich Miescher remembered, either.



In 1989, after noticing that the large number of global scientists he was workingwith at CERN (the large particle physics laboratory near Geneva, Switzerland) were having difficulty sharing information easily, Oxford graduate and software engineer Tim Berners-Lee developed an idea that would revolutionise the way we communicate.

His proposal was for a “hypertext project” called “WorldWideWeb”, which would enable “browsers” to view a “web” of “hypertext documents”, rather than logging onto a different computer every time they wanted to access new information.

Essentially, what Berners-Lee was suggesting was an application that uses the Internet (conceived in the late 1960s) to share information such as videos and text. By the end of 1990, the first web page had been served on the open Internet, and in 1991, this new web community was made available to people outside of CERN. The World Wide Web had been spun.



If Athens is the cradle of democracy, then Cleisthenes was its midwife. Despite being born into a less-than-democratic lineage (his maternal grandfather was the tyrant Cleisthenes of Sicyon), Cleisthenes the lawgiver was the architect of a new system of government, one that valued equality over patronage. Having displaced a pro-Spartan oligarchy, Cleisthenes undertook a radical reshaping of the Athenian constitution.

He sought to break up entrenched alliances and to reduce the power of aristocratic families, attempting to replace the status quo with a pan-Athenian worldview that united all strata of society. Under this mindset, the three regions of Attica (the peninsula that projects into the Aegean Sea) worked together to run the city, cutting across previous notions of clan. But while all citizens enjoyed equal rights, there was a glass ceiling. Only men were deemed to be citizens.



In 1687, a prolific British mathematician and astronomer called Isaac Newton published a book that would shape thinking about the cosmos for the next 200-plus years. His Philosophiæ Naturalis Principia Mathematica (aka Mathematical Principles of Natural Philosophy) set out his thoughts about gravity, tides and the movement of planets, all but confirming the heliocentric school of thought – that the Solar System’s planets revolve around the Sun. Newton’s findings would become the dominant worldview for more than 200 years, until Albert Einstein and his theory of relativity came along, although the Englishman remained modest about how he had recalibrated people’s thinking. “If I have seen further than others,” he once confessed, “it is by standing upon the shoulders of giants.”



“Whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.” Charles Darwin had developed his theory of evolution – that species adapt and evolve over time as a process of natural selection – during his travels in the 1830s, specifically to the Galápagos Islands.

But it would be 20 years before he published it in On The Origin Of Species. There was a reason why he had delayed. His ideas were in contravention of the creationist explanations of the natural world that were dominant at the time – a domination not unconnected to religious benefactors underwriting the work of scientists. Had Darwin published his theory as soon as he had shaped and sanded it, he would have been at the mercy and probable ridicule of the scientific community.

In the end, Darwin’s hand was called. Another naturalist, Alfred Russel Wallace, had sent him a short overview of his research: Wallace’s theories mirrored Darwin’s own. He hurriedly edited his manuscript and published. The book’s reception from religious quarters was predictably scathing, but some of his scientific brethren stood in his corner, themselves liberated by Darwin’s bravery.



When Prime Minister Neville Chamberlain announced in a solemn radio address on 3 September 1939 that Britain was at war with Germany, there was none of the flag-waving patriotism of August 1914.

Instead, the British people – many of whom had lived and fought through the horrors of World War I – were mostly resigned to the fact that Adolf Hitler, and his aggressive form of German territorial expansion, needed to be stopped.

The road to Allied victory was far from inevitable and the German army proved to be an efficient and effective fighting force. But a combination events – from the US’s entry into the war in 1941, to D-Day (the largest seaborne invasion in history) in June 1944 – saw the conflict enter its endgame in April 1945. On 7 May 1945, Germany’s unconditional surrender was signed in Rheims and the following day – known as Victory in Europe (VE) Day – was celebrated as the war’s official end in Europe.

“This is not victory of a party or of any class. It’s a victory of the great British nation as a whole”, Chamberlain’s successor Winston Churchill announced in his VE Day address from the balcony of the Ministry of Health in London. But while the streets of Britain erupted in celebration in the wake of the German surrender, war continued to rage in the Far East as Imperial Japanese forces fought the Allies for control of eastern Asia and the western Pacific.

On 6 August, following continued Japanese refusals to surrender to the Allies, an atomic bomb was dropped on the city of Hiroshima, killing an estimated 140,000 people, 70,000 immediately and the remainder from the effects by the end of 1945. A few hours later, US President Harry S Truman again requested Japan’s surrender, stressing that the alternative was “a rain of ruin from the air, the like of which has never been seen on this Earth”.

Three days later, a second atomic bomb was dropped, this time on the city of Nagasaki, wreaking mass destruction on its civilian population. Japanese Emperor Hirohito ordered the Supreme Council for the Direction of the War to accept the Allies’ terms. On 15 August 1945, Truman declared the day as Victory over Japan (VJ) Day, signalling the end of the war.

“Our hearts are full to overflowing, as are your own. Yet there is not one of us who has experienced this terrible war who does not realise that we shall feel its inevitable consequences long after we have all forgotten our rejoicings today”, said King George VI during his address to the nation and empire on VJ Day. Out of the blood and destruction of the six-year conflict, peace had finally been achieved, but at a terrible cost to human life. Figures vary, but up to 80 million lives were lost over the course of the conflict.



On a steamy July day in 1969, those gathered in the control room of what is now the Johnson Space Center in Houston, Texas, held their collective breath.

Hearts were pounding. Brows, perspiring. More than 380,000 kilometres away, close to the surface of the Moon, the object of their concern and anticipation – a strange-looking spacecraft named Eagle – was possibly in difficulty. Alarms were sounding from its in-flight computer as the crew attempted to land it amongst the strewn boulders of the Moon’s Mare Tranquillitatis. Fewer than 30 seconds’ worth of fuel remained.

The tension among NASA’s ground staff in the control room was absolute and unbearable, but eight words from mission commander Neil Armstrong punctured that anxiety. “Houston, Tranquility Base here. The Eagle has landed.”

The immediate response from one of the controllers back at base said it all. “You got a bunch of guys about to turn blue. We’re breathing again. Thanks a lot.”

As heroic as it sounds, “The Eagle has landed” wouldn’t be the most-quoted statement Armstrong would make that day. This he reserved for the moment at which he planted the first human foot on the loose lunar surface. “It’s one small step for man,” he was heard saying down an understandably crackly line, “one giant leap for mankind.” The 650 million TV viewers who were tuned in at home could forgive him for slightly fluffing his lines; he should have said “one small step for a man”. Armstrong later maintained he had said it. Armstrong and his colleague Buzz Aldrin then spent a couple of hours exploring the lunar surface, which the latter described as “magnificent desolation”. Before embarking on the return leg of their journey, the pair planted a US flag into the rocky ground, as well as affixing a plaque to one of the legs of the soon-to-beabandoned Eagle: “Here men from the planet Earth first set foot upon the Moon. July 1969 AD. We came in peace for all mankind.”

“All mankind” might be debatable. There was a definite political edge to the US’s determination to put a man on the Moon, with the accelerating Space Race being a key (and conspicuous) tenet of the Cold War. Just a month after the Soviets successfully propelled Yuri Gagarin into space to take the advantage, US President John F Kennedy delivered his ‘moonshot’ speech to Congress, outlining his vision of landing men on the Moon and returning them to Earth “before this decade is out”. For him, the US needed to be the leading party in conquering this final frontier, these uncharted waters. “Only if the United States occupies a position of preeminence,” he observed during another speech, this one in September 1962 at Rice University in Houston, “can we help decide whether this new ocean will be a sea of peace or a new, terrifying btheater of war.”

Kennedy was also driven by the idea of creating history, of titanic accomplishment. “We choose to go the Moon in this decade and do the other things, not because they are easy, but because they are hard; because that goal will serve to organise and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one we intend to win.”

The subsequent Apollo programme, which ran until 1972, consisted of both manned and preparatory unmanned missions. It wasn’t an unqualified success. In January 1967, the Apollo 1 mission ended in tragedy when a fire in the command module during a launch rehearsal killed the three-strong crew. !ree years after the tragedy, and nine months after the successful Apollo 11 mission, the explosion of an oxygen tank on its outward journey denied the crew of Apollo 13 the opportunity to land onthe Moon. Their safe passage back to Earth was a dramatic, touch-and-go affair.

For those first men on the Moon, their short walk was a profound one. Buzz Aldrin later recalled the experience of gazing back at his home planet. “From the distance of the Moon, Earth was four times the size of a full Moon seen from Earth. It was a brilliant jewel in the black velvet sky. Yet it was still at a great distance, considering the challenges of the voyage home.” The third member of the Apollo 11 mission, Michael Collins, never got to feel moondust under his feet. His experience was seen through the window of the command module Columbia, orbiting solo around the Moon while Armstrong and Aldrin got to stretch their legs. He would report that he was neither lonely nor disappointed by this, detailing his emotions as being “awareness, anticipation, satisfaction, confidence, almost exultation”.

But was this extraordinary achievement by these three men actually an achievement? Did the 1969 Moon landing really happen? Conspiracy theorists, seeking a new cause célèbre six years after John F Kennedy’s assassination, poured scorn on the idea that science was able to accomplish a feat as far-fetched as landing a spacecraft on this distant natural satellite.

These doubters believed NASA falsified the landings, filming fake footage to trick people into believing that the Space Race had been won. While up to a fifth of US citizens continue to subscribe to this notion half a century later, substantial third-party evidence has been produced to debunk the theory, including subsequent photographs showing the tracks made by various Apollo crews, as well as the flags that each mission left behind. The Apollo missions were far more than flag-planting, strength-showing exercises.

After Armstrong and Aldrin set foot on the lunar surface, ten more astronauts did likewise over the following three-and-a-half years as five further missions successfully reached their destination. They returned to Earth with the data gathered from extensive experiments – both geological and meteorological – along with an accumulated 382 kilograms of rock samples. But did their findings justify the stratospheric expense, the $25.4 billion outlay that was reported to Congress in 1973?

When Kennedy had announced the Apollo programme, his predecessor in the White House, Dwight Eisenhower, had dismissed it as “just nuts”. But the country wasn’t with old Ike. They were dreaming.

As Andrew Smith, author of Moondust: In Search Of ~The Men Who Fell To Earth, points out: “For one decade, and one decade only, Americans appeared happy, even eager, to place their trust and tax dollar on the collection plate of big government and its scientist priests”. And they got what Kennedy had promised them. Footprints on the Moon. And one giant leap.

Nige Tassel writes about sport and popular culture as both a journalist and author.


This article first appeared in the August 2019 issue of BBC History Revealed


Sponsored content