In Hiroshima on 6 August 1945 – the day of the Christian Feast of the Transfiguration – a young mother, Futaba Kitayama, looked up to see “an airplane as pretty as a silver treasure flying from east to west in the cloudless pure blue sky”. Someone standing by her remarked: “A parachute is falling.” Then the parachute exploded into “an indescribable light”. The American B-29 bomber Enola Gay had just dropped ‘Little Boy’, a four-tonne bomb which detonated with the power of 15,000 tonnes of TNT. Pilot Paul Tibbets, who had named his plane after his own mother, struggled to hold it steady as the first shock waves hit. Bathed in bright light, he looked back and saw “a giant purple mushroom boiling upward”. Over the intercom he announced to his shaken crew: “Fellows, you have just dropped the first atomic bomb in history.”
The destructive flash that seared Hiroshima into history was the culmination of 50 years of scientific creativity and innovation, and over 50 years of political and military turmoil. Generations of scientists had contributed to that moment. Yet, as they began uncovering the minute building blocks forming the world around them, few could have predicted how their compulsive curiosity would combine with political events to produce a devastating new weapon.
The journey began at the end of the 19th century, often called the chemists’ century (as opposed to the 20th, which physicists would dominate). The first step to the bomb arguably arrived in the mid-1890s, when Henri Becquerel found that uranium emitted energy rays. Subsequently, Marie Curie experimented with uranium as well as other elements. She discovered that thorium also emitted these enigmatic rays, which she later called ‘radioactivity’.
New Zealand physicist Ernest Rutherford found that uranium actually emitted two types of radiation – alpha and beta – and probably a third. He was correct about the latter; this third type would be called gamma.
In 1919, Rutherford made history again. He became the first person to chip a piece off an atomic nucleus, in the process identifying a sub-atomic particle – the positively charged proton (his mentor JJ Thomson had discovered electrons some years before). Rutherford already suspected the presence of another particle within the nucleus: the neutron. One of his former pupils, James Chadwick, discovered the sub-atomic particle at Cambridge University’s Cavendish Laboratory in 1932. The neutron contained no charge – hence its name – and, if used to bombard elements, it could penetrate atomic nuclei without being deflected. Thus, it was an ideal tool for the investigation of atoms.
The year 1932 was a scientific annus mirabilis. Also at the Cavendish, John Cockcroft and Ernest Walton built the first machine – using plasticine to seal its joints; the Cavendish did not encourage extravagance – to disintegrate an atomic nucleus with accelerated particles. Their device, dubbed a linear accelerator, provided the first experimental confirmation of Albert Einstein’s theory of E = mc2 (energy is equal to mass times the speed of light squared), implying that enormous amounts of energy could be squeezed from a tiny mass.
A community torn apart
But in 1933 – the year when Rutherford dismissed the idea of harnessing energy from atoms as “moonshine”, and when Adolf Hitler came to power – the scientists’ world changed. Previously a small, close-knit community, they had met frequently at international conferences – disparagingly termed “witches’ sabbaths” by Einstein – and published their results openly. Now, friendships and professional partnerships were sundered, as many were compelled to flee Nazi Germany and other countries gripped by totalitarian regimes because of their race or political views. Einstein crossed the Channel to England, protected by the British Naval Commander Oliver Locker-Lampson. However, finding England too formal (he preferred “No butlers. No evening dress”), he accepted a post at Princeton University.
Against this increasingly tense backdrop, the final pieces of the atomic jigsaw were dropping into place. In France, Marie Curie’s daughter, Irene, and her husband, Frederic Joliot-Curie, discovered how to force nuclei to disintegrate to form new, unstable elements that released radioactive energy as they decayed – so-called ‘artificial radioactivity’. In Rome, physicist Enrico Fermi found that firing neutrons at a target substance through a filter of water – his assistants carried it in buckets from the goldfish pond in the garden behind his laboratory – dramatically slowed the neutrons’ speed, enhancing their chances of hitting and penetrating the target nuclei.
However, in Mussolini’s Italy, Fermi’s Jewish wife was in danger. Awarded the Nobel Prize, he used the ceremony in Stockholm as a pretext to leave Italy with his family and travel to the US. Like other scientists, Fermi had friends in Germany, two of whom – the chemist Otto Hahn and Vienna-born Jewish physicist, Lise Meitner – were renowned for their experimental work.
In 1938, with Hahn’s help, Meitner fled to Sweden, where Hahn sent her results from experiments on uranium that he had conducted but could not interpret. During a cold, snowy walk with her nephew, physicist Otto Frisch, Meitner pondered the results and realised that Hahn had succeeded in splitting the uranium nucleus, releasing energy. Back in Copenhagen, where he was working with Danish scientist Niels Bohr, Frisch asked American biologist William A Arnold what he called the process by which single cells divided. Arnold replied: “fission.”
Soon after, while in the United States, Bohr announced the discovery of fission. However, his own studies showed that of the two isotopes present in uranium – U-238 and U-235 – only the much rarer U-235, constituting less than 0.7 per cent of natural uranium, actually fissioned. This reassured him that an atomic bomb was not viable, since it would take a gigantic effort to separate enough U-235 fuel from natural uranium; “you would need to turn the entire country into a factory,” Bohr believed.
Exposing nature’s secrets
Hungarian scientist Leo Szilard, now in the US, was less certain that an atomic bomb was impossible, and he lobbied fellow scientists to keep ‘nature’s secrets’ secret. However, the Joliot-Curies were not swayed by Szilard’s pleas, and in March 1939 they published a paper suggesting a chain reaction – a self-sustaining nuclear reaction triggered when a neutron induces a nucleus to fission, releasing energy and causing further fissions – in uranium may be possible. In August 1939, an alarmed Szilard persuaded Einstein to write to US president Franklin Roosevelt, alerting him to the dangers of nuclear weapons. One of his advisors convinced an initially sceptical Roosevelt to act, and the Uranium Committee was established.
The outbreak of the Second World War meant many British physicists turned to war work, in particular radar. Excluded from classified projects, refugee scientists in Britain continued to ponder an atomic weapon. They included Otto Frisch, now in England, and German-born Rudolf Peierls. Hunched over a small gas fire at Birmingham University in the bitterly cold winter of 1940, they calculated the amount, or ‘critical mass’, of fissionable U-235 needed to release suffi-cient neutrons to spark a self-sustaining chain reaction to create an atomic bomb. The Joliot-Curies had estimated 40 tonnes. Peierls and Frisch calculated about one pound.
Scribbling on the back of an envelope, they calculated the energy released as equivalent to thousands of tonnes of ordinary explosive. Peierls recalled: “We were quite staggered… an atomic bomb was possible, after all, at least in principle!” They composed the Frisch-Peierls memorandum, a compelling three-page document which they presented to the British government in March 1940. It dealt with scientific, strategic and ethical issues, suggesting that the very high likely number of civilian casualties “may make it unsuitable as a weapon for use by this country”. But, as Germany might have been working on a bomb, they suggested Britain could develop one as a deterrent, “even if it is not intended to use the bomb as a means of attack”.
The high-powered Maud Committee reviewed the memorandum, and in summer 1941 they recommended that Britain should institute an atomic bomb project. Churchill agreed: “Although personally I am quite content with the existing explosives… we must not stand in the path of improvement.” British scientists sent copies of the report to colleagues in the still-neutral US, which they used to convince Roosevelt to increase fission research. After Pearl Harbor and America’s entry into the war, that relatively modest work expanded into the vast scientific and engineering effort codenamed the ‘Manhattan Project’, which cost $2bn, ballooned to the size of the US car industry and employed 130,000 people.
Under the overall supervision of General Leslie Groves, the project’s scientific core, led by physicist Robert Oppenheimer, was based at Los Alamos in New Mexico, where a small British team under James Chadwick joined it. On 2 December 1942, in a nuclear pile (reactor) under the west stands of Chicago University’s disused football stadium, Enrico Fermi achieved the world’s first self-sustaining chain reaction. This convinced Roosevelt to fund industrial-scale plants to produce U-235 and also plutonium – an alternative fissionable element created at Berkeley.
Nazi Germany had surrendered before the Manhattan Project team felt confident they were ready. The world’s first nuclear explosion – the Trinity test – took place on the morning of 16 July 1945 in the New Mexico desert, producing the equivalent of 20,000 tonnes of TNT. The brilliant blinding flash “told us… we had done our job”, Peierls recalled. As for Oppenheimer, he remembered that a line from the Hindu scripture Bhagavad Gita raced through his brain: “I am become Death, the destroyer of worlds.”
Locking onto the target
The target now was Japan, where the military estimated the atomic bomb might save a million young American lives that would otherwise be lost in invading the Japanese home islands. A target committee, with British as well as American members, identified the key criterion for deciding where the bomb should fall: “Obtaining the greatest psychological effect against Japan”. They whittled suitable targets down to three: Hiroshima, Kyoto and Niigata. When US secretary of war Henry Stimson removed Kyoto because of its cultural significance, Nagasaki replaced it.
Waiting with his crew on the Pacific island of Tinian, Paul Tibbets learned Hiroshima was his primary target, unless cloud cover was too thick. The final decision would depend on reports from weather planes flying ahead of him. At 2.45am, 6 August 1945, Tibbets opened the throttles, and the heavily laden Enola Gay accelerated slowly down the chopped coral runway. Little more than 100 feet from its end, he eased the B-29 safely and steadily into the northern sky.
Just over five hours later, a weather plane assigned to Hiroshima radioed that cloud cover was relatively light. Tibbets told his crew: “It’s Hiroshima.” At 8.15am Hiroshima time, they were over the city, the Enola Gay’s bomb doors opened and, Tibbets later wrote, “out tumbled ‘Little Boy’”. It exploded with a temperature of 1 million degrees centigrade at its heart, generating a white-hot fireball. Seeing the boiling clouds, his co-pilot Robert Lewis scribbled on his notepad: “My God, what have we done?” Futaba Kitayama, who had watched it fall, ran for her life. She recalled that in the river, “corpses were floating like dead dogs their shreds of clothing dangling like rags”. She also recounted: “I saw a woman floating face up, her chest gouged out, and gushing blood. Could such terrifying sights be of this world?” On the banks, school children writhed, “crazed, crying, ‘mother, mother’”.
As the fires grew in intensity, heavy rain began to fall. At first it came down in large, sticky black drops – “black rain” – as water intermingled with soot, dust and debris flung into the air by the explosion and radioactive material. Traumatised people instinctively opened their mouths to cool their parched throats. At his clinic two and a half miles from the epicentre, army doctor Hiroshi Sawachika struggled to cope as badly burned people arrived. “They held their hands aloft. They looked like they were ghosts,” he said. A heavily pregnant woman begged, “Please help my baby live,” but he could do nothing.
A fractured nation
Japan surrendered on 14 August 1945, five days after a second bomb – the plutonium-fuelled ‘Fat Man’ – devastated Nagasaki. Exact casualty figures are hard to calculate. In Hiroshima, the authorities estimated the number of deaths by December 1945 to be around 140,000; there were approximately 350,000 people in the city when the bomb fell. Since then, deaths from radiation-related diseases have added to the toll.
Although Hiroshima is again a vibrant city with a population more than treble what it was in August 1945, it still remembers. The former Hiroshima Prefectural Industrial Promotion Hall, close to the epicentre of the blast, has – apart from work to stabilise it – been left untouched, a shattered icon. Every day at 8.15am, a bell by the dome rings out, and for a moment passers-by pause and reflect. However, not all Hiroshima survivors are comfortable with how they are perceived. One commented: “I don’t like this special view of us. I’d like to stand up as an individual.”
The dropping of the atomic bomb cast a shadow over the lives of some of the Enola Gay’s crew. Tibbets, the plane’s pilot, repeatedly stated that he felt no personal guilt for doing what he saw as his military duty. However, he wrote in his memoirs: “I feel a sense of shame for the whole human race, which through all history has accepted the shedding of human blood as a means of settling disputes between nations.”
His co-pilot, Lewis, raised money for medical treatment for the so-called ‘Hiroshima maidens’ – young girls disfigured by the blast. In 1971, he auctioned his log of the Enola Gay’s flight and used part of the proceeds to buy marble from which he sculpted statues with religious themes. They included ‘God’s Wind’, a mushroom-shaped statue that symbolically leaked blood.
For their part, many atomic scientists agonised at the time and later about the morality of a nuclear weapon. Oppenheimer returned to academic life and refused to support US plans to build an even more powerful nuclear weapon: the world’s first hydrogen bomb. A variety of opponents, including fellow Manhattan scientist Edward Teller – devoted to what he called the “sweet technology” of the hydrogen bomb – used this, as well as Oppenheimer’s leftwing sympathies, to attack him during the an ti-communist McCarthy era.
Meanwhile, Joseph Rotblat – the only scientist to walk out of the Manhattan Project on conscience grounds – returned to England, his adopted homeland, and campaigned against nuclear proliferation. Working with other leading scholars, he founded the Pugwash Conferences: a means of bringing academics and public figures together to find peaceful solutions to global security threats. In 1995, he received the Nobel prize.
Of course, many more scientists who had been involved in the race to build the atomic bomb grappled with the moral implications of their ruinous creation. Perhaps the last word should fall to Einstein, who famously regretted the role he played in the development of nuclear weapons. He said: “Had I known that the Germans would not succeed in producing an atomic bomb… I would have never lifted a finger.”
Want to read more on this topic? Here are some of our most popular articles…
- Was the US justified in dropping atomic bombs on Hiroshima and Nagasaki during the Second World War?
- How have nuclear weapons shaped global politics? 10 key moments in the post-war atomic world
- In pictures: the atomic bombings of Hiroshima and Nagasaki
Diana Preston is the author of Before the Fallout: From Marie Curie to Hiroshima (Corgi, 2006). Her most recent book, out in paperback, is Eight Days at Yalta (Picador, 2020)