The 19th century was a period of rapid technological change, and huge shifts in scientific understanding. As medical knowledge increased, the prospect of surgery was no longer a likely death sentence, and great leaps were made in medical procedures, equipment and knowledge of how the body worked.
By the 1860s what would become known as cell theory was widely accepted: that all living organisms are composed of one or more cells; that the cell is the basic unit of structure and organisation in organisms and that cells are created from pre-existing cells. New methods in cell staining and the development of technology, such as the microscope, also played crucial roles in furthering biological breakthroughs and aiding new research.
One of the most important medical breakthroughs of the 19th century was French chemist Louis Pasteur’s discovery that germs cause disease, following on from Italian bacteriologist Agostino Bassi’s work on silkworm infections. Pasteur’s work with pasteurisation and his subsequent breakthroughs with inoculations against anthrax and rabies in turn inspired men like Robert Koch, who, in the early 1880s, identified the microorganisms that cause tuberculosis and cholera and created standards for researching the links between germs and disease.
“Having the ability to separate different bacteria and identify them for the first time was a huge medical breakthrough,” says Professor Mary Fissell. “Being able to associate a specific germ with a specific disease and form a clear relationship between the two ushered in a completely new way of thinking about the body and ill health. There wasn’t an immediate therapeutic payoff to the discovery, but it wasn’t long before scientists started to figure out that antitoxins could be created to cure specific diseases.
“The first antitoxin, for diphtheria, was developed in the early 1890s and its impact was incredibly dramatic. I’ve always loved the fact that, in Britain, the diphtheria antitoxin was held in police stations because they could be accessed all night.”
More like this
Florence Nightingale and Elizabeth Garrett Anderson both made marks in a man’s world
The foundations of modern nursing and the nursing profession itself were laid in the 19th century, and both owe a great deal to individuals such as Florence Nightingale (above), whose experiences as a nurse during the Crimean War revolutionised nursing practices. Arriving in Scutari, the British base hospital near Constantinople in November 1854, Nightingale and her team of nurses were greeted by horrific conditions: rodents and bugs; patients lying in their own excrement; contaminated water and lack of basic medical supplies.
Prioritising cleanliness and fresh air, Nightingale’s tireless work saw the death rate in Scutari fall from 43 per cent to 2 per cent and the establishment of a Royal Commission for the Health of the Army, in 1857. News of her work spread and by 1900 there were 64,000 trained British nurses, with Nightingale herself founding a Training School for Nurses in 1860. In 1876, women were permitted to enter the medical profession with Elizabeth Garrett Anderson becoming the first Englishwoman to qualify as a doctor.
Prior to the 19th century, surgery had largely been performed outside the body – the only real operative surgery undertaken was the removal of bladder stones, a procedure performed without anaesthetic and usually completed in under two minutes to avoid the patient dying of shock and pain. In 1658, prolific diarist Samuel Pepys had a bladder stone allegedly the size of a tennis ball removed without pain relief – historians have suggested Pepys may have survived the procedure because he had been first on the surgeon’s list that morning and had therefore been operated on with clean tools and hands!
“The advent of anaesthesia and antiseptic surgery was a real game changer, and surgery as we know it today was really invented in the late 19th century”, comments Fissell. “As germ theory developed, people began to conceptualise what that might mean in terms of surgery. Initially, antiseptic surgery saw mists of carbolic acid continually sprayed in operating rooms during medical procedures in order to kill bacteria in the air, and wounds were packed and covered with lint and gauze soaked in carbolic acid. But by the 1880s, aseptic practices were being developed, eventually including steam sterilisation of instruments, rubber gloves, and the wearing of surgical gowns.”
The greatest showman?
Scottish surgeon Robert Liston was always keen to demonstrate his knife skills
In the early 19th century, surgeons became minor celebrities as they vied with each other to operate on conscious patients – before anaesthesia – in the quickest time.
One of era’s most famous surgeon-showmen was Scotsman Robert Liston whose speed at amputation (just 30 seconds in some cases) was widely celebrated. On one occasion, Liston’s enthusiasm saw him amputate a patient’s testicles as well as their leg, while on another occasion, his knife accidentally amputated his assistant’s fingers in addition to his patient’s limb – both the assistant and the patient later died of infection.
Nevertheless, Liston, who, unusually for pre-germ theory times, always removed his frock coat before operating, performed 66 amputations between 1835 and 1840, of which ‘only’ 10 died (1 in 6 patients). Elsewhere, at nearby St Bartholomew’s Hospital, one in four patients ended up in the mortuary.
- Read more about Robert Liston and the horrors of the early Victorian hospital
9 medical breakthroughs in the 19th century
From X-rays to new wonder drugs, the 19th century saw seismic change…
In 1816, French physician René Laënnec observed two children signalling to each other via a piece of wood and a pin – one child had their ear to the wood and was receiving the amplified sound of a pin being scratched at the opposite end. Laënnec was inspired by this acoustic phenomenon to invent the stethoscope, with which the sounds made by the heart and lungs could be heard more clearly.
In 1820, French scientists Pierre Pelletier and Joseph-Bienaimé Caventou discovered the process to extract and isolate quinine from the bark of the Cinchona tree, which had been used in powdered form to treat malaria since the 17th century.
In 1897, German chemist Felix Hoffman successfully combined acetylating salicylic acid with acetic anhydride, creating acetylsalicylic acid, a substance that could relieve fever, pain, and inflammation without upsetting a patient’s stomach.
In 1847, James Young Simpson, professor of midwifery at the University of Edinburgh (below), became the first physician to demonstrate the anaesthetic properties of chloroform on humans. He went on to pioneer its application in surgery and obstetrics.
Using the work of men like Louis Pasteur and Ignaz Semmelweis (the latter who proved that doctors were responsible for transmitting childbed fevers in hospitals), Joseph Lister developed a carbolic acid spray for use during surgery. The death rate from infection after surgery decreased significantly as a result, although it was later realised that carbolic acid damaged tissues in the body.
In 1895, German physicist Wilhelm Röntgen accidentally discovered X-rays while testing whether cathode rays could pass through glass. Soon, doctors in Europe and the US were using X-rays to locate gun shots, bone fractures, swallowed items and kidney stones.
Electric hearing aid
The first portable electric hearing aid was patented in 1895 by electrical engineer Miller Reese Hutchison (below). Known as the Akouphone, it used a carbon transmitter and electrical current to amplify weak audio signals.
In 1893, African-American general surgeon Daniel Hale Williams performed the first successful documented heart surgery on a young black man, James Cornish, who had been stabbed in his chest. Cornish survived and was discharged 51 days later.
The German physiologist Hermann von Helmholtz is generally considered to have invented the ophthalmoscope, in 1851, although English mathematician and inventor Charles Babbage is also a contender. The device revolutionised ophthalmology, allowing physicians to examine the inside of the human eye.
A poor state of health
Britain’s filthy and overcrowded cities caused cholera chaos…
Between 1801 and 1841, Britain witnessed a population explosion: the number of people living in London doubled, and in Leeds, the number nearly tripled. Many cities’ housing and sanitation systems struggled to cope, and people were forced to live in appalling conditions: disease was rife.
Cholera – a bacterial infection caused by contaminated food or water – first arrived in Britain in 1831 and thrived in crowded, industrial towns. A cholera epidemic in 1831–32 claimed more than 50,000 lives across Britain, followed by another in 1848-9 – during which a Board of Health was set up, with the power to clean streets and build sewers. But with the prevailing theory in the first half of the 19th century being that disease was spread via bad air, more than 20,000 people died in a cholera outbreak in 1854.
It was during this third epidemic that Dr John Snow made the connection between contaminated water and cholera – he plotted cases in the Soho area of London and identified a specific water pump as a source of contagion. After the pump handle was removed, cholera cases dropped dramatically.
Mass vaccination programmes also launched in the 19th century – in 1853, smallpox vaccination became mandatory in the first three months of an infant’s life. And in 1875, a Public Health Act placed responsibility for public health on local councils – streets, sewers and water supplies were to be kept clean and inspectors hired to enforce the laws.
This article first appeared in the February 2021 issue of BBC History Revealed
Charlotte Hodgman is Strategic Projects Editor for HistoryExtra. She currently looks after the HistoryExtra Academy and was previously editor of BBC History Revealed, and deputy editor of BBC History Magazine - although not at the same time. She also makes the occasional appearance on the HistoryExtra podcast