Essays on Science and SocietyBEYOND THE IVORY TOWER

The Victorian Revolution in Surgery

Science  02 Apr 2004:
Vol. 304, Issue 5667, pp. 54-55
DOI: 10.1126/science.1092361

The 1944 Hollywood movie The Great Moment tells of the discovery of ether anesthesia in Boston in the 1840s. This discovery was one of a trio of clinical innovations between the 1840s and the 1890s that collectively made up the Victorian revolution in surgery: anesthesia, antisepsis, and x-rays. But did these “moments” really represent a revolution in surgery alone, or did they set in motion an even larger revolution in medicine? Viewed historically, these “discoveries” help us understand how medical innovations relate to science and technology. They also reveal how a new medical marketplace came to be and how market forces shaped modern medicine.

The first of the three clinical innovations was the introduction of ether in America in 1846 and chloroform in Britain in 1847. Inhalation of the vapors of these compounds not only put people to “sleep,” making them insensible to pain, but, as one Victorian surgeon declared, its use meant that patients were “rendered unconscious of torture” (1). This was a boon not only for those who chose to go under the knife but also for those who wielded it, because surgeons no longer had to contend with patients who squirmed around on the operating table during an amputation—or who tried to escape altogether.

Twenty years later, the Glasgow-based surgeon, Joseph Lister, put forward his system of antiseptic surgery. Lister was correct in his view that surgical wound infection was the result of bacteria. But his methods to combat their action were cumbersome, constantly changing, and confusing. His techniques included varying dilutions of carbolic acid (phenol) and an array of putty, tin, and rubber protective devices. He also used vaporizing sprays that emitted an unpleasant and irritating acidic mist in the vicinity of patient and surgeon, but later denounced the use of this equipment.

By the 1880s, antiseptic surgery (or “Listerism”) had transformed into aseptic surgery as knowledge about pathogenic bacteria accumulated. Surgeons now concentrated their efforts on excluding disease-causing bacteria from incisions and amputation sites by ensuring that their own hands had been thoroughly cleaned and their street clothes were covered by clean white gowns; later, they began to wear caps, masks, and rubber gloves (see the figure, below).

Aseptic precautions became universal by 1900.

Surgeons and nurses wore white caps and gowns to reduce postoperative surgical infections.

CREDIT: OTIS HISTORICAL ARCHIVES, NATIONAL MUSEUM OF HEALTH AND MEDICINE

By the late 1890s, white-robed surgeons replaced their earlier, black and bloody frock-coated confreres. Sterilization became the order of the day as hospitals installed autoclaves and water treatment equipment; dry iodoform dressings supplanted earlier wet carbolic acid-impregnated devices; and instruments made out of a single piece of steel that could be readily sterilized replaced bone- and wooden- handled surgical tools that could not. Operating rooms and their furniture, too, were remodeled to incorporate smooth, impervious surfaces that did not harbor germs and could be readily cleaned (2, 3).

Coeval with these developments, in 1895 the German physicist Wilhelm Roentgen began to experiment with x-rays. With the assistance of his volunteer wife, whom he exposed to an uncontrolled stream of cathode rays, he photographed the skeletal structure of her hand. Roentgen's ability to “photograph the invisible” (see the figure below) immediately became front-page news around the world. For the first time, doctors could view the internal bone structure of a living body without slicing it open (4).

X-rays for viewing internal bones and bullets.

One of Roentgen's first x-ray photographs, taken in 1896.

CREDIT: BETTMANN/CORBIS

The implications of x-rays for surgery were obvious. The most vivid illustration was their immediate impact on military medicine. During the Spanish-American war in 1898, for example, American hospital ships sent to Cuban and other war zones were fully equipped with bacteriological laboratories, aseptic operating suites, and radiological apparatus. Radiographs of bullets embedded in bone, soft tissues, and shattered joints guided army surgeons in their work (5).

The surgical world of the mid-1890s was thus radically different from that of the 1840s; indeed, it remains closer to that of today. How did scientific knowledge, medical technology, and society contribute to this fundamental change? The historian George Basalla has argued that technology is not the servant of science and that necessity is not the mother of technological invention (6). The components of the surgical revolution are grounded in techniques and mechanical devices—innovations that, at heart, are technological. Did they depend on the scientific theory and social needs of their day?

Surprisingly, the answer is—not really. Knowledge of chemical or physiological principles had little to do with the advent of anesthesia or with explaining its action. Even today, we do not know the scientific grounds for profound anesthetic states. Listerism, too, was based less on solid scientific research than on an interesting hunch: In the mid-1860s, there was little scientific basis for it, except for an unsubstantiated germ theory of disease. Physics was better developed than many of the biological sciences, but the theoretical explanation for x-rays was not made until a couple of years after Roentgen's pioneering work in radiology. In these examples of innovation, at least, science was not technology's master.

The inventions were also not always responses to necessity. For many, anesthesia was not a solution to a pressing need. Victorian doctors were not engaged in a relentless search for ways to reduce the pain of surgery. The means and methods to induce general anesthesia in the form of nitrous oxide and ether had been available and successfully tried for many years before the “great moment” in Boston (resulting in a long-running priority dispute). Even after formal demonstrations of the effects and benefits of ether and chloroform, many patients refused to consent to surgery while under the influence of such noxious gases. Reports of unconscious women being raped by their doctor or dentist further fueled popular mistrust of this innovation. Accounts in newspapers, medical journals, and coroners' inquests of “healthy” people dying after the administration of ether or chloroform added to people's skepticism (7). Several decades after the announcement of anesthesia, some Victorian surgeons still considered it an unnecessary luxury.

Similarly, Victorian surgeons were well aware of the dangers of wound suppuration, especially after amputations, but they attributed this problem to impure air from crowded and improperly designed hospital wards. Joseph Lister's original idea appeared at first to have little to do with solving this issue. Prolonged resistance to Listerism only subsided when laboratory scientists showed that hospital infection was microbiological, not environmental in nature.

Reports going back to the 1870s of cathode rays fogging photographic plates—the same phenomenon that tipped off Roentgen—indicate that x-rays were around long before his discovery of them. But the timing of Roentgen's discovery was not driven by economic or clinical imperatives. That x-rays became front-page news of course had a lot to do with a few doctors taking quick advantage of their medical uses. But for most Victorians, their linkage to the “modern” era of mechanical and electrical devices such as the telephone, phonograph, incandescent light, and electric appliances seems to have been just as important. With x-ray technology, medicine could be seen as part of this wave of modernism (8).

The historical insights above derive from the viewpoint that science and technology may lead separate social and intellectual lives. But are not science, technology, and medicine intimately connected to each other—not only intellectually but also socially and even economically? The genesis, development, and impact of anesthesia, the control of wound infection, and radiology all relate to the practices of science, technology, and medicine. The historian and former scientist John Pickstone has called for a reexamination of technology's historical relation to science and medicine and their interactions with social and economic trends. He asserts that we must consider science, technology, and medicine (STM) together to understand how they developed either singly or collectively.

Pickstone has also advanced the concept of “technoscience.” When business enterprises and/or the state get involved in the innovation process associated with experimental laboratories, not only as supporters of research but also as economic or political beneficiaries of it, then technoscience is probably being practiced. Scientific knowledge and its products then become a commodity that can be bought and sold in the marketplace. Examples include innovations with military or pharmaceutical applications. They “involve dense intertwinings of universities, industry and government … [t]rying to separate the science from the technology seems less profitable than recognising their characteristic, dynamic combinations” (9).

Applying STM thinking along with the term technoscience introduces a new spin on the Victorian surgical revolution. The development of large-scale medical technology helped to shift the surgeon-doctor from the realm of independent skilled artisan to the world of corporate, mechanized medicine. Until the end of the 19th century, doctors owned their own instruments, which were often passed down from father to son. This tradition would continue in the iconic doctor's “black bag” and its contents. But as the range of equipment used became more complex and the challenges of sterilization, maintenance, and operation became overwhelming, increasingly it would be the hospital that owned and supplied surgical instruments and apparatus.

Likewise, as the technology of anesthesia and radiology became bulkier and more complicated, it was more likely that this apparatus would be located in hospitals. Hospitals underwent major reconstruction to accommodate the requirements of this new technology. Hospital architecture and design became a new specialty at the turn of the 20th century (10).

Equipment suppliers also profited from the surgical revolution during these early steps toward “technoscience.” During the late 19th century, Robert Wood Johnson and his brothers marketed prepackaged sterilized gauze across the United States. They also sold sterilized catgut sutures to doctors and hospitals. This product line of ready-to-use surgical dressings was the foundation for what would become Johnson & Johnson, a multibillion-dollar global enterprise. In 1896, Siemens and General Electric already made and sold the first x-ray apparatuses; today, these corporations are among the world's largest industrial researchers, commercial suppliers, and financial beneficiaries of this and other medical technologies. Turn-of-the-century instrument and hospital supply catalogs—typically a thousand pages in length—advertising aseptically designed amputating knives, furniture, and surgical garb, as well as x-ray and other electrical apparatus, illustrate this market boom. One American supplier noted in 1899 that manufacturers measured their annual trade in hundreds of thousands of dollars, not hundreds as they had done only a few years before.

All of this meant that the tools and technology of medicine quickly acquired an economic significance that would burgeon in the centuries to come. Technoscience was arriving. From the late 19th century, then, STM innovations became commodities, due in great part to the surgical transformation described here. Perhaps this was the real revolution.

References and Notes

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.

Related Content

Subjects

Navigate This Article