What It Was Like To Be Sick In 1884


But the effectiveness of these drugs did not undermine the traditional home and family orientation of medical practice: in contrast to 1984, every weapon in the physician’s armory was easily portable. Contemporaries sometimes complained of difficulties in finding competent nurses and continuous medical help in critical ailments, but neither problem was serious enough to convince middle-class patients that they might best be treated away from their families.

Even surgery was most frequently undertaken in the patient’s home—despite the fact that revolutionary changes already had begun to reshape surgical practice. One source of this change was the rapid dissemination of anesthesia, which by 1884 was employed routinely. The question was not whether to administer an anesthetic in a serious operation but which one it should be; ether, chloroform, and nitrous oxide all had their advocates.

Despite the availability of anesthesia, however, major operations remained comparatively uncommon. Many of the technical problems of blood loss, shock, and infection had not been solved. But the style of surgery certainly had changed, as had the surgical patient’s experience. “Formerly,” as one surgeon explained the change, the “great aim of the surgeon was to accomplish his awful but necessary duty to his agonized patient as rapidly as possible.” Surgeons even timed their procedures to the second and vied with each other in the speed with which they completed particular operations. Now, the same surgeon explained, “we operate like the sculptor, upon an insensible mass.” The best surgeon was no longer necessarily the fastest.

Physicians had difficulty envisaging how microorganisms could bring about catastrophic change in individuals so much bigger. But these views were in the process of rapid change.

But doing away with surgical pain had not removed the more intractable dilemma of infection; by increasing the amount of surgery and length of time occupied by particular procedures, it may actually have worsened the problem of surgical infection. In the mid-1860s the Glasgow surgeon Joseph Lister had suggested that such infection might be caused by microorganisms—ordinarily airborne—and proposed a set of antiseptic procedures to keep these organisms from growing in exposed tissue. Immediate reactions were mixed. At first Lister’s ideas were seen as extreme and wedded arbitrarily to a particular antiseptic, carbolic acid—“Lister’s hobbyhorse” as skeptics termed it. But Lister gradually modified his technique, and by 1884 his point of view had come to be accepted by most American surgeons. This was also the year when surgeons learned that Queen Victoria had awarded Lister a knighthood; he already had become a historical figure.

But the problem of surgical infection was still far from solved in practice. Most surgeons and hospitals paid due homage to Lister but had no consistent set of procedures for keeping microorganisms away from wounds and incisions. Medical memoirs of this period are filled with stories of surgeons operating in their street clothes, of their using dressings again and again without intervening sterilization. Natural sponges were washed and reused. The day of aseptic surgery, in which every aspect of the operating room was calculated to keep contaminating objects as well as the atmosphere away from wounds, was still a decade away.

PART OF THE DIFFICULTY surgeons experienced in understanding Lister’s theories paralleled the more general problem of relating microorganisms to infectious disease: physicians had difficulty envisaging how such tiny living things could bring about catastrophic change in individuals so much bigger. And why did one person exposed to a disease fall victim while another continued in good health?

Tuberculosis was a particularly good example. The single most important disease of the century, in terms of mortality, tuberculosis always had been seen as caused by a combination of constitutional and environmental factors such as diet, work, and cleanliness. The simple announcement that a particular bacterium was associated with the disease could not change these age-old views. One needed both seed and soil to grow a crop, as a frequently used analogy ran: in their enthusiasm for the germ theory, physicians should not lose sight of the fundamental role played by the soil—that is, the individual’s life history and constitutional endowment—in preparing the way for infection. These views would not be changed easily, for they incorporated centuries of acute clinical observation as well as the authority of tradition.