How America’s Health Care Fell Ill

PrintPrintEmailEmail

It is clear that there is something terribly wrong with how health care is financed in this country. Doctors, who are drowning in paperwork, don’t like it. Insurance companies, which write most of the checks for medical care, don’t like it. Corporations, whose employee medical benefits now just about equal their after-tax profits, don’t like it. And the roughly thirty-seven million Americans who live in terror of serious illness because they lack health insurance don’t like it.

 

Is there a better way? How do we fund the research that has alleviated so much human suffering and holds the promise of alleviating so much more? How should society care for those who cannot provide for themselves? How do we give them and the rest of us the best medicine money can buy for the least amount of money that will buy it?

Finding the answers to these questions will be no small task and will undoubtedly be one of the great political battles of the 1990s, for vast vested interests are at stake. But the history of medical care in this country, looked at in the light of some simple, but ineluctable, economic laws, can, I hope, help point the way. For it turns out that the engines of medical inflation were deeply, and innocently, inserted into the system fifty and sixty years ago, just as the medical revolution began.

That first medical milestone in the almanac, dating to 2700 B.C., is the Chinese emperor Shen Nung’s development of the principles of herbal medicine and acupuncture. (Emperors, it would seem, were more gainfully employed back then than they are today.) But the practice of medicine is far more ancient than that. And while rather a different line of work is often called “the oldest profession,” medicine has at least an equal claim.

Even the most technologically primitive tribes today have elaborate medical lore, often intertwined with religious practices, and there is no reason to doubt that earlier peoples had the same.

It was the Greeks—the inventors of the systematic use of reason that two thousand years later would evolve into science—who first believed that disease was caused by natural, not supernatural, forces, a crucial advance. They reduced medicine to a set of principles, usually ascribed to Hippocrates but actually a collective work. In the second century after Christ, the Greek physician Galen, a follower of the Hippocratic school, wrote extensively on anatomy and medical treatment. Many of these texts survived and became almost canonical in their influence.

 

After classical times, therefore, the art of medicine largely stagnated. Except for a few drugs, such as quinine and digitalis, and a considerably improved knowledge of gross anatomy, the physicians practicing in the United States at the turn of the nineteenth century had hardly more at their disposal than the Greeks had had. In some ways they were worse off, for while the followers of Galen believed in one theory of disease—they thought it resulted from imbalances among four bodily “humors”— early-nineteenth-century doctors were confronted with dozens. One list drawn up as late as 1840 numbered no fewer than thirty-eight schools of thought, including German Christian Theosophists, Magnetizers, Exorcisers, and Gastricists.

Needless to say, whenever there are that many theories, none of them is likely to be much good. Indeed, partly because of this theoretical confusion, there were no standards by which to judge the qualifications of those who entered the profession. Anyone could become a “doctor,” and many did. In 1850 the United States had 40,755 people calling themselves physicians, more per capita than the country would have as recently as 1970. Few of this legion had formal medical education, and many were unabashed charlatans.

This is not to say that medical progress stood still in that time. Indeed, there was more progress than there had been in two thousand years. The stethoscope was invented in 1816. The world’s first dental school opened in Baltimore in 1839. But the most important advance was the discovery of anesthesia in the 1840s. Until anesthesia, surgery was extremely limited, and the surgeon’s single most desirable attribute was speed. The most skilled prided themselves on being able to amputate a leg in less than one minute. But while anesthesia made extended operations possible, overwhelming postoperative infections killed many patients, so most surgery remained a desperate, lastditch effort.

Another major advance of the early nineteenth century was the spread of clean water supplies in urban areas. This great “public health” measure actually had at first little to do with public health and much to do with the rising standard of living brought about by the Industrial Revolution. But with clean water piped into the cities from distant reservoirs and waste water disposed of by sewer systems, the epidemics of waterborne diseases, such as typhoid and cholera, which had ravaged cities for centuries, rapidly, if largely fortuitously, abated.

Still, for all the recent improvements in living standards, death was no stranger even in the better parts of town. In the 1850s children under five years of age accounted for more than half of all the mortality in New York City, and it was common even for people in their prime suddenly to sicken and die while no one knew why. Every day’s newspaper’s obituary column was filled with commonplace tragedies: