- Historic Sites
How America’s Health Care Fell Ill
As modern medicine has grown ever more powerful, our ways of providing it and paying for it have gotten ever more wasteful, unaffordable, and unfair. An explanation and a possible first step toward a solution.
May/June 1992 | Volume 43, Issue 3
Did the hospitals shrink in size as a result? Not nearly enough. Because of the cost-plus way hospitals are paid, they don’t compete for patients by means of price, which would have forced them to retrench and specialize. Instead they compete for doctor referrals, and doctors want lots of empty beds to ensure immediate admission and lots of fancy equipment, even if the hospital just down the block has exactly the same equipment. The inevitable result, of course, is that hospital costs on a per-patient-per-day basis have soared.
Doctors, meanwhile, came to be reimbursed for their services according to what were regarded as “reasonable and customary” charges. In other words, doctors could bill whatever they felt like as long as others were charging roughly the same. The incentive to tack a few dollars onto the fee became strong. The incentive to take a few dollars off, in order to increase what in crasser circumstances is called market share, ceased to exist. As more and more Americans came to be covered by health insurance, doctors no longer even could compete with one another for patients on the basis of price, let alone have an incentive to.
The third dislocation that lay hidden in the early hospital plans was that they paid off for illness but not to maintain health. Imagine an automobile insurance company writing a policy on a very expensive car, guaranteeing to pay for any needed repairs by any mechanic of the owner’s choice, regardless of cost, but not requiring the owner to have the car regularly maintained, inspected, or used in a prudent manner. If the owner never changed the oil or checked the radiator hoses, if he added dubious substances to the fuel—or to himself—to improve his immediate driving pleasure, the company would have to pay for the eventual results. Needless to say, such a company would be out of business in short order. But that is precisely the way many health insurance policies are written even today. The result in today’s high-tech, high-capacity, high-expense medical world is economic lunacy.
The cost of bringing a single seriously premature baby out of danger, for instance, would pay for the prenatal care of thousands, sometimes tens of thousands of babies and would prevent many such premature births. But many poor parents cannot get prenatal care, so society often has to spend as much as a quarter of a million dollars to rescue a child from a tragedy that could have been prevented in many cases for one-tenth of one percent of that sum.
Most company health plans in this country charge no more to insure smokers and heavy drinkers than they do their more sensible co-workers. Couch potatoes are insured for the same amount as their regularly exercising friends. And because the system is weighted heavily in favor of acute care, acute care has become where the money is and thus where doctors tend to concentrate. Surgeons and surgical subspecialists earn six or seven times as much as primary-care physicians (the family doctor, in other words) in this country, even allowing for the difference in skills and training. It is not hard to see why the United States has more surgeons, per capita, than any other country in the world and, no surprise, more surgery.
During World War II there arose another feature of the American health-care system with large financial implications for the future: employer-paid health insurance. With twelve million working-age men in the armed forces during the war years, the American labor market was tight in the extreme. But wartime wage and price controls prevented companies from competing for the available talent by means of increased salaries. They had to compete with fringe benefits instead, and free health insurance was tailor-made for this purpose.
The IRS ruled that the cost of employee health-care insurance was a tax-deductible business expense, and in 1948 the National Labor Relations Board ruled that health benefits were subject to collective bargaining. Companies had no choice but to negotiate with unions about them, and unions fought hard to get them.
Businesses could now pass on a considerable portion of the cost of health insurance to the government via the tax deduction. (The deduction currently costs the federal government about $48 billion a year.) But a private individual buying his or her own policy could not. Thus a powerful incentive for employer-purchased health care was built into the system. By 1950 as many as 54.5 million Americans were covered by employer-provided health plans out of a population of 150 million.
The first hospital plans were designed not to protect patients but to guarantee cash flow for hospitals and demand for their services.
But the company plan increased the distance between the consumer of medical care and the purchaser of medical care by one more layer. When individuals have to pay for their own health insurance, they at least has an incentive to buy the most cost-effective available, given their particular circumstances. But beginning in the 1940s a rapidly increasing number of Americans had no rational choice but to take whatever health-care plan their employers chose to provide.