Skip to main content

The American Environment

June 2024
30min read


The Cuyahoga River died for our sins . In 1796 the Cuyahoga, which promised easy transportation into the wilderness of the Ohio country from Lake Erie, prompted the city of Cleveland into existence. Over the next 170 years a primitive frontier town grew into a mighty industrial city, one that stretched for miles along the banks of its seminal river.

By the mid-twentieth century, however, the river no longer served as a major artery of transportation, having been superseded by railroads and highways. Now, instead of carrying the products of civilization into the vast interior, it carried the effluent of a far more technically advanced civilization out into the lake. The once crystalline waters of the river had become turbid and rank with its new cargo of chemicals and sewage. Its once abundant wildlife had long since fled, leaving only a few carps and suckers to eke out a living in the foul sullage on its bottom, testifying thereby to the very tenacity of life itself.

The first settlers saw the wilderness not as beautiful but as barren and threatening.

Finally, late in the morning of June 22,1969, the Cuyahoga could no longer bear the burden humankind had placed upon it. In a sort of fluvial cri de coeur , the river burst into flames.

The fire was no will-o’-the-wisp flickering over a transient oil slick. Rather, it roared five stories into the sky, reduced wooden railroad trestles to ruins, and demonstrated to the people of Cleveland and the nation as no scientific study or news report ever could that the burden being placed on the environment was reaching limits that could be crossed only at the peril of the future.

Less than a year later, on April 22, 1970, Earth Day was held, one of the most remarkable happenings in the history of democracy. Fully 10 percent of the population of the country, twenty million people, demonstrated their support for redeeming the American environment. They attended events in every state and nearly every city and county. American politics and public policy would never be the same again.

Today, nearly a quarter-century after the fire, sunlight once more sparkles off the surface of the Cuyahoga. Boaters cruise its waters for pleasure, and diners eat at riverside restaurants. Mayflies —so characteristic of a Great Lakes spring—once more dance in the air above it in their millions while their larvae provide food for at least twentyseven species of fish that have returned to its waters.

The Cuyahoga is not pristine, and barring an alteration in human priorities and circumstances beyond anything now imagined, it will not become so. But it has changed greatly for the better and continues to improve. It is once more a living river.

The Cuyahoga and its history is a microcosm of the American environment. For the history of that environment is the story of the interaction between a constantly changing, ever-more-powerful technology and an only slowly shifting paradigm of humankind’s proper relationship with the natural world.


Human beings evolved in the Old World, a fact that more than once would have sudden and drastic consequences for the New.

The beginning of the Upper Paleolithic period was marked by a dramatic technological development as humans acquired tools and weapons that were far more sophisticated than any known before and became the most formidable hunters the world has ever known. In the Old World both our prey and our competitors, evolving alongside, quickly learned to treat the emerging biological superpower with the greatest respect, and most were able to adapt successfully. But the New World lay in innocence while human hunters perfected their newfound skills in the Old.

When the land bridge that was a temporary consequence of the last ice age allowed humans to migrate into it, the results were swift and devastating: much of the North American Pleistocene fauna went extinct. Horses, camels, mastodons, mammoths, true elephants, several species of deer, bison, and antelope, ground sloths, glyptodonts, and giant beavers vanished, as did their associated predators, such as saber-toothed cats, giant lions, and cheetahs.

It cannot be known for sure to what extent the arrival of human hunters affected this great extinction, but there is little doubt that it was an important, perhaps fundamental, factor. But the evolutionary equilibrium that had been shattered by the arrival of the superhunters eventually returned, for the human population of the New World, limited by numerous other factors besides food supply, remained low. And the surviving among the species they had encountered quickly adapted to the new conditions.

Thus the next human culture that appeared in the New World, the Europeans, found it to possess a biological abundance and diversity of, to them, astounding proportions. But these newcomers failed almost entirely to appreciate this aspect of the New World, for hunting in their culture had been reduced to, at most, a secondary source of food.

They were heirs to the agricultural revolution that began in the Old World at the end of the last ice age. It, too, was marked by a profound leap in technology. In turn the more settled conditions of agricultural communities allowed the development of still more elaborate technologies as well as social and political organizations of unprecedented complexity. The result was what we call civilization.

But the early civilizations were acutely aware that they were small islands surrounded by vast seas of wilderness from which savage beasts, and savage men, might come at any time and wipe them out. Thus their inhabitants came to look on the wilderness as an alien place, separate and apart. Not surprisingly under these circumstances, the religions that developed in the Near East in the wake of the agricultural revolution reflected this worldview, sanctioned it, and codified it. Because it became, quite literally, Holy Writ, it persisted unquestioned for centuries.


The Book of Genesis, in fact, could hardly be more direct on the subject. “God said unto [man], Be fruitful, and multiply, and replenish [i.e., fill up] the earth, and subdue it: and have dominion over the fish of the sea, and over the fowl of the air, and over every living thing that moveth upon the earth.”

Over the next more than two thousand years, humans operating with this worldview in mind transformed the continent of Europe, and by the time they began to expand overseas, wilderness had disappeared from all but the margins of that continent.

Thus the world they encountered in North America was unlike anything they had ever seen. The greatest temperate forest in the world, teeming with life, stretched almost unbroken from the Atlantic seaboard to well west of the Mississippi. The grasslands that filled the Great Plains in the rain shadow of the Rocky Mountains also abounded with animal life as millions of bison, pronghorn antelope, elk, white-tailed and mule deer roamed it, as did their associated predators, the wolf, the mountain lion, the bear, and the jaguar.

Farther west still, the forests of the Northwest and the deserts of the Southwest reached to the Pacific.


When the new settlers arrived, they did not see the beauty or abundance of the wilderness that greeted them. Far from it; they regarded it as barren and threatening because the ancient paradigm that dated to the dawn of civilization still molded their thinking. Thus they regarded their first task in the New World to be a re-creation of what they had known in the Old, an environment shaped by the hand of man, for man’s benefit.

But while they sought, as nearly as possible, to re-create the Europe they had left behind, converting the “remote, rocky, barren, bushy, wild-woody wilderness” into a “second England for fertilness,” there was one way in which the New World was utterly unlike the Old: it possessed an abundance of land so great that it seemed to the early settlers, and to their descendants for many generations, to verge upon the infinite. “The great happiness of my country,” wrote the Swiss-born Albert Gallatin, Jefferson’s Secretary of the Treasury, “arises from the great plenty of land.”

Because the supply seemed without end, the value placed on each unit was small. It is only common sense to husband the scarce and let the plentiful take care of itself. Caring for the land, an inescapable necessity in Europe, was simply not cost-effective here. After all, the settlers could always move on to new, rich land farther west. For three hundred years they did exactly that, with ever-increasing speed.

Americans also developed other habits in the early days that stemmed directly from the wealth of land and scarcity of the population. Today, when American archeologists investigate a site, they know that the place to look for the garbage dump is on the far side of the fence or stone wall that was nearest to the dwelling. In Europe that was likely to belong to a neighbor; in America it was often wilderness and thus beyond the human universe. This out-of-sight-out-of-mind attitude would have no small consequences when technology increased the waste stream by orders of magnitude.


The early settlers, while they greatly altered the landscape of the Eastern seaboard, clearing whole stretches of the primeval forest and converting the land to fields, pastures, and meadows, did not greatly diminish the biological diversity. They opened up the best land for farming but left untouched the steep or rocky areas as well as, to a great extent, the wetlands and mountains. Indeed in some ways the early settlers increased the diversity by expanding habitat for such grassland species as bluebirds, ground hogs, and meadowlarks. The ecosystem as a whole remained intact.

North America was transformed within a century. There was a vast price to pay.

Only in the South, where plantation agriculture became the rule in areas to which it was suited, did monocultural husbandry greatly diminish the fertility and texture of the soil. Virginia, the largest and, thanks to its tobacco exports, most powerful of the colonies, found its yields declining sharply toward the end of the eighteenth century as the best land was exploited and exhausted. Erosion became an increasing problem. As early as the 178Os Patrick Henry thought that “the greatest patriot is he who fills the most gullies.”


Meanwhile, as a new civilization was being built out of the wilderness of North America, new attitudes toward wilderness itself were emerging in Europe. The ancient paradigm that had gripped Western thinking since Genesis was beginning, partially, to shift at last.

In the seventeenth century, wilderness had been universally regarded as at best a waste, if not an evil. In the eighteenth, however, it began to be seen for the first time as a thing of beauty. Mountains came to be viewed as majestic, not just as an impediment to travel or a barrier against invasion.

In Britain the aristocracy began to lay out gardens, such as those by Capability Brown, that were highly stylized versions of nature itself, rather than the direct refutation of it that seventeenth-century gardens, like those at Versailles, had been.

Biology became a systematic science (although the word itself would enter the language only in the early nineteenth century). Linnaeus studied the relationships of plants and animals. Georges Cuvier, William Smith, and others began to examine fossils and to sense, for the first time, a history of the earth that was at variance with the account given in Genesis.

The new attitude toward wilderness soon came to this country and contributed to the growing American sense of uniqueness. James Fenimore Cooper’s novels and Thoreau’s essays displayed a love of wilderness that would have been inconceivable a century earlier.

Of course, in Europe wilderness was largely an abstraction. In America it was just down the road. At the end of the Revolution, it nowhere lay more than a few days on horseback from the Atlantic shore, and Thomas Jefferson, no mean observer, thought it would be “a thousand years” before settlement reached the Pacific.

Jefferson was wrong. He did not realize—no one could have—that a third technological revolution was just getting under way, one that would give humankind the power to transform the world far beyond anything provided by the first two. It had taken millennia to reshape the face of Europe to human ends. North America would be transformed in less than a century. But there would be a vast price to pay for this miracle.

The steam engine and its technological successors allowed energy in almost unlimited quantity to be brought to bear on any task. So forests could be cut, fields cleared, dams built, mines worked with unprecedented speed. As a result, in less than a single human lifetime an area of eastern North America larger than all Europe was deforested. Virtually uninhabited by Europeans as late as 1820, the state of Michigan by 1897 had shipped 160 billion board feet of white pine lumber, leaving less than 6 billion still standing.


But the new engines needed fuel. At first waste wood supplied much of it, and later coal and then oil. The by-products of this combustion were dumped into the atmosphere as they had always been, but now their quantity was increasing geometrically. In 1850 Americans were utilizing more than eight million horsepower, animal and mechanical. By 1900 nearly sixtyfour million, almost all mechanical, was being used by what economists call prime movers.

The factory system and mechanization brought many commodities within the financial reach of millions, while new transportation systems created national markets and made economies of scale both possible and necessary. This, in turn, caused the demand for raw materials to soar. The great mineral wealth that was being discovered under the American landscape was exploited with ever-increasing speed. Again the waste products were dumped at the lowest possible cost, which meant, in effect, on the far side of the nearest stone wall.

Increasing wealth and the new technologies allowed cities to bring in fresh, clean water for their rapidly increasing populations. This water was used to flush away the dirt and sewage of human existence, but only into the nearest body of water. The quality of life in the human environment was immeasurably improved by this, as the squalor that had characterized the urban landscape since Roman times disappeared. But the quality of the nation’s waterways sharply deteriorated.

The new technology allowed us to turn more and more of the landscape to human use. The old-fashioned moldboard plow, in use since medieval times, could not deal easily with the rich, heavy soils and deep sod of the American Midwest. The steel plow invented by John Deere in 1837 quickly opened up what would become the breadbasket of the world. Wetlands could now be drained economically and made productive. Millions of acres vanished, and their vast and wholly unappreciated biological productivity vanished too.

So rapid an alteration of the landscape could only have a severe impact on the ecosystem as a whole. The loss of so much forest caused runoff to increase sharply, eroding the land and burdening the waters with silt, destroying more wetlands. Many animals’ habitats disappeared. And because the ancient biblical notion that humans had dominion over the earth still held, others vanished entirely.

The beautiful Carolina parakeet, North America’s only native parrot, proved a major agricultural pest. Because it lived in large, cohesive flocks, it made an easy target for farmers with the shotguns that the Industrial Revolution made cheap. It was extinct in the wild by the turn of the century; the last known specimen died in the Cincinnati Zoo in 1914.

Another avian casualty was the passenger pigeon, one of the great natural wonders of America, as amazing as Niagara Falls or the Grand Canyon. The passenger pigeon almost certainly existed in larger numbers than any other bird in the world. Moreover, it was concentrated in flocks of unbelievable immensity. Audubon reported one flock that took a total of three days to pass overhead and estimated that, at times, the birds flew by at the rate of three hundred million an hour.

The passenger pigeon nested in heavily forested areas in colonies that were often several miles wide and up to forty miles long, containing billions of birds. Trees within the colony each had hundreds of nests, and limbs often broke under the weight. The squabs, too heavy to fly when abandoned by their parents at the end of the nesting season, were easy prey. With railroads able to ship the fresh-killed birds to the great Eastern cities quickly, hunters slaughtered them in the millions to meet the demand.

Unfortunately it turned out that passenger pigeons needed the company of huge numbers of their fellows to stimulate breeding behavior. Once the size of the flocks fell below a certain very large minimum, the birds stopped reproducing, and the population crashed. Just as with the Carolina parakeet, the last passenger pigeon died in the Cincinnati Zoo in 1914.

The herds of the Great Plains also fell to hunters. It is estimated that upward of thirty million bison roamed the grasslands of North America in the middle of the nineteenth century. By the dawn of the twentieth, less than a thousand remained alive.

As early as the 1850s it was clear that something irreplaceable was disappearing.


As early as the 1850s it was clear to the more thoughtful that something precious and irreplaceable was rapidly disappearing. The wilderness that had helped define the country seemed ever more remote. It was now recognized the natural world could provide refreshment whose need was becoming more and more keenly felt.

Urban parks, such as New York City’s incomparable Central and Prospect parks, were intended to provide the population with a taste of nature that many could now obtain no other way. But these parks were, like the aristocratic gardens created in eighteenth-century Britain, wholly man-made and no more truly natural than a sculpture is a rock outcropping.

Movements began to take hold to preserve portions of the fast-vanishing wilderness itself. As early as the 1830s the painter George Catlin put forward the idea of a wild prairie reservation, a suggestion that, alas, was not implemented before nearly all of the country’s prairie ecosystem was destroyed. But the movement took root, and in 1864 the first act of preservation was undertaken when ownership of the Yosemite Valley and a stand of sequoias was transferred from the public lands of the United States to the state of California.

In 1872 the first national park in the world was created when reports of the splendors of Yellowstone were delivered to Congress. James Bryce, British ambassador to the United States, called the national parks the best idea America ever had. Certainly they have been widely copied around the world. Today American national parks protect 47,783,680 acres, an area considerably larger than the state of Missouri.

States, too, began to set aside land to protect what was left of the wilderness. New York turned five million acres—15 percent of the state’s land area—into the Adirondack Park and Forest Preserve, to remain “forever wild.”

In the 1870s Carl Schurz, Secretary of the Interior, began moving for the preservation of federally owned forests. Born in Europe, where forests had long since become scarce and thus precious, and where forest-management techniques were far more advanced than those in this country, Schurz and many others helped create a new concern for America’s fast-dwindling woodlands. By the end of Theodore Roosevelt’s Presidency, almost sixty million acres were in the forest reserve system.

Today hundreds of millions of acres in this country enjoy various levels of protection from development, and more are added every year. But while the parks and reserves created by this movement are national treasures that have greatly enriched the quality of life, their creation was predicated on the part of the ancient paradigm that still survived. That part held that the natural world and the human one were two separate and distinct places. And it was still thought that each had little effect on the other.



It was George Perkins Marsh, lawyer, businessman, newspaper editor, member of Congress, diplomat, Vermont fish commissioner, and lover and keen observer of nature, who first recognized the folly of this unexamined assumption. Growing up in Vermont, he had seen how the clear-cutting of the forests and poor farming practices had degraded the state’s environment.

Marsh observed in 1864, “Man is everywhere a disturbing agent.” Nobody listened.

In 1864 he published Man and Nature , which he expanded ten years later and published as The Earth as Modified by Human Action . Individual instances of human effect on the natural world had been noted earlier, but Marsh, like Darwin with evolution, gathered innumerable examples together and argued the general case. He decisively demonstrated that the impress of humankind on the whole world was deep, abiding, and, because it was largely unnoticed, overwhelmingly adverse. “Man is everywhere a disturbing agent,” he wrote. “Wherever he plants his foot, the harmonies of nature are turned to discords.”

Recognizing that technology, energy use, population, food production, resource exploitation, and human waste all were increasing on curves that were hyperbolic when plotted against time, he feared for the future. “It is certain,” he wrote, “that a desolation, like that which overwhelmed many once beautiful and fertile regions of Europe, awaits an important part of the territory of the United States … unless prompt measures are taken.”

Darwin’s book On the Origin of Species provoked a fire storm of controversy in the intellectual world of his time when it was published in 1859. It changed humankind’s perception of the world profoundly and immediately. But Man and Nature changed nothing. Published only five years later, it met with profound indifference, and its author sank into the undeserved oblivion of those who are out of sync with their times. As late as 1966, when the science of ecology he was instrumental in founding was already well developed, so commodious a reference work as the Encyclopaedia Britannica made no mention of him whatever.

Perhaps the difference was that Darwin’s ideas had only philosophical, religious, and scientific implications. Marsh’s ideas, on the other hand, had profound economic consequences. An America rapidly becoming the world’s foremost industrial power did not want to hear them, even though as early as 1881 the mayor of Cleveland could describe the Cuyahoga River as “an open sewer through the center of the city.”



In fact, the seeds of the country’s first great man-made ecological disaster were being planted even as Marsh wrote.

In the 1860s railroads pushed across the Great Plains and opened them up to settlement by connecting them to Eastern markets. On the high plains toward the Rockies, as hunters slaughtered bison and pronghorns by the millions, ranchers replaced them with cattle, which overgrazed the land. Then farmers began moving in.

World War I greatly increased the demand for wheat, while the tractor made plowing the tough, deep sod of the high plains a more practical proposition. The number of farms in the area east of the Rocky Mountains burgeoned in the 1920s, taking over more and more of the ranchland.

The mean annual rainfall in this area varied between ten and twenty inches, not enough for crop farming except in the best of years. But the early decades of the century happened to see many such years. Then, in the late twenties, the rains slacked off, and drought swept the plains.

This had happened hundreds of times in the past, and the plants and animals that had evolved there were adapted to it. Wheat and cattle were not. Worse, over the last few years, the sod, the deep net of grass roots that had bound the soil together, had been broken over millions of acres by the farmers with their plows. The topsoil, without which no plant can grow nor animal live, now lay exposed to the ceaseless, drying winds.

In 1933 no rain fell for months in western Kansas, and little elsewhere. The crops withered, the livestock died of thirst or starvation, and the dust, bound by neither sod nor moisture, began to blow. On November 11 a howling, rainless storm sprang up. “By mid-morning,” a reporter wrote of a farm in South Dakota, “a gale was blowing cold and black. By noon it was blacker than night, because one can see through the night and this was an opaque black. It was a wall of dirt one’s eyes could not penetrate, but it could penetrate the eyes and ears and nose. It could penetrate to the lungs until one coughed up black. …

“When the wind died and the sun shone forth again, it was on a different world. There were no fields, only sand drifting into mounds and eddies that swirled in what was now but an autumn breeze. There was no longer a section-line road fifty feet from the front door. It was obliterated. In the farmyard, fences, machinery, and trees were gone, buried. The roofs of sheds stuck out through drifts deeper than a man is tall.”


The dust of this storm, uncountable millions of tons of topsoil, darkened the skies of Chicago the following day and those of Albany, New York, the day after that. Terrible as it was, the storm proved but the first of many that ravaged the high plains in the next several years, as the drought tightened its grip and the unforgiving winds blew and blew. In the middle years of the 1930s, they laid waste thousands of square miles of what had been, just a few years earlier, a vibrant ecosystem. It was now the Dust Bowl. Upward of two hundred thousand people were forced to abandon their farms and trek westward in desperate search of the necessities of life itself.

The rains finally came again, and in the 1940s the discovery of the Oglala aquifer, a vast reservoir of water that underlies much of the Midwest, rescued the farmers who remained. Tapped by ever-deeper wells, the aquifer is now seriously depleted. And economics is slowly rescuing the land as the price of water increases every year.

It was always marginal for farming, and so it remains. Even with many, though mostly ill-conceived, federal programs, the farmers on the high plains are finding it ever harder to compete in world markets. Every year more and more farms are abandoned, and the land reverts to what in a perfect world it would never have ceased to be—shortgrass prairie.


The technological leap that had begun in Jefferson’s day only accelerated in the twentieth century. The burdens that had been placed on the environment in the nineteenth century by such things as fuel use and sewage disposal increased sharply as the population expanded and new technologies spread across the land.

The limits of the ability of the environment to cope with the load were being reached more and more often. In October 1947 a thermal inversion settled over Donora, Pennsylvania. The town is set in a natural basin and was home to much heavy industry. The layer of cold air trapped the effluent of that industry and of the cars and furnaces of the population. By the time the inversion ended, four days later, twenty people were dead and six thousand ill enough to require treatment.

To an astonishing extent—at least as viewed from today’s perspective—the people of the time accepted such happenings as the price of the Industrial Revolution that had brought them so much wealth and material comfort. A New Yorker cartoon of the day showed a woman sitting at a table set for lunch in the garden of a New York brownstone. “Hurry, darling,” she calls to her unseen husband, “your soup is getting dirty.”

New burdens were also added. The chemical industry grew quickly in this century, fueled by an explosion in knowledge. The disposition of chemicals was, as always, over the nearest stone wall: into a landfill or convenient body of water.


Agriculture became more businesslike as farms grew in size, became much more mechanized, and increasingly specialized in one or two crops. Of course, even Patrick Henry had known, two centuries earlier, that monocultural farming depletes the soil and is vulnerable to insects and other pests. But now the chemical industry could overcome this, thanks to synthetic fertilizers and pesticides.

Such chemicals as DDT were greeted as miracles of modern science when they first became available, and their use spread rapidly. In 1947 the United States produced 124,259,000 pounds of chemical pesticides. Only thirteen years later, in 1960, production was up to 637,666,000 pounds of often far more potent pesticides.

Diseases such as malaria and agricultural pests such as the boll weevil were declared on the verge of eradication. And the “control of nature,” the final realization of the dominion enjoined by Genesis, was said to be at hand. DDT and other pesticides sprayed from airplanes blanketed vast areas, to kill gypsy moths, budworms, and mosquitoes.

But there were troubling signs for the few who looked. The pesticides were nondiscriminatory; they killed all the insects they touched. Honeybees, essential for the pollination of many crops and innumerable natural plants, were often wiped out by spraying programs aimed at other insects. Beekeepers began to fight back with lawsuits. “It is a very distressful thing,” one beekeeper wrote, “to walk into a yard in May and not hear a bee buzz.”

More than two hundred new pesticides were introduced in the years following World War II. The reason was that the older ones became increasingly ineffective. Many species of insects go through numerous generations a year and can evolve very rapidly, especially when a severe pressure such as a new pesticide is applied. In a monument to the vigor with which life clings to existence, they did exactly that.

And birdwatchers noticed a troubling decline in the numbers of some species, especially the large raptors that lived at the top of the food chains. Charles Broley, a retired banker, banded bald eagles in Florida beginning in 1939 as a hobby. He usually banded about a hundred and fifty young birds a year on the stretch of coast he patrolled. Beginning in 1947, more and more nests were empty or held eggs that had failed to hatch. In 1957 he found only eight eaglets, the following year only one.

But these troubling events were scattered, knowledge of them dispersed over a huge country and many scientific disciplines. They were no match for the chemical companies. But these, it turned out, were no match for a frail middle-aged woman named Rachel Carson.

Rachel Carson was trained as a marine biologist, but she was a born writer. In 1952 her book The Sea Around Us was published with a very modest first printing. To everyone’s astonishment—most of all hers—it became a titanic bestseller that made its author famous across America. Ten years later she published Silent Spring . It changed the world.

Within a few years of Silent Spring, the demand for action became irresistible.

Again a huge bestseller, Silent Spring detailed in lucid, often poetic, and always accessible prose how pesticides were playing havoc with the air, land, and water of the country and how their uncontrolled use was doing far more harm than good. Further, it introduced millions of Americans to the concept that the natural world was an intimately interconnected web. This web, Carson made clear, included humans quite as much as every other living thing that shared planet Earth. What killed insects would, if not handled carefully, one day kill us too. George Perkins Marsh had said much the same thing a hundred years earlier. This time the people read and believed.

The ancient paradigm from the dawn of civilization, when man was frail and nature omnipotent, was dead at last. Dead with it was what had been in theory a dream and in fact a nightmare—the control of nature. It had been, Rachel Carson wrote on the last page of Silent Spring , “a phrase conceived in arrogance.”


Within a few years the public demand for action in behalf of the environment became irresistible, and it caught a complacent government by surprise. John C. Whitaker, Nixon’s cabinet secretary, later recalled that “we were totally unprepared for the tidal wave of public opinion in favor of cleaning up the environment.”

Earth Day cleared up any lingering doubts about the public’s opinion on the matter. Federal government agencies such as the Environmental Protection Agency were created, and goals and timetables for air and water quality were established. We Americans set out on a crusade to rescue the land from ourselves. In many ways we shared the fervor with which the medieval world had set out to rescue the Holy Land from the infidel.

Today, nearly a quarter-century after the crusade to the new Jerusalem of a clean environment began, there is vast progress to report. In 1959, 24.9 million tons of particulate matter—soot—were emitted into the air in the United States. By 1985, 7.2 million were, and less every year. In 1970, 28.4 million tons of sulfur oxides, a prime contributor to smog, were released by power plants and automobiles. In 1990, 21.2 million tons were, a drop of nearly 25 percent. Carbon monoxide emission has fallen by 40 percent since 1970, and lead has been eliminated as an additive to gasoline.

Cars being manufactured in the 1990s emit only a fifth as much pollution as those made before 1975. Thus 80 percent of all automobile pollution today is generated by just 10 percent of the cars on the road. In the next few years, as these clunkers end up on the scrap heap, automobile pollution will decrease sharply.

Already the number of days per year when the air quality is below standards in most of the country’s cities has fallen significantly, by 38 percent in the 1980s alone. Even Los Angeles, the smog capital of the country thanks to its geography and automobile-oriented infrastructure, has enjoyed a 25 percent decline in smogalert days.


In 1960 only about 50 million Americans were served by municipal sewage plants that provided secondary or tertiary treatment. Today more than half the population is. As a result, many urban waterways are now cleaner than they have been since the early 180Os. New York used to dump the sewage of eight million people into the Hudson, Harlem, and East rivers. Today, in a development that would have stunned turn-of-the-century New Yorkers, there is an annual swimming race around Manhattan Island.

Rural rivers too have greatly benefited. Most of the Connecticut River’s four-hundredmile length was declared “suitable only for transportation of sewage and industrial wastes” in the 1960s. Today 125 new or upgraded water treatment plants, costing $900 million, have transformed it. Fishing and swimming are now allowed almost everywhere, and wildlife such as ospreys, bald eagles, blue crabs, and salmon has returned in numbers.

The sludge that is the end product of sewage treatment was until very recently dumped in the ocean or into landfills. Now it is increasingly being sought by farmers as a cheap fertilizer and soil conditioner. New York City produces 385 tons a day, all of it once dumped beyond the continental shelf. One hundred tons of that is being used by farmers in Colorado and Arizona. Initially skeptical, fifty of those farmers recently sent New York’s mayor a letter asking for more. He’s likely to oblige. Boston sludge now fertilizes Florida citrus groves. And because sewage sludge not only fertilizes but improves soil quality, it is displacing chemical fertilizers.

As old factories reach the end of their productive lives and are replaced by new ones built under stringent controls, the non-sewage pollution of the waterways is also steadily declining. The violation rate (the percentage of tests where the amount of pollutants was found to be above standards) for lead and cadmium fell to less than one percent. Dissolved oxygen is an important measure of a water body’s biological viability. The percentage of times it was found to be below standard fell 60 percent in the 1980s.

Many bodies of water, such as Lake Erie, declared dead in the 1970s, have bounded back with the improved situation and with the help of life’s ferocious determination to go on living. The amounts of pesticides being used every year fell by more than a quarter in the 1980s, and those in use today are far less persistent and far less toxic than most of those in widespread use in the 1960s. The level of DDT present in human fatty tissue, a fair measure of its presence in the environment, was 7.95 parts per million in 1970. By 1983 it had fallen to 1.67 parts per million. Today, ten years later, no one even bothers to gather the statistic.

The land, too, has improved. In the eastern part of the United States, the area of forest land has been increasing for more than a century, as clear-cut areas have been allowed to regenerate. It will be another hundred years, at least, before they reach the climax stage, but they are on their way. And today 28 percent of all farmland is no longer plowed at all, and the percentage is growing quickly. Conservation tillage is used instead; the method sharply reduces erosion and improves soil quality while slashing costs, producing crops for as much as 30 percent less.

Programs to reduce the use of chemical fertilizers are being tried in more and more areas as farmers learn new techniques. In Iowa in 1989 and 1990 a joint EPA-state program helped farmers cut their use of nitrogen fertilizer by four hundred million pounds without sacrificing crop yields. Because agricultural fertilizers and pesticides now account for more than 65 percent of all water pollution (factories account for only 7 percent), this trend has no small implication for the future.

Wildlife is on the mend in many ways. To be sure, the number of species on the endangered list has grown sharply in the last two decades, but that is much more an artifact of increased knowledge than of a still-deteriorating situation.

Many species have rebounded sharply, thanks in some cases to protection and in others to the explosion of biological and ecological knowledge that has so marked the last twenty-five years. To give just two examples, alligators, once hunted mercilessly for their skins, are no longer on the list at all. And peregrine falcons, almost extirpated in the Eastern United States by DDT, have been with infinite care and effort put on the road to recovery. Today there is a pair nesting on the Verrazano Bridge at the entrance to New York’s Upper Bay, and there is even a pair nesting on the top of the Met Life (formerly Pan Am) building in midtown, exploiting the distinctly unendangered local pigeon population.

Nor has public interest in rescuing the environment slackened. The New York Times Index for 1960 needed less than 19 inches to list all the references to air pollution that year, and only 15 for water pollution. In 1991 the two subjects required 87 and 107 inches respectively. Local organizations monitoring local situations have multiplied across the country. Many hire professionals, such as the Hudson River Fisherman’s Association, whose “riverkeeper” patrols the Eastern seaboard’s most beautiful waterway.

And public opinion has become a powerful force. In the fall of 1992 the governor of Alaska proposed culling the number of wolves in the state in order to increase the number of moose and caribou for human hunters. It was not long before he wished he hadn’t. The state, heavily dependent on tourist dollars, was soon backpedaling furiously before the onslaught of intensely negative public reaction.

So is the American environment once more pristine? Of course not. Many pollutants have proved unexpectedly stubborn and persistent. Many businesses have resisted changing their ways. In most cities the storm and waste sewers are still one and the same, and sewage overflows in bad weather. It will take many years and billions of dollars to correct that. An unknowable number of species are still threatened by human activity.

But the nation’s water, air, land, and wildlife all are better, in many respects, than they have been in a century, and they continue to improve. To put it another way, if the task of cleaning up the American environment were a journey from Boston to Los Angeles, we would be well past the Appalachians and might even have the Mississippi in sight.

Then why is the impression so widespread that we are, at best, entering Worcester, if not actually marching backward somewhere in Maine? There are many reasons, and as so often happens, human nature lies at the root of all of them.

A first reason is that environmental bureaucrats, like all bureaucrats, want to maximize the personnel and budgets of their departments. So from their point of view, it simply makes good sense to highlight new problems and to minimize news about the old ones that have been successfully addressed. Similarly, environmental organizations live and die by fundraising. The-sky-is-falling stories are simply far more effective in getting someone to reach for a checkbook than are things-are-looking-up stories. And environmental bureaucrats and lobbyists alike know that they must struggle hard to maintain their constituencies and budgets to fight the serious problems that do persist. They fear, not without reason, that if they don’t play up the troubles that endure, they may lose the ability to address them at all—and we might lose much of what we’ve won.

A second reason is that the media have often failed to evaluate environmental stories with scientific competence and sometimes even honesty. As in fundraising, bad news sells better than good news.

As a result, tentative data have often been presented as irrefutable fact, and short-term or local trends have been extrapolated into global catastrophes. In the 1970s there were many stories about the coming ice age. Ten years later global warming was destined to extinguish civilization.

A third reason that things often seem to be getting worse here at home is extremists. Extremists are always present in great reform movements, and the goal of environmental extremists is not a clean environment but a perfect one. They are few in number, compared with the legions now dedicated to cleaning the American environment, but like many extremists, they are often gifted propagandists and they are willing to use ignoble means to further noble ends.

Consider the support given by some environmental organizations to the Delaney Clause. This law, passed in 1958, requires that even the slightest residue of pesticides that have been shown to cause cancer in laboratory animals may not be present in processed foods. The Delaney Clause made some sense in the 1950s, when our ability to detect chemicals was limited to about one part in a million and our knowledge of carcinogenesis rudimentary at best. Today it is nothing short of ludicrous, for we can now detect chemicals in amounts of one part in a quintillion. To get some idea of what that means, here is the recipe for making a martini in the ratio of 1:1,000,000,000,000,000,000: Fill up the Great Lakes—all five of them—with gin. Add one tablespoon of vermouth, stir well, and serve.

As a result, to give just one example, propargite, a nonpersistent pesticide that controls mites on raisins, can’t be used because it has been shown to cause cancer when fed to rats in massive doses. But a human being would have to eat eleven tons of raisins a day to ingest the amount of propargite needed to induce cancer in laboratory rats. Had it been available in the 1950s, propargite’s use would have been perfectly legal because the infinitesimal residue would have been completely undetectable.

Every first-year medical student knows it is the dosage that makes the poison. Yet many environmental organizations are adamantly against any revision of the Delaney Clause for reasons that amount to nothing less than scientific know-nothingism. They are wasting time, money, and, most important, credibility on the chimera of perfection.

But time, money, and most of all credibility are precious commodities. For even if we are at the Mississippi on the journey to clean up the American environment, we still have two-thirds of the journey to go. And it will be the most difficult part.

For as we proceed, the problems will become more and more intractable, and thus more and more expensive to deal with. For instance, it was easy to get a lot of lead out of the atmosphere. We simply stopped adding it to gasoline as an antiknock agent, virtually the sole source of atmospheric lead. But getting the fertilizers and pesticides out of agricultural runoff—now far and away the greatest source of water pollution in the country—will be another matter altogether, especially if we are to keep the price of food from rising sharply.

Part of the problem is the iron law of diminishing returns. Getting, say, 90 percent of a pollutant out of the environment may be easy and relatively cheap. But the next 9 percent might cost as much as the first 90, and so might the next .9 percent, and so on. At some point we have to say, “That’s clean enough.” Where that point will be, in case after case, is going to have to be decided politically, and democratic politics requires give and take on all sides to work.


Another part of the problem is that, increasingly, environmental regulations have been impinging on private-property rights. In the early days, the environmental movement was largely about cleaning up the commons—the air and water that belong to us all. The rule of thumb was easy: He who pollutes—whether the factory owner or the commuter in his automobile—should bear the cost of cleaning up now and of preventing that pollution in the future. Today, however, new regulations are more likely to affect the ways in which someone can use his or her own property and thus gravely affect its economic value.

There is a genuine clash of basic rights here. One is the individual right to hold, enjoy, and profit from private property. The other is the general right to pass on to our children a healthy and self-sustaining environment.

To give just one specific example of how these rights can clash, a man in South Carolina bought beachfront property in the 1980s for $600,000. The property was worth that much because it consisted of two buildable lots. He intended to build two houses, one for himself and one to sell. But the state then changed the regulations, to protect the delicate shoreline ecosystem, and his property became unbuildable. Its value plummeted from $600,000 to perhaps $30,000.

Not surprisingly, the owner sued for the economic loss he had suffered. But the state ruled that it was merely regulating in the public interest and that no compensation was due as it was not a “taking”: the property still belonged to the owner. The property owner argued that the regulations, however valuable a public purpose they served, had indeed effected a taking, because the state had sucked the economic value out of his property, leaving him the dried husk of legal title.

This case is still in the courts, and cases like it are multiplying. A general acknowledgment of the validity of both sides’ rights and motives is necessary if difficult matters such as these are to be resolved successfully.

Still a third problem is that, increasingly, environmental issues are global issues, beyond the reach of individual sovereign states. Worse, scientists have been studying the earth as a single, interlocking ecosystem for only the last few decades. Global weather and ocean temperature data nowhere stretch back more than a hundred and fifty years and usually much less. The amount of data we possess, therefore, is often insufficient to allow for the drawing of significant conclusions. Has the recent increase in atmospheric carbon dioxide caused an increase in average temperatures, or has a normal cyclical increase in temperature caused an increase in carbon dioxide? We just don’t know the answer to that question. But billions, perhaps trillions of dollars in spending may depend on the answer.

Another issue is growth versus the environment. Many feel that economic growth and increased pollution are two sides of the same coin, that it is impossible to have the one without the other. Others feel that economic growth is the very key to cleaning up the environment because it alone can provide the wealth to do so.

Obviously, in some absolute sense, the more production of goods and services, the more waste products that must be dealt with. But if the wealth produced greatly exceeds the pollution produced, the pollution can be dealt with while standards of living continue to rise. Certainly among the world’s densely populated countries, the correlation between wealth and environmental quality is striking. People cannot worry about the problem of tomorrow’s environment if the problem of tonight’s supper looms large. It is landless peasants, more than timber barons, who are cutting down the Amazon rain forest.

So far there has been no flagging of the pace or weakening of the spirit on the crusade to a clean American environment. The commitment of the American people is firm. Doubtless it will remain firm, too, if, in the midst of the ferocious political debates sure to come, we all keep in mind the fact that honorable people can disagree about means without disagreeing about ends; that there is more than one road to the New Jerusalem; and, especially, that cleaning up the American environment is far too important to be left to bureaucrats, activists, journalists, and fanatics. This is our crusade.


Enjoy our work? Help us keep going.

Now in its 75th year, American Heritage relies on contributions from readers like you to survive. You can support this magazine of trusted historical writing and the volunteers that sustain it by donating today.