London Bridge Is Falling Down

PrintPrintEmailEmail

Cities and states began passing laws requiring inspections, but no one was exactly sure what would be safe, and the regulations couldn’t be enforced over state lines anyway, so the deaths continued. In 1830 the Franklin Institute, recently formed in Philadelphia to help engineers better understand their technologies, began a concerted effort to figure out the causes of the explosions, and after a very gruesome one in Memphis that year, Congress began backing its work. Seven years later, the institute produced a report that revealed major discoveries about how water and steam and the various metals that boilers were made of interacted, and the report included strong recommendations for setting standards for manufacturing, maintaining, and inspecting boilers.

Congress ignored the whole thing. Daniel Webster captured the general sense of the House when he quoted one congressman as saying, “Let the Government attend to its own business, and let the people attend to theirs.” Then a crisis—an especially deadly one—finally forced action. On April 25, 1838, the steamboat Moselle blasted apart just above Cincinnati and killed perhaps 200 passengers, the worst carnage yet. Shortly thereafter, the Steamboat Act of 1838 was passed into law.

The act did more than just make steamboats safe, though that was an enormous accomplishment in itself. That law paved the way for the later establishment of the Food and Drug Administration, the Federal Aviation Administration, and all the other government regulatory and investigative agencies that work to protect us today.

As our technologies have grown, and our expectations for keeping them safe have grown with them, we have become increasingly responsive to the hazards that lurk within them—once they clearly manifest themselves—and increasingly adept at regulating them. While it still usually takes a crisis to initiate real action, those crises have become more and more effective in prompting reform and less and less likely to be anywhere near as severe as the toll on lives that led up to the Steamboat Act.

DDT was widely accepted as little less than a miracle in the 1940s and 1950s, but once Rachel Carson showed the world the environmental havoc it was wreaking, the experience could never happen again. There will be no more miracle pesticides. Three Mile Island should never have happened, but once it did, it ensured that nothing like it would be likely again, and without its having cost a single life. The great blackouts of 1965 and 1977 revealed an international infrastructural grid collapsing under its own weight and led to reforms and redundancies in those systems that have prevented anything so crippling from happening since.

The price of living in our advanced, dauntingly complex technological world is not only eternal vigilance but also periodic crisis. This is how it must be. Nobody can foresee the consequences of our technologies any more than anyone can foresee the consequences of any of our complex human activities. All the works of man, from political systems and economic structures to power systems and computer networks, can seem to become monsters beyond our control. They are bound at times to appear to run amok, or to defeat themselves, or to completely break down. And then our alarm at the catastrophe is bound to make us fix them.

It is good that they scare us so, for people will never stop courting the hazards of unexplored technical realms. People have always been and will always be moved by the same spirit that guided (or misguided) Lord Foster, the architect of the Millennium Bridge. At the height of that crisis, he asked, undoubtedly with a note of defensiveness in his voice, “Can you ever be overambitious?” His answer: “I would rather be accused of being overambitious than of being lily-livered and retreating into a nostalgic past that never existed.”