Skip to main content

Dixie’s Victory

March 2024
19min read

The old Confederacy got only as far north as Pennsylvania, but its great-grandchildren have captured America’s culture. Joshua Zeitz looks at sports, entertainment, and religion to show how.

About 60 years ago, in July 1942, a 35-year-old coal miner from East Kentucky named Jim Hammittee packed up his belongings and traveled with his wife to Detroit, where he found work in a roller-bearing plant. “When I first came there, we only planned to stay till the war was over and then we’s moving back South,” he later recalled. “But by the time the war’s over in 1945, we had pretty well adjusted and accepted that way of life as the way we wanted to live. So we settled down....” The Hammittees raised three children in the Detroit suburbs. Except for trips to visit friends and kin, they never returned to their native South.

Jim Hammittee and his family were part of a demographic revolution that changed America. Between 1910 and 1970 more than 11 million Southerners pulled up stakes and settled in the North, mostly in the industrial Northeast and Midwest, and in Western boom states like California and Washington. Meanwhile, another internal migration caused the almost overnight transformation of the South’s traditional rural landscape. In just two decades between 1940 and 1960, roughly 9 million Southern farmers, well over half the region’s total agricultural force, left the cotton patches of the Mississippi Delta and the wheat fields of the Southern plains for cities like Houston, Dallas, Richmond, and Atlanta.

These two migrations, from South to North and from farm to city, amounted to a major turning point in American cultural history. They’re why jazz and the blues graduated from being regional “race music” to being the stuff of PBS documentaries and college courses. They’re why white Manhattanites travel to Harlem to eat “soul food” at the landmark restaurant Sylvia’s and why the whole nation is involved in a never-ending debate about the varieties of barbecue. They’re why Southern folk music has become “American roots” music. They help explain why Wal-Mart, once just a small Southern chain, represents American consumer culture in Argentina and Brazil, China and South Korea.

In the mid-twentieth century the arrival of Southern rural traditions in the urban marketplace created a new breed of Southland culture that exploded onto the national scene. At the same time, the millions of white Southerners planting new roots in the North introduced the rest of the country to their conservative religious and political culture and to once-regional pastimes like stock car racing and country music. The consequences have been revolutionary.

Surprisingly, though, while the effects of the black Southern migration have garnered considerable academic attention, only recently have historians considered the importance of that other stream of Southern country migrants. The significance of their travels is apparent everywhere. NASCAR ranks as one of the most popular and lucrative spectator sports nationwide, claiming some 40 million fans, staging races from Chicago to Phoenix, and recently closing a $2.8 billion television deal with NBC and Fox. And recording industry executives have long ceased to laugh at what Northern sophisticates once ridiculed as “hillbilly” music. With the sole exception of rock music, annual country-music sales over the past decade have rivaled or outstripped all other genres, including pop, rap, hip hop, R&B, and jazz. At the same time, mainline Christian churches in every region find themselves steadily eclipsed by their fundamentalist and charismatic competitors, which formerly bore a distinctly Southern profile and were widely regarded as dying. More and more, it appears, Southern culture has become American culture.

 
Early racers were professional “trippers bootleggers using expensively souped-up cars to outrun revenue agents.

It is one of history’s ironies that white Southern culture never stood a chance of becoming nationally ascendant until the Old South itself became a relic. And that didn’t occur until quite recently. Although the former Confederate states were hardly immune to economic and social changes after the Civil War, the region’s fundamental social and economic landscape remained remarkably much the same in the 1920s as it had been in the 1860s. Southern boosters like the brash young journalist Henry Grady were heralding the emergence of a “New South” as early as 1886, but the historian Jack Temple Kirby reminds us that this vision was largely predicated on “hyperbole and fraud.” Well into the twentieth century “the region remained rural and poor.”

All of this changed with the coming of the New Deal and World War II. Federal subsidy programs initiated in the 1930s helped spur crop reductions and land consolidations that forced millions of tenant farmers off the land, while technological innovations in the 1940s and 1950s—chemical pesticides, fertilizers, the mechanical cotton picker—swiftly transformed Southern agriculture from a labor-intensive endeavor to a capital-intensive one. These structural developments, in addition to a massive infusion of federal defense and research dollars during World War II and the early Cold War, helped thrust the South into the modern age.

 

In 1920 the population of nearly every Southern state was at least two-thirds rural, a figure all the more remarkable when we remember that the U.S. Census qualified towns of just 2,500 as urban. By 1960 everything had changed. Industrial work had out-stripped farming for more than a decade, and in eight Southern states a majority of citizens now lived in urban areas, where they were swiftly closing regional gaps in education and income. In short, the South finally started to resemble the rest of the country. Paradoxically, it was at just this historical moment that Southland culture matured and circulated nationally.

Why then? Pete Daniels, a leading student of that culture, has written that “having worked outdoors in harmony with the seasons most of their lives,” many white and black migrants “resented confining hourly jobs that demanded discipline and regularity. Each day they faced the whip, chair, and pistol of corporate management that punished them for displaying any residue of wildness. They chafed at punching a time clock and at other constraints that challenged their will, and they longed for escape, if not retribution.” One means of escape—and retribution—was stock car racing, a sport that had evolved from outlaw origins during the Depression into a wildly popular industry by the early 1950s.

Early racers were, strictly speaking, not stock car racers at all. They were professional “trippers,” drivers in the liquor racket who used expensively souped-up automobiles to outrun revenue agents. Trippers drove every variety of car, but they generally preferred 1939 and 1940 Fords, with easily available spare parts that could be modified for greater speed and better suspension, and with generous trunk space to accommodate their cargo. Trippers tended to be mountain folk and thus had had to master the blind curves and nonexistent shoulders of old hill-country roads. Dexterity and speed were imperative; losing the race meant prison.

Many trippers took to racing for sport and occasionally for prize money in their spare time. Some early NASCAR greats like Junior Johnson and the Flock brothers—Bob and Fonty—got their start hauling homebrewed whiskey for their parents and only gradually found themselves drawn to the racetrack. Even as his competitive racing star ascended in the 1950s, Johnson faithfully plied his talents for the family business. He was arrested in 1956, served 11 months for liquor trafficking, and returned to the track in 1959 in time to participate in a fierce NASCAR run at Charlotte, North Carolina, where he drove another contestant into a wall.

Informal car racing flourished in the 1940s, when war production gave Southerners unprecedented capital to put into cars and wagers. Tim Flock, an early NASCAR Grand National champion and younger brother to Bob and Fonty, insisted that modern stock car racing was inaugurated immediately after the war in a “cow pasture right outside Atlanta, Georgia,” where drivers competed for cash prizes that could total as much as $20,000. For the most part these early races were rowdy affairs, their fans notorious for boozing, womanizing, and supremely proficient cursing.

It took a visionary to realize the commercial potential of stock car racing in a modernizing South. William Getty France, a service station owner in Daytona Beach, Florida, and an occasional participant in so-called outlaw races, understood the appeal this raffish new sport held for the country migrants in their newfound factories. On December 14, 1947, he convened a summit of three dozen leading mechanics and racers to form a sanctioning body for stock car competitions. Thus was born the National Association for Stock Car Auto Racing (NASCAR). “Big Bill” France became the organization’s first president.

From its inception NASCAR embodied the split personality of postwar Southland culture: at once urban and market-oriented, yet still close to its rural, precommercial roots. Locked at first in tough competition with four other sanctioning organizations, France’s NASCAR sponsored standard races featuring modified cars, but it also launched a 150-mile contest at Charlotte whose participants were allowed to drive only automobiles that were “strictly stock”—that is, wholly unenhanced. The ploy was a great success with many fans, who enjoyed seeing professionals race the same cars an ordinary consumer could own. It was an even bigger hit with the car companies; they quickly recognized the marketing potential in NASCAR and invested accordingly. Keeping his eye on commercial strategy, France also enforced stern discipline. Even such driving legends as Tim Flock, Lee Petty, and Red Byron found they could pay a high price, in cumulative points lost, if they participated in unsanctioned races or otherwise bucked Big Bill’s authority.

 

Yet if NASCAR was big business, its success was intimately related to its self-styled reputation as a place of last refuge for Southerners who missed the wild pulse of old country ways. So NASCAR promoted its drivers as cowboys on wheels, drinkers, skirt chasers and partygoers, whose much-publicized (and often exaggerated) debauches offered spectators to behave just as raucously—or at least to dream of it. Drivers like Curtis Turner gave NASCAR fans plenty of legend to revel in. Decked out in his trademark silk suits, Turner partied with movie stars and burned through enormous sums of money—$6,000 a month, according to some sources. His fellow driver Smokey Yunick remembered spending “a lot of time with Curtis, drinking, chasing women, racing, raising hell, teaching people how to turn around in the middle of the road at 60 miles an hour, putting cars in swimming pools.”

Over the years, as its fans climbed the socioeconomic ladder, NASCAR angled for a more respectable image. In the 1950s it banned women from the pit, thus drawing to a premature close the racing careers of drivers like Sara Christian, Louise Smith, and Ethel Flock Mobley. As the organization steadily moved North, where millions of white transplants demonstrated as much enthusiasm for the sport as their Southern cousins, it began to look less like itself. By 1960 the Southern circuit had eight paved courses; in the 1970s officials introduced organized prayer services at the start of races. In time the tracks came to resemble big-city sports arenas. At Darlington, South Carolina, spectators who can afford $500 tickets are now ushered off to the Azalea Terrace; others vie for even more expensive chairs in the President’s Suite or the Fourth Turn Club. In perhaps the most conspicuous example of NASCAR’s self-reformation, drivers are now fined $10,000 for fighting.

By the 1940s Grand Ole Opry had become perhaps the most admired radio show in the United States.

If skeptics had any lingering doubt about NASCAR’s stature as a national pastime, the coverage of this year’s Daytona 500 put the issue to rest. On February 17, NBC bumped its coverage of the Winter Olympics in Utah in favor of the race at Daytona Beach. The decision reflected NASCAR’s immense profitability to its sponsor networks. In 2001 NEC’s stock car ratings jumped 34 percent over the previous year; among men earning $75,000 and more, a phenomenal 74 percent. NASCAR is no longer a Southern phenomenon.

Writing in the mid-1990s, Peter Applebome, a former Atlanta bureau chief for The New York Times , affirmed that country music had become “white America’s music of choice,” and Nashville, the capital of country, “the Tin Pan Alley of the nineties.” What was once a marginal form of entertainment is today the staple of more radio stations—2,600, to be precise—than any other kind of music. Seventy million Americans tune into country and help drive what has become a two-billion-dollar industry. Country superstar Garth Brooks has sold more than 60 million records, making him second only to the Beatles in total U.S. sales.

Unlike stock car racing, country music was a highly lucrative industry as early as the 1920s, when advances in recording and radio helped capture and institutionalize the “hillbilly” sounds rural Southerners had invented, and reinvented, over the better part of two centuries. Bill C. Malone, the leading historian of country music, has written that the South prior to World War II was sufficiently conservative to encourage “the preservation of older values and institutions,” particularly a rich “folk culture.” Yet that folk culture, at least in its musical form, was always a “vigorous hybrid.” radio show in the United States.

The mix contained evangelical hymns and camp songs (first dubbed “gospel” music in 1875), African-American song styles —most notably the blues—outpourings of New York City’s Tin Pan Alley, whose professionals produced a variety of music wide enough to satisfy both Northern urbanites and Southern country folk. Most regional troubadours probably didn’t realize that such Southern favorites as “Old Dan Tucker,” “Listen to the Mockingbird,” and “Cotton-Eyed Joe” had been written by Northern minstrel troupes, or that sturdy American tunes like “Flop-Eared Mule,” “Fire on the Mountain,” and “Leather Breeches” had been born in Britain. Anthropologists in the 1920s were astounded to find “maverick” remnants of sixteenth-century English verse alive and fully integrated into regional music in the Southern Appalachians. In some cases whole songs, like the haunting ballad “Barbara Alien,” remained intact. Yet even these enthusiastic and knowledgeable students never appreciated the dynamic evolution of Southern country music. It was neither entirely authentic nor invented.

Thanks to technology—the phonograph and the radio- country music matured after World War I. At first record companies weren’t interested in rural sounds. But early radio producers were, and in the twenties radio fast outpaced the phonograph as a source of popular home entertainment. Between 1920—the first year of commercial broadcasting—and 1930, annual sales of radios jumped from $60 million to $842 million. At the close of the 1920s more than 12 million American households owned radio sets.

Because most stations then could reach only local audiences, their selection of music tended to be more democratic than the recording industry’s. Small stations carried local country talent from the beginning, and in 1922 the Atlanta Journal’s radio station, WSB, became the first high-power outlet to feature what Americans soon called “hillbilly music,” and for the first time millions of listeners heard authentic country talent like “Fiddlin’ John” Carson.

 

Following quickly on the heels of WSB’s coup, WBAP in Fort Worth invented the first-ever broadcast “barn dance,” a live country-music and talk program that proved immensely popular with Southern listeners. By the late 1920s WLS (Chicago) and WMS (Nashville) had perfected the form with “National Barn Dance” and “Grand Ole Opry,” two mainstays of American radio culture: “Barn Dance” ran for a quarter-century, and the “Opry” is with us still. By the 1940s “Grand Ole Opry” was perhaps the most admired radio show in the United States, with a cast of songsters and comedians that included Roy Acuff, Bill Monroe, Minnie Pearl, and “Uncle Dave” Macon. Like its Chicago counterpart, the “Opry” purveyed rural folk culture with an urban, commercial twist. After the 1920s that combination, more than anything else, would define the otherwise eclectic and diffuse art form that was country music.

The early success of hillbilly radio spurred the recording industry into action. While it’s not clear who the first country recording artist was, good money is on the duo of Alexander Campbell (“Eck”) Robertson, a fiddle player from Amarillo, Texas, and Henry Gilland, of Altus, Oklahoma, who on impulse traveled together to New York in June 1922 to cut a few tracks for Victor. The seminal moment in country recording came several years later, in August 1927, when a professional talent scout and producer, Ralph Peer, discovered and recorded modern country’s first two sensations, the Carter Family and Jimmie Rodgers.

Originally comprising Alvin Pleasant (“A.P.”) Carter, his wife, Sara, and his sister-in-law Maybelle, the Carters remained popular long after their joint singing career ended in the 1940s. Together they recorded over 300 songs for various labels. Their repertoire of new and traditional material, their trademark three-part harmony, and Maybelle’s unique thumb-brush method on lead guitar gained a wide following throughout the South. Long after hillbilly music had gone mainstream and invaded the North, so-called Carter Family songs, like “Keep on the Sunny Side” and “The Wabash Cannonball,” remained standard titles in the catalogues.

While the Carters gave the fledgling industry a down-home, family aura, Jimmie Rodgers, a native of Meridian, Mississippi, cultivated a somewhat different image as country music’s “singing brakeman.” He was the genre’s first self-styled rambling balladeer. In truth Rodgers’s railroad days were short-lived: He developed tuberculosis, which, coupled with his hard-living ways, drove him to an early grave in 1933. But in his few years of productive fame, his appealing blend of old and new music, his distinctive yodeling style, and his Western affectations brought him a level of renown unprecedented among country artists.

 

In the 1930s and early 1940s the evolving style that the Carters and Rodgers helped create became a national sensation, urged along by two unrelated phenomena: electrification and World War II.

Musicians and guitar makers had experimented with electrifying string instruments as early as the 1920s. The electric guitar made its country-music debut in Texas in 1934, and new models manufactured by Gibson, Rickenbacker, Bigsby, and Fender soon made it widely accessible to small-time musicians. In turn, electrification helped spur the decade’s “honky-tonk” sound, as small roadside bars throughout the Southwest featured a new form of country music that was both electrified and more rhythmic (so that customers could dance to it) than the usual hillbilly fare. These modern features helped make country more accessible to non-Southerners; World War II accelerated the process exponentially.

Since U.S. Army training camps were disproportionately located in Southern states, millions of Northern GIs heard their first licks of hillbilly music while sojourning in Dixie. Ferlin Husky, a popular country performer during the 1950s, recalled serving in the merchant marine with “lots of boys … who had never really heard country music before, and it was interesting to see how fast they acquired a taste for it.” So fast, it seems, that by 1945 GIs stationed in Munich, Germany, were debating the relative talents of Frank Sinatra and Roy Acuff. Well into the postwar period, the armed forces would continue to anchor the nation’s country-music obsession. In 1960 some 65 percent of all country record sales took place at base PX’s.

At the same time, war production catalyzed an exodus of Southern whites, and they took their music with them. In the 1950s the ABC Music Corporation, Chicago’s largest jukebox supplier, reported that of its 12 city routes, 2 were dominated by country music. The city’s largest record store, Hudson-Ross, found that in neighborhoods where Southern transplants were heavily concentrated, 30 percent of its sales volume was country and western. The trend had prompted Billboard to run headlines like HILLBILLY TUNES GAIN POPULARITY IN BALTIMORE , and HILLBILLY TUNES SCORE BIG HIT IN MOST DETROIT JUKES .

The North saw the Scopes trial as a defeat for fundamentalism— but the faithful were regrouping.

Country was also making considerable headway in California, thanks to the influx of Dust Bowl migrants during the 1930s and defense-industry workers in the 1940s. By 1945 a music writer in the state’s East Bay region could report, “It hasn’t been so many years since Hillbilly and Western programs were a real scarcity out here.... Boy, OH BOY , it’s a different story now! Turn the dial just any hour of the day and you’ll get a good old time program of OUR KIND of music.” Even in so unlikely a place as New York City, the country sound was coming into its own. In 1947 “Grand Ole Opry” staged a two-night performance at Carnegie Hall and grossed $9,000.

Already enjoying a national profile, country music continued to evolve in the 1950s and 1960s in much the same way it had originally ambled onto the airwaves and 78s in the 1920s: by melding tradition and commercialism. As home to the “Opry,” Nashville attracted considerable recording talent. In the 1950s the city gave birth to what was termed the “Nashville sound” or the Chet Atkins Compromise, a highly electrified pop-country blend, made wildly popular by rising talents like Atkins, Elvis Presley, Johnny Cash, Jim Reeves, and Patsy Cline, the cowgirl sensation who always felt most comfortable with country music and never quite reconciled herself to performing pop hits like “Walking After Midnight,” “I Fall to Pieces,” and “Crazy.”

 

The 1960s and 1970s saw this formula tempered but essentially left intact as country music—now electrified and rhythmic and spread to the North and West—set out to conquer television. Two of the first and most successful experiments in country television were “The Wilburn Brothers Show,” which helped propel Loretta Lynn to stardom, and “The Porter Wagoner Show,” which in 1967 provided a national stage for a young, blonde country-pop hopeful named Dolly Parton. As always, the music was in flux, but the country style remained so distinct and recognizable that many listeners confused new creations like “Tennessee Stud,” “The Long Black Veil,” and “The Battle of New Orleans” with traditional folk music.

Since the 1970s country stars from Willie Nelson, Kris Kristofferson, and Kenny Rogers to Garth Brooks and Lyle Lovett have continued to merge old and new aesthetics in a way that appeals to an immense national audience. Like NASCAR , country music triumphed at the moment when the South itself began to modernize demographically and economically, so its ascendance is a mark of both Dixie’s triumph and its metamorphosis.

The south’s cultural conquest has also had powerful religious implications. In his classic 1931 work Only Yesterday: An Informal History of the Nineteen-Twenties , Frederick Lewis Alien recalled the famous 1925 Scopes monkey trial as a turning point for American religion. “Theoretically, Fundamentalism had won,” Alien observed, noting that a local court in Dayton, Tennessee, had essentially upheld a state law enjoining public schools from teaching Charles Darwin’s theory of evolution. “Yet really Fundamentalism had lost... and the slow drift away from Fundamentalism certainly continued.” Alien’s book sold more than a million copies and became “the font at which most subsequent writers about the decade initially drank,” according to the historian Roderick Nash. Echoing Alien’s insight on the topic, scholars and journalists like Mark Sullivan, author of the 1935 bestseller Our Times: The United States, 1900-1925 , concluded that the “Scopes trial marked the end of the age of Amen and the beginning of the age of Oh Yeah! ” Alien and Sullivan would have been astonished by what the next 70 years would reveal. The evangelical Christians hadn’t been defeated; they were simply regrouping. From their base in the South they have brought conservative Christianity to the rest of the country.

Over the past several decades, mainline Christian groups- most notably those affiliated with the United Methodist Church, the Presbyterian Church (U.S.A.), the Episcopal Church, the Evangelical Lutheran Church in America, and the United Church of Christ—have seen their memberships fall off precipitously, while evangelical, charismatic, and fundamentalist sects have emerged as an almost equal force within America’s splintered Christian community. Today roughly 50 million Americans are affiliated with mainline churches, while 45 million others identify themselves as evangelicals. The battle that began in a small Tennessee courtroom is far from over.

All evangelicals share a commitment to the doctrine of salvation, to personal conversion experiences, to the authority of Scripture, and to missionary work to spread the gospel. Fundamentalism first emerged at the turn of the last century as a particular strain of evangelicalism. It was a reaction to liberalism, particularly in the leading Christian denominations, which were trying to reconcile the Bible with modern science and social activism.

Fundamentalists championed biblical inerrancy and the literal reading of Scripture. They also believed in dispensational premillennialism, a doctrine that prophesied Christ’s imminent return to earth. They were deeply scornful of their liberal counterparts, particularly adherents of the activist Social Gospel movement, who tended to believe in postmillennialism, the idea that Christ’s Second Coming would only follow a 1,000-year era of peace. They also argued incessantly with Pentecostals, another group of conservative evangelicals who shared many of the fundamentalists’ convictions but who also believed that the Holy Spirit could bestow special gifts upon the saved—for instance, glossolalia, the ability to speak in tongues.

Though most nonevangelicals see little distinction among the various conservative factions, a wide gulf persists between Pentecostals and other “charismatic” sects, on one hand, and strict fundamentalists, on the other. This is the source of the lasting animosity between two modern-day leaders of the Christian right—Pat Robertson, who professes to speak in tongues, and Jerry Falwell, who once joked that glossolalia was just a symptom of indigestion.

In the early 1900s conservative theologians in the South pretty much ruled the Baptist, Pentecostal, and Presbyterian denominations in their region. But in the North there was a real fight, with the leading churches split almost evenly between liberals and fundamentalists. Greatly concerned after World War I by the apparent harbingers of moral and cultural decay— labor strife, loosening sexual mores, modernist art and literature —the conservatives went on the offensive. In 1919 they formed the World Christian Fundamentals Association and, under the direction of leaders like William Bell Riley of Minneapolis and J. Gresham Machen of Princeton, undertook to purge liberals from the Northern churches.

The battle came to a head in 1925, when a group of local boosters in Dayton, Tennessee, persuaded a young high school science teacher, John Scopes, to violate the state’s anti-evolution law. They merely wanted to draw attention to their economically depressed crossroads town. Instead, what followed was a sensational trial that pitted the famous lawyer Clarence Darrow, a committed civil libertarian and an almost fanatical atheist, against William Jennings Bryan, the famously eloquent former congressman from Nebraska who had thrice failed to attain the Presidency but who remained a hero to the rural, fundamentalist South. The trial’s climax came when Darrow called his adversary to the stand as a biblical expert and Bryan admitted that some Scriptural language might be more allegorical than literal.

Much of the postwar conservative movement’s strength grew out of this new infusion of fundamentalism.

Although the trial was technically a win for the prosecution, Northern liberals declared it a great victory for their cause. Bryan, they said, had unintentionally exposed fundamentalism for the inchoate drivel it was, while Darrow had established the modern North’s supremacy over the backward, hyper-religious South. Fundamentalism receded in the Northern denominations and became essentially a regional phenomenon. But the conservatives were far from licked. In the decades following the trial they chartered missions, publishing houses, and radio stations; they founded 70 Bible institutions nationwide, including Bryan College in Dayton but also Moody Bible Institute in Chicago and Riley’s Northwestern Bible Training School in Minneapolis. In the 1940s they began to reappear in public life.

In the West, areas like Orange County, California, which had long been home to pockets of Pentecostal and fundamentalist activity, witnessed an explosion of conservative Christian activity during the massive influx of Southern migrants that began with World War II. In Northern California’s East Bay area, mainline churches formed an umbrella group, the United Ministry, to organize religious worship for tens of thousands of new defense workers, almost a quarter of whom came from Dixie. The United Ministry found its efforts largely ignored; only about 2 percent of migrant families attended the services. Instead, new defense workers set up their own evangelical and fundamentalist churches, many of them in storefronts or private homes. By 1945 area residents could listen to radio sermons by C. L. Hunter of Texas, the “Cowboy Evangelist,” or Bebe Patten of Tennessee, the “Girl Evangelist.”

In the Midwest many Southern transplants also found their new local Baptist churches at once unsatisfyingly staid and yet liberal in the interpretation of the Bible, and so they began establishing their own fundamentalist churches. Although in 1894 the (liberal) Northern Baptist and (fundamentalist) Southern Baptist Conventions had signed the Fortress Monroe agreement, a mutual pledge not to step across the Mason-Dixon line, by the late 1940s the Northern Baptists found themselves on the defensive. So many Dixieland transplants were organizing fundamentalist churches that in 1949 the SBC voted to abrogate Fortress Monroe. In response, the NBC became ABC: The Northern Baptist Convention changed its name to the American Baptist Convention and pledged retaliation. But the momentum was with the South. In the 1950s and 1960s the Southern Baptists were the swiftest-growing denomination in Ohio. By 1971 they had registered 169 churches in Michigan, 230 in Indiana, 380 in Ohio, and 893 in Illinois. In Ohio and Michigan the American Baptists and Southern Baptists enjoyed almost equal numbers.

The rise of evangelical Christianity in the North and West has reinvigorated politics. On the whole, Pentecostals and fundamentalists assume more conservative positions on social issues, like abortion and the separation of church and state, than do mainline Protestants, Catholics, and Jews. Scholars have recently suggested that much of the postwar conservative movement’s strength, particularly in California, grew out of this relatively new infusion of fundamentalism. It is precisely this religious migration that accounts for the rise of Rep. Gary Condit, a socially conservative Democrat, born in Oklahoma, the son of a Baptist minister, a transplant to the West Coast, and once a local favorite in his heavily evangelical district.

The importance of the great white migration has not gone unnoticed, though often enough liberal writers like John Eagerton, the Nashville author of The Americanization of Dixie: The Southernization of America, have been quick to castigate Southern migrants for “exporting” the “vices” of their region “without importing values.” Or, as Peter Applebome argued more recently, “at a time when a Democratic president like Bill Clinton is coming out for school prayer, going along with sweeping Republican legislation shredding welfare and taking his cues from a consultant, Dick Morris, who formerly worked for Southern archconservatives like Jesse Helms and Trent Lott, when race is a fractious national obsession … when the Supreme Court is acting as if Jefferson Davis were chief justice … in times such as these, to understand America, you have to understand the South.”

Despite his tone of aggrievement, Applebome’s point is sound: Southern culture today enjoys far more national influence than it has at any time since a Virginian was given command of the Continental Army.

In 1971, in the first stages of a short-lived country-music phase, the folk singer Joan Baez helped popularize Robbie Robertson’s wistful composition “The Night They Drove Old Dixie Down.” The song was recorded by many others, including Bob Dylan and the Band. Its title is revealing enough, but the final verse, uttered by the former Confederate, Virgil Caine, is more poignant still: “Like my father before me I will work the land/Like my brother above me who took a rebel stand/He was just 18, proud and brave, but a Yankee laid him in his grave/I swear by the mud below my feet, you can’t raise a Caine back up when he’s in defeat.” In truth the Virgil Caines of the world stopped working the land several decades ago. They moved to Nashville, Atlanta, Chicago and Los Angeles. They went to work in factories and offices. They took their culture, their music, and their religion with them, and they have changed America.

We hope you enjoy our work.

Please support this magazine of trusted historical writing, now in its 75th year, and the volunteers that sustain it with a donation to American Heritage.

Donate