America's Changing Economic Landscape

Is the decline in the industrial belt a step into perilous new territory or is it merely a continuation of the ceaseless transformation that built our prosperity ?

BY JAMES FALLOWS

ON THE LAST DAY OF 1980, when Jimmy Carter had been defeated and the question of further political damage to him was moot, a commission that Carter had appointed, called the President’s Commission for a National Agenda for the Eighties, issued a report, Urban America in the Eighties: Perspectives and Prospects. It was written in diplomatic, understated prose, which may initially have obscured the import of its message. Its meaning did not long remain concealed, however, for within days the report was taken in the northern half of the country as a declaration of war.

The clues to why this should be so lay in the report’s introduction, which described economic life as ceaseless, churning change. Industries rose and fell—and so did empires, and so did cities. Nothing endured forever, and in the real world of economics it was vain to hope that anything would. “An increasingly productive economy should be recognized as necessitating simultaneous painful growth and shrinkage, disinvestment and reinvestment,” the report said. “Disinvestment” was another way of saying that industries might die and cities fade. The shape of America’s cities reflected the painful growths and shrinkages that had already occurred, as immigrants arrived trom the hinterland and from overseas, emigrants departed for the frontier, new classes rose and built temples to themselves, and other classes fell.

Yet somehow, the report mused, “the city is perceived as something that should be largely permanent and unchanging, reflecting continuity and stability, promising to be a lasting monument to society’s achievements and failures, and serving as a testimony to the success with which people have hammered out relations among themselves.” This vision of a city might seem appealing, the report concluded, but it was naive and unrealistic, and in the United States it had never been accurate.

And it could not be accurate now, according to the report, because the American economy was undergoing yet another transformation, one that was sure to make the nation’s older cities more suitable for some functions and less for others. In particular, the big, broadshouldered places from which America’s twentieth-century industrial wealth had been wrung—Chicago, Cleveland, Pittsburgh, Detroit—could not expect to survive in their current form. There had been much talk about “revitalizing” these cities, through means as genteel as the movement of young professionals into marginal neighborhoods or as direct and unrefined as large redevelopment projects. But if revitalization “is defined as the attempt to restore our older industrial cities and regions to the influential positions that they have held throughout the industrial era, urban revitalization shall surely fail.”

In the face of such a certainty, the report argued, the government should stop trying to ward off the inevitable by subsidizing specific cities or regions. After all, its obligations were to people, not to the places where they happened to live, and people had often advanced by moving someplace else. The postwar growth of California had done more for the people of Appalachia and the Midwest than had any targeted-assistance program. The government could best discharge its obligations “by removing barriers to mobility that prevent people from migrating to locations of economic opportunity, and by providing migration assistance to those who wish and need it.”

In other words, as the report was soon translated in the daily press, the President’s commission was recommending that Americans vote with their feet. FEDS TO NORTHEAST: DROP DEAD, said the headline in the New York Daily News.

LIKE OTHER BITS OF CARTER-ERA DETRITUS, URBAN America in the Eighties soon left the news, but the report amounted to the first shot in the economic battle of this decade. The battle is about the way in which the United States should view the tumultuous economic change that seems to lie ahead. Should it rest secure in the faith that despite the disruptions and human friction we may endure, the shift away from heavy industry and the heavy-industrial belt is just the latest twist on the same process that created America’s wealth? Or might this transition be something quite different? Could it be a step into unknown and perilous territory, where we will find less equality, fewer jobs, more suffering, and less of the spiritual and material glue that has helped the nation cohere?

The political battle lines are being drawn between those who think the economic future is promising and those who see lowering signs. Within the Republican Party the pizzazz now seems to belong to a group of congressmen who are urging the United States toward their vision of a “Conservative Opportunity Society.” These COS politicians, the most visible of whom are Representatives Jack Kemp, of New York, and Newt Gingrich, of Georgia, express an unmixed optimism that economic benefits lie ahead. The only difficulty on the road to tomorrow is keeping the federal government and its tax code out of the way. The COS spokesmen claim that they can win over even such bedrock Democratic groups as black voters, once they make clear that they are talking about opportunity for everyone.

Within the Democratic Party there is a cleavage over the changing economy that is leading to an ugly fight. One group of politicians, mainly from places other than the Midwest and the Northeast, has been recommending that the United States get moving toward a higher-tech economy. They see a big role for the government in that transition; it should be encouraging research and development, retraining workers, targeting its incentives, and generally seeing that things proceed on course. As incarnated in Gary Hart, this movement lost the 1984 nomination to Walter Mondale and the tradition that the political analyst Kevin Phillips has cruelly but precisely called “reactionaryliberalism.”

The label is less derogatory than it may sound, for it captures the challenge that many liberals feel they now face: the defense of noble values that have come under unprecedented assault. Since the New Deal they and their forebears have helped build a healthy labor movement, raise the working-class wage, create a social “safety net,” and establish the Democratic Party as the guarantor of such benefits. From the party’s base to the unions’survival, every one of those achievements is now threatened by the industrial and regional shift. These liberals see the rise of foreign competition as another way of cutting the union wage; they know that the growing prominence of the Sun Belt means electoral trouble for the Democrats. Can it be a surprise that they speak in “reactionary” terms, seeking to preserve what they have built? Whereas Urban America urged the government to help people hit the road, a recent book called Rebuilding America, by Gar Alperovitz and Jeff Faux, exemplified the liberal emphasis on community stability: “The sensible planning answer clearly is to shift the money and physical capital involved in making cars to making something else—in Detroit.”

Even though Walter Mondale was trounced by Ronald Reagan, his wing of the Democratic Party heavily influences national discussion of economic change. The conservatives may have George Gilder, but the liberals have Barry Bluestone and Bennett Harrison, whose book The Deindustrialization of America out-facts Gilder by about a hundred to one in its grim recitation of jobs lost, factories closed, and hopes destroyed.

The outcome of this argument over “deindustrialization” matters, because societies find ways of deflecting the things that frighten them. When they see change as a threat, they manage to retard it. The government can be pressured into taking over businesses that the market would let die, as happened with the English steel and coal and auto industries before Margaret Thatcher came to power. It can subsidize people to stay where they “should,” as France has done with its numerous farmers and Italy has attempted to do with its southerners. When, on the other hand, societies are indifferent to the costs of economic change, they glorify their Gradgrinds and Bounderbys, who may have risen largely through luck, and sneer at those who fall by the way.

Should the United States fear the change that is under way? The strongest evidence that it should might be found in South Chicago.

ALONG THE SOUTH SHORE OF LAKE MICHIGAN, IN A thirty-five-mile crescent that starts on the east in Burns Harbor, Indiana, runs through Gary and Hammond, and stretches to the southern edges of Chicago, lies the densest concentration of steel mills in the world. The neighborhoods at the western end of the crescent, barelv visible to drivers who zoom southward from Chicago on the Chicago Skyway, have names like Irondale, Millgate, Slag Valley. They consist of bungalows and two-or-three-deckers, built right up against the factory walls. Apart from the mills the biggest structures in South Chicago are St. Michael’s Cathedral, which was briefly in contention to be the seat of the Archdiocese of Chicago, and the United Steelworkers anion hall.

The first settlers in South Chicago were a mixture of English, Irish, Germans, and Swedes, but around the turn of the century new groups came. The Poles arrived during the “new immigration” of eighty years ago, and they still predominate around many of the steel mills. Mexican immigrants and blacks from the South came shortly afterward, drawn by the labor shortages of the First World War and efforts to break the nationwide steel strike of 1919.

They found themselves on a battlefield of class war and naked strife. The Communist Party was strong in Chicago early in the century; the city was the organizing center of the 1919 strike, when 400,000 workers left the mills for four months but finally were beaten by the companies. Even more than 1919, 1937 seems to count as Year Zero in the steel communities, for on Memorial Day of that year demonstrators marching in front of Republic Steel were gunned down by the police. Ten were killed and eighty more were injured. Nearly fifty years later, at a time when young Americans are said to know almost nothing about Watergate or Vietnam, let alone events pre-dating the Second World War, steelworkers in their thirties made sure I saw the exact points where the shots were fired and the Republic Steel martyrs fell.

The biggest mill now operating along the lake is Inland Steel’s plant in East Chicago, Indiana, with some 22,000 employees. Bethlehem, Interlake, U.S. Steel, and LTV are all still making steel, though with smaller work forces than they employed ten years ago. Some of the factories, especially those that have diversified into high-value specialty steel, are getting along. But two important plants have closed, and (although no one wants to hear this in South Chicago) they are probably gone forever. They are the Wisconsin Steel Works, so named even though it is in Illinois, and the famous South Works of U.S. Steel. Their histories help explain why “displaced workers” feel as if the rules of life have suddenly and unfairly been changed.

Until 1977 Wisconsin Steel was owned by the International Harvester Company, and much of Wisconsin’s steel went straight into IH tractors and combines. For most of its existence it had been a profitable, docile plant putting out a high-quality product, but by the mid-1970s it was costing Harvester money. In 1976 it lost almost $20 million. Why was the plant failing? The reasons were the familiar combination that has affected so many of America’s heavy industries. The steel market had become truly worldwide, and American companies had to compete against low-wage operations overseas, many of which enjoyed government subsidies; the Japanese steadily modernized their mills, and most American companies failed to; new labor contracts sent American wages soaring, at just the moment when the industry was becoming intrinsically less competitive. And, in Harvester’s case, a management team that eventually drove the entire parent company to the brink of bankruptcy made strategic errors.

The solution to the Wisconsin Steel problem, from Harvester’s point of view, was to “sell” the mill to Envirodyne, a tiny California-based engineering firm whose total sales were only a small fraction of Wisconsin Steel’s. In a transaction that makes sense only when seen through the prism of the U.S. tax laws, Harvester lent Envirodyne $50 million of Wisconsin Steel’s $65 million purchase price; the rest came from the Chase Manhattan Bank. All that Harvester asked in exchange was a lien on the Wisconsin Steel Works and related iron and coal mines.

Envirodyne was led by peppy young executives who were frequently profiled in business magazines. Everyone may have hoped that with such new management Wisconsin Steel could be turned around. But in 1979 it lost $44 million. Meanwhile, a federal loan, intended to finance the renovation of the steelmaking equipment, was stalled for a year inside the government—and just after it came through, the fatal blow fell. In November of 1979 the United Auto Workers struck Harvester. For the next five months Wisconsin Steel was without its most important customer. Seeing the drift of things, Harvester foreclosed on its loan to Envirodyne and sent Wisconsin Steel into bankruptcy. On March 28, 1980, a Friday and a payday, the 3,400 steelworkers learned that the mill had closed.

Only the night before, some union members had heard from company officials that Wisconsin Steel was secure. Chase Manhattan apparently knew otherwise, for it froze the company’s assets so quickly that the workers’ final paychecks bounced. Their medical insurance also ended as of that day; they received no severance pay and none of the “supplemental unemployment benefits” that have become so important a part of labor contracts in heavy industry.

The Wisconsin Steel transaction had been a big net plus for International Harvester: R. C. Longworth, of the Chicago Tribune, calculated that the sale had saved Harvester $130 million, through the operating losses, pension obligations, and anti-pollution investments it had been spared. Envirodyne escaped mortal harm, since it had set up a shell corporation to protect it from liability for some $62 million in unfunded pension obligations. The reputations of some of its officials were damaged, but its original assets remained intact. For the Wisconsin Steel employees, however, it was the end of the world. Not only were they out of work but the accompanying pensions and other benefits, which become the motivating focus for those who spend more than a few years in factory life, seemed to have disappeared. (Eventually the federal government joined the list of losers, since the Pension Benefit Guarantee Corporation picked up some of the pension obligations. The Economic Development Administration, which had guaranteed some of Envirodyne’s loans, wound up holding title to the mill.)

The steelworkers responded with a performance as plucky as it was pathetic, Wisconsin Steel had long been unionized, but not by the United Steelworkers. Its workers were represented by an independent union known as the Progressive Steelworkers. A few weeks after the shutdown the Progressive Steelworkers held a meeting to assure employees that everything would be all right. For years the union had paid a $35,000 annual retainer to Edward Vrdolyak, the Chicago alderman who is king of the South Chicago wards. According to a newspaper report, Vrdolyak, in his role as the union’s counsel, breezed into the meeting “sporting a fresh Florida tan” and told the workers that he would be taking care of them. Mayor Jane Byrne said that she would be doing likewise—and since she was at that time stumping for Jimmy Carter in the Illinois primary, everyone got the point that the federal government would be there to help if Chicago voted right. Just before the 1980 election Byrne appeared at another meeting. Straight from a talk with Jimmy Carter, she told the workers that everything was set and the plant would be open soon. “There’ll be turkey on the table for Thanksgiving,” she said to rousing cheers. Whether or not Carter would have done anything if re-elected, he was beaten, and the steelworkers began to look less to Byrne and Vrdolyak and more to Frank Lumpkin.

Lumpkin had come to Wisconsin Steel in 1950, thirty years before the mill closed. He was born in Georgia in 1916, one of ten children in a family of farm workers that migrated to Florida in 1922. He headed north in search of work during the Depression, starred in steel in Buffalo in 1938, spent time as a prizefighter, and came to Chicago after service in the Merchant Marine during the war. At Wisconsin he started as a “chipper,” removing defects from steel with an air hammer, and moved through a variety of jobs; he was a skilled millwright in the Wisconsin Steel tool shop when the mill dosed down.

Lumpkin is by nature an organizer, and he had tried to establish a more aggressive alternative to the company union throughout his time at Wisconsin Steel. Within weeks of the shutdown he had used his previous base to put together the Save Our Jobs Committee. For the five years since then he and his supporters have tried to recover some of what they lost when the mill went down.

One blustery day last March, for example, Lumpkin and some forty members of Save Our Jobs rode in a yellow ex-school bus from the blighted streets of South Chicago to International Harvester’s headquarters, in the Loop. The Harvester building is an imposing bronze-colored skyscraper, which sits just off the lake. The group had come to picket in front of it. As the marchers circled, a, freezing wind blasted from the north, tearing the placards—“Jobs or Income Now!” “Wisconsin Steelworkers Need Pensions”—from their hands or turning them upside down. About ten of the marchers were white and the rest Mexican-American or, like Lumpkin, black. They wore nylon windbreakers and working jackets—no overcoats—and knit caps. They walked for an hour at lunchtime, as professional men and women in topcoats or downfilled jackets scurried in and out of the building. They attracted a moment’s attention from a television crew; Lumpkin and the group’s young lawyer, Thomas Geoghegan, said a few words into the camera about their suit to recover lost pension rights and their hope that the mill would reopen. Then the marchers climbed into the bus and rode back home.

ON THE BUS, OR IN THE decrepit Save Our Jobs office, located over a South Side restaurant, or in the parlors of their houses, the Wisconsin Steel workers tell the same story time and again. They think someone should be able to help them get the plant reopened, but no one wants to try. Thomas Geoghegan has filed an enormously complex lawsuit against Harvester, alleging that it cannot escape its obligation for $15 million to $20 million in pension rights and other benefits that accrued while it owned Wisconsin Steel. (Both the company and the former employees filed motions for summary judgment in the summer of 1983; by the end of 1984 the judge still had not ruled on whether the case would come to trial.) The steelworkers keep asking why Harvester and Envirodyne and now the federal government should just walk away from the plant, when its modernized Number Six Mill is perfectly capable of making steel, and when steel is needed to repair so many bridges and buildings. They see bridges corroding and work that needs to be done all around them on the South Side. Why can’t they go back to the mill and make the steel their community needs? The government already owns the plant: why doesn’t it just start it up again? As time has gone on and no one has answered these questions, the Save Our Jobs Committee has become more modest in its goals. Its ambition now is to keep a demolition company from dismantling Number Six Mill.

The Wisconsin Steel work force was old—theirs was a stable mill, without a lot of new blood. A few of the workers have found other jobs. One, a black man in his forties, does odd jobs like installing bathtubs and fixing gutters. (“I used to make $650 a week, maybe $750 with overtime. When you go from that to zero, it takes some adjusting.”) Another, a grave-faced Mexican-American in his fifties, says he spends eight hours a day looking for aluminum cans and flattening them out. Most say that they are too old and the prospects are too bad.

They say they make ends meet by living with their children, or relying on their wives’ earnings, or eating the free government cheese that Frank Lumpkin has been distributing, or eking out their company benefits. Most exhausted their unemployment insurance long ago. The few with thirty years’ service, including Lumpkin, get nearly their full pensions from the government’s Pension Benefit Guarantee Corporation. Also, under the elaborate pension formula known as the Rule of 65, most of those with twenty years’ service get a meager benefit of about $100 a month, designed to supplement the Social Security they will eventually collect. They are owed an average of $5,000 in severance pay alone and more in other lost benefits. The members of the Save Our Jobs Committee are starting to die.

LAST APRIL I SAT WITH LUMPKIN AND HIS WIFE, BEA, in their home, five miles from the old Wisconsin Steel Works. Practically speaking, the two of them are out of harm’s way. Lumpkin has his pension; he has become a kind of hero; their children, unlike so many from the area, have effected a miraculous escape. One is in the computer business, another is an M.D., a third is a mathematician, and a fourth is a biologist who was returning that day from Mozambique.

Lumpkin, now sixty-eight, has a big, square face. His head is usually topped indoors and out by a brown felt hat with the brim rolled up all the way around. He is missing several fingers from years in the mill. He speaks in a southern accent that sounds almost like Brooklynese (“woik” for work). Bea is a slight Jewish woman; their house has matzos in the kitchen and African hangings on the walls. They talk as if the world they have cared about—the labor movement, the steel community—is on the way down.

“The day the plant closed I had my thirty years,” Lumpkin said. “The only thing on my mind was, I’ve been here since 1950, trying to organize this group and that group, and I could look back on thirty years of fruitless work. I was going to leave that plant without accomplishing a single thing for organizing the workers or improving their situation in any way. That is the only reason I have continued in this struggle—the way human beings are mistreated by the system. They can just be tossed off and forgotten. If we hadn’t organized Save Our Jobs, the whole thing would have been forgotten, because the deal was all set. We wouldn’t have gotten a dime.”

They still have not gotten much more than a dime, but Thomas Geoghegan says that Save Our Jobs has not been as futile as it might seem. “Some things you do as a matter of pride,” he said. “You have to try to keep people from sinking even further in morale. Frank has people out for a whole series of causes. That’s a lot better than guys going into the bar at noon or sitting at home with the oven on to keep warm.”

A BLOCK AND A HALF DOWN SOUTH CHICAGO AVENUE from the Save Our Jobs headquarters sits Hilding Anderson Hall (named for one of the Republic Steel martyrs), home of Local 65 of the United Steelworkers of America. At its height Local 65 was one of the largest locals in the country. It was briefly famous for Edward Sadlowski, a former president, who in 1977 was a thirty-nineyear-old insurgent candidate for the international union presidency. Sadlowski lost in an exceptionally ugly campaign, and, coincidentally, Local 65 began declining at about the same time.

This union local, and much of the South Chicago economy, was long centered on the South Works of U.S. Steel. Eighty-odd years ago, before the U.S. Steel conglomerate was formed, the South Works was one of two great railmaking installations in Chicago, hub of the nation’s expanding railroad network. (The other, known as the North Works, was at the mouth of the Chicago River, in what is now the heart of downtown.) It reached its peak employment during the Second World War, when 16,000 workers turned out metal for ships and tanks. The rail mill rolled its last rail in the 1950s; through the mid-seventies, as the mill concentrated on heavy structural-steel products, its work force dwindled from 10,000 to 7,500.

On March 30 of last year most of the South Works was closed for good. Whether the fundamental cause of the shutdown was the knot of circumstances that make American steel mills uncompetitive, as the company claimed, or simply U.S. Steel’s desire to get out of the steelmaking business and into oil and other pursuits, as most workers believed, there was a large enough component of manmade disaster to leave bitter feelings.

Since the late 1970s steel companies have been either warning or threatening, depending on your perspective, that they would be closing plants unless they got certain concessions. In the case of the South Works, U.S. Steel was asking the state government for relief from pollution laws and sales taxes and asking the union for more-flexible work rules. If those conditions were met, the company said, it would bring new hope to the South Works by investing $225 million in a new rail mill.

The union and the State of liiinois did their part. Then the company announced that it was having second thoughts. If the city government really wanted to keep the South Works in Chicago, it should help out the company with additional tax breaks. The city, too, complied. In that case, said the company, there would have to be yet another change in union work rules, which would allow the company to hire outside, non-union contractors for much of the work inside the plant. At that point the union leadership said “Enough.” U.S. Steel, apparently relieved, announced that plans for the rail mill were being canceled and the venerable South Works would be closed. R. C. Longworth, a careful and normally calm reporter, wrote ot this decision:

U.S. Steel lied to the workers, to the State of Illinois, and to the South Side neighborhood where South Works stands. It said it planned to expand operations there with the rail mill, but it is clear now that the company never had any such plans. It toyed with the workers, many of whom gave decades of their lives to l ,S. Steel. And when it finally opened the trap door under them, it had the gall to claim that it was the workers’ fault.

On the afternoon of March 30, 1984, the mill’s last day of operation, a funeral cortege formed in South Chicago. Shortly after three o’clock some sixty automobiles gathered at the Local 05 union hall. The drivers turned on their headlights, displayed orange FUNERAL stickers on their windshields, and drove in stately procession to the South Works gates. The guard inspected each badge and let them pass. Once inside, they pulled up beside the towering furnaces, parked, and stared for a while. Many of the cars were draped with purple crepe and carried signs: MACHINE SHOP REST IN PEACE, WELDING SHOP REST IN PEACE. The drivers were mainly twenty-fiveand thirtyyear men, part of the generation that had left South Chicago for Korea or even Normandy and Okinawa and then come home to work in the mills. Starting in the late 1970s their sons and nephews had been laid otf. Now their jobs were going too.

The men talked quietly among themselves, but every few minutes one of them would walk over to a locked shop door and self-consciously proclaim to the other members of the procession, as if mimicking TV news reports of other plant closings, ‘4 put thirty-one years of my life into this building! Do you know what that means?” A flathed truck rumbled past, carrying what appeared to be the disassembled components of a large crane. “That’s our work,” said a former boilermaker, one of the few men in his thirties. “There it goes! The thing of it is, we can beat the shops they’re sending it to.”

Later that evening, after the procession had left the South Works and driven with horns blaring through the streets of South Chicago, a wake was held at the union hall. Perhaps a thousand steelworkers and their families were there, along with a handful of supervisors who had started out as union men, instantly recognizable in their cardigan sweaters and dress slacks (instead of nylon jackets and work pants). Even off duty there is a dress code in industrial communities. One after another the steelworkers went up to the microphone to choke out their sentiments about what had been lost. But in addition to the laments that might be heard at any wake, there was another note. Hold on, many of the speakers said, in the lace of all evidence. Things are hound to turn around.

That vain hope was precisely what many of the workers clung to—even though the company was urging them to take severance pay, acceptance of which meant forfeiting their right to some hypothetical future recall, and even though the dwindling membership of Local 65 had only the night before met to discuss the distasteful necessity ot selling the union hall. The loss of income was bad, but that was only part of the story. It was almost impossible to face the fact that the mighty South Works, which had given such predictability to life on the South Side for so many decades, was gone.

ED SADLOWSKI, THE ONETIME REFORM CANDIDATE who is now a United Steelworkers official based at the Local 65 hall, gave me a tour of the landscape that was being transformed. We drove to the southern edge of Lake Michigan, where ice Hoes from the 300 miles of lake to the north had been piled up by the wind. To the left, across the lake and in the distance, the skyscrapers of the Loop reared up. To the right were the factories of Hammond and Gary. All around us were industrial canals and boarded-up houses and smokestacks with nothing coming out of them. The wind picked up a fine gray grit from the slag heaps that surrounded the steel mills. In the deserted yards at Wisconsin Steel sat hundreds of ingots and metal pieces, now rusted red. The only men at work were operating a bulldozer in a slag heap, reshaping the great dunes of grit. This, Sadlowski said, was home.

Sadlowski is a paunchy man with a dark mop of hair. He is in his mid-forties but, like many industrial workers, looks ten years older. He has the outgoing air of a natural charmer and storyteller. Born in South Chicago, within blocks of the South Works, he quit school at seventeen, joined the Army, and came back to the mill. He threw himself into union politics, becoming president of Local 65 when he was twenty-five and rising through the hierarchy until he ran for the international presidency. His principal error, as is so often the case in politics, was to utter an inconvenient truth: he said in a Penthouse magazine interview that he didn’t think people were meant to work in smelters and hoped the day would come when they didn’t have to. The Walter Mondales and Gerald Fords of the world can, when defeated, retire in comfort to the law firm or the lecture circuit. The Ed Sadlowskis cannot. Although he is still a power in Chicago, Sadlowski clearly misses his glory days as a national figure. He seems to be a man caught in one world but yearning for another. He has persevered in night school, completing his college degree and recently his master’s.

“Nothing much has changed here during my lifetime.” Sadlowski said, as we drove through the lines of bungalows abutting the steel mills. “Except maybe the blight is a little worse. Even when it was new it was never exactly in the travel guide. It was strictly a working-stiff mill community. You’ll find houses like this wherever you find steel mills or coal mines.”

We passed through the downtown of South Chicago, where Sadlowski grew up. “The house I was born in was right there. I can’t show it to you, because it fell down. The sucker just fell down, boom!” He pointed to a growing Hispanic section, where a Mexican restaurant and a kielbasa stand, both with the look of decay, sit near each other. “This is the most highly industrialized neighborhood in the world, the richest in terms of industry. But look at it. We’ve got all kinds of big factories. We should have some big libraries and hospitals, too—but where are they? That tells me that they’ve taken and taken from this community and never put back.”

To the steelworkers of South Chicago everything about the “new” economy is bad. They see the movement of jobs to the American Southwest or to Taiwan as just another way to undercut the workingman’s wage. Few of them see a way to adapt to the new order. Why should they even think of moving? Everyone has a story about the friend or nephew who drove to Houston or Denver, found himself stacking bottles at the 7-Eleven or competing with illegal Mexican immigrants for construction jobs, and drove back home. Even if they wanted to, how could they move when their assets are tied up in houses no one wants to buy? The only hope is to ride things out, endure as they endured at the mill, and wait for the government to come to its senses and step in. Someday the politicians will realize that we have bridges to repair and steelworkers dying for a chance to make bridge girders. Someday they’ll find the guts to crack down on the Japanese. Someday they’ll change the tax laws—or recognize that the welfare of a community is too important to be determined by marginal-profit calculations by the boys in green eyeshades. Until then there’s nothing to do but organize and wait.

It may not matter to the steelworkers whether their miseries are omens of wider suffering or isolated misfortunes, typical of nothing. Either way, the steelworkers still feel stuck. But that distinction is precisely what matters to everyone else.

TO START UNTANGLING THE COMPLEXITIES OF THE current economic transformation, it is helpful to appreciate the curious role played by “productivity.” In most economic discussions the word is shorthand for “labor productivity”—the amount of corn or cars or coronary-bypass operations produced per man-hour. A high and steadily increasing rate of productivity is, of course, one of the elements that makes the U.S. economy different from Ghana’s; it is the source of higher wages, higher profits, lower prices, and other blessings. But when productivity rises, the change inherently eliminates some work that people used to do, whether as milkmaids or welders. Thus a more productive economy can be wonderful in the abstract and threatening in the concrete. This accounts for a strange schizophrenia in public warnings about productivity. Economists worriedly contrast America’s recent modest growth in productivity with the robust gains of the Japanese and ask, How can we keep up? Meanwhile, labor spokesmen, describing the same economy in the same anxious tones, say that productivity is increasing too fast and that robots are putting too many people out of work. Because nearly every productive innovation has disrupted someone’s business, nearly every innovation has initially been opposed.

Theoretically, these perspectives might be reconciled with each other. It is conceivable that productivity could be stagnant in general—thereby slowly depleting the economic base—but rising quickly in certain industries, thereby discomfiting the work force. But in practice, the two perspectives on productivity hint at disagreements about almost every other economic issue. Those who think productivity is too low view the American economy, for better or worse, as inextricably bound up in international competition, its options constrained by what the Japanese and the Koreans do. Those who think productivity is too high believe that the fact that technology makes something possible or the market makes it attractive is not a good enough reason to rend our social fabric.

The most unrefined version of the anti-productivity argument is that technology will lower wages and put people out of work. This is sometimes referred to as the “lump of labor” concept—the idea that there is a certain amount of work to be done, and if machines do it, people can’t. It is possible to find in American history some apparent confirmation of this view—for example, in the saga of the American farm.

A hundred and fifty years ago nearly three quarters of the working people in this country worked on farms. By the turn of the century, even after the great manufacturing centers had risen and immigrants had filled the eastern cities, farmers made up more than one third of the work force, and on the eve of the Second World War about one fourth of the American population still lived on the farm. By the beginning of the 1980s only one fortieth did. Clearly, automation “destroyed” many farm jobs. But if technology were really eliminating work, we would expect all those displaced farmers to be unemployed. Some of them were, especially the blacks who moved from southern farms to northern and western cities between 1940 and 1960, in the largest migration of a single group in American history. Their experience has been different from that of other migrant groups, mainly because of discrimination at both ends of the journey. But the reason most ex-farmers, black and white, are not unemployed is that jobs were being created in the cities as they were being eliminated on the farms. (Indeed, most historians contend that people were pulled from the farm by the lure of city jobs, rather than pushed off by combines and electric churns.) Two thirds of all the jobs that existed in the industrialized nations a hundred years ago have been “eliminated,” but three times as many people are at work. The United States is more automated than before, but a larger proportion of its people are employed than at any other point in its history.

The lump-of-labor argument reached its zenith twenty years ago, during the great automation controversy. The worry at that time was that productivity had become a selfaccelerating, unstoppable force, which could ultimately put most of the population out of work. Through the late 1950s unemployment had been stuck at what were then considered dangerously high rates—6.8 percent during the recession year of 1958, a peak of 7.1 percent in May, 1961, and never less than 5.5 percent as late as 1963. Could it be an accident that so many millions were shut out of work when productivity rates were relentlessly ratchetting upward? From 1850 to 1889 output per man-hour rose by 1.3 percent a year. From 1889 to 1919 it rose by 2 percent. For the next twenty years it rose by 2.5 percent, and from 1947 to 1960 it rose by 3.3 percent. In the first few years of the 1960s it rose by 3.6 percent. Where would it end?

A man named John Snyder, the president of a robotmaking company, told a congressional committee investigating the problem in 1963 that automation was eliminating 40,000 jobs each week—or two million a year, at a time when the total work force was less than 75 million. Several economists warned that it was vain to think that increases in consumption could ever catch up with increases in production. Charles Killingsworth, of Michigan State University, a renowned automation expert, soberly told the same congressional committee that 99.5 percent of homes with electricity already had refrigerators, 93 percent had televisions, and 83 percent had washing machines. What more could they be expected to buy? “The only sharply rising sales curve in the consumer-durables field today is that of the electric can-opener industry,” he reported.

Congress passed the Manpower Development and Training Act in 1962 to cope with the menace of automation. Even after it was passed, in the mid-1960s there were predictions that the numbers of steelworkers, auto workers, telephone operators, and even IRS agents were destined for permanent decline. Most menacing of all, it appeared that manufacturing employment in general was headed the same way. Robert Theobald, part of a briefly influential anti-automation group called the Ad Hoc Committee on the Triple Revolution, wrote in 1961: “Despite the continuing increase in manufacturing, the number of people employed in this field appears to have reached a peak, at least in the United States. Total employment in the manufacturing industries was 17,500,000 at the height of the 1953—54 boom, 17,100,000 at the height of the 1956-57 boom, and only 16,500,000 at the height of the 1959-60 boom.”

If a million manufacturing jobs had disappeared in six years, by extrapolation it seemed reasonable that automation might drive the manufacturing work force below 15 million in the 1960s and below 14 million in the 1970s, just when the Baby Boom generation would be looking for jobs.

As it turned out, the frightening predictions were wrong—and not by some small margin but completely and monumentally. In 1980 total manufacturing employment was higher than ever before in American history. The number had begun to rise in the early sixties. The first increase, after years of stagnation in the 1950s, seemed significant enough to be pronounced by Fortune “one of the most sensational pieces of news” in years and to become the basis for a long Fortune series debunking the automation threat. Manufacturing employment reached 19.3 million in 1967, 19.6 million in 1977, and 20.6 million in 1980. (By 1982 it had fallen by about a million and a half, but that was in the middle of the most severe recession in forty years.)

The same was true in many specific industries affected by automation. In 1961 a writer named Walter Buckingham warned, “There are 160,000 unemployed in Detroit who will probably never go back to making automobiles, partly because the industry is past its peak of growth and partly because automation has taken their jobs.” But by 1978 the American auto industry employed more people than it ever had before. The number of telephone operators nearly doubled between 1940 and 1970, despite the elimination of the “Hello, Central!” dialing system, before falling by a quarter during the 1970s, with the advent of computerized switching. The number of machinists— skilled and well-paid craftsmen who operate metal-working equipment—dipped during the 1960s but rose during the 1970s, even though more and more shops were adopting “numerically controlled” (computer-operated) machine tools. All the while, the machinists’ average wage was also increasing. So were the nation’s per capita income (except when depressed by OPEC), the skill and education levels of its work force, its employment totals, and most other indicators of its economic well-being.

WHAT HAPPENED TO THE AUTOMATION PERIL? WHY did the predictions look so silly so soon after they were issued? The answer does not seem to lie in one commonsense notion about automation—that people who lost their jobs to computers or robots might have found new jobs making computers or robots. (Employment in those industries rose very little.) Part of the pessimists’ error may have been their touching but mistaken faith that mankind’s material wants had now been satisfied and any additional production would lie unclaimed upon the market. (Many of you will read these words in houses lacking electric can-openers but containing computers, telephone-answering machines, video-cassette recorders, microwave ovens, and other items that Professor Killingsworth never dreamed people might buy.) In the early sixties, as at other times, there was a strong temptation to declare the technological universe complete: all those old discoveries, the airplanes and automobiles and telephones, may have been great, but there’s nothing else in the pipeline.

Even more, the automation scare revealed that the rate of productivity affected employment less than something else did. That something was the overall rise and fall of the economy.

In retrospect, it seems that manufacturing employment had slumbered during the late 1950s not because of new automated devices but because of austerity measures promoted by the Eisenhower Administration, which was determined not to relive the horror of 1958, when the federal deficit had soared to the now quaint sum of $10 billion. During the boom of the mid-1960s the American economy found work for millions of new workers, each of them more productive than previous workers, because the level of demand was so high. The government did its part, by pumping up the economy (and starting a war), and the labor unions helped, by bidding up wages and sustaining a mass market. But automation itself also contributed to growth, by driving down prices and creating new markets. The force that was most feared helped propel the United States through a decade of unprecedented prosperity.

THE LUMP-OF-LABOR IDEA DOES NOT HAVE A LARGE following these days, but its underlying premise— that the next wave of economic change will be destructive, whatever the previous ones may have been—is central to the warnings about a deindustrialized society. The best exponents of the deindustrialization theory, including Barry Bluestone, of Boston College, claim that while international competition and a higher-tech economy may not cost America jobs in the aggregate, they will take away America’s “good” jobs in the places where Americans already happen to live.

The background to this argument is the change in the world’s economic landscape in the past fifteen years. Immediately after the Second World War the United States enjoyed a crushing economic advantage because its productive machinery was more modern than anyone else’s (and had not been bombed). But by the early 1970s the forces that would eventually destroy South Chicago were being set in motion around the world. As investment capital became more mobile, companies were freer to shop for locations with lower wages and better “business climates,” whether in Tennessee or Taiwan. The oil-price increases engineered by OPEC in 1973, and the resulting inflation, reduced the standard of living for most Americans—but not for workers in the heavy industries, whose unions had negotiated the cost-of-living adjustments known as COLAs. This was a temporary advantage for them and a long-term disaster for their industries. During the late 1970s, when chronic inflation eroded the dollar’s value in international trade, American goods became artificially attractive to foreign buyers—and American manufacturers were lulled into an artificial sense of security about their ability to compete. They were not prepared to adapt when circumstances changed in the early 1980s and an overvalued dollar drove their foreign customers away. The American steel industry had made matters worse with its “list price” system, which was essentially a legalized pricefixing scheme. By disdaining to undercut each other’s prices American steel companies practically invited an attack by foreign producers who would compete on price. Soon steel was in trouble, and to many people it appeared that America was doomed to lose its manufacturing base.

The threat to manufacturing is what most worries the deindustrialization spokesmen, because they see a vast distinction between the manufacturing and service sectors of the economy. In a sense the difference is merely symbolic: nearly 70 percent of employed Americans work in service industries, and services have employed more people than manufacturing for at least a hundred years. Nonetheless, manufacturing employment retains tremendous economic, political, and emotional significance.

The jobs that America is losing, according to Barry Bluestone, the AFL-CIO, and many other sources, are the unionized, secure, well-paying manufacturing jobs that offered a chance to ambitious workers (and built support for a liberal political movement). The jobs that America is gaining, they say, are bad jobs—the non-unionized, highturnover, ill-paying, no-future jobs in the service industries and in high tech. The service sector, including psychiatrists and laundry clerks, funeral directors and teamsters, is at least as diverse as manufacturing, but for political purposes it has been summed up by the 7-Eleven and McDonald’s. If there is one widely accepted symbol of today’s changing economy, the 1980s version of the allegorical Joad family hitting the road during the Depression, it is the proud steelworker who gets laid off in Youngstown and is reduced to flipping burgers for one-fifth his former wage. His is said to be more than an individual tragedy, for in his downfall lie the beginnings of a “two-tier” society and the destruction of the middle class.

The effect of this change, it is argued, is not simply to drag wages down but to distort the income distribution, There will be more rich doctors and more poor janitors, and fewer $12to $15-an-hour auto workers who can afford a house, their union dues, and a boat. This is a vision of a United States with all the egalitarianism and economic justice of Manhattan.

Until recently most discussions of the impending twotier economy dwelt on seemingly irrefutable evidence from the Census Bureau, which showed that the United States took a sharp turn toward inequity in the late 1970s. Through most of the decade, according to these reports, about half of all families—50.4 percent—fell into the broad “middle class” range, with incomes between $15,000 and $35,000 (in 1982 dollars). About a quarter of all families were in the bottom bracket (24.5 percent with incomes below $15,000) and the other quarter in the top (25 percent with incomes over $35,000). But as the 1980s began, according to these figures, the middle class was dramatically depleted. It dropped all at once to 43.9 percent. Of the 6.5 percent of American families who were pushed out, the majority went up, as the top bracket swelled to contain 30.9 percent of the population. The rest moved down, into a bottom tier that contained 25.1 percent of all families. “By almost all measures, the degree of income inequality between rich and poor American families has been increasing,” Fortune magazine said in 1983, in an article that focused sharply on the Census findings and helped publicize the “shrinking middle” concept.

If all these developments have, in fact, occurred, then the United States should be doing something about them. It could start by recognizing that there can be no such thing as “free trade” in a world where nations are competing for scarce work. Through tariffs or quotas or “domestic-content” laws the United States should do what it takes to protect its manufacturing jobs. The government should also recognize that “capital mobility”—factory closings in some cities, new factories elsewhere—can destroy regions, families, and the very fabric of society. Therefore, the movement of capital should be slowed down, with plant-closing restrictions, public ownership, and other efforts to keep Youngstown’s jobs in Youngstown. Clear in the knowledge that the natural course of events is toward disaster, the government should attempt to steer investment toward industries that will produce “good” jobs and preserve community roots.

These steps might seem sensible if the country were deindustrializing and a two-tier society were emerging. But is this happening? I think the evidence is that it is not.

I here are good reasons to believe that today’s economic change is very much like past changes that the United States has undergone—and therefore to conclude that it will create many more opportunities than it destroys.

TO BEGIN WITH, CONSIDER THE PUTATIVE ECLIPSE OF American manufacturing. Robert Z. Lawrence, an economist from the Brookings Institution and the author of Can America Compete?, is as prominent an opponent of the deindustrialization theory as Bluestone is a proponent, and he asserts that the United States is in no imminent danger of vanishing as a manufacturing power. “Judged by the volume of production, America is no more a service economy today than it was in 1960,” he has said. “Goods output was 45.6 percent of GNP in 1960, 45.8 percent in 1979, and 45.3 percent in the final quarter of 1983.” There is a grave threat to American manufacturing, Lawrence and others have argued, but it arises principally from the overvalued American dollar, which makes foreign goods so cheap and American exports so dear.

A smaller share of the population works in manufacturing now than in 1960, of course; but precisely because the service sector is already so large and manufacturing so small, the prospects for a dramatic swing in employment are remote. Many of the jobs that are going to disappear from manufacturing have already disappeared. Moreover, steel and auto workers, symbolically evocative as their cases may be, are extremely misleading guides to the overall difference between manufacturing and service jobs.

Robert Samuelson, of Newsweek, has pointed out that the steel and auto industries accounted for only 2.2 percent of American employment as long ago as 1950, and only 1.5 percent in 1979, when the recent decline began. Textile and apparel workers, among the worst-paid manufacturing employees, are more numerous than steel and auto workers combined. Leaving aside their limited numbers, steel and auto workers are unusual in their relative affluence. Samuelson reports that in 1970 steelworkers, as part of the traditional aristocracy of labor, had average hourly earnings 29 percent higher than those for all nonsupervisory workers in the economy. By 1980 steelworkers were earning 71 percent more than other workers. During the same period auto workers’ earnings rose from 31 percent higher than the average to 48 percent higher.

If steel or auto workers were typical, we might expect the state with the highest proportion of manufacturing workers to be Ohio or Illinois or Pennsylvania—the bulwarks of the old industrial belt. In fact, as Robert Samuelson has shown, not one of them is among the top ten states when ranked by proportion of the work force in manufacturing. Number one on that list is North Carolina, home to low-wage furniture factories and textile mills, with 32.8 percent of its non-farm work force employed in manufacturing. North Carolina stands thirty-ninth among the fifty states in per capita income. Number two on the manufacturing list is South Carolina, with 30.6 percent of its work force in manufacturing. It has the forty-eighth highest per capita income in the country. Indeed, the only states in the top ten in manufacturing that are in the top twenty in per capita income are Connecticut (fifth in manufacturing, second in income) and New Hampshire (sixth in manufacturing, nineteenth in income). All the other states in the top twenty in manufacturing are in the bottom thirty in per capita income.

If manufacturing were the main source of opportunity, we might expect the growth of a service-heavy Sun Belt boomtown like Houston to drive wages and income down. But from 1977 to 1982 Houston went from fifth in per capita income among large metropolitan areas to third—and this at a time when migrants were pouring in, which usually depresses per capita income. Meanwhile, Chicago was falling from third to ninth, Detroit from fourth to fourteenth, Cleveland from eleventh to sixteenth, and St. Louis from sixteenth to seventeenth. For many people, entering the service sector represents an important step up, not down. According to Michael Urquhart, of the Bureau of Labor Statistics (BLS), in a study published in Monthly Labor Review, most of those entering service jobs were not displaced manufacturing workers but people (mainly women) who had previously not been employed at all.

A year ago the BLS released its forecasts for employment growth by 1995. They seemed to confirm the worst predictions about the nature of a high-tech and service economy. The single job category with the most new openings expected was building custodians, the number of which is expected to increase by 779,000 by 1995. Second was cashiers (744,000 new jobs), and third was secretaries (719,000). More glamorous and Financially rewarding callings, such as those of registered nurses (sixth; 642,000 new jobs) and computer-systems analysts (twenty-second; 217,000), came farther down the list.

The BLS projections are featured in almost every report about the coming two-tier society. What is less often emphasized is that the United States already has a lot of janitors, cashiers, and secretaries—so many, in fact, that 779,000 new janitors will be just enough to keep the proportion of janitors from falling. In 1981 there were 993,000 janitors, and they accounted for 2.8 percent of the work force. In 1995, if the ominous BLS projections are borne out, janitors will constitute 2.8 percent of the work force. The other fast-growth callings are also expected to remain proportionally stable or increase very slightly. It is one thing to say that a society with so many low-level jobs is unfair. Why not say that outright, instead of pretending that an increase in the number of janitors is the cause of worry?

One of the most dramatic changes during the 1970s was the decline in “private household workers”—the maids, cleaning ladies, servants, and nannies who are nearly all female and disproportionately black. The number of “cleaners and servants” fell by nearly one third between 1972 and 1980, the number of in-house child-care workers by one fifth. Maybe it’s hard to find good help these days, and maybe some of the lost jobs have merely been transferred to illegal immigrants; but can there be any more hopeful sign of progress out of “bad" jobs than the shrinkage of this category?

The most exhaustive forecast of the impact of automation on the future labor market comes from Wassily Leontief and Faye Duchin, of New York University, who put the eighty-nine industrial sectors and fifty-three occupations that make up the U.S. economy through the computerized input-output models that won Leontief the Nobel Prize. The study is often cited in deindustrialization arguments; Leontief has warned that the United States will need shorter work weeks and new forms of job-sharing to cope with automation. But the data in the study point in a different direction. All in all, the study concludes that by the year 2000 computerized automation will allow the United States to produce goods and services with 10 percent fewer workers than it would need to produce the same amount of goods without computers. This is not exactly a surprising finding, when you consider that the study rested on the assumption that the level of “final demand" in the year 2000 was fixed—that is, it explicitly ignored the possibility that technical progress would create new markets for new products. (The study did allow for possible increases in business investment, which could create new jobs.)

This omission aside, what did the study say about the jobs America would be losing? The charts for the fiftythree occupations show that computers will increase demand for most of them and reduce it significantly only for secretaries, office-machine operators, bank tellers, other clerical workers, phone operators, drafters, and managers and proprietors. With the exception of the last, these are not America’s “best" jobs. “The impacts . . . will involve a significant increase in professionals as a proportion of the labor force and a steep decline in the relative number of clerical workers,”the report’s summary said. “Production workers can be expected to maintain their share of the labor force; direct displacement . . . will, at least in the initial stages, be offset by the increased investment demand for all sorts of capital goods, especially computers.” These are developments to be feared?

IF MANUFACTURING IS NOT REALLY DECLINING, AND IF most manufacturing workers are not paid that well anyway, and if fast-growing cities offer rising incomes, and if the occupational structure is stable or improving, then why did the middle class suddenly shrink in the late 1970s? The answer is, it didn’t. An honest but important statistical error lay behind that Census report. Robert Samuelson carefully dissected it more than a Year ago in the National Journal, but his analysis never seemed to catch up with the fast-moving “fact.”

According to most published interpretations of the Census report, fully 10 percent of American households dropped out of the middle-income group in 1979. Shifts that sudden rarely occur in normal times; what really happened was a change in statistical practices. Before 1979 the Census Bureau had reported no subdivisions within the $25,000-to-$50,000 income bracket. In 1979 it began using a more finely graduated system, with brackets for $37,500 to $40,000, and so forth. But then the Census had to determine how to reclassify the people who in surveys made before 1979 had been simply lumped together in the $25,000-to-$50,000 category. One way to do so would have been to run all the original Census tapes back through the computers, but that would have been time-consuming and expensive, and there was no reason to think it was necessary. The bureau had a well-established routine for estimating income distribution by fitting the data to statistical curves, and the results were usually correct. And so the Census analysts came up with informed guesses about where families had probably fallen within the $25,000-to$50,000 range before 1979. It was when those approximated historical figures were compared with the post-1979 results that the difference between the two seemed to indicate a shrinking middle class. Robert Samuelson noticed the sharp change in 1979, suspected that something other than natural economic forces must lie behind it, and found out about the estimation procedure from the Census. Then Frank Levy, of the University of Maryland, and Richard Michel, of the Urban Institute, obtained the original tapes and ran them through their computer. On re-examination the tapes showed that the distribution of earnings had been remarkably constant through the years.

Barry Bluestone now concedes that the two-tier society is still a hypothetical danger. “There may not be much evidence of a ‘bimodal distribution’ of income yet,” he told me last summer, shortly after he and Robert Lawrence had faced each other down before a congressional committee. “But remember that a bimodal pattern”—one with most people at the extremes and few in the middle—“would be a revolutionary change. It is almost a fact of nature to have unimodal distributions. Any movement in this direction suggests a revolution in economics.”

Robert Lawrence, who generally debunks the idea of a two-tier society, does not dispute that recently the fastestgrowing job categories have been in the lower-paying sectors. But he says there is a more obvious explanation than a historic shift toward a two-tier society.

The explanation is the Baby Boom. During the 1970s millions of new workers entered the labor market. The American economy managed to find work for most of them, to the astonishment of the many European nations where unemployment among the young is a major worry. They did not land many of the “good” jobs in heavy industry; the seniority rules that are sacred to unionized operations kept young workers out of slow-growing industries like autos and steel. Instead they found work through the growth of lower-paying industries. Fifteen years ago, as members of the Baby Boom were beginning to get out of college or come home from the Army, men under age twenty-five earned on average three-quarters as much as men over twenty-five. By 1983 the younger men were earning about half as much as the older ones.

If there really is a “shrinking middle,” then, it is a phenomenon limited to young men. For working women over twenty-five the past fifteen years have meant not a shrinking middle but a “shrinking bottom,” as their incomes have come closer to men’s and they have been pushed out of the low earnings bracket and into the middle range. Meanwhile, older men have been moving from the middle to the top. “The evidence suggests that shifts in the laborforce distribution across age groups rather than sectors are a more important reason for the declining middle,” Robert Lawrence says. The reason this explanation matters is that, according to economists, such demographically induced distortions tend to cure themselves as an unusually large generation ages.

There is another obvious explanation for the growth in the number of low-wage jobs: as in so many other recent economic misfortunes, OPEC played its part. For two centuries before 1973 the American economy had raised labor productivity (and therefore wages) by using energy as if it were free. When energy suddenly became expensive, businesses had to concentrate on raising “energy productivity”—getting the most out of a barrel of oil or a kilowatt of power, even if it meant doing more work by hand. The jobs this entailed (installing insulation, stripping paint off jetliners to increase their mileage) were relatively “unproductive” and low-paid; but with stable energy prices it should be possible to raise labor productivity again. Taking into account OPEC and the Baby Boom, says Donald Hicks, of the University of Texas at Dallas, “a case can be made that, if anything, our national economy has shown a remarkable agility in responding to all manner of disturbances since 1965.”

TRYING AS THE RECENT DISTURBANCES HAVE BEEN, Hicks says, the United States has surmounted far worse. “You can’t tell me that the movement of people off the farm did not have a greater impact than anything that may be going on now,” he says, “—or Henry Ford, or the decline of the textile factories.” Each of those episodes was disruptive, yet each was part of the process that produced the communities that people are now trying to save. The life that Frank Lumpkin and Ed Sadlowski mourn was, after all, built by people who came from Poland or Florida looking for new opportunities.

If that is so, what accounts for the note of falling-off-thebrink to be found in so many descriptions of the evolving economy? I am not talking about the wrath that is justly directed toward the International Harvesters of the world for callously tossing off the Frank Lumpkins, or about the taxcode insanities that tempt firms to close factories that are still paying their way. I mean instead the insistence that as certain industries decline and others emerge, nothing good can possibly come of the transition, and that the pain is all that matters. I think this outlook has less to do with the nitty-gritty of facts than with heartfelt assumptions about stability and change. In particular it reflects a romantic amnesia about how American society has evolved.

If only Bruce Springsteen had written his song “My Hometown” a few years earlier, it could have been paired with the Carter Urban America report as the yin and yang of views about community stability. The song says that the factories are closing down; it is about the pathos of seeing places you have cared about weaken and die. But through most of American history the factories have constantly been closing down—and opening up—and people have been moving from declining areas to those on the rise. Capitalism is one of the world’s more disruptive forces. It can call every social arrangement into question, make cities and skills and ranks merely temporary. To buy into it is to make a commitment to permanent revolution that few political creeds can match. Accepting that commitment has made the United States different from other countries in several ways: it is more open to immigrants and to the socially unfavored, it is more prosperous, and it is more atomized and rootless. Other nations have struck a different bargain with capitalism. England, for example, is more stable, less prosperous, more bound by class. The American bargain is responsible not simply for what is painful in our society but also for what is promising.

Both the Northeast and the upper Midwest, now taken as symbols of stable communities unchanging through the eons, in fact illustrate how tumultuous American life has always been. Consider the case of nineteenth-century Poughkeepsie, New York, which has been studied by Clyde Griffen, of Vassar. In the years before the Civil War the city enjoyed a boom in beer-making. As a result there was also a boom in cooperage (barrel-making), and from 1850 to 1860 the number of coopers in Poughkeepsie quadrupled. New firms sprang up; master coopers went into business for themselves; wage rates soared. But in the mid-1860s demand started to fall. By 1870 the total number of coopers was only slightly higher than it had been in 1860, and by 1880 it was actually lower. Wages also fell, and by 1875 coopers, who had mastered the skill of making watertight barrels by precisely fitting the slats, were earning about as much as common day laborers.

What did the coopers do? The younger ones left and found other work elsewhere, and the older ones stayed. A similar fate came to Poughkeepsie’s cabinetmakers, not because of slack demand but because of the dawn of mass production. The younger cabinetmakers moved away, and the older ones did well for a time, but the firms that kept paying them their accustomed wage finally went out of business. There was nothing atypical about any of this. “For two centuries different groups of skilled artisans have faced technological changes that have made their skills less valuable than they were,” Stephan Thernstrom, of Harvard, said last summer. “Their choices were to remain and make half the wages that they used to or to move to some other place, away from the mainstream, where their skills might be more valuable.”

In his book Poverty and Progress, a classic study of nineteenth-century Newburyport, Massachusetts, Thernstrom emphasizes the same constant churning of people from place to place. Early in the nineteenth century the city suffered economic misfortunes at least as distressing as anything that has befallen South Chicago. For twenty years, from 1810 to 1830, people moved out of Newburyport, and the town fell apart. Most of those who left during the nineteenth century, Thernstrom found, were laborers and artisans. Those who stayed were the ones who could afford to, the merchants and bankers. The same was true in Poughkeepsie. According to Griffen’s study, from one decennial U.S. Census to the next, most of the upper class of the city could be found in place, but the working class was always on the road. “Skilled craftsmen moved in and out of the city very frequently. Of the fourteen trades . . . in only one—cabinetmaking between 1850 and 1860 [the true boom years]—did as many as half the workers remain between censuses. . . . Two thirds or more of the younger workers in nine of the fourteen crafts in 1850 departed during the next decade.” Many of them ended up on the frontier, where they could gain property of their own or sell their skills at a premium. After the Second World War the connection between class and propensity to migrate was reversed. The recent movement toward the Sun Belt has been led not by displaced laborers but by better-educated professionals. This is one of the biggest differences between today’s migration and that of a hundred years ago.

The industrial Midwest was born of the same migration, tumult, and economic change that are now said to threaten it. Nearly fifty years ago an economist named Glenn E. McLaughlin wrote a book called Growth of American Manufacturing Areas. It came out during the Depression, but it focused on the astonishing growth of American manufacturing power between 1899 and 1929. America’s population was growing fast then, largely through immigration, and the number of manufacturing workers was growing almost 50 percent faster. Where was the growth concentrated? In the region that the Census Bureau calls East North Central, and that others now refer to as the Rust Belt.

The book depicted the nation’s thirty-three industrial areas, the fast-growing, prosperous places where industry was moving in. With three exceptions, on the West Coast (Los Angeles, San Francisco, and Seattle), they were all in the Northeast and the upper Midwest. Toledo, Scranton, Dayton, St. Louis, Youngstown, Detroit, Chicago—these were the upstart areas of America, the places that offered immigrants from Poland, Tennessee, Italy, and Georgia wages and opportunities they could not have found at home. Not even Houston in the 1970s grew so rapidly as Cleveland and Detroit did after the turn of the century. Front 1900 to 1930 Detroit’s population quintupled. In their rise, these new powers elbowed more-settled cities, and their products and jobs, out of the way. “Among areas, Boston and Albany have suffered much from these changes,” McLaughlin said. “Up to 1919 Chicago”—including its steelmaking South Side—“probably profited more from shifts in industrial activity than any other area.”All this happened, of course, after Frederick Jackson Turner declared the frontier “closed” and America’s social safety valve eliminated. The United States, it turns out, has continued to have frontiers, which are to be found not in vacant tracts of land but in the social and economic fluidity created by industries on the rise.

Yes, the migration that built the industrial cities was also responsible for some of the most wrenching episodes in American life. When Detroit’s population boomed, many of its new citizens were blacks and whites from the South who had left their familiar plots of land in pursuit of Henry Ford’s $5 a day. Even when times were great in the auto industry, the migrants crowded into their row houses and never got over their longing for the comfortable, slowpaced hamlets where they could no longer afford to live. The most notorious migrations of our recent history—the flight of Okies from the Dust Bowl, immortalized by Dorothea Lange’s photos and John Steinbeck’s (as well as John Ford’s) Grapes of Wrath, and the exodus of southern blacks to the cities—brought their share of suffering. This was especially so among the blacks, who reached the manufacturing belt just as its growth crested and who are now being left behind in declining cities.

But while American artists have captured the pain of migration, they have not dwelt on its benefits—such chores are left to literal-minded economists. Their studies nearly unanimously conclude that people who pull up stakes fare better than those who stay behind. Indeed, John Kenneth Galbraith has said that migration is the only force that ever breaks the “equilibrium” of regional poverty. Certain people decide they will not be poor anymore, and they leave. Galbraith, himself a migrant from the hardscrabble of rural Canada, was talking mainly about the Third World, but his point seems also to apply to the United States.

For as long as accurate birth records have been kept (since the mid-nineteenth century), the proportion of Americans living outside the region of their birth has been fairly constant (“region” in this sense means one of four large Census Bureau divisions—Northeast, North Central, South, and West). Through the years about one in four Americans has moved to a region other than the one where he was born. Whatever the hardships they have endured, the ones who have moved have found better opportunities than the ones who have stayed put.

“Geographic movement is associated with superior occupational achievement, regardless of place of birth or destination,” Peter M. Blau and Otis Dudley Duncan said in their encyclopedic study The American Occupational Structure. “The data unequivocally show that migrants have more successful careers than men still living in the region of their birth. . . . In brief, men who remain in the region in which they were born start their careers on lower levels and advance less subsequently than those who live outside their region of birth. It appears that something either about migration or about migrants promotes occupational success” (emphasis added).

Blau and Duncan went on to try to explain what this “something” might be. “A hypothesis that can explain these findings is that living some distance away from his childhood home frees a man from the restraints and influences his childhood environment imposes on his career.”Having broken one social rule by leaving, the migrant is freer to break others. He does not have to marry the girl his family likes or follow his dad into the plant. He is more able to rise—and in theory more likely to fall—without family to call on for spiritual and material help. But in practice, Blau and Duncan found, migrants did not fall—neither the ones who went to Detroit in 1910 nor, for that matter, the ones who left there in 1980.

WHEN I THINK OF THE CASUALTIES OF ECONOMIC change, I think of Frank Lumpkin and the Save Our Jobs Committee. When I think of its beneficiaries, I think of Steve Contreras and Jean Bonner.

Contreras, now in his mid-forties, grew up on the western fringe of San Antonio, within the borders of the Edgewood School District, which is synonymous in Texas with impoverishment. His father was an auto painter; his mother raised the children. San Antonio’s Hispanic neighborhoods are as firmly rooted by family ties as any Polish community in Chicago, yet when Contreras finished high school (at a time when more than half of his classmates were dropping out), he left town for Austin and the University of Texas. The state scholarship he had won as the Edgewood valedictorian ran out after a year, and he retreated to San Antonio. In 1963, on the day of his marriage, he moved back to Austin in hopes of completing his degree. Instead he took a job with Tracor, one of the first of the high-tech firms to spring up in Austin, and soon found that his work was more engrossing than his schooling.

It was a “bad" job, by the standards of the two-tier society—a position as a trainee assembler, learning to put electronic parts together, for $1.75 an hour. It was with a company that, like many in the industry and the area, was vigorously anti-union. But it was the best opportunity Contreras had, and he did not think he would be stuck. After completing the mandatory ninety days as a trainee, he became a regular assembler, then a lead assembler, and then he moved on through a long series of advancements. He became a technician, a senior technician, and the leader of a group of technicians; he went into engineering, became a program engineer, and ended up overseeing subcontractors.

“They were testing me at each stage, but I told them I was putting them to the test,”Contreras told me last year. “Would they reward someone with potential?” Having supervised several product lines and worked in sales, he is now Tracor’s “facilities services manager,” in charge of the architectural, maintenance, and housekeeping crews that keep the plant running day by day.

At that same factory I met Jean Bonner, a large, pleasant-faced black woman who is now fifty years old. She was a teenage mother and had worked as a nursery school teacher at the Greater Mt. Zion Baptist Church in Austin, caring for twoand three-year-olds. She came to the plant in 1964 and found a “bad" job, as a trainee at near minimum wage. Eighteen months later she was the soldering group leader, in charge of twenty-three people. She moved into quality control and eventually became a manufacturing supervisor, presiding over a room full of women fitting electronic chips onto circuit boards. Late last year she was promoted again to assembly manager, overseeing the work of a hundred employees and six product lines. Her earnings approximate those of a steelworker (Contreras’s exceed them). Her daughter, now thirty-one, has worked at the same plant for ten years and is a group leader. “When I came to Tracor I came to learn what was here to be done and master it,”Bonner said last summer. “My hope for my daughter is that she will be as happy here as I have been.”

Stories like these can be hard to take, because of their high saccharin content. The point is that they are true and that they depend on the kind of expansion and adaptability that is very hard to find in settled, declining areas. Yes, Tracor is mainly a defense contractor, so its “opportunities" do not really reflect the workings of the market. But I heard very similar stories in Austin from employees of Motorola, Texas Instruments, and IBM. Perhaps the gains of Contreras and Bonner have come at the expense of midwestern steelworkers, although it is hard to see a direct connection. But aren’t the opportunities that got them out of the barrio and away from the usual plight of teenage mothers worth preserving too?

I have seen the same pattern elsewhere in Texas—even in Lubbock, which has in common with South Chicago an unrelenting wind and a major factory layoff. Lubbock is a city of some 220,000, on the treeless high plains of west Texas, where the Panhandle joins the Texas mainland. The skies are often dark with dust storms, and the downtown has that curious East German look of having been built all at the same time, because a tornado erased the old downtown fifteen years ago.

Lubbock was originally a farming and trading center, but in the past decade it has attracted more and more hightech assembly work. One important installation was the Texas Instruments plant, where TI made its “home” computer, the 99/4A. For a while the city and the company shared high hopes, but in the summer of 1983 both of them faced reality: the computer was a failure, and the Lubbock line was closed. TI tried to transfer many of its career employees elsewhere in the corporation, but at least a thousand people in Lubbock were laid off.

Lubbock’s unemployment rate, historically below five percent, shot up to nearly eight. But by the end of 1983 it was already declining, and last spring, nine months after the shutdown, it was 6.2 percent and still headed down.

Why was Lubbock so quick to rebound? TI had created a boomtown situation, and when the boom ended people moved on to something else. Unlike most boomtowns. Lubbock had a diversified economy—lots of agriculture, some oil work, wholesale trade—but it also had people who expected to keep on the move. I spent several days in Lubbock less than a year after the TI layoff, and I found no pools of unemployed TI workers, waiting for things to pick up at the plant. I did find several for whom TI had been a first step—toward a business of their own, toward a skilled trade, or, at a minimum, out of the sorghum fields. Wayne Finnell, a local banker, said, “These new industries have been picking up the minorities, the Spanishspeakers, and giving them better jobs than they’ve had before. Their fathers may have been picking watermelons, and now they’ve got a chance.”

“MIGRATION, WE have seen, is the oldest action against poverty, John Kenneth Galbraith wrote in The Nature of Mass Poverty. “It selects those who most want help. It is good for the country [or region] to which they go. . . . What is the perversity in the human soul that causes people to resist so obvious a good?”

This is a more elaborate way of asking Jimmy Carter’s Urban America question. Why shouldn’t we help people rather than places? Why should we decide that the whole national history of migration, adjustment, and advancement must now come to an end?

If we had a “people” policy, it would have to come to grips with certain uncomfortable human truths. One is that some— perhaps most — displaced workers will never again be as well off as they used to be. The members of the Save Our Jobs Committee will not be filling the new professional slots that computerization will open up. Some economists might prissily observe that steelworkers have enjoyed above-average earnings for years, so what’s their gripe? The observation is correct, but it is meaningless in South Chicago. It is harder to give something up than not to have had it in the first place. The government cannot give the steelworkers back their jobs, but it should not pretend that they are all going to move to San Jose and do well. It owes them its enforcement powers to keep their former employers from welshing on their pension obligations, and it should alter its depreciation schedule so that factories are not worth more dead than alive. In certain instances there might even be a case for public-works programs, which could employ people to dismantle the unused industrial eyesores that give their communities such a devastated air and to put up something that for the first time in these communities’ history actually looks nice. Among the many traumas of unemployment is that people who took pride in their work can no longer do so. Offering a dignified outlet for their talents would surely boost their spirits. But in most cases such a boost would prove to be cruel, because of the unavoidable conflict between easing the pain of displaced workers and sending the proper signal to those workers’ children. Frank Lumpkin’s mission is to offer hope to the small group of people who depend on him. The government’s mission should be to make clear that there is hope, even though it may not lie within the shadow of the mill.

Another uncomfortable truth is that the culture of the postwar industrial town has a thousand ways to discourage migration. People age early in industrial environments. They are parents at twenty, thinking of their pensions by thirty, old at forty-five. From Houston or Lubbock it might seem that a forty-five-year-old could just throw his family into the car and start all over again. Some forty-five-year-olds do, but for many others it’s too late.

It should be no surprise that long-distance migration has become a white-collar phenomenon: some college graduates can move anywhere and feel at home. Except for family, what they care about—things like jobs and services — can be found nationwide. For many industrial workers, though, “community” is a much smaller place. It arises from things planted in one physical location—extended family, church, job. How can you go to Texas when Aunt Maria’s seventieth-birthday party is coming up? Thirty years of stability in the steel and auto industries have erased memories of the fathers and grandfathers who moved from somewhere else.

Stephan Thernstrom argues that the nation is smaller for everyone now, steelworkers as well as young lawyers. “The cultural difference between Detroit and Houston is much less than between the farm and the Manchester mills,” he says. “The difference between living in the Berkshires and living in Lowell was pretty dramatic.” Still, a change from Detroit to Houston seems dramatic to those who are considering making it. Everything about industrial life has instilled subordination to large institutions in which collective effort is everything and the individual nothing—the church, the military, the unions, the mills.

I am suggesting not that the government undertake to remold culture but that a “people, not places” policy must be built on an understanding of why “place” matters to so many people. The buccaneers of Dallas and LA speak with withering contempt of the deadheads still stuck in Fort Wayne. But the people who chose to stay are there for a reason, and by their lights it is a noble one.

THERE IS ONE MORE UNCOMFORTABLE TRUTH, BY far the most difficult to come to grips with. What really matters in the decline of heavy industry is not one generation’s lost wage but the eclipse of an important form of economic advancement. In the most familiar version of the American dream the factory worker’s son becomes a doctor. But there is another version, in which the steelworker’s son becomes a steelworker. The way he moves ahead is through the union, which steadily bids up the wage. The first dream depends for its fulfillment on luck, sacrifice, perseverance, and more luck. The second requires discipline, endurance, a healthy industry to draw on—and the entry ticket, a union card. Since the Second World War that ticket has entitled its bearer to the chance for security. Now, because the heavy industries are floundering, it does not take anyone far.

It is impossible to spend time in Houston or Miami without seeing that opportunities are still available. But they are opportunities of a sort different from those familiar in South Chicago. In part they are the same chances as are held out to any refugee Vietnamese or Guatemalan: to scrimp like crazy for several years and emerge a success. This is a path most native-born Americans disdain. There are also opportunities for those carrying the modern union cards: college or graduate degrees. These credentials are only roughly related to raw intellectual talent, and through much of America’s past they were happily ignored. When the Irish moved into police forces and city governments, when Chinese and Jewish immigrants began to thrive in trade, they did not have to prove that they had taken forty hours toward a master’s degree. Through much of the nineteenth century, businesses recruited their future executives from among the stock clerks—not because American society was perfectly egalitarian but because no one had thought of using academic degrees as a screening device for any profession except academic ones. Even now the industries growing fastest pay the least attention to credentials. There were few MBAs in the first computer businesses, which were notably indifferent to how long you spent in school and who your grandparents were. But to a growing number of businesses controlling most of the “good” jobs the new union card is indispensable.

The problem with the new union card is that it is distributed even less fairly than the old. If you don’t get on the college track in high school, you’ll probably never go to college, and if you’re born into the wrong family, you may never know of the track at all. If we are serious about expecting people to move toward new opportunities, we must also be serious about removing the barriers that stand in their way. This is why the saga of South Chicago may constitute a strong case for nationwide funding of grade schools and high schools. Why should children be trapped just because they grow up in a city with an eroding tax base? It would be easier to recommend “people, not places” if people in all places had a fairer chance at the start. In addition to federal funding for education, another step in the right direction would be a kind of reverse Individual Retirement Account, through which people could borrow to cover the cost of higher education or specialized training, and then repay a percentage of their future lifetime earnings through the IRS.

Those are some of the hard truths about choosing mobility. No doubt there are many more. But the benefits include a number of things that most Americans value in their society, from its openness to its abundance. Do we really have to believe that the costs have finally become too high and it is time to settle down and be a mature, settled nation, like England or France? We are less likely to do so if we are honest with ourselves about our past.

Few people understand this better than Donald Hicks. Hicks grew up in South Bend, Indiana; in December of 1963, on the day the Studebaker plant closed, he was in high school. The city responded to the closing with a campaign to talk up South Bend and attract new business. Hicks watched his friends’ families move away.

In 1980, when he was thirty-three years old, Hicks took a leave from his job at the University of Texas in Dallas and came to Washington. There he became the principal author of Jimmy Carter’s Urban America report. A few weeks after the report was released, with the rain of national outrage still falling upon him, Hicks was summoned with one of his superiors to a “retreat” in Atlantic City. There he was to speak with Brendan Byrne, then the governor of New Jersey, and other state officials. After a congenial dinner Hicks was offered up for dessert.

“The questions were heated,”he said recently. “It had touched a nerve. People felt we were attacking something that was very dear to them. What was threatened was their conception of community.” One man told Hicks that his grandmother was buried in his hometown in New Jersey and that he could never bear to leave.

“I was being told all this by people whose last names would have been very much at home in Florence or Milan,” Hicks said. “They asked me how I dared to tear people away from the roots they had established. I wasn’t cheeky enough to say anything then, but it stayed with me. So many of their families had decided to migrate, and they’d found a better life. Now they were angry at the advice that America help others do the same.” □