On June 13, 1944, a few days after the 90th Infantry Division went into action against the Germans in Normandy under the command of Brigadier General Jay MacKelvie, MacKelvie’s superior officer, Major General J. Lawton Collins, went on foot to check on his men. “We could locate no regimental or battalion headquarters,” he recalled with dismay. “No shelling was going on, nor any fighting that we could observe.” This was an ominous sign, as the Battle of Normandy was far from decided, and the Wehrmacht was still trying to push the Americans, British, and Canadians, who had landed a week earlier, back into the sea.
Just a day earlier, the 90th’s assistant division commander, Brigadier General “Hanging Sam” Williams, had also been looking for the leader of his green division. He’d found MacKelvie sheltering from enemy fire, huddled in a drainage ditch along the base of a hedgerow. “Goddamn it, General, you can’t lead this division hiding in that goddamn hole,” Williams shouted. “Go back to the [command post]. Get the hell out of that hole and go to your vehicle. Walk to it, or you’ll have this goddamn division wading in the English Channel.” The message did not take. The division remained bogged down, veering close to passivity.
American troops were fighting to stay alive—no small feat in that summer’s bloody combat. One infantry company in the 90th began a day in July with 142 men and finished it with 32. Its battalion commander walked around babbling “I killed K Company, I killed K Company.” Later that summer, one of the 90th’s battalions, with 265 soldiers, surrendered to a German patrol of 50 men and two tanks. In six weeks of small advances, the division would use up all its infantrymen, requesting replacements of more than 100 percent.
General Collins removed MacKelvie on the very same day that his tour revealed no fighting in progress. Collins instructed the 90th’s new commander, Major General Eugene Landrum, to fire the commanders of two of the division’s three regiments. One of those two, the West Point graduate Colonel P. D. Ginder, was considered by many to be a disaster. One man, a mortar forward observer, remembered that Ginder “almost constantly made the wrong decisions.” He had been in command of his regiment for less than a month when he was replaced.
MacKelvie’s successor, Landrum, was given a few weeks to prove he was an able commander, but by midsummer he too was judged to be wanting. Before he was relieved, Landrum fired the assistant division commander he had inherited, Sam Williams, with whom he had clashed. “I feel that a general officer of a more optimistic and calming attitude would be more beneficial to this division at this time,” Landrum wrote. General Omar Bradley, the senior American general in France at the time, concurred. He topped off the dismissal by demoting Williams to colonel.
Within a few weeks, Bradley relieved Landrum as well, and sent Brigadier General Raymond McLain, whom he had brought from Italy to England to have on tap as a replacement when someone was fired, to take over the 90th Infantry Division. “We’re going to make that division go, if we’ve got to can every senior officer in it,” Bradley told him. Two days later, McLain gave him a list of 16 field-grade officers he wanted out of the division.
The swift reliefs of World War II were not precise, and while many made way for more-capable commanders, some were clearly the wrong move. Nonetheless, their cumulative effect was striking. The 90th Division, for instance, improved radically—transforming from a problem division that First Army staff wanted to break up, into “one of the most outstanding [divisions] in the European Theater,” as Bradley later wrote. Retired Army Colonel Henry Gole, in his analysis of the 90th Division, directly credits the policy of fast relief:
Because incompetent commanders were fired and replaced by quality men at division and regiment, and because the junior officers of 1944 [who were] good at war … rose to command battalions in a Darwinian process, the division became an effective fighting force.
Generalship in combat is extraordinarily difficult, and many seasoned officers fail at it. During World War II, senior American commanders typically were given a few months to succeed, or they’d be replaced. Sixteen out of the 155 officers who commanded Army divisions in combat were relieved for cause, along with at least five corps commanders.
Since 9/11, the armed forces have played a central role in our national affairs, waging two long wars—each considerably longer than America’s involvement in World War II. Yet a major change in how our military operates has gone almost unnoticed. Relief of generals has become so rare that, as Lieutenant Colonel Paul Yingling noted during the Iraq War, a private who loses his rifle is now punished more than a general who loses his part of a war. In the wars of the past decade, hundreds of Army generals were deployed to the field, and the available evidence indicates that not one was relieved by the military brass for combat ineffectiveness. This change is arguably one of the most significant developments in our recent military history—and an important factor in the failure of our wars in Afghanistan and Iraq.
To a shocking degree, the Army’s leadership ranks have become populated by mediocre officers, placed in positions where they are likely to fail. Success goes unrewarded, and everything but the most extreme failure goes unpunished, creating a perverse incentive system that drives leaders toward a risk-averse middle where they are more likely to find stalemate than victory. A few high-profile successes, such as those of General David Petraeus in Iraq, may temporarily mask this systemic problem, but they do not solve it.
Ironically, our generals have grown worse as they have been lionized more and more by a society now reflexively deferential to the military. The Bush administration has been roundly (and fairly) criticized for its delusive approach to the war in Iraq and its neglect of the war in Afghanistan. Yet the serious failures of our military leaders in these conflicts have escaped almost all notice. No one is pushing those leaders to step back and examine the shortcomings of their institution. These are dangerous developments. Unaddressed, they could lead to further failures in future wars.
Generals are born, and generals are made. The promotion from colonel to brigadier (or one-star) general is one of the largest psychological leaps an officer can take. It is richly symbolic: the promoted officer removes from his or her uniform the insignia of an Army branch (the crossed rifles of infantry, for example, or the tiny triple-turreted castle of engineers) and puts on a single star. As brigadier generals, the newly promoted officers are informed that they no longer represent a part of the Army, but now are stewards of the entire service. They are expected to coordinate and control multiple branches, such as artillery, cavalry, and engineers—that is, to become generalists.
These people are given powers we accord to few: to save and take lives; to advise presidents on our most fundamental national issues; to shape their own institution by deciding how to select and groom their successors.
During World War II, top officials expected some generals to fail in combat, and were prepared to remove them when they did. The personalities of these generals mattered enormously, and the Army’s chief of staff, George C. Marshall, worked hard to find the right men for the jobs at hand. When some officers did not work out, they were removed quickly—but many were given another chance, in a different job. (Ginder, Landrum, and Williams were all given second chances, for instance—and all, to varying extents, redeemed themselves.) This hard-nosed but flexible system created a strong military, not only because the most competent were allowed to rise quickly, but also because people could learn from mistakes before the results became crippling, and because officers could find the right fit for their particular abilities.
In World War II, the firing of a general was seen as a sign that the system was working as planned. Yet now, in the rare instances when it does occur, relief tends to be seen, especially inside the Army, as a sign that the system has somehow failed. Only one high-profile relief occurred during the American invasion of Iraq, and the officer removed was not a general but a Marine colonel. Relief has become so unusual that even this firing made front-page news.
How did this transformation occur? Why has relief become so rare, and our military leadership rank so sclerotic? The nature of the wars the nation has fought since World War II is one reason. Korea, Vietnam, and Iraq were all small, ambiguous, increasingly unpopular wars, and in each, success was harder to define than it was in World War II. Firing generals seemed to send a signal to the public that the war was going poorly.
But that is only a partial explanation. Changes in our broader society are also to blame. During the 1950s, the military, like much of the nation, became more “corporate”—less tolerant of the maverick and more likely to favor conformist “organization men.” As a large, bureaucratized national-security establishment developed to wage the Cold War, the nation’s generals also began acting less like stewards of a profession, responsible to the public at large, and more like members of a guild, looking out primarily for their own interests.
In Vietnam, the consequences of this shift in Army practices became painfully evident. Almost no generals were fired in that war, and those few who were removed were only the top men, ousted by civilian leaders in Washington—generals did not fire other generals. Not coincidentally, appropriate risk-taking diminished (the art of combat pursuit was almost lost in Vietnam), and a “cover your ass” mentality took hold.
These corrosive tendencies were reinforced by a new policy of officer rotation after six months in command, which encouraged many leaders to simply keep their heads down until they could move on—and likewise encouraged superior officers to wait out the tours of bad officers serving beneath them. Instead of weeding out bad officers, senior leaders tended to closely supervise them, encouraging habits of micromanagement that plague the Army to this day. Mediocrity also led to mendacity: Almost forgotten now is that an Army investigation of the 1968 massacre of hundreds of Vietnamese villagers by troops of the 23rd “Americal” Division concluded that 28 officers, including four colonels and two generals, appeared to have committed offenses in covering up the incident. Even after the extent of the massacre and the subsequent cover-up were revealed, Major General Samuel Koster, who had commanded the Americal and who had been implicated in the cover-up, was allowed to remain in uniform for another 23 months, and was never brought to trial (although he was eventually demoted).
The Army famously rebuilt itself after the Vietnam War ended. It improved training; reequipped itself with a new array of tanks, helicopters, and armored vehicles; and, most significant, learned how to live without a draft, relying instead on a more professional “all volunteer” force. These developments, combined with a successful offensive in the 1991 Gulf War, led to a resurgence of American pride in the military, and a newfound veneration for military leaders.