I commend Ian Frazier ("On the Rez," December Atlantic) for telling the story of SuAnne Big Crow. We all need heroes who lift us up by showing us what we can be.
But Frazier's statement "If the Iroquois hadn't resisted the French in the 1600s, the Northeast would be speaking French today; if the Comanche hadn't opposed the Spanish, the American Southwest would now be Mexico" is truly absurd. The French alienated the Iroquois, who became Britain's most important Indian allies. Their control of the Mohawk Valley was an important bulwark against French incursion, but that is it. They did not block the Lake Champlain route that the French used to threaten Massachusetts Colony; and a century later, fighting with the British against the American Colonists, they failed to take Fort Stanwix, much less Boston. In the 1600s the French did inspire raids like the one on Deerfield, but they lacked the resources and the Iroquois lacked the will to form an army that could have overrun New England.
The statement about the Comanche is even more absurd. Like the Apache and the Ute, the Comanche were formidable enemies of Spain. However, every bit of Comanche land passed from Spain to Mexico. The Mexicans lost it in the Texas Revolt and the Mexican War. Both the Iroquois and the Comanche fought heroically for their land and culture, but their struggles in no way produced the results Frazier attributes to them.
Frazier says, "I want to be an uncaught Indian like them," citing Powhatan, Joseph Brant, Red Cloud, and Sitting Bull. But these men were well grounded in their culture and expressed nationalistic preferences to remain so. His fundamental error is the statement "Indians were the original 'free'; early America was European culture reset in an Indian frame." The Indians were never "free" in that way. Like other tribal peoples, they remained bound to their clan, strictly observing the folkways of their ancestors. This precluded the development of a common language and fostered bitter tribal rivalry that made each tribe a neat bite for European expansion. (In our headlong quest for "diversity" we should remember what the lack of a common language cost the Indians.)
The individual freedom we cherish has everything to do with America but almost nothing to do with "the Indian." In 1517, not thirty years after Columbus discovered the New World, Martin Luther challenged the Catholic Church's control over men's souls. By 1776 this challenge had produced a philosophy of the individual naturally free from and equal to all other individuals. This philosophy was European, but Europe had no place for it to take root in the real world.
For several reasons it took root in Britain's American Colonies. First and foremost, the Colonies were distant from the British Crown, and by granting a charter to the first Colonies the King limited his power over them. From Jamestown and Plymouth to the closing of the frontier "the West" has been a place where government was distant and limited. As such, it has always been identified with freedom by whites. In colonial Britain owning land was the criterion for political rights. Cheap land in America transformed its society without changing the premise on which that society was based, simply by vastly increasing the percentage of landowners.
On the frontier, however, a rough egalitarian meritocracy evolved. Here men lived by their abilities or they did not live. For nearly three centuries European Americans went west in search of wealth and freedom. Although the Indians from Squanto onward facilitated the white man's move west, white men did not move west to become Indians. They moved west to practice the freedom promised by their own culture but denied to them because they lacked the wealth to claim it in the East. Thus Frazier's "freedom" belongs not to the Indian whose culture bound him as securely as any European serf but to the white and sometimes black mavericks who slipped their bonds to live as they chose just beyond the reach of government.
After the frontier closed, Hollywood kept its spirit of inclusive egalitarianism alive long enough for America to complete the long uphill passage from the Charters of Jamestown and Plymouth to the 1964 Civil Rights Act. That act finally made law of the ideal. "We hold these truths to be self-evident, that all men are created equal ..." However, that achievement may not stand. The Oglala nationalism evident in Frazier's writing is a troubling indication that those who most need opportunity no longer believe in America as a place where anyone can make it because individual will and skill trump all the arbitrary factors of race, birth, religion, and so forth. "The Rez" is about the size of Kosovo, so who is to say that America can't be balkanized if people no longer believe that the whole is greater than the sum of its parts?
Harry E. Beemer
No one with a shred of common sense, education, and ability to think logically can possibly avoid seeing Indian reservations for what they really are: internment camps for an overrun enemy. Part of our sad history as a country is that as we grew and prospered, we perpetrated atrocities on the native peoples of what is now Mexico and the United States. (Of course, our national immigration policy, our southern border, and the NAFTA-inspired "back flow" of the work-seeking peoples of Mexico are threatening a de facto repossession of vast tracts of the southwestern and southeastern United States.)
The solution to the "Indian problem" is not more reservations but fewer, along with the abolition of the Bureau of Indian Affairs. Such institutions have a vested interest in maintaining the politically correct balkanization of American culture and people. People of Indian extraction will share in our national prosperity only by becoming a part of American culture and working the system to their personal advantage, as other people of color have done. What they worship or smoke, or how they dress at home, is their business; they'll never succeed, though, if they insist on trying to change the mainstream American culture of the twenty-first century with militant insistence on public (that is, school, government, and workplace) accommodations and displays of eccentric or illegal ethnocentric behaviors not shared or valued by the majority of the American population. Reservations are not the way to achieve this, nor are other government-sanctioned "separate but equal" programs; they are doomed to failure absent a complete, naturally evolved (not government- or media-enforced) change in mainstream American culture and its accepted mores.
What a beautiful, inspiring, wonderfully brave act SuAnne Big Crow performed. She is truly an American hero.
F. F. Nakovic
The great Jewish fear in the Holocaust was that the Jewish people would be "vanished." Somehow the Oglala Sioux survive; Frazier helps me understand how. Not many magazines want to help us know these things.
The fierceness of Iroquois resistance prevented the French from expanding their influence southward in the early days of New France, just as the Comanche interfered with the northward expansion of New Spain. Had those two European powers been able to establish themselves more securely and deeply in North America, history might have turned out quite differently. As for Harry Beemer's claim that the Indian's culture "bound him as securely as any European serf," we are of course all bound by our culture. In fact, from a European point of view, Indians made very unsatisfactory serfs, as would-be masters kept finding out to their frustration. To many freedom-loving people, Indians seemed less bound by place, occupation, hierarchy, and material obligations than most Europeans were. The idea of Native American freedom was certainly an important part of the allure of this continent in Europeans' eyes, as many writings of the frontier era make clear.
Allan Hull's letter speaks in the age-old voice of termination and assimilation, policies that have never worked half as well as the confidence in the voice seems to promise. Neither the government nor the media made America into a diverse place -- it just is one, as it was when only Indians lived here, as it was when almost all its believing white people were Protestants of innumerable denominations and sects. America has survived and prospered as a diverse nation, and we should not take out our unjustified fears of balkanization on Indian tribes.
After reading Bruce Katz and Jennifer Bradley's "Divided We Sprawl" (December Atlantic) it occurred to me that if Al Gore and the authors of this article have their way, it could be the end of the American Dream: the average American family could no longer aspire to a nice single-family house on a decent-sized lot in a safe neighborhood. Such a thing would be available only to the well-off. My parents live in a nice house on a nice lot in the suburbs. My wife and I have built three new houses at the edge of the built-up areas of the suburbs, but my children and grandchildren may not be allowed to do the same. Instead they may have to settle for a small townhouse or apartment in a congested area. Within built-up communities that are desirable and safe, land is very expensive, so only the well-off can afford a house, even on a small lot. Restricting expansion into undeveloped land at the edge of the suburbs would drive up the cost of desirable buildable land and of existing houses in established areas. Although Bruce Katz and Jennifer Bradley may prefer living in a small apartment near public transportation and other urban amenities, most Americans would rather have their own houses, even if it means putting up with the problems of suburban sprawl.
Floyd Dunn's response represents a common caricature of metropolitan thinking. Doubtless other people share these views and believe that they themselves benefit, on balance, from sprawl. Yet that is in part because they're not paying the full cost of growth and are generally able to distance themselves from the burdens of poverty and distress in other communities. Metropolitanism is about leveling the playing field between older communities, where we have billions of dollars in public and private investment (and major social challenges), and those communities that are springing up overnight on the suburban fringe. In addition, metropolitanism is not about substituting high-rise apartments in urban neighborhoods for freestanding homes on suburban lots. That is a false choice that ignores the substantial amount of vacant land that still exists in most metropolitan areas and that can accommodate the building of new single-family homes (as well as other housing) for decades to come.
Let us assume, hypothetically, that Charles Morris ("The Health-Care Economy Is Nothing to Fear," December Atlantic) is correct and that the national economy could afford to have 25 percent of our gross domestic product go to health care. America still faces a series of agonizing health-related issues.
Consider: The same dynamics that take health care to 25 percent of our GDP are likely to take it to 30 percent and continue on up. Health-care costs have grown at an average of more than twice the rate of inflation for the past thirty-five years, and such a rate is obviously unsustainable. Sooner or later we must recognize that health care in a technological, aging society has the power to absorb our entire GDP unless we realistically confront what we can afford and what we can't afford. We need to start that dialogue now.
Consider: Twenty-five percent of the GDP would be likely to give us a Taj Mahal medical system surrounded by inadequate roads, infrastructure, schools, parks, and so on. Eighty percent of U.S. health care is paid for by third-party payers (government or insurance), and any system that allows us to explore our real and imaginary diseases without economic pain is unsustainable. Americans clearly want more health care as consumers than they are willing to pay for as taxpayers. Health care cannot be allowed to crowd out all the other needs of a modern society.
Consider: From all we know about the health of societies, health care is not the best way to buy health for a country. Health care and health are not synonymous. The U.S. Department of Health and Human Services points out that of the thirty years of life expectancy we added in the past century, only five were due to allopathic medicine. Historically the great enemy of death and disease has been public health, not medicine. This continues to be true: the biggest threats to our national health today are smoking, drugs, alcohol, bad habits, and bad lifestyles, not inadequate health-care funding.
Consider: Taxpayers currently fund approximately 50 percent of U.S. health care, for the elderly, the disabled, and people on welfare. Spending 25 percent of the GDP on health care would dramatically and unacceptably raise taxes and transfer payments. Under present funding patterns 25 percent would dramatically increase taxes on the young for the benefit of the elderly. We are living longer and having fewer children and must ask, Is it wise to socialize the costs of growing old and charge today's workers, many of whom have no or inadequate health insurance themselves?
Consider: Health-care spending does not exist in a vacuum but competes with other important needs of an aging society, such as income security, long-term care, meals on wheels, respite care, and senior-citizen centers. Most of us would rather buy a decent quality of life than buy the last bit of biological life.
Do we really want to almost double the funding of a system whose important health indicators have fallen further behind those of other developed nations? Study after study has suggested that we have too many specialists, too many hospital beds, too much duplication and redundancy in medical technology, and too much defensive medicine. Our doctors earn a far greater multiple of the average per capita income than doctors in any other country, and to our shame we leave 44 million Americans without health insurance. Polls show that for all our medical miracles, no other developed country would trade its health-care system for ours. George Scheiber, an international expert, has observed, "In comparison with other major industrial countries, health care in the United States costs more per person and per unit of service, is less accessible to a larger portion of its citizens, is provided at a more intensive level and offers comparatively poor gross outcomes."
Charles Morris is articulate and thoughtful, but his arguments only reinforce the status quo in U.S. health care and delay consideration of the hard issues that this nation faces. How do we cover those 44 million Americans for basic health care? Do we really want to give our present system another 11 percent of GDP? What should we pay for collectively and what individually? From a public-policy perspective, America's health-care system is technically brilliant but socially inadequate. America denies more health care to more people than any other developed country. Our aging bodies have the potential to bankrupt our children and grandchildren. We can do a lot but not everything. We face, inevitably, a series of Hobson's choices as we retire the Baby Boomers. The sooner we face those choices the better.
Richard D. Lamm
Increased health-care spending provides a better quality of life for some people. (Just ask my post-operative cataract patients.) However, high levels of spending have failed to affect high urban infant-mortality rates and have not changed overall life expectancy. "Evidence-based medicine" at Group Health of Puget Sound and other closed-panel HMOs is the exception. In fact half the heart-attack patients leaving a "high-quality" (high-cost) cardiac-care unit do not get the beta-blockers that are known to prevent second heart attacks. How does more spending solve a problem like this? Charles Morris advocates a "health-care-based economy," with a shift "toward the public and nonprofit sectors" that could consume 25 percent of GDP. He proposes funding this change by having people "retiring later, working harder, liquidating assets, and borrowing against their houses" and with "steeper payroll taxes." It is simple economics and accounting: using taxes and liquidated assets to fund this health-topia requires that spending decrease in other areas. Should we spend less on schools, environmental cleanup, or magazine subscriptions?
Currently, efficiencies in health care rarely result in lower costs to consumers. Hospitals dominate most markets and are able to avoid price competition. I propose that the health-care sector, instead of taking a growing share of GDP, increase productivity by adopting efficiencies and market mechanisms found in other parts of the economy.
Stephen E. Kraft, M.D.
I am appalled that someone claiming to write an authoritative article on health care in the United States can ignore the basic immoral fact that 44.3 million Americans have no health insurance at all -- in effect are rationed out of the health-care system. Everyone in Canada has access to health care, everyone in the United Kingdom has access to health care, everyone in Japan has access to health care -- and the list goes on.
In 1998 the Organization for Economic Cooperation and Development assembled a list of those countries that make 100 percent of their populations eligible for government-provided office and hospital coverage: Australia, Canada, the Czech Republic, Denmark, Finland, Greece, Iceland, Ireland, Italy, Japan, Korea, Luxembourg, New Zealand, Norway, Portugal, Sweden, Switzerland, and the United Kingdom. Countries providing coverage to more than 90 percent of the population include Austria (99 percent), Belgium (99 percent), France (99.5 percent), Germany (92.2 percent), Hungary (99 percent), and Spain (99.8 percent).
Morris is sanguine about the escalating health costs in the United States, saying that we can well afford them. Did he consult with businessmen alarmed over the rising costs of providing health insurance to their employees? Does he have some special insight into the political views of Congress, which has voted down all suggestions for universal health care since the days of Teddy Roosevelt? Professor Uwe Reinhardt has repeatedly pointed out that Americans already pay the total cost of the health-care system. It is not a "tax" but a distributed financial burden that ultimately falls on the public, and the public does not seem willing to tax itself further for the benefit of the large uninsured population in this country.
In response to the Commerce Department's assertion that "the United States could save about 4 percent of GDP" through health reform, the economist Sherry Glied asks, "What do they want us to spend it on?" The answer, of course, is universal access to health care.
Orville C. Green III, M.D.
Thank you for the timely and stimulating article "The Health-Care Economy Is Nothing to Fear." Charles Morris made a compelling case for medical technology but pointed out that technological innovation may increase the demand for a service faster than it reduces cost per service. When this happens, the total cost for that service increases. I wish to add, however, that technological innovation may also reduce costs for a service faster than it increases demand for a service. When this happens, the total cost for that service decreases. As competition among medical service providers increases, technology innovation by computer engineers will eventually reduce the total health-care bill for our nation.
I also wish to take issue with the argument that escalating health-care costs are not to be feared. As Paul Kennedy's analysis of escalating military spending in The Rise and Fall of Great Powers suggests, when spending grows in one area of the economy, it usually leads to diminished savings and restricted economic growth. For example, food, housing, and medical services each account for about 15 percent of GDP. If Americans were willing to reduce spending on housing and food by a third, health-care costs would reach 25 percent of GDP without any economic impact. How many owners of $300,000 homes would be willing to move into $200,000 homes? How many households would be willing to reduce their $600-a-month grocery bills by $200 a month? My guess is not very many. Most would tap into their savings rather than reduce their standard of living.
James E. Gover
Richard Lamm assumes that I am correct that we can afford to have 25 percent of GDP go to health care, but he then proceeds to argue that this will impoverish us. Back in the 1960s economists said it would be "intolerable" for health care to consume more than 10 percent of GDP. But changes on this scale happen all the time over twenty-year periods. Who twenty years ago would have predicted the enormous sector of the economy devoted to personal computers and video games? And why blame health care, rather than, say, the taste for huge, very expensive, gas-guzzling new cars, for deficiencies in highways and parks? He ends with a cry that "we" should stop the health-care juggernaut. But that train has left the station. The sooner experienced, thoughtful policy analysts like Mr. Lamm face up to it, the better.
Stephen Kraft suggests in his letter that I "advocate" a system of health care that consumes 25 percent of GDP. Not true. I simply forecast it. If spending were rising because of price gouging, it would be a relatively easy problem to fix. Instead prices are probably falling, but spending is going up because new technology is creating many more useful health-care interventions that people want. That fact, coupled with Boomer aging, imparts an irresistible momentum to the continued expansion of the health-care sector. If nothing else, it underlines the importance of expanding the reach of government-supported insurance programs.
I'm afraid that Orville Green read the article he wanted to read, not the one I wrote. He is "appalled" that I ignored the large numbers of uninsured Americans. In fact I said that problem is "scandalous." And he takes Sherry Glied's quote wrenchingly out of context. She is a strong advocate of universal health-care insurance, and was arguing that health-care spending is at least as productive as buying imported video gadgets. As I suggested in the article, the rising standard of basic health care will force the expansion of redistributive, government-subsidized insurance programs, as more and more working families and their employers become unable to pay for insurance on their own. Congress, in fact, has been increasing taxes to pay for health care for more than thirty years, although protesting all the time.
James Gover is surely right that technology will occasionally reduce health-care outlays (note the disappearance of major stomach surgery for ulcers), but in most industries lower costs and better outcomes typically expand markets and increase spending. The economics of hip replacements and personal computers are much the same. And it's just not true that growth in health-care spending will mean cutting back on housing and other consumption goods. If Professor Gover does the math, he'll see that even with economic growth of only one percent or so for the next two decades, health care can grow to 20-25 percent of GDP and the rest of the economy will still grow. There's no plausible scenario that says that as a nation we will have to cut back on other necessities to afford more health care; we just can't increase our spending on other items as fast.
In his review of Ian Hacking's The Social Construction of What? ("Phony Science Wars," November Atlantic), Richard Rorty challenges the orthodox scientific view that "reality has an intrinsic structure that science accurately describes" and defends the alternative view put forward by Thomas Kuhn in The Structure of Scientific Revolutions: "Science might have done as good a job if it had never come up with either quarks or genes.... In this view, scientific theories are tools that do a job. They do it well, but some other tools might perhaps have done the same job equally well." Rorty writes that scientists find this view absurd (Hacking, he says, is "dubious" about it), but doesn't explain why -- nor does Hacking in The Social Construction of What? Let me try to do so.
The reasons have to do with the structure of scientific theories and the way they relate to experience. To understand them we don't need to consider anything as recondite as quarks. Euclid's (and Newton's) theory of physical space, Euclidean geometry, will do just as well. Near the end of the nineteenth century it became clear that Euclidean geometry, considered as a theory of physical space, has two distinct parts: a formal part, which is independent of both experience and any natural language; and an interpretation, which connects the formal part to experience. The formal part consists of abstract geometric definitions, theorems, and proofs, together with abstract logical rules and definitions. In this formal part terms such as "point," "line," and "between" -- and even logical terms, like "and," "or," and "not" -- have no assigned meanings. They do have definitions, but these are not like ordinary dictionary definitions. They tell us not what the terms mean but rather how we are allowed to use them. The intuitive meanings are just mnemonic devices -- concessions to the fact that human brains did not evolve to deal with abstract structures.
The interpretive part of the theory tells a carpenter or a surveyor how to use and test the formal structure -- what, concretely, is meant by such terms as "triangle," "angle," "right angle," "two," and "sum" -- and how to translate formal theorems, represented by meaningless strings of symbols, into testable statements. Unlike the formal part, the interpretive part relies on a natural language. But any natural language will do. And nothing of scientific importance is lost in translation from one natural language into another.
Even today Euclidean geometry fits the experience of carpenters and surveyors perfectly. Until well into the twentieth century it also fit the experience of astronomers perfectly. However, Einstein's theory of physical spacetime makes predictions that differ very slightly from those of Euclidean geometry, and delicate astronomical observations from 1919 onward have shown that deviations from the predictions of Newton's theory do indeed exist and are identical with those predicted by Einstein's theory.
Now we come to a key part of the argument: Although the formal part of Einstein's theory differs profoundly from the formal part of Newton's theory, it nevertheless contains it as a limiting case, approximately valid under specific conditions (like those of interest to carpenters and surveyors). Thus Einstein's theory did not displace Newton's; it swallowed it whole.
Is Einstein's theory the final word? Most physicists think not. But they expect that the formal part of any theory that supersedes Einstein's will contain the formal part of Einstein's theory as a limiting case, approximately valid under specific conditions. It will not displace but swallow Einstein's theory. Even so, physicists do not expect the new theory to resemble Einstein's theory, any more than Einstein's theory resembled Newton's.
These remarks apply to other physical theories -- and, indeed, to the whole tightly interwoven theoretical fabric of natural science. When a strongly confirmed theory like Newton's is displaced by an even more strongly confirmed theory like Einstein's, the old theory lives on as a limiting case of the new theory. The domain of the new theory always includes that of the old theory (Einstein's theory, like Newton's, applies to bounded physical systems like stars and galaxies, but unlike Newton's theory, it also applies to an unbounded system, the universe), even when the two theories are radically different; and the new theory is always at least as accurate as the old theory within their common domain. There seems to be no logical reason why these historical generalizations should hold, but from the time of Archimedes to the present they always have. Moreover, as scientific theories have laid claim to broader and broader domains, they have also become increasingly unified. Maxwell's theory of electromagnetism united previously separate theories of electricity, magnetism, and light (and in the process predicted a new phenomenon: radio waves); quantum physics united physics/chemistry and biology. Again, there seems to be no logical reason why scientific theories should repeatedly merge in this way. (Elsewhere -- in philosophy, for example -- splitting is the rule.) Unless ... unless successive versions of the formal part of natural science -- the part that is independent of natural language and hence of human culture -- are progressively less incomplete views of an ultimate formal structure. This suggestion is, of course, an unprovable speculation; but it accounts for otherwise unexplained scientific and historical facts in a simple and natural way. Until Richard Rorty can give us some idea of what theories that dispense with quarks and genes (but account equally well for the experimental evidence that led scientists to posit them) might be like, it will continue to seem more plausible than the alternative option that "science might have done as good a job if it had never come up with either quarks or genes."
David Layzer's letter lucidly outlines the position that philosophers of science call realism -- a position centering on the claim that successive scientific theories provide "progressively less incomplete views of an ultimate formal structure." Thomas Kuhn and other critics of realism regard the idea that science is gradually closing in on what is really out there as an unhelpful metaphor -- intuitively attractive, picturesque, but non-explanatory. The debate between those who share Kuhn's dismissive attitude toward realism and those who think that the realist's "unprovable speculation" (to use Layzer's words) is essential to science's self-image is central to contemporary philosophy of science.
Re Leonard J. Leff's excellent piece "Gone With the Wind and Hollywood's Racial Politics" (December Atlantic):
In 1979, forty years after the movie opened, the TV producer George Schlatter staged a sort of second edition of his hugely successful Laugh-In, featuring thirteen quite unknown young comedy actors. One memorable skit was a bit from Gone With the Wind. A worried and distracted Scarlett was pacing up and down in Tara's ornate living room, wringing her hands as she waited for Rhett Butler to make an appearance. Finally his booming voice could be heard just offstage: "Scarlett!" She clasped her hands together and cried, "Rhett, my darling!" And in strode the manly, handsome Rhett Butler -- who was black. The part was being played by a fine young comedian named Ben Powers.
As a publicist of some twenty-five years in Hollywood, I have rarely heard a studio audience crack up the way this one did. David Selznick would have given a huge sigh of relief.
Daniel A. Jenkins
Two of the letters in the January Atlantic in response to "The Mystique of Betty Friedan," by Alan Wolfe (September Atlantic), were thoughtful critiques supporting Friedan's work. The third gave exposure to the seventeen-year-old grudge of someone whose hand didn't get shaken at a speaking engagement. The circumstances showed the airport-to-podium schedule to have been very tight. Untold thousands must have been similarly "snubbed" when Jesus, Gandhi, and Martin Luther King Jr. brushed past. (Jesus might have humbled the self-important department chair on purpose!)
Daphne Dunn Wilson
A photograph on page 70 of the January, 2000, issue was incorrectly captioned, because of an error by the photo-stock agency. The rabbi shown in the picture is Rabbi Mordechai Eliyahu.
The Atlantic Monthly; March 2000; Letters to the Editor - 00.03; Volume 285, No. 3; page 6-13.