We regret that Garrison Keillor's new book, from which last month's cover story,"Talk Radio" was drawn, was misidentified on the magazine's cover. The book is ; it was published last month by Viking Penguin.
From Ellen Ruppel Shell's article "Resurgence of a Deadly Disease" (August Atlantic) it is clear that "the international funding picture" for malaria is "dismal" indeed, and that the disease continues to wreck the lives of multitudes everywhere. But if the "roughly $85 million ... of public-sector funds" begrudged to it annually came to only "two cents for every reported case," then more than four billion people would be infected. Elsewhere Shell cites the World Health Organization's estimate that malaria "sickens as many as half a billion" -- a more plausible number, and dire enough.
Christopher S. Mazzara
Ellen Shell mentions the fruitless efforts to find a vaccine to prevent malaria and the near inevitability of contracting the disease in areas infested by the anopheles mosquito. She mentions the use of DDT during infestations in the Second World War, but she never mentions the use of atabrine.
During that war I was in New Guinea and several other South Pacific Islands for more than two years, and I never contracted malaria. Why not? I took atabrine tablets three times a day. Many men in my company contracted the disease, and it became common to say that atabrine did not help. The truth is that a great many of the soldiers refused to take atabrine, either because they did not believe in its effectiveness or because they did not like to take the pill. I believe that none of the men who religiously took the pill contracted the disease.
I would like to know why atabrine is not being used today where malaria is a problem. In time my skin became yellow, but that was better than contracting malaria.
Joseph L. Keller
The article on malaria left me with several questions. If Plasmodium is such a mutable parasite, greater success in combating malaria might be had by focusing on the insect vector of the disease. Yet the opinions of entomologists were not prominent in the article; the only entomological solution discussed was the failed, ecologically disastrous DDT campaign (and similar programs) of some forty years ago. Admittedly, entomology does not have the cachet of medical research, and entomologists do not have the political clout or charisma to elicit billions of dollars to investigate troublesome bugs. But surely there are ways other than insecticides to control or eliminate malarial mosquitoes.
For example, only females are blood feeders, while the males derive their nourishment exclusively from plants. Do the males of malarial species eat from only certain plants? If so, can these plants be eliminated? Are they repelled by any plants -- and, more critically, do the females also find these plants repellent? Perhaps one of these could serve as the basis for an earth-friendly mosquito repellent, one that could be developed into a lotion or even a soap. Or sterile males or females could be introduced en masse into the habitat to decrease the population of malarial mosquitoes. Since not every species of mosquito is capable of serving as a vector of malaria, large numbers of "malaria-resistant" mosquitoes or other insects could be released to compete with the malarial species in the same ecological niche without causing any decrease in the insectivore population that is currently feeding on malarial mosquitoes. Is there any method that could enhance the success of species that are now predators of malarial mosquitoes? A program that combined several earth-friendly strategies against mosquitoes might prove cheaper and more effective than the glamorous but elusive vaccine against the parasite.
Christopher Mazzara is correct. The accurate figure is close to twenty cents -- not two cents, as reported. My larger point, however, holds true: the amount of money spent on malaria cases worldwide is appallingly low.
Atabrine, the brand name for quinacrine or mepacrine, was introduced in 1932 and was the first synthetic anti-malaria drug. It was, as Joseph Keller recalls, used in the Second World War when quinine was in short supply. (The Japanese, as I mentioned in my article, stockpiled quinine during the war.) The drug is no longer used to prevent malaria because of its toxic effects, which include gastrointestinal disorders, skin loss, and disturbances in the central nervous system. Prolonged use can result in blood disorders and psychological disturbances. The drug's most obvious side effect is a yellow discoloration of the skin and urine. Today other, equally successful anti-malaria drugs that do not cause such side effects are available.
Regarding Mary Knight's letter: According to Professor John Edman, an entomologist at the University of Massachusetts, "the sterile-male technique has been used successfully against screw worm flies and fruit flies and has been demonstrated in field studies to work against mosquitoes. However, mosquitoes are so much more numerous that a Herculean effort would be required to release enough sterile males over an entire continent like Africa." Such a plan "might be feasible on smaller islands, but would still be very expensive."
Of the two entomological options for malaria control currently being field-tested or researched with great vigor, one is the use of bed nets (which I discussed in detail in my article). The other, says Professor Edman, is "flashier and involves engineering mosquitoes to be, for example, resistant to the malaria parasite -- and then releasing them into the wild population. The practicality and sustainability of such an approach is hotly debated among entomologists and geneticists, but it is attracting a large share of the research money devoted to research on biological vectors."
Deplorable and baffling as the race-gender-class approach to literature is, I find equally baffling Frank Kermode's notion that literature can and should be read in a kind of neutral trance "The Academy vs. the Humanities" (August Atlantic). Kermode says nothing of admiration. While objecting to the political agenda of the likes of Fredric Jameson and Stanley Fish, critics who should be read with disgust, Kermode apparently believes that Ezra Pound should not be so treated simply because he was a poet. The fact is, however, that Pound was nothing if not political, and, indeed, insisted that his fascist, racist, anti-Semitic filth was inextricable from his art. How, then, read such a person without disgust and disagreement? By Pound's own admission his art and his perverted ideas were one. So long as a poet's agenda is anything but art for art's sake, devoid of ideas of any consequence, he should no more escape moral and intellectual scrutiny than should the one-note race-gender-class critics. To suppose that how something is said is more important than what is said is as absurd as the idea that "the ultimate purpose of all study is political." If Pound is to be ultimately judged by his art, it follows that Jameson and his ilk should be judged by their rhetoric. Although art should be judged primarily on artistic grounds, reason demands that one first determine whether the thing to be judged is art, propaganda, or just plain trash. Humanities teachers who can't or won't make such distinctions are not worthy of their calling. I suspect that had a sufficient number of English professors had the courage and insight to separate the wheat from the chaff, instead of simply teaching literature as literature in a safe, neutral vacuum, the race-gender-class crowd never would have gained a foothold. Whether Pound was a great poet is a matter of opinion. That much of his work is hateful, obscene propaganda is a matter of fact. As such it is no more deserving of a place in the humanities than the mindless maundering of the devotees of political correctness.
Another look at what I wrote might persuade John Hendrickson that he is wildly wrong to say that I recommended reading without disagreement. It is true that I don't recommend instant disgust, which seems to be something he relishes. The point I was trying to make, which he was apparently too disgusted to grasp, is surely very elementary: in certain respects there will always be disagreement. If you read Lucretius, you will be considering views on physics and religion with which you cannot possibly agree; few find Milton's theology acceptable; and Dante can be in various ways ethically or politically distasteful. All three nevertheless remain great poets in the opinion of most who can read them. Hendrickson's comment could give the impression that I discussed Ezra Pound at length, possibly in some kind of a trance; let me point out that he was mentioned only in passing in my review. However, since the matter has been raised, I will certify that I, too, deplore Pound's fascism and anti-Semitism. Nevertheless, as it happens, I think Hendrickson quite wrong to suggest that everything Pound wrote should evoke principled disgust in all honest readers. He seems unable to disagree without feeling disgust, but I can hardly be expected to regard that disability as my problem.
In "The Computer Delusion" (July Atlantic), Todd Oppenheimer uses two anecdotal stories to support his argument that some educational software, simulation programs in particular, "may be of questionable relevance." To back up this statement he argues that simulations are simplified and "built on hidden assumptions."
Maxis -- the maker of the two programs he uses as examples -- would be the first to agree with his statement about simplification. Even the most complex simulations can't come close to matching the complexity of the real world. Simulations, though, are very accurate on a broad scale and give users an opportunity to observe and experiment in otherwise impossible ways.
A child can dissect a frog in a classroom to learn about its structure or take a field trip to observe a local habitat. With SimPark, a simulation, players learn about ecosystems by creating one. Players pick plants and animals to fill their park. Those who build viable food webs and take into account the local climate will succeed. If their parks fail -- perhaps species diversity was too low, or there weren't enough trees for birds to nest in -- players usually return to find out why until they get their parks just as they envisioned them.
Far from being software that "fosters passivity, ultimately dulling people's sense of what they can change in the world," as Oppenheimer states, simulations give kids a sense of power and a feeling that they can make a difference.
Most good educators realize that the best learning takes place in a multi-faceted environment, with a variety of powerful teaching materials. Technology-haters like Clifford Stoll reject computers for education because "no computer can teach what a walk through a pine forest feels like." Of course not, although I'd argue that the filmstrips and videos Stoll maligns so much do a better job of capturing the experience than a teacher lecturing at a blackboard. Nobody is arguing for replacing real experiences with computerized ones. But when it comes to a study of outer space or the Australian outback, the real thing isn't always easy to visit. That's where video footage, or an E-mail exchange with an astronaut or a group of Australian schoolchildren, can improve on what kids might learn from a print encyclopedia alone.
It's ludicrous to blame word processors for the failure of schoolchildren to link ideas or develop relationships in their writing, just as it makes no sense to reject Internet research because some of what young people find on the Net is garbage. In fact, these examples just illustrate that it is imperative for our schools to teach children (and teachers) to be discriminating citizens of the information age -- to ask good questions and refuse to be fooled by surface appearances.
Technology & Learning
San Francisco, Calif.
Todd Oppenheimer confuses the report McKinsey & Company produced for the National Information Infrastructure Advisory Council, "Connecting K-12 Schools to the Information Superhighway," with the report produced for the Clinton Administration by the council itself.
As its introduction clearly states, our report was independently prepared as a submission of information to the advisory council and as such does not represent the views or recommendations of the council or its members. We explicitly played the role of impartial fact-finder in conducting a detailed analysis of the economics of various options for deploying computer-based infrastructure in schools.
Oppenheimer dismisses our statement that the report "does not attempt to recommend specific public policy goals" or to "evaluate the relative merits of competing demands on educational funding." We deliberately did not take a stand on these issues in this report, although public policy and school-budgeting choices are clearly critical decisions. Instead our aim was to help the task force -- which was charged with making such policy recommendations -- to make better-informed decisions.
Oppenheimer also ignores our report's main conclusion: that with relatively modest increases in spending, a variety of educational benefits could be attained from the use of computers in classrooms -- provided that technology investments are closely tied to teacher professional development, clear educational performance goals, support for school administrators in planning and budgeting, and increased awareness among community leaders on how to help schools and hold them accountable for results.
The value of simulations like SimPark is questioned not only by me but also by several independent authorities on education and technology. The simplifications in the programs that Robin Harper describes should not be lightly dismissed. When children use simulations, the subtlety and unpredictability found in real life -- complications that can be honestly studied in real-world inquiries -- generally get distorted or ignored. This creates flawed experiments and bad lessons. That's what I and others mean by the dangers lying in simulations' hidden assumptions.
Judy Salpeter is partly right. Filmstrips, E-mail, the Internet, and even an occasional simulation can be helpful when used carefully and skeptically. Unfortunately, too many schools frantically gather all the machinery they can and then corrupt their curricula to squeeze it in. In truth, these technologies don't diminish the need for other study -- they intensify it, while crowding out time for it. Even good programs cause problems simply by accumulation. Technology creates an addiction to gear, as the demands for constant upgrades, maintenance, and training swallow ever larger amounts of time and money. This often sidelines old-fashioned but time-tested forms of study, particularly those that attend to human values over mechanical ones.
Michael Nevens and Margot Singer may call their document an impartial accumulation of facts. But the McKinsey report that the Clinton Administration received and circulated does not read that way. Each chapter emphasizes various ways to get more computers into classrooms, while virtually ignoring the many negative studies and problematic consequences involved.
I read with interest the article "Zero" by Dick Teresi, in the July Atlantic. Teresi raised many interesting points, among them one I hadn't thought about before, as I am not an astronomer or a historian -- namely, the difficulty that people in these two professions have in calculating years over the zero point.
I have trouble conceiving of a year named "zero." Sure, NASA launches rockets at zero, but what do they call the second immediately before or after the launch? The zero'th second? I don't think so. They probably call it the first second. Likewise, the first mile on Teresi's fine German automobile would be the first mile (or mile one), and not the zero'th mile (or mile zero). Would a man introduce his second wife as his first wife, saying, "This is my new wife, June. She loves me so much -- unlike my zero'th wife, May"?
To my way of thinking, leaving things as they are makes the most sense. Zero is a void, nothingness; how can it be used to designate a period of time? The Maya thought that zero marked both the beginning and the completion of a cycle, not a segment of time. A line contains an infinite number of points. One of those points is the zero point, a mark of the beginning of the A.D. cycle and the completion of the B.C. cycle. Jesus can be found here, too, for he said, "I am the Alpha and the Omega."
Karl F. Meyer
Dick Teresi has hopelessly confounded cardinal and ordinal numbers, and his "problem" of the millennium dissolves when these concepts are properly understood.
Ordinal numbers, which the Gregorian calendar uses, indicate sequence. Thus "A.D. 1" (or the first year A.D.) refers to the year that begins at the zero point and ends one year later. Think of a carpenter's ruler, if you will; the first inch is the interval between the edge and the one-inch mark. Thus the millennium will end with the passing of the two-thousandth year, not with its inception.
Cardinal numbers, which astronomers use in their calculations, indicate quantity. Zero is a cardinal number and indicates a value; it does not name an interval. Thus "zero" indicates the division between B.C. and A.D., not the interval of the first year before or after this point. Continuing with our example, put two rulers end to end: although there is a zero point, there is no "zero'th" inch.
As it stands now, we refer to years with ordinal numbers and to ages with cardinal numbers. Thus a child less than a year old is usually said to be so many weeks or months old, rather than "zero years old." If we changed over to this system for our calendar (referring to the age of our era, rather than to the order of the year), then there would be "zero years" for both A.D. and B.C.! That is to say, the last twelve months before the birth of Christ and the first twelve months after the birth of Christ would be the years 0 B.C. and A.D. 0 respectively.
R. M. Schultz
Dick Teresi does a good job of explaining the history of the zero but a bad job of understanding the concept. Zero is a placeholder on a continuum. If I have no money but owe no money, you cannot describe this situation without a zero. You cannot say that I have $1 (+1) or that I owe $1 (-1).
The zero in counting time refers to the starting point. When I was born (or in some cultures when I was conceived) is the zero point. When I have lived 365 days, I am not zero years old. I have lived one year. I am one year old. To believe otherwise is to believe that zero equals one.
In the case of Jesus Christ (disregarding the four-year error made by Dionysus), the day he was born is zero. The day he was born, he began the first year of his life, not the zero year of his life. The year before he was born was the first year before Christ, not the zero year before Christ. Teresi's arguments about Kings' reigns are sophistry, not mathematics.
Karl Meyer's logic -- that zero is "nothingness," and therefore cannot designate a period of time -- is classic Greek logic, which, however, is not consistent with sound mathematics.
Mr. Meyer rejects zero by subscribing to a time line in which zero is not assigned to any increment, just as in the Bede-Dionysus B.C. and A.D. numbering scheme. The increment (year) to the right of zero, Meyer says, is 1, and to the left is 1 B.C. (or -1).
Zero in this scheme becomes a numerical "continental divide," the numbers flowing east and west, as it were, and zero itself being deprived of its own increment. The flaw in this logic can best be seen by substituting another year for 0, say 1997, on his time line. Now all other years flow east and west. To the left of 1997 we number the increment 1996; to the right 1998. Voilà! I've made 1997 disappear. Using the above scheme we can "disappear" any year.
So where would we put a year 0? Mathematically, if one is to label increments on a number line (as is done on a calendar), the increments take on the number of the line to their left or right. Which you choose doesn't matter, but you must be consistent. The number-line increments below are numbered to the left. Jacques Cassini renumbered the calendar this way, renaming 1 B.C. year 0, 2 B.C. year -1, and so on.
The mistake Mr. Meyer has made is one of inconsistency. He switched his numbering system from right to left when he hit zero. Zero thus gets no increment. This is wrong. Zero is entitled to all the rights and privileges of the other integers. The Meyer line mirrors the Venerable Bede's calendar, but we can let Bede off the hook, because he had no 0 to play with.
R. M. Schultz contends that the numbers of the years are ordinal numbers, not cardinal numbers. This is true. That's the point. Ordinals are numbers in ordered succession, and what we are trying to do is put the years in an ordered sequence. It is true also that zero isn't normally considered an ordinal except in rare cases (zero'th). Neither are negative numbers. When Cassini replaced B.C. years with negative years, I suspect he was just trying to make the best of a bad situation, using negative years as a means of distinguishing them from A.D. years. He could have called them "left" and "right" or "Fred" and "Ethel." Again, ordinal numbers denote ordered succession. The Bede-Dionysus calendar violates order by skipping a number, zero.
Concerning Bill Baird's contention that zero is merely "a placeholder on a continuum": the Babylonians are credited with being the first (zero'th?) civilization to use a placeholder, but the actual naming of zero didn't take place until the first century A.D., in India and in the Mayan empire. According to the number theorist Tobias Dantzig, many cultures had counting boards -- abacuses and similar devices -- that used place systems represented by columns of numbers. But mathematics did not advance significantly until the "empty column," as Dantzig called it, was given a symbol, a name: zero.
I n her reply to Mary Vassar Hitchings (Letters, August Atlantic), Katha Pollitt ridicules Ms. Hitchings's suggestion that arguments for legalized abortion could be used to support euthanasia for the sick and the old. Yet this year, when the Supreme Court was called upon to decide whether there was a constitutional right to assisted suicide, proponents of such a right relied heavily on Roe v. Wade and Planned Parenthood v. Casey, the cases in which the Court found and reaffirmed the right to abortion.
and Casey proclaimed that private, autonomous choice over fundamental questions of life and death was better for both the woman (who would decide what burdens she could bear) and the entity being killed (who would be spared a poor quality of life if unwanted or disabled). Similarly, proponents of assisted suicide argued that the alleged right was necessary both to protect families from the burden of caring for the patient and to protect the patient from a poor quality of life.
The same errors corrupt this argument in each instance. First, the private choice-maker may choose death for another in bad faith, out of a selfish desire to be unencumbered by unchosen duties to others. Second, and more important, the emphasis on private choice allows society to take a cheap way out: instead of investing time and money to alleviate the problems that motivate such a terrible decision -- investing in jobs, welfare, pain management, health care -- we give people the "right" to end lives that burden themselves or others.
Since feminists like Pollitt uncritically accepted this cop-out as a solution to women's problems, abortion, rather than public assistance and private responsibility, has become the accepted cost-effective solution to unwanted pregnancy. Now the same bad arguments, for the same reasons, are being advanced to justify similar abandonment of the sick and the old. Luckily, the Court has learned from its mistakes. Why hasn't Pollitt?
Jendi B. Reiter
Whatever one thinks of assisted suicide -- which, Jendi Reiter notes, has been defended on grounds drawn from abortion-rights decisions -- it is not the same as euthanasia, much less the outright murder Ms. Hitchings posited arranging for her aged parents. To equate these acts with abortion is possible only if you believe that the fertilized egg, the embryo, and the fetus are the moral equivalent of your mother. But this is exactly the point at issue, isn't it? Millions of Americans have had or helped others to have abortions, when doing so was legal and when it was not. Hardly any have killed their parents, or anyone else. This suggests to me that having an unwanted pregnancy is not very much like being the adult child of an elderly parent, is not experienced as such by those who actually face these situations, and is not so regarded by American society.
Jendi Reiter suggests that legalizing abortions permits society to avoid addressing the reasons why women have them. Yet, interestingly, the countries that do the most for women and children -- the European social-welfare states -- all permit abortion, and even pay for it in their nationalized health plans. The countries that do the least -- most Latin American nations, for example -- forbid it. In our own country the criminalization of abortion did not lead to caring social policies in the hundred years it was in force. Why should it do so now?
The Atlantic Monthly; November 1997; Letters; Volume 280, No. 5; pages 10-23.
We want to hear what you think. Submit a letter to the editor or write to firstname.lastname@example.org.