"Over all the years that followed, I found myself thinking from time to time of that picture, my hand over the baby's mouth. I knew then, and I still think now, that the right thing to do would have been to kill that baby."
Like a morbid time capsule from the mind of an elder New York psychiatrist, a report surfaced this week in which Dr. Fredric Neuman essentially confesses to criminal breaches of medical ethics. His essay "The Cyclops Child," which appeared on the website of the journal Psychology Today, recounts the dishonesty and cruelty surrounding the brief existence of a child with severe birth defects. Compounding the offenses detailed in the story itself, Dr. Neuman uses dehumanizing terminology -- referring to the infant as a "monster" -- and focuses disproportionately on the hardship endured by hospital staff, as opposed to the dying child (or "it," according to the author). An anachronism of some unadulterated views from a voice of the medicinal community circa 1960, "The Cyclops Child" is an offending document in itself.
I will summarize here the critical facts resurrected by Dr. Neuman, but please do read his entire essay as well. While he withheld the specifics, I estimate that the hospital was St. Vincent's (now closed), and the year was 1959 or 1960 (based on my review of his curriculum vitae). The essay -- which catalogues what would today be considered kidnapping, assault and possibly murder -- shocks our modern sensibilities. "As a person with disabilities, I find this entire post chilling," one of Dr. Neuman's readers wrote. "I hope the NY state medical board investigates the physicians involved and takes appropriate action," chimed in another.
Here's what happened. A mother gave birth to an infant with a fatal developmental defect called holoprosencephaly. Infants born today with this condition are almost unheard of, as most women will opt to abort the pregnancy when the condition is identified on an early ultrasound. Fifty years ago, women didn't have this option. They were treated by obstetricians who felt that they operated on a higher plane than the rest of us, paternalistically keeping information from patients and limiting options however they saw fit. In this case, the obstetrician decided the parents should not know that the baby was born with the condition. Instead, he and the rest of the team lied to the parents, telling them their baby was dead.
A word about this baby's condition, holoprosencephaly. As humans develop in the womb from a bundle of cells into distinct tissues and organs, the nervous system emerges from a structure called the "neural tube." In rare instances that doesn't form appropriately. In the case of holoprosencephaly, a defect in the neural tube occurs at the head, and various midline structures like the brain, eyes, and mouth may not fully form. In this case, Dr. Neuman describes the deformity using "cyclops" -- which is actually a valid medical term, but is used here outside of the appropriate pathological context -- in describing eye tissue that did not separate into two distinct eyes. It is disturbing that the term holoprosencephaly never occurs in his essay.
The baby is treated as an object and given no gender, referred to as "it." In the events that followed there is no indication that he received any palliative treatment, as would be the standard today (comfort care, including pain management).
The hospital staff expected and hoped that the newborn would soon pass away, but he did not. They left the child ignored in the back of the hospital nursery. Doctors and nurses waited for him to starve. An excruciating death watch followed that dragged on for about 13 days, as Dr. Neuman notes in the comments section of his piece. The child's cries anguished nursery staff who kept the dark secret. Dr. Neuman wrote:
"There was a price to be paid. Dying though it might be, the staff still had to tend to it, to change it, to clean it, to hold it in repeated attempts to comfort it. The baby was suffering, and so was everyone else. Earlier, I had caught an aide crying. A couple of nurses had stayed home that day. It was at that point that I began to think about killing the baby."
Dr. Neuman did not kill the baby. But he did torture him at the direction of his senior resident, who asked him to practice a finger amputation procedure on the child:
"The way you treat a baby's extra fingers is to tie a ligature, a string, as tight as you can around the base of the finger. The blood supply is cut off, and after a while the finger falls off.
When I went over to the baby, it was lying quietly in its bed. It did not object when I picked up its hand. But when I tied the ligature around its finger and pulled tightly, it screamed."
The newborn finally died. The parents of the child never knew of the suffering or the needless procedure. Dr. Neuman still believes he should have euthanized the child:
Over all the years that followed, I found myself thinking from time to time of that picture, my hand over the baby's mouth. I knew then, and I still think now, that the right thing to do would have been to kill that baby. It wasn't really a baby; it just sounded like a baby--that's what I tell myself. But I would like to stop thinking about it. After all, the whole thing happened over fifty years ago.
I'd compare Dr. Neuman's sickening tale to the work of Edgar Allen Poe, except that Dr. Neuman has not written a piece of creative fiction. This is the truth, we're told. So, has Psychology Today just published potential evidence in a trial for murder?
That's possible, but in no way probable, says Professor Martin Guggenheim of the New York University School of Law. Despite the fact that the statute of limitations doesn't run out on homicide, Guggenheim can't imagine a city prosecutor being interested in the case today. "Particularly because St. Vincent's is no more, I'd be more than a bit surprised if a prosecutor would do anything about this," Guggenheim says. All the other crimes - the kidnapping, the assault, the lies - are far too dated to be actionable.
Disability scholar Rebecca Garden, who teaches medical bioethics at Upstate Medical University, points out that despite the prevailing 1960's attitudes in this essay, deciding when life is worth living is still a contested issue. Disability rights advocates are still dealing with this on a daily basis. In this context, she feels Dr. Neuman's blog post is "distressing on many levels."
"This piece seems to be a complex and conflicted mix of confession, provocation, and defense or apologia," Professor Garden told me. Parts of "The Cylops Child" are written in the present tense. There is a passage where Dr. Neuman suggests that an obstetrician could smother such a baby. His observation that "such things happen" isn't confined to the past, Garden observes.
How can we fathom Dr. Neuman repeatedly describing this child as a monster? According to Laurence McCullough of the Center for Medical Ethics and Health Policy at Baylor College of Medicine, we're merely witnessing equally valid discourse from another era. Our modern scientific understanding that such developmental anomalies are errors of reproductive development derived from our evolutionary biology carries little human meaning, Prof. McCullough points out. A monster was considered "a portent sent by the Gods to punish transgression."
At least that's something people can somehow grasp and justify. "What may, at first, strike us as a wrong-headed or even repellent discourse of the past... turns out to have a distinct advantage over our own," McCullough says.
Though I would like to think of "The Cyclops Child" as a dusty artifact, it nonetheless appeared on my computer in 2012, from the mind of person living contemporaneously. I find myself trying to construct a narrative around it, to explain and contain it. Maybe Professor McCullough is right that "human scale" explanations at least offer us a framework to comprehend the things that distress us.
Alright, then. I'll believe the essay is a monstrosity published by the Gods to punish one doctor's fifty-year-old transgression.
Dean of Students John Ellison gets an A for initiative, a B-minus for execution, and extra-credit for stoking a useful debate.
When I was a heretical student at a Catholic high school deciding where to apply to college, I thrilled at the prospect of an educational institution where free inquiry would reign supreme and forceful debate would never be hemmed in by dogma.
A letter like the one that University of Chicago Dean of Students John Ellison sent last week to incoming first-year students––reminding them of the school’s “commitment to freedom of inquiry and expression," and affirming that those admitted to it “are encouraged to speak, write, listen, challenge, and learn, without fear of censorship”––would have struck me as a glorious affirmation: that robust intellectual communities truly did exist; that I would finally be free to follow my brain; that college would be a crucible that tested the strength of all my beliefs.
What do we actually know about the candidate’s health?
Cameras rolling, Manhattan gastroenterologist Harold Bornstein was confronted last week with a letter that carried his signature. In that letter, the writer “state[d] unequivocally” that Donald Trump “will be the healthiest individual ever elected to the presidency.”
Donald Trump would be the oldest individual ever elected to the presidency. He sleeps little and holds angry grudges. He purports to eat KFC and girthy slabs of red meat, and his physique doesn’t suggest any inconsistency in this. His health might be fine, but a claim to anything superlative feels off.
Bornstein might have jumped on that opportunity to get out of this mess—to say that Trump had dictated the letter, and Bornstein only signed it. Or that Trump had at least suggested phrases. Because it’s not just the facts of Trump’s life that don’t add up, but the linguistics of the letter.
The San Francisco quarterback has been attacked for refusing to stand for the Star Spangled Banner—and for daring to criticize the system in which he thrived.
It was in early childhood when W.E.B. Du Bois––scholar, activist, and black radical––first noticed The Veil that separated him from his white classmates in the mostly white town of Great Barrington, Massachusetts. He and his classmates were exchanging “visiting cards,” invitations to visit one another’s homes, when a white girl refused his.
“Then it dawned upon me with a certain suddenness that I was different from the others; or like, mayhap, in heart and life and longing, but shut out from their world by a vast veil. I had thereafter no desire to tear down that veil, to creep through; I held all beyond it in common contempt, and lived above it in a region of blue sky and great wandering shadows,” Du Bois wrote in his acclaimed essay collection, The Souls of Black Folk. “That sky was bluest when I could beat my mates at examination-time, or beat them at a foot-race, or even beat their stringy heads.”
How will the show maintain its charm while unraveling its mysteries?
Stranger Things will return in 2017 for a second season with nine episodes by original writers/directors Matt and Ross Duffer, Netflix announced today. The news is about as unsurprising as, say, the idea that four Dungeons and Dragons-playing nerds in 1983 would be bullied at school. But it’s also an intriguing development—not unlike the revelation of an alternate dimension that resembles our own but has unfriendly plant-headed monsters roaming about.
The first eight episodes of the nostalgia-soaked sci-fi saga became the unpredicted breakout pop-culture conversation piece of summer 2016, spawning memes online and faux funerals in real life. Netflix doesn’t reveal viewership numbers, but this week the independent data-measurement company Symphony Advanced Media estimated that the series drew an average of 14.07 million adults age 18-49 in the first 35 days of streaming. That would make it the second most-watched Netflix original of 2016, just behind Fuller House and the latest Orange Is the New Black season, both of which (unlike Stranger Things ) arrived with established fan bases. Netflix’s business model relies on shows doing exactly what Stranger Things has done: draw buzz to lure subscribers.
The talk-radio host claims that he never took Donald Trump seriously on immigration. He neglected to tell his immigration obsessed listeners.
For almost a decade, I’ve been angrily documenting the way that many right-wing talk-radio hosts betray the rank-and-file conservatives who trust them for information. My late grandmother was one of those people. She deserved better than she got. With huge platforms and massive audiences, successful hosts ought to take more care than the average person to be truthful and avoid misinforming listeners. Yet they are egregiously careless on some days and willfully misleading on others.
And that matters, as we’ll come to see.
Rush Limbaugh is easily the most consequential of these hosts. He has an audience of millions. And over the years, parts of the conservative movement that ought to know better, like the Claremont Institute, have treated him like an honorable conservative intellectual rather than an intellectually dishonest entertainer. The full cost of doing so became evident this year, when a faction of populists shaped by years of talk radio, Fox News, and Breitbart.com picked Donald Trump to lead the Republican Party, a choice that makes a Hillary Clinton victory likely and is a catastrophe for movement conservatism regardless of who wins.
Richmond was once the epicenter of black finance. What happened there explains the decline of black-owned banks across the country.
On April, 3rd, 1968, Martin Luther King Jr. gave his famous “I’ve Been to the Mountaintop” speech in Memphis. In it, he urged African Americans to put their money in black-owned banks. It wasn’t his most famous line, but the message was clear: “We’ve got to strengthen black institutions. I call upon you to take your money out of the banks downtown and deposit your money in the Tri-State Bank. We want a ‘bank-in’ movement in Memphis … We begin the process of building a greater economic base.”
The next day, King was assassinated, and his hope of harnessing black wealth remains unfulfilled. Before integration, African Americans in cities like Richmond, Chicago, and Atlanta relied on black community banks, which were largely responsible for providing loans and boosting black businesses, churches, and neighborhoods. After desegregation, black wealth started to hemorrhage from these communities: White-owned banks were forced to open their doors to African Americans and the money that once flowed into black banks and back out to black communities ended up on Wall Street and other banks farther away.
Practices meant to protect marginalized communities can also ostracize those who disagree with them.
Last week, the University of Chicago’s dean of students sent a welcome letter to freshmen decrying trigger warnings and safe spaces—ways for students to be warned about and opt out of exposure to potentially challenging material. While some supported the school’s actions, arguing that these practices threaten free speech and the purpose of higher education, the note also led to widespread outrage, and understandably so. Considered in isolation, trigger warnings may seem straightforwardly good. Basic human decency means professors like myself should be aware of students’ traumatic experiences, and give them a heads up about course content—photographs of dead bodies, extended accounts of abuse, disordered eating, self-harm—that might trigger an anxiety attack and foreclose intellectual engagement. Similarly, it may seem silly to object to the creation of safe spaces on campus, where members of marginalized groups can count on meeting supportive conversation partners who empathize with their life experiences, and where they feel free to be themselves without the threat of judgment or censure.
Education experts offer their thoughts on how—if at all—schools should assign, grade, and use take-home assignments.
This is the third installment in our series about school in a perfect world. Read previous entries here and here.
We asked prominent voices in education—from policy makers and teachers to activists and parents—to look beyond laws, politics, and funding and imagine a utopian system of learning. They went back to the drawing board—and the chalkboard—to build an educational Garden of Eden. We’re publishing their answers to one question each day this week. Responses have been lightly edited for clarity and length.
Today’s assignment: The Homework. Will students have homework?
Rita Pin Ahrens, thedirector of education policy for the Southeast Asia Resource Action Center
Homework is absolutely necessary for students to demonstrate that they are able to independently process and apply their learning. But who says homework has to be the same as it has been? Homework might include pre-reading in preparation for what will be covered in class that day, independent research on a student-chosen topic that complements the class curriculum, experiential learning through a volunteer activity or field trip, or visiting a website and accomplishing a task on it. The structure will be left to the teachers to determine, as best fits the learning objective, and should be graded—whether by the teacher or student. Students will be held accountable for their homework and understand that it is an integral part of the learning process.
A Hillary Clinton presidential victory promises to usher in a new age of public misogyny.
Get ready for the era of The Bitch.
If Hillary Clinton wins the White House in November, it will be a historic moment, the smashing of the preeminent glass ceiling in American public life. A mere 240 years after this nation’s founding, a woman will occupy its top office. America’s daughters will at last have living, breathing, pantsuit-wearing proof that they too can grow up to be president.
A Clinton victory also promises to usher in four-to-eight years of the kind of down-and-dirty public misogyny you might expect from a stag party at Roger Ailes’s house.
You know it’s coming. As hyperpartisanship, grievance politics, and garden-variety rage shift from America’s first black commander-in-chief onto its first female one, so too will the focus of political bigotry. Some of it will be driven by genuine gender grievance or discomfort among some at being led by a woman. But in plenty of other cases, slamming Hillary as a bitch, a c**t (Thanks, Scott Baio!), or a menopausal nut-job (an enduringly popular theme on Twitter) will simply be an easy-peasy shortcut for dismissing her and delegitimizing her presidency.
In its early days, the first English settlement in America had lots of men, tobacco, and land. All it needed was women.
“First comes love, then comes marriage,” the old nursery rhyme goes, but historically, first came money. Marriage was above all an economic transaction, and in no place was this more apparent than in the early 1600s in the Jamestown colony, where a severe gender imbalance threatened the fledgling colony’s future.
The men of Jamestown desperately wanted wives, but women were refusing to immigrate. They had heard disturbing reports of dissension, famine, and disease, and had decided it simply wasn’t worth it. Consequently, barely a decade after its founding in 1607, Jamestown was almost entirely male, and because these men were unable to find wives, they were deserting the colony in droves.
An immediate influx of women was needed to save the floundering colony; its leaders suggested putting out an advertisement targeting wives. The women who responded to this marital request and agreed to marry unknown men in an unfamiliar land were in a sense America’s first mail-order brides.