So the theory has it that the universe expanded exponentially from a point, a singular space/time point, a moment/thing, some original particulate event or quantum substantive happenstance, to an extent that the word explosion is inadequate, though the theory is known as the Big Bang. What we are supposed to keep in mind, in our mind, is that the universe didn't burst out into pre-existent available space, it was the space that blew out, taking everything with it in a great expansive flowering, a silent flash into being in a second or two of the entire outrushing universe of gas and matter and darkness-light, a cosmic floop of nothing into the volume and chronology of spacetime. Okay?
And universal history since has seen a kind of evolution of star matter, of elemental dust, nebulae, burning, glowing, pulsing, everything flying away from everything else for the last fifteen or so billion years.
But what does it mean that the original singularity, or the singular originality, which included in its submicroscopic being all space, all time, that was to voluminously suddenly and monumentally erupt into concepts that we can understand, or learn-what does it mean to say that ... the universe did not blast into being through space but that space, itself a property of the universe, is what blasted out along with everything in it?
What does it mean to say that space is what expanded, stretched, flowered? Into what? The universe expanding even now its galaxies of burning suns, dying stars, metallic monuments of stone, clouds of cosmic dust, must be filling ... something. If it is expanding it has perimeters, at present far beyond any ability of ours to measure. What do things look like just at the instant's action at the edge of the universe? What is just beyond that rushing, overwhelming parametric edge before it is overwhelmed? What is being overcome, filled, enlivened, lit?
Or is there no edge, no border, but an infinite series of universes expanding into one another, all at the same time? So that the expanding expands futilely into itself, an infinitely convoluting dark matter of ghastly insensate endlessness, with no properties, no volume, no transformative elemental energies of light or force or pulsing quanta, all these being inventions of our own consciousness, and our consciousness, lacking volume and physical quality in itself, a project as finally mindless, cold, and inhuman as the universe of our illusion.
I would like to find an astronomer to talk to. I think how people numbed themselves to survive the camps. So do astronomers deaden themselves to the starry universe? I mean, seeing the universe as a job? (Not to exonerate the rest of us, who are given these painful intimations of the universal vastness and then go about our lives as if it is no more than an exhibit at the Museum of Natural History.)
Does the average astronomer doing his daily work understand that beyond the celestial phenomena given to his study, the calculations of his radiometry, to say nothing of the obligated awe of his professional life, lies a truth so monumentally horrifying-this ultimate context of our striving, this conclusion of our historical intellects so hideous to contemplate-that even one's turn to God cannot alleviate the misery of such profound, disastrous, hopeless infinitude? That's my question.
In fact if God is involved in this matter, these elemental facts, these apparent concepts, He is so fearsome as to be beyond any human entreaty for our solace, or comfort, or the redemption that would come of our being brought into His secret.
These are the first words in E.L. Doctorow's City of God. I found them on a humble, just flipping through fiction at bookstore. I've been thinking a lot about aggression and violence lately. My favorite writers are all aggressive, they snatch you up and propel you on the strength of their confidence, on the strength of their evangelical belief that this world they are conjuring is all real talk. These writers are not waiting on you, playing with you, or apologizing to you. They don't even really need you. They are writing, and you can either come along on you can do what we all know you want to know, and go back you life of various Real Housewives in various fake locales.
What I am saying here, in somewhat overly dramatic terms, is my favorites writers are always daring me, always threatening me. I picked up a few other books before I got to this one, and all the writing was some wimpy, limp, half-ass, and apologetic. If the writer doesn't believe in the world they are creating, why should I?
It isn't just in fiction. Half the opinionating I read begins with apology or bravado. I'd rather be punched in the face. I remember the first time I heard "Rebel Without A Pause." I hated it. But Public Enemy just compelled me.
Ta-Nehisi Coates is a national correspondent at The Atlantic, where he writes about culture, politics, and social issues. He is the author of The Beautiful Struggle and the forthcoming Between the World and Me.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
As the vice president edges toward a presidential run, is he banking on further public disclosures to discredit the frontrunner?
As Joe Biden edges closer to a presidential run, there’s no shortage of theories as to what he’s up to. Former secretary of state Hillary Clinton has built a commanding lead in the national polls, giving Biden little apparent space to gain traction. Perhaps he’s counting on the early-primary state of South Carolina to provide a critical boost. He might be banking on appearing as a stronger general-election candidate than any of his potential rivals in the primary race. Maybe after spending the past 42 years of his life running for elective office, he just can’t stop.
But there’s one intriguing theory that has so far garnered little attention: What if Biden knows something about Democratic frontrunner Hillary Clinton that the rest of us don’t?
The drug modafinil was recently found to enhance cognition in healthy people. Should you take it to get a raise?
If you could take a pill that will make you better at your job, with few or no negative consequences, would you do it?
In a meta-analysis recently published in European Neuropsychopharmacology, researchers from the University of Oxford and Harvard Medical School concluded that a drug called modafinil, which is typically used to treat sleep disorders, is a cognitive enhancer. Essentially, it can help normal people think better.
Out of all cognitive processes, modafinil was found to improve decision-making and planning the most in the 24 studies the authors reviewed. Some of the studies also showed gains in flexible thinking, combining information, or coping with novelty. The drug didn’t seem to influence creativity either way.
In 1998, Toni Morrison wrote a comment for The New Yorker arguing that “white skin notwithstanding, this is our first black President. Blacker than any actual black person who could ever be elected in our children’s lifetime.” Last week the New York Times, implicitly cited Morrison’s piece, and claimed the author was giving Clinton “a compliment.” This interpretation of Morrison’s claim is as common as it is erroneous.
The popular interpretation of Morrison’s point (exhibited here) holds that, summoning all of her powers, the writer gazed into the very essence of Clinton, and found him sufficiently soulful. In fact, Morrison’s point had little to do with soul of any kind. She was not much concerned with Clinton’s knowledge of Ebonics, his style of handshake, nor whether he pledged Alpha or Q. Morrison was concerned with power.
It is not too late to strengthen the Iran deal, a prominent critic says.
It appears likely, as of this writing, that Barack Obama will be victorious in his fight to implement the Iran nuclear deal negotiated by his secretary of state, John Kerry. Republicans in Congress don’t appear to have the votes necessary to void the agreement, and Benjamin Netanyahu’s campaign to subvert Obama may be remembered as one of the more counterproductive and shortsighted acts of an Israeli prime minister since the rebirth of the Jewish state 67 years ago.
Things could change, of course, and the Iranian regime, which is populated in good part by extremists, fundamentalist theocrats, and supporters of terrorism, could do something monumentally stupid in the coming weeks that could force on-the-fence Democrats to side with their Republican adversaries (remember the Café Milano fiasco, anyone?). But, generally speaking, the Obama administration, and its European allies, seem to have a clearer path to implementation than they had at the beginning of the month.
A new study finds an algorithmic word analysis is flawless at determining whether a person will have a psychotic episode.
Although the language of thinking is deliberate—let me think, I have to do some thinking—the actual experience of having thoughts is often passive. Ideas pop up like dandelions; thoughts occur suddenly and escape without warning. People swim in and out of pools of thought in a way that can feel, paradoxically, mindless.
Most of the time, people don’t actively track the way one thought flows into the next. But in psychiatry, much attention is paid to such intricacies of thinking. For instance, disorganized thought, evidenced by disjointed patterns in speech, is considered a hallmark characteristic of schizophrenia. Several studies of at-risk youths have found that doctors are able to guess with impressive accuracy—the best predictive models hover around 79 percent—whether a person will develop psychosis based on tracking that person’s speech patterns in interviews.
84 percent favor requiring police officers to wear body cameras.
74 percent of survey respondents believe the public should have access to footage from those body cameras any time that a police officer stands accused of misconduct. A narrow majority believes that the public should have access to all footage.
As for investigations into misconduct by police officers, 79 percent believe the public should have access to the findings if there has been wrongdoing, and 64 percent believe the public should have that same access anytime a cop is even accused.
A new study shows that the field suffers from a reproducibility problem, but the extent of the issue is still hard to nail down.
No one is entirely clear on how Brian Nosek pulled it off, including Nosek himself. Over the last three years, the psychologist from the University of Virginia persuaded some 270 of his peers to channel their free time into repeating 100 published psychological experiments to see if they could get the same results a second time around. There would be no glory, no empirical eurekas, no breaking of fresh ground. Instead, this initiative—the Reproducibility Project—would be the first big systematic attempt to answer questions that have been vexing psychologists for years, if not decades. What proportion of results in their field are reliable?
A string of questionable police killings demonstrates the need to reevaluate laws that govern the use of lethal force.
On July 1, 2012, Milton Hall, a homeless man with a history of mental illness, stole a cup of coffee from a convenience store in Saginaw, Michigan. The store’s clerk called 911. When an officer arrived, Hall produced a knife with a three-inch blade and threatened her with it. She called for backup and seven other officers soon joined her, one of them with a police dog. They formed an arc around Hall and aimed their firearms—pistols and a rifle—at him. The standoff continued for several minutes, with the officers repeatedly asking Hall to put the knife down and Hall repeatedly refusing. Finally, Hall, still wielding his knife, began to walk toward the police dog and the K9 officer. After he had taken a few steps—three, by my count, as I watch video footage from a patrol car’s dashboard camera and available on YouTube—the officers shot Hall to death in a volley of 47 bullets.