Readers debate the question and related ones. (To chime in, please email firstname.lastname@example.org.) “What’s the point of college?” was also the crux of the conversation during the closing session of our Education Summit:
Some remaining thoughts from readers on the question:
This summer I accompanied my mother to her 65th college reunion. Part of the weekend’s program was a video about the Cornell University Class of 1950, the first class that came in with a large supply of veterans on the G.I. Bill. The film had some inspiring cameos about veterans who would never have gotten to college otherwise and the lives they made for themselves as a result. I wonder if our preoccupation with credentialism and the faith in the bachelor’s degree as a gateway to success and wealth is a legacy of that postwar crop of veterans.
I have observed the 20-year trend toward arbitrarily requiring college degrees for jobs that do not truly need them. I believe this goes hand-in-hand with the growth of Human Resources as a profession.
A company’s HR department usually handles recruiting functions, and it serves as the gatekeeper over which skills and credentials are required for a given position. The trouble is that they have no idea of what it takes to perform well in those positions, and they are absolutely the wrong people to create the requirements. The actual department heads who are hiring are often very busy and appreciate the HR gatekeepers because it means they have to look at fewer resumes.
I entered the professional workforce in 1979 as a general bookkeeper and later, between on-the-job training and self-study, became a controller. My husband was an electronics technician and ultimately started his own business. The ranks of college-degreed professionals in the workforce was a small percentage, and my husband and I, along with many degreeless others, had good careers without a college degree. It was common.
In the mid-late 1990s I noticed that more and more jobs in finance and accounting wanted bachelor’s degrees in “a related field.” The CPA designation, once available to anyone who took the appropriate coursework, was changed to require five years of education in accounting. Only the CMA (Certified Management Accountant via the Institute of Management Accountants) was available to me—but then only if I had a baccalaureate degree.
I did go back to school, majored in history (for the love of it), and obtained my CMA. Once I had a BA, I had opportunities I never had before. My career took off. Still, even now, although I have been a CFO and now serve as a Corporate Controller for a mid-sized companies, I am viewed to be unqualified for many lesser accounting jobs because I do not have a bachelor’s in accounting or finance. It’s absurd.
My last two great hires have been experienced professionals without a college degree. I frequently see articles about open jobs that can’t be filled because of skill deficits and mismatches between the needs of business and the employment pool. That is also absurd. Businesses are allowing a department (HR) that doesn’t understand job requirements to set the standards for those candidates. This harms business and shuts out a lot of really talented, qualified people, relegating them to perpetual underemployment.
Keep stoking this issue. This needs to be changed for our long-term prosperity.
Another would prefer we stop stoking:
So since you’re someone who’s asking the perennial “is college worth it anymore?” question, I thought I’d ask you to look at it from a different angle. My own fascination isn’t with that question, which to my lights has been answered positively, again and again and again—here’s an absolutely massive trove of recent data on the question, for example.
No, my interest is in why journalists are so eager to ask the question over and over again despite the durability of the “yes” answer. It strikes me that our media is really predisposed to find that the answer is no, despite such large empirical confirmation of the value of college.
And I think that’s more interesting: Why do so many journalists and writers want to say that college isn’t worth it, particularly given that almost all of them went themselves?
I, for one, would not say that, especially since I actually used my B.A. in History to a practical end, meaning my first salaried job out of college was writing about history. Eleven years after graduating, I’m still paying off student loans, but they’re definitely worth it, all things considered. The question of whether an M.A. is worth it—that seems much less doubtful, especially given stats like these:
Indeed, between 2004 and 2012, the amount of debt carried by a typical borrower who had a master of arts degree rose an inflation-adjusted 70%, according to an analysis of data by the New America Foundation. The report says this surge may be thanks to a 2005 congressional move that lets grad students borrow nearly unlimited money for school.
Personally I was fortunate to slip into journalism without going to J-school and rack up more debt. Instead, I got a paid internship at The Atlantic back in ‘07, working part-time to make ends meet and living in a rickety group house. So an M.A. definitely would not have been worth it to me. If you have strong feelings about the M.A. question from your own experience, let me know. Update from a reader:
Your reader who points to a “massive trove of recent data” settling this question should perhaps go back to college himself to learn about statistical inference and the difference between correlation and causation. All the data he points to documents advantages gained by college graduates, but makes no attempt to correct for confounding variables, of which there are many plausible ones.
The most obvious would be family income: people’s whose parents were rich tend to go to college more than those whose parents were poor, and they tend to have higher incomes and better other outcomes later in life. Is it really likely that higher education explains all or even most of those differences? Matt Yglesias ably explains this fallacy.
Furthermore, even if we knew with certainty that college education made people more productive, we couldn’t say with any certainty that it’s worth how much we invest in it, from a social perspective. I made this argument in more detail on my blog a few weeks ago.
I think, taken holistically, it’s pretty clear that getting a college education is worthwhile for most people, but it’s a valid question, and the concern about the growing requirement of bachelor’s degrees for jobs that don’t really require them is a hugely important issue to discuss.
Marxian Economics provides an interesting view of the “value” of any degree. The profits of a company can be divided into two parts: the amount that’s needed to sustain production, and the surplus. Training employees does not directly result in production for a company, which means it must come from the surplus. But the company has many other things they want to spend the surplus on, so they would prefer if their workers were able to do a job from Day One with no training. That means the bill for education/training falls on the individual or the state—which the company also doesn’t want to pay. That’s a different problem.
The readers before me eloquently argued that universities currently have a monopoly on verification for skills; this is sadly true. Even more distressing is the fact that universities operate as companies themselves. Students must pay more money than the value of the education they receive or the system will crash, which is why—I hazard a guess here—they’re forced to take unrelated classes, instead of being speedily prepared for a career.
Now, I learned the basics of this theory from a university lecture, but I haven’t payed a penny.
It’s free on Youtube. Unfortunately, if I want to prove that I know what I’m talking about, I’d need to have a shiny degree—which ironically I would understand is worth less than what I paid for based on the classes I received!
Is this a problem? Yes, it’s a trillion dollar problem. But the universities are getting their money, the politicians work for the corporations, and the corporations only care about their bottom line in the next quarter, so it’s not a problem that’s going to be solved, even though cheaper education is better for literally the entire human race.
Another reader cites a helpful book:
David Labaree’s pessimistic take in Someone Has to Fail is worth quoting in discussions about the value of the B.A. Labaree describes a race between educational access and the demand for educational privilege, and he places it at the center of the history of movements for educational reform. He thinks it unlikely that such a core tension will be resolved in the years ahead, and he imagines an inflation in higher education degrees that will continue unabated for some time:
… consider where the current pattern of expansion is taking us. As master’s programs start filling up, which is already happening, there will be greater pressure to expand access to doctoral programs, which are becoming the new zone of special educational advantage. So it seems likely that we’re going to need to invent new forms of doctoral degree programs to meet this demand, something that universities (always on the lookout for a new marketing opportunity) are quite willing to do. When that happens, of course, there will be demand for a degree beyond the doctorate (the current terminal degree is American higher education), in order to give some people a leg up on the flood of doctoral graduates pouring into the workplace.
In some ways this has already happened to science Ph.D.’s who have to complete an extensive postdoctoral program if they want a faculty position in an American university. We may end up going the direction of many European universities, which require that candidates for professorships first complete a Ph.D. program and then prepare a second dissertation called a habilitation , which is in effect a super-doctorate. This puts people well into their thirties before they complete their educational prepartion.
Another gets into the weeds with a previous reader:
I want to take a moment to reply to the update provided by your reader.
For the most part, he or she is correct that you must have an ABET accredited engineering degree to take the FE exam. A few states allow work experience to count for academic experience, but it isn’t common.
The purpose of the FE is the first step towards obtaining a PE (Professional Engineer) license. A candidate passes the FE, is graded the title of engineer in training and starts to gain work experience. After a number of years, they apply to sit for the PE exam. A number of PEs that they have worked under will provide professional recommendations and the state licensing board grants the PE license.
The reason for all of this process is liability. Only a licensed Professional Engineer can approve construction plans for buildings and public works projects. This is a response to the failures and loss of life that has occurred when these things are not designed and built correctly.
Don’t get me wrong; just because a PE was involved doesn’t negate the possibility of something going wrong. The intent is to minimize that possibility. It’s for the same reasons the bar exam and the medical board exam are required.
As a result, most PEs are in the civil engineering field. Many of the rest are engineers working in related fields, i.e. HVAC, plumbing, electrical wiring, fire suppression, etc. They are working on structures and their supporting systems for construction related to buildings and roads. There are plenty of engineers who never take the FE, and have very successful careers. We are covered under the industrial exemption, or it isn’t a consideration.
Mary Alice McCarthy wrote a piece for us declaring “America: Abandon Your Reverence for the Bachelor’s Degree.” A reader quotes her:
“Undergraduates are supposed to get a general education that will prepare them for training, which they will presumably get once they land a job or go to graduate school.” Au contraire:
Companies simply haven’t invested much in training their workers. In 1979, young workers got an average of 2.5 weeks of training a year. While data is not easy to come by, around 1995, several surveys of employers found that the average amount of training workers received per year was just under 11 hours, and the most common topic was workplace safety — not building new skills. By 2011, an Accenture study showed that only about a fifth of employees reported getting on-the-job training from their employers over the past five years.
Hence the great push for ever-more vocational or job-oriented college degrees. The task of training has been foisted upon higher education.
And another reader is very skeptical of the value of higher ed these days:
The Bachelor’s degree is now the equivalent of a high school diploma. No one is impressed if you have one. But if you don’t have one, they'll toss your resume aside. Colleges and universities know this, which is how they can get away with making you take classes you know you’ll never need. That’s fine for high school. But a college student shouldn’t be forced to take a sociology course or two years of foreign language, especially when he’s paying tens of thousands of dollars per year in tuition.
A Bachelor’s degree is also a convenient way for certain professions to limit their applicant pool.
In other countries, if you want to become a lawyer or a doctor, you apply directly out of high school. In this country, you need a four-year degree before you can apply to law school or med school. By the time someone finishes their undergraduate, they may already have $100,000 in debt to pay off. How inclined will they be to go to law school or med school and pile on even more debt?
As for employers, certain fields like IT don’t even care what you got your degree in. They just want to know about your skills and experience. Gone are the days where employers actually trained people. Now they expect you to be ready as soon as you walk in. Why? Because employers don’t want to spend time and money training people who’ll then apply for a higher paying job now that they have a stronger skill set.
What college needs to do is prove why a Bachelor’s degree is still worthwhile. If the best answer they can give is “because you won’t get a job without it,” that might be true, but it’s still pretty sad. And if that’s the case, they shouldn’t be forcing students to take classes they don’t want to take.
Another reader searches for solutions:
Four or five months ago, I was driving to work and listening to the radio. A commercial was playing for a program called Grads of Life which. According to the commercial [another one is embedded below], Grads of Life is a program dedicated to helping businesses hire from of a pool of workers who didn’t have degrees but possess skills and characteristics that would benefit the employers. “That’s me!” I thought, vainly. “I am possessor of the aforementioned skills and beneficial characteristics!”
Delusions of competence in tow, I hurriedly filled in the web address for the site into my browser. I envisioned the site as what I had been waiting for: some sort of job applicant aggregator that I could add my name to, coupled with some way to quantify those skills. For years, I’ve been crippled in the job market by my lack of a degree, particularly since my skills are in writing, where out-of-work journalism and writing majors are a dime a dozen.
It wasn’t meant to be, though: Grads of Life ended up being a Clinton Foundation fueled PSA program primarily designed to appear to be doing something while in reality only letting companies re-showcase their pre-existing, under-privileged worker hiring programs, without doing any additional work. What few actual programs dedicated to job pathways were dedicated to people younger than my age, and they only served a few thousand applicants a year. It was a program designed to look good and accomplish nothing.
I was disappointed, but it’s nothing new: nobody is seriously trying to establish any way for non-college educated students to find work.
But what would an effective program look like? What’s probably needed is for someone with the clout of the Clinton Foundation to convince a number of large companies to work with the government to establish a way to “test out” of certain skills that are normally certified by a diploma. An employer won’t and can’t believe an applicant who swears that he or she is smart and skilled enough for the job based on promises alone—believe me, I've tried that again and again. There needs to be another way to prove to hirers a minimum level of skills.
Universities hold a monopoly on the ability to certify many skills. I might have read widely and deeply and practiced long hours to become a skilled writer, but without a diploma to prove it, I’ve had hundreds and hundreds of applications rejected. If there was a way to do an end-run around the diploma process for at least some of the skills for which alternative non-university paths of development exist, the monopoly could be broken.
An important point to consider regarding university monopolies on the authenticity of skills is standardized tests for various career fields. Tests for the Fundamentals of Engineering (F.E), CFA, and CPA require degree completion in that field to even sit for the test. Some even require the coursework to be at “upper division,” eliminating the possibility of associates degree holders sitting for these tests.
These careers (engineering, finance, and accounting, respectively) are three of the most lucrative careers available in the primary labor market today. They represent a clear path to the middle class. Colleges have a clear monopoly on the certifications for these degrees, meaning that the cost of an undergraduate education is another barrier to entry in all of these fields.
The former California governor called President Trump’s attacks on the late Arizona senator “absolutely unacceptable.”
Arnold Schwarzenegger and John McCain saw in each other a willingness to buck the Republican Party and became fast friends and political allies. Mindful of McCain’s legacy, the former California governor said on Wednesday that he couldn’t stay silent in the face of President Donald Trump’s recent spate of attacks on the late senator.
He told me that Trump’s swipes at McCain are both disgraceful and destructive. “He was just an unbelievable person,” Schwarzenegger said. “So an attack on him is absolutely unacceptable if he’s alive or dead—but even twice as unacceptable since he passed away a few months ago. It doesn’t make any sense whatsoever to do that. I just think it’s a shame that the president lets himself down to that kind of level. We will be lucky if everyone in Washington followed McCain’s example, because he represented courage.”
The surprisingly short life of new electronic devices
Two years ago, Desmond Hughes heard so many of his favorite podcasters extolling AirPods, Apple’s tiny, futuristic $170 wireless headphones, that he decided they were worth the splurge. He quickly became a convert.
Hughes is still listening to podcasters talk about their AirPods, but now they’re complaining. The battery can no longer hold a charge, they say, rendering them functionally useless. Apple bloggers agree: “AirPods are starting to show their age for early adopters,” Zac Hall, an editor at 9to5Mac, wrote in a post in January, detailing how he frequently hears a low-battery warning in his AirPods now. Earlier this month, Apple Insider tested a pair of AirPods purchased in 2016 against a pair from 2018, and found that the older pair died after two hours and 16 minutes. “That’s less than half the stated battery life for a new pair,” the writer William Gallagher concluded.
A plant virus distributes its genes into eight separate segments that can all reproduce, even if they infect different cells.
It is a truth universally acknowledged among virologists that a single virus, carrying a full set of genes, must be in want of a cell. A virus is just a collection of genes packaged into a capsule. It infiltrates and hijacks a living cell to make extra copies of itself. Those daughter viruses then bust out of their ailing host, and each finds a new cell to infect. Rinse, and repeat. This is how all viruses, from Ebola to influenza, are meant to work.
But Stéphane Blanc and his colleagues at the University of Montpellier have shown that one virus breaks all the rules.
Faba bean necrotic stunt virus, or FBNSV for short, infects legumes, and is spread through the bites of aphids. Its genes are split among eight segments, each of which is packaged into its own capsule. And, as Blanc’s team has now shown, these eight segments can reproduce themselves, even if they infect different cells. FBNSV needs all of its components, but it doesn’t need them in the same place. Indeed, this virus never seems to fully come together. It is always distributed, its existence spread between capsules and split among different host cells.
Just because some people allegedly cheated the system doesn’t mean the system is defensible.
Like most other college presidents, R. Gerald Turner, the head of Southern Methodist University, where my son is a student, sends correspondence only when something goes terribly wrong. When I received a mass email from his office this week, I assumed the school had gotten caught up in the fallout of Operation Varsity Blues, the college-admissions cheating and bribery scandal that came to light last week.
But Turner’s missive turned out to be preemptive instead of apologetic. The scandal offered SMU “an opportunity to add to the ongoing review of our process,” he wrote. The university, he explained, must rely on the accuracy of materials submitted by students, including SAT scores. Turner announced that the university intended to review the records of any students associated with “The Key,” the college-counseling firm run by William Singer, the alleged fixer who is accused of paying bribes, facilitating cheats, and creating fraudulent materials to help wealthy parents get their kids into elite schools such as Stanford, Yale, and the University of Southern California.
As other social networks wage a very public war against misinformation, it’s thriving on Instagram.
When Alex, now a high-school senior, saw an Instagram account he followed post about something called QAnon back in 2017, he’d never heard of the viral conspiracy theory before. But the post piqued his interest, and he wanted to know more. So he did what your average teenager would do: He followed several accounts related to it on Instagram, searched for information on YouTube, and read up on it on forums.
A year and a half later, Alex, who asked to use a pseudonym, runs his own Gen Z–focused QAnon Instagram account, through which he educates his generation about the secret plot by the “deep state” to take down Donald Trump. “I was just noticing a lack in younger people being interested in QAnon, so I figured I would put it out there that there was at least one young person in the movement,” he told me via Instagram direct message. He hopes to “expose the truth about everything corrupt governments and organizations have lied about.” Among those truths: that certain cosmetics and foods contain aborted fetal cells, that the recent Ethiopian Airlines crash was a hoax, and that the Christchurch, New Zealand mosque shootings were staged.
Donald Cline must have thought no one would ever know. Then DNA testing came along.
Updated at 5:23 p.m. ET on March 18, 2019.
The first Facebookmessage arrived when Heather Woock was packing for vacation, in August 2017. It was from a stranger claiming to be her half sibling. She assumed the message was some kind of scam; her parents had never told her she might have siblings. But the message contained one detail that spooked her. The sender mentioned a doctor, Donald Cline. Woock knew that name; her mother had gone to Cline for fertility treatments before she was born. Had this person somehow gotten her mother’s medical history?
Her mom said not to worry. So Woock, who is 33 and lives just outside Indianapolis, flew to the West Coast for her vacation. She got a couple more messages from other supposed half siblings while she was away. Their persistence was strange. But then her phone broke, and she spent the next week and a half outdoors in Seattle and Vancouver, blissfully disconnected.
When the two strangers accosted Chelsea Clinton, she was attending an NYU vigil for the Muslims murdered by a terrorist in Christchurch, New Zealand. “This right here is the result of a massacre stoked by people like you and the words that you put out into the world,” one declared as the other recorded the encounter. “I want you to know that, and I want you to feel that deep down inside. Forty-nine people died because of the rhetoric you put out there.”
The accuser’s blend of callous indignation and extravagant nonsense brought to mind charges that Chelsea’s parents murdered Vince Foster or that her mother committed treason when the U.S. consulate in Benghazi was attacked. But these critics weren’t right-wingers parroting talk radio. They were leftist NYU students.
Year after year, Stuyvesant High has abysmal enrollment rates for black and Latino students. But the debate over admissions reform is brimming with misunderstandings.
On Monday, another admissions scandal injected a new dose of disillusionment into the already disillusioned world of elite education. This time the revelations concern not higher education, but Stuyvesant High and New York City’s other elite public high schools. Of the 895 current eighth graders who secured a spot in next year's Stuyvesant freshman class, just seven identify as African American.
Every year, reports show abysmally low numbers of black or Latino students at all eight of the city’s elite specialized high schools whose admissions rely solely on a standardized exam. City officials including Mayor Bill de Blasio have led an ongoing, multifaceted effort to solve the problem through recruitment initiatives and a summer enrichment program designed to shepherd low-income youth into the rigorous institutions, but enrollment numbers remain disappointing.
Michael Jackson’s music is a gift. What do we do with it now?
The camera flies high above the palm trees of Hollywood, soaring north and west, all the way to the suburb of Simi Valley, where it slows down to seek out a certain street, and then slows some more until it finds a particular house. It hovers above it, and then swoops down, pushing in all the way to the doorstep, where it rests, impatient. It is the house where James Safechuck, one of the two men at the center of Leaving Neverland, an HBO documentary, grew up, but in a way it might as well be the Darlings’ house: “Peter Pan chose this particular house because there were people here who believed in him.”
But the Safechucks are not the only people who believe, because here is another suburban house, and here again is that seeking, searching intelligence, the camera pushing closer and closer. It is the house in Brisbane, Australia, where the other subject of the documentary, Wade Robson, grew up. The implication is clear: Michael Jackson could have any little boy in the world; all he needed were parents who would serve up their sons to him.
“Floods and hurricanes happen. The hazard itself is not the disaster—it’s our habits, our building codes.”
Historic flooding in the Missouri River and Mississippi River basins has ravaged much of the Midwest in recent days. Nebraska and Iowa bore the brunt of the devastation, but rivers in six states at more than 40 locations have reached record levels. The swollen rivers have made short work of the levees that surround them, blasting through or over the tops of 200 miles of earthen barriers in four states. At least three people have died, and hundreds of homes and structures have been destroyed. The Nebraska Farm Bureau estimates farm and ranch losses up to $1 billion in that state alone.
Should we call this a natural disaster?
Labels matter, even—perhaps especially—in times of emergency. Calling the midwestern carnage a natural disaster neatly absolves us of responsibility, and casts us as hapless victims of an unpredictable and vengeful Mother Nature. Far better to draw a distinction between natural hazards and human-induced disasters. According to Craig Fugate, a former administrator of the Federal Emergency Management Agency, “Floods and hurricanes happen. The hazard itself is not the disaster—it’s our habits, our building codes. It’s how we build and live in those areas—that’s the disaster.” This is not a call for blame, but a call to arms to learn from the past to keep ourselves out of harm’s way.