Readers debate the question and related ones. (To chime in, please email email@example.com.) “What’s the point of college?” was also the crux of the conversation during the closing session of our Education Summit:
The closing panel at our Education Summit yesterday, led by Ron Brownstein, centered on the question “What’s the point of college?” The full video of the discussion is here, but here’s a snippet, which includes Jeffery Selingo elaborating on his view that college is a credential:
Chris Sick, via the TAD group of Atlantic readers, offers a long and thoughtful take on the question at hand:
College is a very strange nexus where many of the worst facets of our society compete against each other to decide whose ox is going to be the one to get gored. There’s a large number of tenured faculty and high-ranking administrators who talk the talk of expanding the mind, training citizens to think critically, engaging intellectually with the world. They’ve been on staff forever, they went to college when a year’s tuition cost $4,500 instead $45,000. So, y’know, they can talk that happy horseshit with a seemingly straight face.
Then there are the parents, more recent grads, and incoming students who believe that college is an experience that gives kids training wheels on adulthood, while simultaneously exposing them—possibly for the first time—to people from different walks of life. These people probably listened to “Common People” one too many times, and it’s sorta problematic.
A while back I read an article by Eric Posner insisting that college students are, basically, children who need more rules and regulations and should not be treated as full adults. This is problematic at a time when the majority of college students are nontraditional. You can imagine why an article like that would irritate someone like me, who finally pursued a bachelor’s degree in my thirties. It’s also problematic when some segment of society views this very, very, very expensive experience as some sort of essential transition to adulthood, and thereby those who can’t afford it aren’t, somehow, adults.
So you have those dual tensions of, basically, a kind of idealism: College is either about pure intellectual pursuit and knowledge, or about a sort of transitional life changing that opens the doors—and forms the networks—for adulthood.
Colleges are increasingly treating their undergraduate populations as revenue streams for other projects. Here at Columbia, the university is simultaneously underfunding and overstraining the undergraduate schools in favor of improving its global standing as a research institution. Yale, fr’instance, has seen such dramatic growth in its endowment over the last few years—while continuing to increase tuition and pursuing cost-cutting layoffs in some departments—that it’s not unfair to ask what their institutional priorities are. Do they exist to grow funding to assist students, improve research capabilities, or to just have a massive endowment to brag about to alums and trustees?
There are a lot of tensions pulling in different directions. For my entire life—so a generation, at least—college has been seen as not just a pathway to middle-class security, but thepathway. And during that same time, the cost of it to students has spiraled ever upward.
I think there’s something to the (generally speaking) pro-market/conservative argument that the easy lending criteria and government support for student loans has made it easier for colleges to demand more and more money. There’s also significant evidence that the administrative bodies of universities has swollen dramatically, and colleges are spending more than ever on support offices that don’t directly impact academics, as well as luxury offerings like upscale dorms, recreation and exercise spaces, study abroad opportunities.
But mostly I think it’s just that first thing: You told a generation or more of Americans that they needed to go to college to have a fighting chance in a competitive job market. What family wouldn’t mortgage everything to give their child that?
So you have an idealism of college as hallowed space training citizens for deep, critical engagement in their democracy (spoiler: It isn’t this, because it can’t be this); a more realistic view of it as training wheels for middle-class adulthood, where not only the skills, but the social networks and capital for such are developed; and employers for some time have been requiring a college degree for even menial gigs because it presents (theoretically) proof of a basic level of competency.
Meanwhile, universities are increasingly corporatizing at nearly every level. Beyond the administrator class, there’s high pay to be had for provosts and presidents who come from MBA/Fortune 500 backgrounds, while faculty pay stagnates. Students are viewed in terms of either customers or revenue streams, and when the administrative class that is steeped in that thinking treats them as such, you get angry hand-wringing from the likes of Jonathan Chait insisting that the kids today are not alright.
These aren’t incidental: When you curve the university experience to being transactional and treat students as customers, you’re going to have to, y’know, treat them as customers. [CB: See David Graham’s related note, “Why Don’t Students Strike? Because They Think They’re Customers.”] And that’s going to engender an outsized sense of power on their part, not helped by the administrative class having a vested interest in keeping them happy. This flashpoint most frequently gets highlighted in messy fights over student demands, but I think grade inflation is another symptom of it that causes far more problems.
I’m just sort of rambling, now, but there are so many different things happening that all contribute to the problems our higher education system is facing right now. For what it’s worth, I tend to align my sympathies more closely with, say, Yale’s student protesters than the likes of Chait or Conor Friedersdorf (whom I had a long email exchange with about this subject); I suspect there’s yet another tension there, where more and more elite universities are actively recruiting students of color and/or from disadvantaged backgrounds but really fail to support them once they’re there. [CB: See the Notes conversation on the “mismatch theory” of affirmative action.] To say nothing of telling kids they’re whining/spoiled and illiberal for being dissatisfied going to a school named after a famous racist (pick one) that used their prodigious intellectualism to articulate why they're subhuman.
So we’re left with multiple different actors all articulating different—and frequently competitive and contradictory—ideals for what college ought to be and do. But the underlying agreement is that you should go to college: either because it will make you a good citizen, or expand your mind, or teach you to be a grown-up, or because your potential future employers would rather you take $100k in debt to learn how to use Word rather than them pay you for three months while they assign someone to teach you ...
I’m not one to look back on the past—either ancient or recent—with some sort of idealistic assumptions about things being better back in the day. That said, I suspect that colleges were historically more academically and intellectually rigorous than they are presently, and that past generations not only got a better education (in terms of pure intellectualism/knowledge) but got it at a significantly better price.
But I think that has far more to do with the simple fact that universities used to be significantly more elite. Only ~20 percent of the population has a bachelor’s degree or better, but that’s up for ~5 percent around the middle of the 20th century. You can’t grow the numbers that big and maintain the same standards across the board, and that’s not even taking into account the dramatic growth in international students coming to the U.S. for a degree ...
What’s the point of college? It depends. As an academic, my answer is too complicated.
Another reader agrees:
The point of college is different depending on where you come from, the social and economic expectations, the money available, your academic record or enthusiasm, and more. The point of college for a trust-funder is different than a poor immigrant, which is different than an semi-affluent suburbanite, which is different from than that of a life-long fuck up, which is different from a rural first-generation college student.
And this reader conveys how the various points of college are not created equal:
Ideally, the purpose of college is to finish off a good, liberal education to broaden one’s understanding of the world: physical, social, and intellectual. Somewhat less ideal, to gain deep knowledge in a field in preparation for graduate school. A little less ideally (to me), to give one the skills necessary to start a middle-class career (or better). Less ideal yet, simply to get that piece of paper to send in with job applications. The reasons just continue going down hill from there.
On the other hand, this next reader, Leland Davis, gets much more specific with his answer:
A college degree is the stamp of the modern American middle class, the necessary badge of worthiness that one must have before any other consideration will be made. This helps keep the children of the middle class on the proper road in life—away from the trades and small-business, which might encourage unfortunate degrees of independence and inter-class solidarity, and towards the professions, whose professional standings and ethos encourage a proper deference to their betters.
I teach high school, so I see it happening. I went to grad school, and it happened to me.
This next reader, Ben, doesn’t have a college degree, and as a result he’s struggling to find a job:
I didn’t finish school, and what credits I do have are of the community college variety, so my “what value there is in college” opinions are completely those of an uninformed outsider. But I agree with pretty much every reason for going to college that I’ve read here. Yeah, it’s a path to better things and to ruin depending on planning and luck. It’s a way for employers to sort applicants and a way to encourage the habit of life-long debt, too.
There’s an element of it that I’m more interested in, though, and that I’ve been thinking about for a while. I think it’s impossible or at least very difficult to have a significant voice without it.
On one level, I find most media outlets reluctant to consider non-academic sources to be expert. I’ve seen professionals with a complete and total understanding of their field completely ignored in favor of a scientist with only a passing familiarity with it. This is probably more out of convenience (or sloth) than malice, but verifying a non-PHD’s bona fides is time consuming, especially compared to checking for abbreviations after their name.
Policy makers similarly ignore common voices in favor of anyone writing from behind a diploma. Want to comment on that regulation? Go for it. Want them to respond to it as a significant, individual thing? Want them to even read it rather than just checking you off as another pro or anti voice in their tallies? If you are a scientist or doctor it’s a sure thing, even if you don’t have specific expertise in that field. As a layman, you are either for or against and thrown into a pile labeled “uninformed public opinion.”
I’m looking into journalistic organizations with this in mind. I’m very curious to see how many journalists in top-flight outlets got there without going through the network-at-university-while-checking-the-degree-box route, if any. I’m especially interested to see those numbers when dealing with more high-end think-piece and analysis purveyors. I don’t know but I’m very interested to find out if there’s a diversity in media issue we don’t talk about much that has nothing to do with race and gender.
It’s a question my colleague Steve Clemons is planning to discuss with a panel of experts at The Atlantic’s Education Summit next week, so to get some fodder for the discussion, I posed the question to some of our core readers in TAD, a discussion group created a few months ago by members of TNC’s old Horde.
Here are two quick answers, first from Nick: “The best thing I got out of college was having my opinions tested, learning how to justify them if possible, or correct them if I couldn’t justify them.” Reader Jim looks on the social side:
I’d say the point of college, now—and certainly a part I benefited from—is the exposure to people from a variety of walks of life. Ideally, you learn about the differences in people and come to recognize them as people despite those differences.
This reader touches on both themes:
College was probably the best time in my life (so far). I would say the benefit was twofold: growing intellectually and growing socially. A job was not on my mind as I went through college (aside from my mom's constant “Don’t you want to be a lawyer/doctor?”), so I was really focused on learning. As part of the honors program, I was given the opportunity to take small seminar classes where the students took a strong role in shaping and charting the discussion, and I undertook a serious research project clocking in at a whopping 117 pages. Will that knowledge of apocalyptic texts ever come in handy again? Probably not. But the skills gained along the way in critical thinking, writing, and discussion certainly made me a better citizen.
As far as my social life, I have never and will never be a party person. So that wasn’t my college experience. But I made some great friends from different backgrounds and formed friendships that I might not have otherwise. And my friends from college remain my closest friend circles today (sorry TADbros).
So I think viewing college as simply a means to an end is silly, although probably an unfortunate part of reality.
More on that reality from Katt:
The point of college is to put as many people into debt as possible so they have to settle for a life of mediocrity and being wage slaves to our capitalist overlords.
P.S. I am just bitter because I went to art school. Don’t send your kids to art school.
This reader would probably agree:
What’s the point of college? In the 21st century? Vocational training, credentialism, and resume-sorting disguised as “education.” Many jobs that in no way need a college education require one. It is a replacement for job training, which has been put to pasture by shareholder demand and the general libertarian attitude of employers towards employees. (See the work of Wharton management professor Peter Cappelli for how employers across the spectrum have cut training.)
Bourree interviewed Cappelli last year on the “danger of picking a major based on where the jobs are.” Cappelli told her in a subsequent piece: “There is a long literature in psychology showing that job performance and college grades are poorly related. It is remarkable how frequently companies rely on hiring criteria for which there is no evidence of it working.” Back to our reader:
[A college education] also provides cover from an otherwise swelling unemployment rate for young adults. Resume-sorting is not just accomplished by asking “does the applicant have a college degree or not,” but also sorting by major, GPA, institution, etc. Direct applicability of major is ever more important as actual job training vanishes. GPA is a lazy correlation of equating academic prowess (or strategic choice of easy classes) with professional aptitude.
Institutional sorting is how elite employers remain elite. (I’ve noted in perusing the CVs of The Atlantic’s staff that even attending something outside of the Top 15 or so colleges and universities in USNWR makes one something of an outlier. [CB:👋])
A college education is also a conduit for separating people from their money and/or money they borrow. Student loans are, at their basest form, a regressive taxation on socioeconomic mobility. How’s that for Kafkaesque?
In all of this, the classical purpose of college—to acquire deep knowledge, advanced analytical, rhetorical, and writing skills, and a deeper appreciation of the world around you—is antiquated and scorned.
This next reader relates to the “what’s the point?” question as a parent:
Funny you ask, since we are prepping our 18-year-old daughter for her freshman year this fall.
Contra to my wife, who seems to think planning a degree is akin to prepping a resume (ironically she is very highly regarded and successful corporate dork with a BFA in printmaking), college should be about expanding learning horizons, freeing your mind to consider others, socializing with people of different socio-economic backgrounds who have similar goals and aspirations, learning discipline and refining interests.
Also, having sex, doing drugs, and abdicating responsibility.
Here’s the view from a recent grad, who got his diploma from a large, public, four-year institution three years ago:
To ask what was the “point” of college is interesting. For me and the background I am coming from, I did not see any chance of upward mobility unless I joined the military or went to higher education (and to this day I think about the military quite often). For many people, like me, university was a means to an end being a job with some middle-income security. I had no illusions a degree would provide that, but I understood it was a necessary step along that path.
I think many university students don’t know why they are there themselves beyond just “knowing” college was what came after high school. They might share my sentiments that it will increase their future income, but they haven’t put too much thought into what getting a job after university means or entails. Many are not there with the tools they need in terms of basic logic/critical thinking/skepticism.
I’ll end it with this: the “point” of university is whatever point you set for yourself there. It is a place for you to get help learning about something you care about. When it comes to choosing a major, some are job funnels, but most are not. They are for you to find your own learning. Choose something you want to learn in depth about.
Most importantly, take responsibility for your own education. It is up to you whether or not you retain what you learn or whether you apply it after graduation.
One more reader perspective, from Canada:
Whatever else college is for, it’s about more than simply getting a job. It's for broadening the mind, exposure to thoughts and ideas you haven't encountered before and for both a broader and more in-depth education about various topics.
I, of course, am an educational elitist. I have four-year degrees in both the humanities and the sciences (the broad mind expanding experience and the in-depth study of a topic), as well as professional degree, since I actually wanted a good job at the end of the process.
Nonetheless, it was from my most conservative professor that I learned that higher education is about more than just a job, and more than merely learning the dominant ethos of the day. It was the place where my mind was awakened and my eyes opened.
I expect that my experience (and good fortune to have such an experience) is a rarity these days, and is generally unavailable. Who has the time or the money to spend 9 years in post secondary school after all? Few of us, especially mere middle class folks, do.
(Note: I went to school in Canada over 20 years ago. As such the economic factors were very different than they are now, and tuition was—and still is so far as I’m aware—much less than it is in the U.S. Furthermore, because the status of the school matters so much less in Canada, you can get much further ahead here without having gone to Harvard or the Ivies than you can, or so it seems, than in the U.S.)
Some remaining thoughts from readers on the question:
This summer I accompanied my mother to her 65th college reunion. Part of the weekend’s program was a video about the Cornell University Class of 1950, the first class that came in with a large supply of veterans on the G.I. Bill. The film had some inspiring cameos about veterans who would never have gotten to college otherwise and the lives they made for themselves as a result. I wonder if our preoccupation with credentialism and the faith in the bachelor’s degree as a gateway to success and wealth is a legacy of that postwar crop of veterans.
I have observed the 20-year trend toward arbitrarily requiring college degrees for jobs that do not truly need them. I believe this goes hand-in-hand with the growth of Human Resources as a profession.
A company’s HR department usually handles recruiting functions, and it serves as the gatekeeper over which skills and credentials are required for a given position. The trouble is that they have no idea of what it takes to perform well in those positions, and they are absolutely the wrong people to create the requirements. The actual department heads who are hiring are often very busy and appreciate the HR gatekeepers because it means they have to look at fewer resumes.
I entered the professional workforce in 1979 as a general bookkeeper and later, between on-the-job training and self-study, became a controller. My husband was an electronics technician and ultimately started his own business. The ranks of college-degreed professionals in the workforce was a small percentage, and my husband and I, along with many degreeless others, had good careers without a college degree. It was common.
In the mid-late 1990s I noticed that more and more jobs in finance and accounting wanted bachelor’s degrees in “a related field.” The CPA designation, once available to anyone who took the appropriate coursework, was changed to require five years of education in accounting. Only the CMA (Certified Management Accountant via the Institute of Management Accountants) was available to me—but then only if I had a baccalaureate degree.
I did go back to school, majored in history (for the love of it), and obtained my CMA. Once I had a BA, I had opportunities I never had before. My career took off. Still, even now, although I have been a CFO and now serve as a Corporate Controller for a mid-sized companies, I am viewed to be unqualified for many lesser accounting jobs because I do not have a bachelor’s in accounting or finance. It’s absurd.
My last two great hires have been experienced professionals without a college degree. I frequently see articles about open jobs that can’t be filled because of skill deficits and mismatches between the needs of business and the employment pool. That is also absurd. Businesses are allowing a department (HR) that doesn’t understand job requirements to set the standards for those candidates. This harms business and shuts out a lot of really talented, qualified people, relegating them to perpetual underemployment.
Keep stoking this issue. This needs to be changed for our long-term prosperity.
Another would prefer we stop stoking:
So since you’re someone who’s asking the perennial “is college worth it anymore?” question, I thought I’d ask you to look at it from a different angle. My own fascination isn’t with that question, which to my lights has been answered positively, again and again and again—here’s an absolutely massive trove of recent data on the question, for example.
No, my interest is in why journalists are so eager to ask the question over and over again despite the durability of the “yes” answer. It strikes me that our media is really predisposed to find that the answer is no, despite such large empirical confirmation of the value of college.
And I think that’s more interesting: Why do so many journalists and writers want to say that college isn’t worth it, particularly given that almost all of them went themselves?
I, for one, would not say that, especially since I actually used my B.A. in History to a practical end, meaning my first salaried job out of college was writing about history. Eleven years after graduating, I’m still paying off student loans, but they’re definitely worth it, all things considered. The question of whether an M.A. is worth it—that seems much less doubtful, especially given stats like these:
Indeed, between 2004 and 2012, the amount of debt carried by a typical borrower who had a master of arts degree rose an inflation-adjusted 70%, according to an analysis of data by the New America Foundation. The report says this surge may be thanks to a 2005 congressional move that lets grad students borrow nearly unlimited money for school.
Personally I was fortunate to slip into journalism without going to J-school and rack up more debt. Instead, I got a paid internship at The Atlantic back in ‘07, working part-time to make ends meet and living in a rickety group house. So an M.A. definitely would not have been worth it to me. If you have strong feelings about the M.A. question from your own experience, let me know. Update from a reader:
Your reader who points to a “massive trove of recent data” settling this question should perhaps go back to college himself to learn about statistical inference and the difference between correlation and causation. All the data he points to documents advantages gained by college graduates, but makes no attempt to correct for confounding variables, of which there are many plausible ones.
The most obvious would be family income: people’s whose parents were rich tend to go to college more than those whose parents were poor, and they tend to have higher incomes and better other outcomes later in life. Is it really likely that higher education explains all or even most of those differences? Matt Yglesias ably explains this fallacy.
Furthermore, even if we knew with certainty that college education made people more productive, we couldn’t say with any certainty that it’s worth how much we invest in it, from a social perspective. I made this argument in more detail on my blog a few weeks ago.
I think, taken holistically, it’s pretty clear that getting a college education is worthwhile for most people, but it’s a valid question, and the concern about the growing requirement of bachelor’s degrees for jobs that don’t really require them is a hugely important issue to discuss.
Marxian Economics provides an interesting view of the “value” of any degree. The profits of a company can be divided into two parts: the amount that’s needed to sustain production, and the surplus. Training employees does not directly result in production for a company, which means it must come from the surplus. But the company has many other things they want to spend the surplus on, so they would prefer if their workers were able to do a job from Day One with no training. That means the bill for education/training falls on the individual or the state—which the company also doesn’t want to pay. That’s a different problem.
The readers before me eloquently argued that universities currently have a monopoly on verification for skills; this is sadly true. Even more distressing is the fact that universities operate as companies themselves. Students must pay more money than the value of the education they receive or the system will crash, which is why—I hazard a guess here—they’re forced to take unrelated classes, instead of being speedily prepared for a career.
Now, I learned the basics of this theory from a university lecture, but I haven’t payed a penny.
It’s free on Youtube. Unfortunately, if I want to prove that I know what I’m talking about, I’d need to have a shiny degree—which ironically I would understand is worth less than what I paid for based on the classes I received!
Is this a problem? Yes, it’s a trillion dollar problem. But the universities are getting their money, the politicians work for the corporations, and the corporations only care about their bottom line in the next quarter, so it’s not a problem that’s going to be solved, even though cheaper education is better for literally the entire human race.
Another reader cites a helpful book:
David Labaree’s pessimistic take in Someone Has to Fail is worth quoting in discussions about the value of the B.A. Labaree describes a race between educational access and the demand for educational privilege, and he places it at the center of the history of movements for educational reform. He thinks it unlikely that such a core tension will be resolved in the years ahead, and he imagines an inflation in higher education degrees that will continue unabated for some time:
… consider where the current pattern of expansion is taking us. As master’s programs start filling up, which is already happening, there will be greater pressure to expand access to doctoral programs, which are becoming the new zone of special educational advantage. So it seems likely that we’re going to need to invent new forms of doctoral degree programs to meet this demand, something that universities (always on the lookout for a new marketing opportunity) are quite willing to do. When that happens, of course, there will be demand for a degree beyond the doctorate (the current terminal degree is American higher education), in order to give some people a leg up on the flood of doctoral graduates pouring into the workplace.
In some ways this has already happened to science Ph.D.’s who have to complete an extensive postdoctoral program if they want a faculty position in an American university. We may end up going the direction of many European universities, which require that candidates for professorships first complete a Ph.D. program and then prepare a second dissertation called a habilitation , which is in effect a super-doctorate. This puts people well into their thirties before they complete their educational prepartion.
Another gets into the weeds with a previous reader:
I want to take a moment to reply to the update provided by your reader.
For the most part, he or she is correct that you must have an ABET accredited engineering degree to take the FE exam. A few states allow work experience to count for academic experience, but it isn’t common.
The purpose of the FE is the first step towards obtaining a PE (Professional Engineer) license. A candidate passes the FE, is graded the title of engineer in training and starts to gain work experience. After a number of years, they apply to sit for the PE exam. A number of PEs that they have worked under will provide professional recommendations and the state licensing board grants the PE license.
The reason for all of this process is liability. Only a licensed Professional Engineer can approve construction plans for buildings and public works projects. This is a response to the failures and loss of life that has occurred when these things are not designed and built correctly.
Don’t get me wrong; just because a PE was involved doesn’t negate the possibility of something going wrong. The intent is to minimize that possibility. It’s for the same reasons the bar exam and the medical board exam are required.
As a result, most PEs are in the civil engineering field. Many of the rest are engineers working in related fields, i.e. HVAC, plumbing, electrical wiring, fire suppression, etc. They are working on structures and their supporting systems for construction related to buildings and roads. There are plenty of engineers who never take the FE, and have very successful careers. We are covered under the industrial exemption, or it isn’t a consideration.
Mary Alice McCarthy wrote a piece for us declaring “America: Abandon Your Reverence for the Bachelor’s Degree.” A reader quotes her:
“Undergraduates are supposed to get a general education that will prepare them for training, which they will presumably get once they land a job or go to graduate school.” Au contraire:
Companies simply haven’t invested much in training their workers. In 1979, young workers got an average of 2.5 weeks of training a year. While data is not easy to come by, around 1995, several surveys of employers found that the average amount of training workers received per year was just under 11 hours, and the most common topic was workplace safety — not building new skills. By 2011, an Accenture study showed that only about a fifth of employees reported getting on-the-job training from their employers over the past five years.
Hence the great push for ever-more vocational or job-oriented college degrees. The task of training has been foisted upon higher education.
And another reader is very skeptical of the value of higher ed these days:
The Bachelor’s degree is now the equivalent of a high school diploma. No one is impressed if you have one. But if you don’t have one, they'll toss your resume aside. Colleges and universities know this, which is how they can get away with making you take classes you know you’ll never need. That’s fine for high school. But a college student shouldn’t be forced to take a sociology course or two years of foreign language, especially when he’s paying tens of thousands of dollars per year in tuition.
A Bachelor’s degree is also a convenient way for certain professions to limit their applicant pool.
In other countries, if you want to become a lawyer or a doctor, you apply directly out of high school. In this country, you need a four-year degree before you can apply to law school or med school. By the time someone finishes their undergraduate, they may already have $100,000 in debt to pay off. How inclined will they be to go to law school or med school and pile on even more debt?
As for employers, certain fields like IT don’t even care what you got your degree in. They just want to know about your skills and experience. Gone are the days where employers actually trained people. Now they expect you to be ready as soon as you walk in. Why? Because employers don’t want to spend time and money training people who’ll then apply for a higher paying job now that they have a stronger skill set.
What college needs to do is prove why a Bachelor’s degree is still worthwhile. If the best answer they can give is “because you won’t get a job without it,” that might be true, but it’s still pretty sad. And if that’s the case, they shouldn’t be forcing students to take classes they don’t want to take.
Another reader searches for solutions:
Four or five months ago, I was driving to work and listening to the radio. A commercial was playing for a program called Grads of Life which. According to the commercial [another one is embedded below], Grads of Life is a program dedicated to helping businesses hire from of a pool of workers who didn’t have degrees but possess skills and characteristics that would benefit the employers. “That’s me!” I thought, vainly. “I am possessor of the aforementioned skills and beneficial characteristics!”
Delusions of competence in tow, I hurriedly filled in the web address for the site into my browser. I envisioned the site as what I had been waiting for: some sort of job applicant aggregator that I could add my name to, coupled with some way to quantify those skills. For years, I’ve been crippled in the job market by my lack of a degree, particularly since my skills are in writing, where out-of-work journalism and writing majors are a dime a dozen.
It wasn’t meant to be, though: Grads of Life ended up being a Clinton Foundation fueled PSA program primarily designed to appear to be doing something while in reality only letting companies re-showcase their pre-existing, under-privileged worker hiring programs, without doing any additional work. What few actual programs dedicated to job pathways were dedicated to people younger than my age, and they only served a few thousand applicants a year. It was a program designed to look good and accomplish nothing.
I was disappointed, but it’s nothing new: nobody is seriously trying to establish any way for non-college educated students to find work.
But what would an effective program look like? What’s probably needed is for someone with the clout of the Clinton Foundation to convince a number of large companies to work with the government to establish a way to “test out” of certain skills that are normally certified by a diploma. An employer won’t and can’t believe an applicant who swears that he or she is smart and skilled enough for the job based on promises alone—believe me, I've tried that again and again. There needs to be another way to prove to hirers a minimum level of skills.
Universities hold a monopoly on the ability to certify many skills. I might have read widely and deeply and practiced long hours to become a skilled writer, but without a diploma to prove it, I’ve had hundreds and hundreds of applications rejected. If there was a way to do an end-run around the diploma process for at least some of the skills for which alternative non-university paths of development exist, the monopoly could be broken.
An important point to consider regarding university monopolies on the authenticity of skills is standardized tests for various career fields. Tests for the Fundamentals of Engineering (F.E), CFA, and CPA require degree completion in that field to even sit for the test. Some even require the coursework to be at “upper division,” eliminating the possibility of associates degree holders sitting for these tests.
These careers (engineering, finance, and accounting, respectively) are three of the most lucrative careers available in the primary labor market today. They represent a clear path to the middle class. Colleges have a clear monopoly on the certifications for these degrees, meaning that the cost of an undergraduate education is another barrier to entry in all of these fields.
Five years ago, the flight vanished into the Indian Ocean. Officials on land know more about why than they dare to say.
1. The Disappearance
At 12:42 a.m. on the quiet, moonlit night of March 8, 2014, a Boeing 777-200ER operated by Malaysia Airlines took off from Kuala Lumpur and turned toward Beijing, climbing to its assigned cruising altitude of 35,000 feet. The designator for Malaysia Airlines is MH. The flight number was 370. Fariq Hamid, the first officer, was flying the airplane. He was 27 years old. This was a training flight for him, the last one; he would soon be fully certified. His trainer was the pilot in command, a man named Zaharie Ahmad Shah, who at 53 was one of the most senior captains at Malaysia Airlines. In Malaysian style, he was known by his first name, Zaharie. He was married and had three adult children. He lived in a gated development. He owned two houses. In his first house he had installed an elaborate Microsoft flight simulator.
The backlash against the Harry Pottercreator is a growing pain of her fandom.
It has taken two decades, but I am finally ready to admit that I was the world’s most annoying teenager. My parents are Catholic, and I used to delight in peppering them with trollish questions, preferably several hours into a long car journey. “Why does the Mass service refer to God as ‘he’ and ‘father’?” was a favorite. “Does God have a Y chromosome, then? Does God have, like, testicles?” I was openly dismissive about transubstantiation, by which the host is consecrated, and according to Catholic doctrine, literally turns from mere bread into the body of Christ. “But all the atoms stay the same!” I would insist. “That makes no sense!”
My parents humored me, but predictably, I didn’t find their responses satisfying. Realizing that your omniscient parents are, in fact, just regular, flawed humans is a vital part of growing up. So is learning that their values are different from yours—that they are products of a particular time and place. Ideas and beliefs that they accept without question make no sense to you, and vice versa. As the 20th century ended in the liberal West, the tenets of feminism seemed irrefutable to me: Of course I would go to university and get a job. A family would come later, if at all. (My mother, by contrast, had her first child at 25.) Gay rights were the same: Why on earth couldn’t two men get married? In my 20s, when The God Delusion came out, I bought it immediately. I was proud to call myself an atheist. Religion was nothing but a tool of patriarchal oppression.
As states ease restrictions on businesses, individuals face a psychological morass.
Reopening is a mess. Photographs of crowds jostling outside bars, patrons returning to casinos, and a tightly packed, largely maskless audience listening to President Donald Trump’s speech at Mount Rushmore all show the U.S. careening back to pre-coronavirus norms. Meanwhile, those of us watching at home are like the audience of a horror movie, yelling “Get out of there!” at our screens. As despair rises, the temptation to shame people who fail at social distancing becomes difficult to resist.
But Americans’ disgust should be aimed at governments and institutions, not at one another. Individuals are being asked to decide for themselves what chances they should take, but a century of research on human cognition shows that people are bad at assessing risk in complex situations. During a disease outbreak, vague guidance and ambivalent behavioral norms will lead to thoroughly flawed thinking. If a business is open but you would be foolish to visit it, that is a failure of leadership.
In France, where I live, the virus is under control. I can hardly believe the news coming out of the United States.
I returned to Paris with my family three months after President Emmanuel Macron had ordered one of the world’s most aggressive national quarantines, and one month after France had begun to ease itself out of it. When we exited the Gare Montparnasse into the late-spring glare, after a season tucked away in a rural village with more cows than people as neighbors, it was jarring to be thrust back into the world as we’d previously known it, to see those café terraces overflowing again with smiling faces.
My first reaction was one of confused frustration as we drove north across the river to our apartment. The city had been culled of its tourists, though it was bustling with inhabitants basking in their reclaimed freedom. Half at most wore masks; the other half evinced indifference. We were in the midst of a crisis, I complained to my wife. Why were so many people unable to maintain even minimal discipline?
Americans found out the hard way that education is essential infrastructure.
If American society is going to take one major risk in the name of reopening, ideally it should be to send children back to school. This issue is personal for me. I have three kids, one in college and two in a local public high school. It’s now early July, and we still have no idea whether or how they will be returning to classes that, ordinarily, would resume just weeks from now. My children’s summer has been idle. They have no jobs and not much summer programming to keep them busy. I try to convince myself they aren’t missing out on much. Hey, I grew up in the ’80s, I think, and all we did during the summer was hang out at the beach. Most days, I make it to about 10 a.m. before I rouse them.
Imagine if the National Transportation Safety Board investigated America’s response to the coronavirus pandemic.
Coping with a pandemic is one of the most complex challenges a society can face. To minimize death and damage, leaders and citizens must orchestrate a huge array of different resources and tools. Scientists must explore the most advanced frontiers of research while citizens attend to the least glamorous tasks of personal hygiene. Physical supplies matter—test kits, protective gear—but so do intangibles, such as “flattening the curve” and public trust in official statements. The response must be global, because the virus can spread anywhere, but an effective response also depends heavily on national policies, plus implementation at the state and community level. Businesses must work with governments, and epidemiologists with economists and educators. Saving lives demands minute-by-minute attention from health-care workers and emergency crews, but it also depends on advance preparation for threats that might not reveal themselves for many years. I have heard military and intelligence officials describe some threats as requiring a “whole of nation” response, rather than being manageable with any one element of “hard” or “soft” power or even a “whole of government” approach. Saving lives during a pandemic is a challenge of this nature and magnitude.
Our neighborhood made us sick. A Praxair industrial gas-storage facility was at one end of my block. A junkyard with exposed military airplane and helicopter parts was at the other. The fish-seasoning plant in our backyard did not smell as bad as the yeast from the Budweiser factory nearby. Car honks and fumes from Interstate 70 crept through my childhood bedroom window, where, if I stood on my toes, I could see the St. Louis arch.
Environmental toxins degraded our health, and often conspired with other violence that pervaded our neighborhood. Employment opportunities were rare, and my friends and I turned to making money under the table. I was scared of selling drugs, so I gambled. Brown-skinned boys I liked aged out of recreational activities, and, without alternatives, into blue bandanas. Their territorial disputes led to violence and 911 calls. Grown-ups fought too, stressed from working hard yet never having enough bill money or gas money or food money or day-care money. Call 911.
American conspiracy theories are entering a dangerous new phase.
If you were an adherent, no one would be able to tell. You would look like any other American. You could be a mother, picking leftovers off your toddler’s plate. You could be the young man in headphones across the street. You could be a bookkeeper, a dentist, a grandmother icing cupcakes in her kitchen. You may well have an affiliation with an evangelical church. But you are hard to identify just from the way you look—which is good, because someday soon dark forces may try to track you down. You understand this sounds crazy, but you don’t care. You know that a small group of manipulators, operating in the shadows, pull the planet’s strings. You know that they are powerful enough to abuse children without fear of retribution. You know that the mainstream media are their handmaidens, in partnership with Hillary Clinton and the secretive denizens of the deep state. You know that only Donald Trump stands between you and a damned and ravaged world.
As the pandemic has raged on, popular culture has found new ways to ask an old question: What could have been instead?
There’s a certain kind of movie that lets you down not because it’s bad, but because it could have been great. One of those movies, for me, is Sliding Doors. The 1998 rom-com has a “philosophical” premise and a double timeline: As its poster asks, “What if one split second sent your life in two completely different directions?” In the first timeline, Helen Quilley (Gwyneth Paltrow) gets fired from her job and returns home to her boyfriend—just in time to discover him cheating on her. In the second, Helen misses her train, by one split second, and therefore remains unaware of the infidelity. The two plots—two possibilities—unfurl; in the process, age-old questions about contingency and destiny are answered by way of Hallmarkian melodrama. Like I said: It could have been great. It isn’t.
For his first three years of life, Izidor lived at the hospital.
The dark-eyed, black-haired boy, born June 20, 1980, had been abandoned when he was a few weeks old. The reason was obvious to anyone who bothered to look: His right leg was a bit deformed. After a bout of illness (probably polio), he had been tossed into a sea of abandoned infants in the Socialist Republic of Romania.
In films of the period documenting orphan care, you see nurses like assembly-line workers swaddling newborns out of a seemingly endless supply; with muscled arms and casual indifference, they sling each one onto a square of cloth, expertly knot it into a tidy package, and stick it at the end of a row of silent, worried-looking babies.