A college degree may be the golden ticket to a better job, but that incentive alone isn’t enough to stop millions of students from dropping out of school. In fact, just over half of students complete their postsecondary degrees within six years. But a lack of academic preparation is not necessarily the saboteur of their success: More than 40 percent of dropouts left their studies with at least a B average, a recent analysis of 55 colleges showed.

Faced with bleak statistics such as these—in addition to scrutiny over their affordability—colleges are looking in the mirror to examine how they might do more for these students who have the talent to make it but ultimately don’t. At a growing number of schools in states like Maryland and Tennessee, the results of this soul-searching are starting to take shape as a series of digital columns and rows in spreadsheets. This reform practice even has a flashy name: predictive analytics.

Colleges have looked at student data before, but often “it’s data too late,” said Frederick Corey, the vice provost of undergraduate education at Arizona State University, who spoke at a meeting for education reporters last month.

Corey is one of the university’s main drivers of using data to improve the students’ academic experiences. While we may view lists of numbers as the arch expression of a campus’s impersonal attitude toward its students, predictive-analytics evangelists believe that data collected the right way ultimately can personalize a student’s time at a large school in ways that weren’t previously possible. The result is a system of timely suggestions that prompts students to perform the tasks that are shown to improve their chances of completing a course, and ultimately a degree. But while the potential is high, the risks are salient, too. When does a digital nudge turn into a dictum that prevents a student from chasing her dreams? And does that digital profile become a risk to the student’s privacy?

Institutions of higher learning have always gathered copious amounts of information about their students, from how many of them complete certain courses to how accurately a grade in one course predicts their success in harder classes down the line. But until recently, much of that information had been collected merely for accountability purposes—data that are shared, for example, with state and federal agencies that track how successful colleges are at graduating their students or giving them educations that allow them to earn living wages in the workplace. (It’s these data mandates that allow journalists to report on the previous year’s graduating class.)

But what about using these data to look ahead? “I just want to understand, why is the world of education obsessed with autopsy data?” Mark Milliron, the co-founder of education predictive analytics firm Civitas Learning, recalled his partner at work asking after a few months on the job. “It’s always studying data of students who are no longer [enrolled].” And often the data are sliced to show how the average student behaves, painting a picture of the typical student that actually applies to no one. “How many of us know people with 2.3 kids?” Milliron quipped.

And not all data are digital. Small colleges already have a type of predictive analytics built into its system, explained Milliron. Thanks to small faculty-to-student ratios, professors and administrators are able to make quick judgment calls about their students’ weaknesses or points of trouble—lack of participation in class, fear of making eye contact, the tremors in the voice hiding the embarrassment of being overwhelmed—and act on those observations. But “you’re probably not going to get that same personalized experience” at larger campuses where students are likelier to be the first in their families to vie for a bachelor’s degree and may not know how to navigate both the bureaucracies and expectations of a college education, Milliron said. With predictive analytics, “you’re connecting the dots so you’re not getting lost in the mix.”

Experts say a good predictive-analytics system avoids making recommendations based primarily on a student’s financial or cultural background. In Milliron’s experience, many colleges initially assume it’s enough just to observe that low-income students or those who belong to certain racial groups underperform; colleges would then make assumptions about students from similar backgrounds who enroll and then refer them to mentoring sessions or more time with advisers. “You’re insulting and-or stereotyping that student,” Milliron said. Worse, colleges may feel motivated to either exclude those students from their admissions or lower their standards for issuing degrees.

Rather than focusing exclusively on race or family income, a more precise predictor of success is whether or not a student’s financial aid is adequate to address her financial needs. Students stressing over holes in their finances are at greater risk of leaving college—as Temple University learned when it turned to data to boost its graduation rates. (In fact, hundreds of schools are offering emergency small loans and grants to students who may be at risk of dropping out due to diminished funds.) Another telltale sign that some students may be off track is whether they have enrolled in a key requisite to their major by a certain point early in their college tenure. Sending alerts to them or their advisors can preempt the cascading effects of taking the necessary classes too late.

Maintaining a staff of data analysts who are able to monitor student behavior in real time across multiple variables can be expensive for colleges, but the payoffs can be huge. Corey of Arizona State University said since his school began using predictive-analytics programs nearly a decade ago, it’s seen its graduation rate climb by 20 percent. One tool ASU has relied on is College Scheduler, a product that several hundred postsecondary institutions have used. Before they sign up for classes, students enter personal information into a dashboard program that spits out possible course schedules and take into consideration their personal and academic obligations, like being a working parent pursuing biology who has to pick up a daughter from daycare. The tool can be valuable because many students may otherwise end up taking courses that don’t count toward their major, wasting their time and financial aid. At ASU, the College Scheduler auto-populates with the courses students have to take, Corey said.

“Students, unless they’re John Nash, Jr., can never do that matching,” said Milliron, who added that College Scheduler has been shown to boost college-completion rates by more than three percent.

Still, all that data requires a high degree of training and security, because universities have “data points on a student encompassing almost every single aspect of that student’s life in a way that no one else does,” said Brenda Leong, a senior counsel and director of operations at the Future of Privacy Forum. Beyond the abuses of power that are potential hazards with the use of predictive analytics, there’s also the difficulty of ensuring a student’s privacy. Leong noted that just by knowing someone’s birth date, gender, and zip code, there’s an 87 percent chance she could determine that person’s identity. Leong said she often hears boosters of big data referring to the growing amounts of student information as “fields of gold.” “That’s the kind of phrase that puts a lot of people off,” she said. “It’s not data, it’s students; it’s real people with real lives.”

How these predictive data are relayed to students matters as much as the data itself, experts contend. Haughty notes or clinical red flags in students’ inboxes that caution they’re at risk of jeopardizing their academic futures because they’re not attending classes or logging into online portals to turn in homework can chase them away permanently. “This phrase, ‘You’re at risk’ is highly problematic. We never say to a student, ‘you’re at risk,’” Corey said. In recent years some scholars have been developing motivational language for students that strikes the right tone between concern and a kind call to action. The idea is to have colleges adopt these terms so students feel emboldened to improve rather than distraught over their limited success.

Milliron said more encouraging language would say something like, “just so you know, the next milestone is this course. If you pass this course at this level, you’ll triple your likelihood of graduating.”

But even that approach might lead to miscues. Leong warned that if professors are the ones monitoring the student data and firing off such missives, their opinion of the students may be altered. The professor could become overly solicitous or judgmental, undoing the potential benefit of the initial concern. One workaround is to have mentoring officers who are trained to speak sympathetically to struggling students send those notes and track the data.

In other words, predictive analytics is a few Jurassic eggs along in its evolution. Getting it right will take time, Milliron said. “Anybody who says they have this all figured out doesn’t know what they’re talking about.”


This article appears courtesy of the Education Writers Association.