At Georgia State University, algorithms alert advisers when a student falls behind in class. Course-planning tools tell students the classes and majors they're likely to complete, based on the performance of other students like them. When students swipe their ID cards to attend a tutoring or financial-literacy session, the university can send attendance data to advisers and staff.
Colleges are analyzing all kinds of student data to figure out who needs extra support and when advisers and faculty should intervene. But as technology advances, and students' offline and online lives become more intertwined, data analytics—particularly, predictive analytics—may raise more ethical questions.
Georgia State started using predictive analytics in 2012. It worked with the Education Advisory Board, a for-profit consulting company, to analyze millions of past course grades and create algorithms that identify signs of academic struggle—from the obvious, like failing a class, to the not-so-obvious, such as barely passing a core class required for a major. The system is designed to predict students' likelihood of success in any major.
The university has since added financial-aid data (and is slated to add card-swipe data) to the predictive model to create a more comprehensive assessment of every student's progress. All the data analysis has reportedly helped the Atlanta university target outreach to its entire population of more than 32,000 undergraduate and graduate students, many of whom are low-income, African American, or Hispanic.
Supporters say this kind of data analysis is legal and can be performed with students' interests in mind. Timothy Renick, the vice president for enrollment management and student success at Georgia State, said the university is "resoundingly confident" that it's complying with the Family Educational Rights and Privacy Act, federal legislation that addresses the collection and use of student data in higher education. Using information to benefit students, he said, "is exactly why we have access to the data in the first place."
But the practice still raises privacy and ethics concerns, according to Joel Reidenberg, the founding academic director of Fordham University's Center on Law and Information Policy. Even when colleges collect aggregate data and scrub it of personally identifiable information, that still counts as surveillance if they use it to guide individual students, he said: "You have to do the data-mining to be able to profile the individual. And you're taking action based on the data-mined profile."
Renick said that Georgia State has considered whether students should be told that universities are mining their data. If a student were to complain, he said, the university would stop tracking them. But so far, according to Renick, the university's received nothing but positive feedback from students.
It is now routine for all kinds of websites to customize users' experiences based on data analytics. Young people have different expectations of privacy than do older generations; after all, they grew up sharing personal information on social media. "What's happening now with the university's interaction with them is not that different from what's been happening on Facebook and other places—Amazon, and so forth," Renick said.
The average college has nothing close to the analytic capacity of Facebook or Amazon. But colleges in theory could data-mine almost every aspect of a student's life. Institutions can track what students say in online class forums, who downloads the lecture notes, and how long they spend reviewing online material. Institutions can record when and where students swipe their ID cards to follow their physical movements, from the dining hall to the health center.
To engage in these practices, institutions typically build or purchase software—the Knewton Platform, for example—that analyzes every keystroke a student makes to figure out his or her learning style. "The NSA has nothing on the ed tech start-up known as Knewton," Politico wrote earlier this year. Some of the data these learning applications collect doesn't fall under the federal government's definition of "educational record," and thus doesn't fall under laws that restrict the kind of information colleges can and cannot share with third parties.
But some experts are starting to question how this type of universal data collection could affect the educational experience. One of them is Matt Pittinsky, the technology entrepreneur who cofounded the learning-management system Blackboard and currently serves as the CEO of Parchment, another education technology company. Though Pittinsky, who also teaches sociology at Arizona State University, believes that most colleges analyze too little data (and thus fail to address completion and quality issues), he's sometimes troubled by what he hears at ed-tech conferences about the big-data movement. Last year, he even challenged a fellow panelist on comments made about the potential of universal data collection in higher ed: "I just sort of stopped and said, 'I think you're describing a state of education where every interaction a learner is having with a faculty member and with each other [online] is tracked and used to form judgments about them, to form judgments about people like them, to form judgments about the next group of people like them.'"
"There's something worth talking about in that," Pittinsky said. He worries that the back and forth of classroom inquiry might be stifled if faculty and students knew every keystroke they made—even every word they spoke—was being recorded and used to make predictions about them.
Predictive tools could convince educators and students that the academic future is predetermined, in the same way that placing students into "honors" or "regular" grade-school classes can end up defining them. "What begins as the notion of pacing education to each learner's abilities at the time can very quickly become a solidified view of what someone is able to do and what someone is not able to do, with very heavy-handed direction given to them about what they then have access to," Pittinsky said.
Universities should be able to navigate privacy and ethical issues: They are, after all, packed with people who conduct research and ponder big questions for a living. With well-trained advisers and well-designed tools, predictive analytics needn't pigeonhole students into one major over another.
At the heart of the debate over predictive technology are two competing visions about the objective of a college education. Should college be a period when students can find their passion, make mistakes and learn from them? Or does that approach doom some students—particularly underrepresented students—to failure?
"What we were doing in the past wasn't working," Renick said. Under Georgia State's former, less proactive advising system, students whose parents had gone to college were fine, but first-generation students floundered. "They left with high amounts of debt and no degree," he said. "That was not an acceptable program."