When Aaron Swartz, a prominent programmer and digital activist, was arrested in 2011, he was halfway through a fellowship at Harvard’s Center for Ethics. Police took him in after he entered a computer closet at MIT and downloaded enormous amounts of data from JSTOR’s extensive database of academic research.
Swartz was charged with 13 felonies, 11 of them based on the Computer Fraud and Abuse Act, or CFAA. Some of the charges hinged on violations of JSTOR’s terms of service, which prohibit bulk, automated downloads that prevent other customers from accessing its documents. Faced with the possibility of decades in prison and up to a million dollars in fines, Swartz took his own life in 2013.
The ordeal brought CFAA—a 30-year-old anti-hacking law that has been updated half a dozen times to keep up with new technology—under intense scrutiny. Digital-rights activists decried the law for its vague definitions of “unauthorized access” to a computer, and for punishing researchers and academics who analyze and pick apart computer systems they don’t own or operate. Some members of Congress tried (and then tried again) to roll back elements of the law, including those that have been used to criminalize terms-of-service violations. But a bill sponsored by Representative Zoe Lofgren and Senator Ron Wyden—a pair of tech-savvy Democrats—died in 2013, and is currently stalled after being reintroduced in 2015. The lawmakers named their bill “Aaron’s Law” after Swartz.
Now, a group of academic researchers and journalists is suing the government, challenging the constitutionality of part of CFAA. With the help of the American Civil Liberties Union, they’re targeting the portion of the law that makes it illegal to break private companies’ terms of service, claiming that the rule violates researchers’ and journalists’ rights to conduct research and investigations in the public interest, as guaranteed by the First Amendment.
Those terms are essentially contracts that users must agree to before receiving an online service like a social network, a retailer’s online catalog, or a specialized search engine. They come in the form of the seemingly endless blocks of lawyerly fine print that pop up as you sign up for a new service—the ones nearly every user clicks through without reading. But although they’re an individual’s agreement with a company, CFAA makes violating them a federal crime.
The four professors bringing the lawsuit are conducting research into racial and other discriminatory biases in online services. One pair of researchers—Christian Sandvig, of the University of Michigan, and Karrie Karahalios, of the University of Illinois—is examining real-estate websites, and the other pair—Alan Mislove and Christo Wilson of Northeastern University—is looking at hiring websites.
Both projects are trying to determine whether the sites serve different search results, listings, and ads to users that appear to be from minority groups. To do so, they’re creating an army of fake profiles and tweaking them to look like they belong to a diverse set of people. But using that tactic—one that’s very popular among researchers—could make the professors felons: The terms of most online services, including the largest employment and housing-search websites, prohibit creating multiple profiles, falsifying profile information, and scraping publicly available information with automated scripts.
The lawsuit argues that testing for biases is essential to enforcing the Fair Housing Act, which prohibits discrimination in home rentals or sales, and the Civil Rights Act, which bans discrimination in hiring. In the suit, ACLU lawyers point out that the government has conducted nationwide undercover testing for illegal biases in housing since the 1970s, and that courts have repeatedly upheld the rights of employment-bias testers.
They argue that online testing for biases in housing and employment is just as important as older testing models, in which actors of different races would pose as buyers or job candidates and record the responses they receive. Online, individuals might not be discriminated against because of a single realtor or hiring manager’s personal bias; instead, biases baked into big-data algorithms can prove harmful to candidates and applicants.
Even if a person isn’t required to enter their race, gender, or sexual orientation into an online form, there are ways for a website to guess at those details about a person. Data brokers create profiles for people based on their online activity, and sell those profiles to companies and organizations. The researchers want to know if a user whose profile indicates she’s black will be served different apartments or jobs than one whose profile indicates he’s white.
The lawsuit also includes First Look Media, the parent company of The Intercept, a news site focused on national-security, surveillance, and civil-rights reporting. The suit says the publication’s journalists are vulnerable to the same sorts of prosecution for their investigative work as the professors are for their research.
As long as attempts to reform CFAA in Congress are stalled, challenging the law in court may be the only way to move the needle. If the case sees any success, it could even breathe life back into Aaron’s Law, or a similar bill. But until the law is changed, researchers who break companies’ terms of service for work have to keep looking over their shoulders for fast-approaching corporate lawyers—or federal prosecutors.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.