One thing a "People"s Terms of Service" would challenge is the Objective Theory of Contract, the doctrine that attempts to ignore the context in which contracts are negotiated and agreed upon. "The law currently protects one-sided contract arrangements," the authors write, "by assuming they were fairly negotiated, and thus reflect a 'meeting of the minds' by equal parties." Yet that assumption, in a world of boilerplate jargon and pages-long disquisitions, is no longer a fair one to make. "After all," they argue, "these contracts are usually created through user confusion and one-sided demands. How can citizens even bargain with a standard, take-it-or-leave-it form?"
The People's Terms of Service Agreement, instead, would be informed by consumers' desires as well as corporations'. It would be user-friendly. It would "use plain English, not legal jargon." It would be short enough for people to realistically be able to read it. Most of all, it would embrace five values: security, confidentiality, transparency, permanency, and respect for intellectual property.
Permanency: The contract cannot be unilaterally altered, period. This seemingly obvious rule, which applies to most contracts, has been undermined by technology companies that attempt to reserve the right to alter their terms of service without meaningful consent from users. A People's Terms of Service would require meaningful opt-in from users for any material changes. Users would also retain the right to have all their materials permanently deleted if they choose to leave the site, and that provision could never be altered.
Transparency: Companies promise to be transparent and provide meaningful notice to the individual regarding its collection, use, dissemination, and maintenance of personal information.
Intellectual Property: Companies respect the value of an individual's name or likeness and the copyrights of user information and work posted on the sites by taking minimal licenses for administrative use, or provide some profit-sharing for advertisements other commercial use of a user's name or likeness. (This is an important option for companies building a business model on monetizing user-generated content).
Confidentiality: Companies promise not to disclose personal information to third parties, unless users meaningfully opt-in to such disclosure for each party. (As always, they would still respond to government and legal requests.) Facebook itself makes a similar promise. To ensure protection, companies also promise to contractually ensure that recipients of personal information are obligated to respect the anonymity of any transferred data sets, and to provide users with control over data portability so it's easier to leave a given service.
Security: Companies promise to use appropriate and industry standard security safeguards protect all media against risks such as loss, unauthorized access or use, destruction, modification or unintended disclosure.
Those core ideas, the scholars say, are a starting point -- values that interested users and consumer advocates could debate and then, perhaps, draft into a model contract. The goal would be to leverage the force of collective action to counter the power enjoyed by digital behemoths. "The power of even a few million social network users, when coordinated," the authors write, "could push companies to offer more consumer rights as a good business decision." (The idea was inspired, in part, by recent class-action suits against Google and Instagram and Facebook -- the last of which accused the company of co-opting their identities in online ads. In response to that action, Facebook revised its "Statement of Rights and Responsibilities," in addition to offering the plaintiffs a $20 million settlement.)