In a decade, cognitive enhancement may have gone mainstream. Pills can already help you stay up longer, bring more focus to your work, and who knows what else. But what might sound good on an individual level could create societal disruptions, or so Palo Alto think-tank the Institute for the Future proposes in its latest Ten-Year Forecasts.
As a result, the Institute has proposed that the world's citizens need a "Magna Cortica."
"Magna Cortica is the argument that we need to have a guidebook for both the design spec and ethical rules around the increasing power and diversity of cognitive augmentation," said IFTF distinguished fellow, Jamais Cascio. "There are a lot of pharmaceutical and digital tools that have been able to boost our ability to think. Adderall, Provigil, and extra-cortical technologies."
Back in 2008, 20 percent of scientists reported using brain-enhancing drugs. And I spoke with dozens of readers who had complex regimens, including, for example, a researcher at the MIT-affiliated Whitehead Institute for Biomedical Research. "We aren't the teen clubbers popping uppers to get through a hard day running a cash register after binge drinking," the researcher told me. "We are responsible humans." Responsible humans trying to get an edge in incredibly competitive and cognitively demanding fields.
And part of Google Glass's divisiveness stems from its prospective ability to enhance one's social awareness or provide contextual help in conversations; the company Social Radar has already released an app for Glass that shows social network information for people who are in the same location as you are. A regular app called MindMeld listens to conference calls and provides helpful links based on what the software hears you talking about.
Both are one more step to integrating digital information directly into how we think as prosthetic knowledge.
So what do we do, societally, with all these possibilities from drugs to intelligent digital agents?
"As the technology improves, the potential for abuse and damage becomes really profound," Cascio continued. "So many of the parallel examples that we would go back to have has really bad results: Are we going to treat this like doping in sports, and create a criminal culture around it? Do we treat it as another version of a cell phone?"
These are not questions that can be answered by the development of the technologies. They require new social understandings. "What are the things we want to see happen?" Cascio asked. "What are the things we should and should not do?"
So, he floated five simple principles:
1. The right to self-knowledge
2. The right to self-modification
3. The right to refuse modification
4. The right to modify/refuse to modify your children
5. The right to know who has been modified
The point, as in the other IFTF forecasts, is not to resolve these complex future problems, but to shift the time when people begin thinking about them a bit closer to our present, so that technologists and critics don't begin thinking about them too late. Cascio calls this Magna Cortica version 0.1, a starting point for a deeper conversation about this strange new world.