In 2003 the iPod was a relatively new gadget for listening to music. Billboard ads showed young people dancing, iPods in hand. Few people would have pinpointed this newfangled Walkman as a powerful teaching tool.
Cathy N. Davidson, a professor at Duke University, believes that classrooms aren't keeping up with the kids. She thought, what is the untapped educational potential of the iPod? She and her Duke colleagues worked with Apple to give every entering freshmen an iPod, and then they sat back and watched as students and teachers developed innovate and collaborative ways to incorporate iPods into their work: med students could listen to recordings of heart arrhythmia, music students could upload their compositions and get feedback from other students, environmental studies students interviewed families in a North Carolina community about lead paint in their town, and then shared their interviews online, for other students to download.
No one could have predicted all the ways the iPods enhanced learning once they were in the hands of students and teachers -- and that's a central point of Cathy Davidson's new book Now You See It. In it, Davidson argues that though our lives outside of the classroom are changing rapidly, our classrooms remain stuck in an earlier era. I asked Davidson a few questions about her book and what schools can do to prepare students for a future we can't even predict.
One of the foundational facts of your book comes early on. "By one estimate," you write, "65 percent of children entering grade school this year will end up working in careers that haven't even been invented yet." You argue that our education system is structured to produce workers for an economy that will not exist when today's students enter the workforce. But many educators are critical that changing curricula and standards to incorporate online collaboration, new gadgets such as iPods, and video games take away from traditional learning. Why is that? What are they worried about?
First, I have sympathy with the in-the-trenches teacher who is constantly being asked to change without a good reason for change, especially when a school district gets a windfall like "free iPads" without any sound curricular motivation. I am against simply dumping technology into a school system. The point is that technology alone, without a clear redesign of the learning it enhances, is not enough. The exception to this is perhaps in a really terrible classroom where there isn't much of anything else going on. A gadget such as an iPad, in that situation, can offer a smart kid a real opening on to a new, wide world.
Second, I think one area of resistance by educators comes when we simply critique teachers for not adapting without offering them support and time for retraining and upgrading. In the business world, IBM happens to spend the equivalent of $1,700 per year per employee on retraining to help workers adjust to a rapidly changing environment. How can we expect teachers to change without similar investment in their future and in the future of our kids?
Pedagogy is tough. To relearn one's own teaching methods
requires some time out of the classroom and dedicated to experimenting and
practicing new methods, with serious feedback from teacher-mentors to
help. If we are serious about reform,
we have to be serious about teacher professionalism and aid that process, not
simply hurl critiques at "bad teachers."
Third, I don't think we do a very good job educating
teachers to understand that they have inherited an education system mostly
designed to prepare students for a focused, task-specific form of
attention demanded by the late-19th-century assembly line and then,
later, by the similarly hierarchical and regulated corporation.
The school bell was the symbol of public education that developed in the 19th century because teaching all humans how to arrive at a school/workplace on time, how to complete a task or "subject" in a designated amount of time, how to work on a test or a project in a specific amount of time was a new way of calculating human productivity. Teachers ought to think about how much of their system has been designed to prepare students for the punch-clock world, and reevaluate their goals and routines in light of the world kids will enter: an interactive, globalized, and contributory world.
You cannot reform the content or the method of teaching without radically changing the terms of assessment.But, finally, it is just hypocritical to think educators can make such a paradigm shift without changing all the systems of assessment that judge the success of our teachers and students. That's why I decided to devote an entire chapter to "How We Measure." All the methods of assessment we use for "quality" are actually metrics for standardization analogous to the punch clock, not about interactive, synthetic, and analytical thinking and problem solving. You cannot reform the content or the method of teaching without radically changing the terms of assessment. That means ending the end-of-grade tests required by No Child Left Behind. It means going beyond so many of the quite simplistic quantitative measures that ostensibly test learning but really test the ability to take tests.
I actually tracked down the archive of the
PhD student who invented the multiple-choice test in 1914, specifically to
address a historical moment: the convergence of new laws requiring secondary schooling,
immigrants flooding into America in record numbers, men away fighting in World War I, and women working in
factories. Frederick Kelly looked at
Ford turning out Model Ts on assembly lines and invented the Kansas Silent
Reading Test to be the Model T of testing -- not very thorough, not deep, a test
only of "lower-order thinking," but easy, sound, fast, efficient.
I am not against testing -- I am against using such a crude form of testing, one that is such a disincentive to deep interactive learning, as our national standard. That's demoralizing to teachers, parents, school administrators, and, mostly, to kids. In so many ways, our educational system is an assembly line churning out kids like Model Ts. That's not what our kids need to address the constant changes and complexities of the 21st century.
Though we don't know what those 65 percent of jobs will be, you do sketch out a rough idea of what the future workplace will be like and what skills workers will need. Can you describe that future and what you recommend doing to prepare for it?
Let's not even try to imagine the future. Everywhere around us are new kinds of employment that didn't exist a decade ago. In fact, if you look back a decade to what people predicted we'd be doing now, almost everyone was wrong. Very few people, including those who invented the Internet and the World Wide Web, understood the impact crowdsourcing would have.
The term "crowdsourcing" was coined only in 2006 and Wikipedia defines it
as "outsourcing tasks traditionally performed by an employee or a contractor,
to an undefined, large group of people or community (a "crowd") through an open
call." Wikipedia, of course, is itself an
example of crowdsourcing and no one predicted that without remuneration, as
volunteers, millions of people all over the world would create the largest
encyclopedia the world has ever seen, and would work to edit it to make it
better, more reliable than any existing dictionary. No economic, intellectual, pedagogical theory
predicted Wikipedia. "Human nature" a
decade ago wouldn't even allow such a cooperative venture, and yet there we
have it. And so much of contemporary
life is now crowdsourced -- including the free and open source Web browser Firefox by Mozilla which has
taken over nearly 30 percent of the world-wide usage share of Web browsers.
So, think about how many "jobs of the future" already exist,
on every level, just related to crowdsourcing: (1) the full array of programming jobs,
building computational platforms for participation ("crowdsourcing"); (2) the
full array of manufacturing jobs building the hardware by which information is
transmitted (cell-phone towers are structures)
and received, by which computers are assembled; (3) the full array of assessment jobs, creating and designing and
implementing relevant testing systems, such as those used by web developers
such as Top Coder, so that anyone's contribution to a crowdsourced system can
be "graded"; (4) intellectual
property lawyers to sort out the complex IP issues of our age, ideally not just
reforming but reconceptualizing outdated and stifling copyright and patent
laws. The list goes on.
In the years ahead, we will need knowledgeable activists to fight to protect our labor laws in a new economy and to protect our personal and civil rights at a time when, from Egypt to San Francisco, authorities have been willing to turn off the cell-phone towers to quell protests. We will need new kinds of financial analysis to understand and regulate erratic market behaviors that the new computational systems already allow. And we will need new artists, writers, dancers, and musicians who can make beauty from our digital, interactive participation on the Web and beyond it.
When you think in these terms, about how drastically paradigms and potentialities have shifted in such a short time, you realize even that 65% number is probably too small!
You write that "Games are unquestionably the single most important cultural form of the digital age." What do you mean by that? How could games be used to improve education?
Games are integral in human society, from ancient times to the present. Games are based on strategy and on challenge. If you do well at a game, your reward isn't "recess" or a "time out"; it's a greater challenge. When you beat a tough opponent, you seek out a tougher one. That is learning. Being able to harness the energy of games is one of our best learning tools, as any good parent knows, from patty-cake to Simon Says to musical chairs to chess or go. You can advance physical, mental, linguistic, and intellectual progress through games where the testing isn't after the fact but is intrinsic to and embedded in the very structure of play.
In the 1980s and 1990s virtually all the research on early video games was positive, about the benefits to everything from attention to memory. Games are still used to train pilots, the military, architects, surgeons (robotic and traditional), musicians, engineers, and are also used for rehab and to help or enhance elderly cognitive functions. But three factors shifted the focus of the research on games away from learning to the negative effects games could have on kids. (1) Cute, abstract games like PacMan gave way to graphic, violent narrative games such as Grand Theft Auto. (2) Kids really took to video games; a recent Pew survey indicates 97% of kids play games. And (3) blame for the terrible 1999 Columbine tragedy, where kids systematically sought out and executed their classmates, was pinned by the public and the press on rock music and video games. The actual commission that studied Columbine did not reach this conclusion but parents and educators understandably were alarmed. After 1999 research dollars that once went to thinking about games and learning were rerouted to moralistic studies about how video games lead to violence, asocial behavior, and so forth. We lost a decade.
I am very proud to be part of the MacArthur Foundation's Digital Media and Learning Initiative that has spent the last several years trying to reverse that trend, to do research that studies how kids actually use media, not how we fear they do, and to develop exciting games that motivate creative, challenging learning. I recently was able to see a demonstration of a fantastic online algebra game, for example, that not only challenges learning, but where every problem is a test, in the sense that, if you don't solve the problem, the system generates a new problem that goes a little backward to some more basic principles, and then, when you succeed, it generates a more advanced problem and so forth. The results are amazing, because the test isn't at the end of the year, it is in everything you do, as you do it, getting not just harder and harder but more and more interesting. We know that boredom -- for the most gifted students and also for the lowest academic achievers -- is the biggest inhibitor of learning there is. Games motivate. Checkmate!
Are there any aspects of our educational model that you think will serve us well in the years ahead? In what ways is our world not changing?
The most fundamental conditions of a classroom are precious -- kids together in a special environment dedicated to their improvement, with the support of a teacher to guide them along the way. Structurally, there is no better situation in which a society can invest. Now, does that environment have to be face-to-face or can it be supplied online, by a virtual community? I like to say that if, as educators, we can be replaced by a computer screen, we should be. That is, if all I'm doing is standing up at the front of the room and yapping at kids, not responding to their individual needs and their collective energy, not exploiting this rare and special opportunity for us to be together, then someone should pocket my salary and use it to buy some really effective online, interactive learning system supported by an online community of active and interactive mentors and facilitators.
The astonishing online nonprofit Peer-to-Peer University offers courses on everything from computer programing to Baroque art in a crowdsourced, interactive way, proposed by individuals, taught by others. Recently Stanford announced it was offering a free, online course in Artificial Intelligence. Last I heard, over 100,000 people had signed up for it. The virtual organization of educators that I cofounded in 2002 is HASTAC (pronounced "haystack," and an unwieldy acronym that means Humanities, Arts, Sciences, and Technology Advanced Collaboratory). It now has more than 7,000 active members including 200 undergraduate and graduate students on small scholarships around the world who hold online intellectual forums on topics they generate -- "Democratizing Knowledge," "Digital Storytelling," "Grading 2.0: Evaluation in the Digital Age," "Race, Ethnicity, and Diaspora in the Digital Age," or even "Critical Code Studies." These heady, online, student-generated Forums have seen over 350,000 visitors in the last two years.
The point is that if the classroom experience is inferior to an online educational program, get rid of it! If you respect and honor the fact that humans love collective experiences where we cheer, fear, laugh, or learn together -- we pay to go to sports, movies, comedy clubs, concerts, and lectures -- then you can begin to rethink school as a collective event and maximize what is added by a group experiencing together.
In researching Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work and Learn, I spent a lot of time in the classrooms of gifted individuals who sometimes used very little actual technology but really thought about interaction in profound and inspiring ways. I profiled some of those in the book. And, of course, I talk about how my own teaching has been transformed by what I've learned from the lessons of the digital age we live in and, more specifically, from the marvelous teachers I've met in the course of my research on this topic.
That is the real message of the book. As long as we understand the importance of
"unlearning," we can take the next step of relearning in order to be
happier and more productive in a
challenging world. This is true at any age. We have to first take the time to understand where our assumptions come
from. History is very useful here. We have to understand and evaluate our own
habits and realize that is where our own attention blindness stems from.
That which is automatic is unexamined -- and that's what gets us in trouble. Our habits cause us distress when the world is changing fast and we cannot keep up. But just as we learned our habits of attention, we can unlearn them and relearn new ones -- if we have the right partners, tools, and methods to help us do that. That's what I was able to find in doing the research for Now You See It: not ideas about the future, but real people, right now, who had found ways to maximize their own potential by working with the right partners, tools, and methods. They have inspired me to make really positive and productive changes in my own patterns -- and I hope their compelling stories will inspire my readers too.
This article available online at: