The tech industry is officially out to remodel your kid's classroom -- and it feels like there's a good chance that it's going to succeed. After years of more or less resisting the pull of the web, both college and K-12 seem ripe to be remade for the digital age. There's political buy-in. There's investor buy-in. There's, frankly, a pervasive sense that it's just time.
But what exactly will tomorrow's schools look like after they get a SIlicon Valley-style makeover? What exactly are we trying to accomplish pedagogically by integrating computers more deeply into the classroom? And how do we work more science, math, and tech education into our schools? These were some of the issues that the guests at The Atlantic's Technologies In Education Forum tackled earlier today. Here are a few of the interesting ideas I took away from the wide-ranging discussion
Kids should play video games in school (and out)
Could video games be the future of tomorrow's classrooms? Maybe. Justin Leites, an executive at News Corp. ed-tech subsidiary Amplify, told the crowd that he used to look at the landscape of educational gaming and wonder, "How is it that all these well-intentioned, smart people have been doing this so long and it's so awful?" Today, he argued, companies are catching up by adopting the production techniques of the big commercial gaming studios. Amplify's goal is create products that students will actually play for fun outside of the school (and no, not even as homework). But others are pushing their products into the classroom space itself.
Important disclosure: one of the day's two underwriters was the Entertainment Software Association, otherwise known as the video game lobby. Obviously they have more than a bit of interest in seeing the K-12 market open up for their product.
In any event, some experimentally minded schools are already embracing video games as teaching tools. Kate Selkirk, a master teacher at New York City's Quest to Learn, talked about students using Minecraft to build roller coasters and Ancient Greek and Roman architecture. She explained how she used Prezi -- which is a bit like an interactive Powerpoint for supernerds -- to build mazes for students which they could navigate by solving math problems. When students hit a dead end, it served as an instant alert system for kids who were having trouble with their work, allowing her to come over and help them.
Which brings us to another one of the day's big themes. Making school more fun is an age old desire among educators. But it might also make it more effective.
The end of industrial-style learning
It's a common complaint among education wonks that our current K-12 system is the product of a bygone industrial era, where students were processed like widgets on an assembly line. As Roberto Rodriguez, President Obama's special assistant on education put it, "the norm persists that we treat all learners the same."
But if today's panelists are to be believed, education may finally be about to move into its post-industrial phase. The key, as always, is data. Games and digital homework tools provide instant feedback that let teachers address specific student needs in ways they haven't been able to before. Richard Culatta, acting director of the Department of Education's Office of Educational Technology, argued that there needs to be a "GPS for learning," which will analyze students strengths and weaknesses, and point them in the direction of how to improve. We're not quite there yet. But maybe one day soon.
Don't punish noble failures
This was a subtle, but interesting point brought up by David Pinder, principal of McKinley Technology High School in Washington, D.C. Thanks to No Child Left Behind, schools now face fairly stiff federal standards. But if teachers are going to experiment with new technologies, they need more margin for error. "Innovation requires failure," he said.
It's time to treat computer programming the way we do algebra
Despite all the education professionals milling around, my favorite insight during the day evolved out of a conversation involving a couple of High School kids and an audience comment. Essentially: It's time to stop thinking of computer programming as a specialty subject. Schools should respect it as a fundamental skill.
Wilfried Hounyo and Sam Blazes were both winners of the 2012 National STEM Video Game Challenge, a competition in which students to design games that can help teach subjects in science, math, technology, and engineering (aka, STEM). Blazes's game, "Battle of the Bugs: Genes Rule," asks players to breed super insects and complete challenges using the principles of genetics, while Hounyo's submission, Electrobob, teaches about the science of atoms and ions by having them skip around as an electron. The two enviably intelligent teens spent much of their interview with moderator and Atlantic national correspondent Hanna Rosin explaining how they'd come to fall in love with programming. One key take away: Robot building teams are the newer, infinitely more awesome version of math club for the kids today. Another: As Rosin noted, these students don't think of programming as a whole separate world the way older adults tend to, but as a tool they can use to explore their interests.
That might seem obvious to some, but it's worth dwelling on in an education context. Coding is just a fundamental tool, the same way writing in English and algebra are. Moreover, having a basic understanding of how technology actually functions and is developed is becoming important across more and more industries. Yet most schools don't treat it that way. They look at it as a niche. Later on in the day, during a Q&A with Minnesota Senator Amy Klobouchar, an audience member expressed frustration with the fact that not every state treats computer programming as a course that can fulfill core math requirements the way, say, algebra does. Perhaps it's time to change that. Or, at more ambitious schools, maybe it's time to think of ways to work coding into other subjects, the way students exercise their writing skills in social studies, or with science papers.
The MOOC revolution is heading to grad school
Last night, Georgia Tech announced that it was partnering with Silicon Valley startup Udacity to create a new, online master's degree program in computer science. In higher ed land, this is very big news. Udacity is one of the leading pioneers of massive online open courses (known to many as MOOCs), college classes that are generally free to the public and can enroll thousands upon thousands of students at a time. Recently, Udacity partnered with San Jose State University to create experimental for-credit courses. But the Georgia Tech effort appears to be the first full-on degree program created in conjunction with a MOOC provider. Students anywhere in the world will be able to take it using Udacity's platform, but only those who are admitted by Georgia Tech will be able to earn a degree.
And here's the twist: It was bankrolled in part by a donation from AT&T, whose Assistant Vice President for Education Leadership, Charles Herget, joined an afternoon panel to discuss college and the workforce. (I should also say here that AT&T was one of the underwriters for the day's event. But, I promise, I'd be writing an article about this development regardless.) Herget told the crowd that half of the students during the program's first year would be AT&T employees, and that the company was hoping it would be a recruiting and training resource in the future.
There are many things I find fascinating the Georgia Tech, Udacity, AT&T partnership, but I'm going to focus on two of them. The first is that it suggests that the new wave of digital ed tech might reshape graduate programs before making waves on the undergrad level. There are a few reasons why this would make sense. First, one of the big concerns about MOOCs is that they might not be appropriate for marginal students who have trouble self-motivating. In grad school, that's not as much of an issue. Furthermore, there's already a very large and lucrative market for online graduate education. Companies like Udacity might only have to do it cheaper, better, or (in an ideal world) both in order to compete.
The second fascinating aspect has to do with the involvement of AT&T. I am a deep, deep skeptic of the argument that America is suffering a desperate shortage of tech talent. But companies constantly complain about their lack of qualified applicants for the jobs they have open. AT&T has responded proactively by putting its stamp on a program it thinks will prepare workers with the necessary skills. I'm curious whether other corporations -- say, Microsoft -- might follow suit.
Use schools to build up better broadband for everyone
In order to bring the technology of the future into public classroom's, those classrooms need a decent Internet connection. And that's going to require some investment. In the Telecommunications Act of 1996, congress created the eRate program to expand internet access to schools and libraries. Today, 95 percent of schools have a broadband connection, said FCC commissioner Jessica Rosenworcel. Unfortunately, she added, many are too slow to handle the high-end educational tools that are being developed or even HD video. She argued that it's time for the FCC to pursue "eRate 2.0," to improve the quality of those connections.
It's not just students who would benefit. By extending high speed internet access to schools, Rosenworcel argued, it makes it easier and cheaper to extend it to extend it to the surrounding community. They could easily be nodes in a better national broadband structure.
This article available online at: