Return to "Where the Rubber Meets the Road," by Ralph Lombreglia.
Previously in Digital Culture:
"The World According to David Gelernter," by Harvey Blume (January 1998)
An interview with a computer scientist who argues that beautiful technology -- and a return to traditional values -- must show us the way forward. Plus, a series of excerpts from David Gelernter's Machine Beauty: Elegance and the Heart of Technology.
"Caught in the Flash," by Harvey Blume (December 1997)
Computer memory, once thought to be finite, now looks like an open frontier. The author peers into the future of memory and sees a paradox rooted deep in our past.
"Dispelled," by Ralph Lombreglia (November 1997)
Why Riven -- the most anxiously awaited multimedia product ever -- is missing the magic of Myst.
For more, see the complete Digital Culture Index.
Join the discussion in the Digital Culture forum of Post & Riposte.
An E-Mail Exchange with Paul LeBlanc
To: Ralph Lombreglia
From: Paul LeBlanc
Subject: Re: Marlboro Graduate Center
Your new Graduate Center offers Master's programs for people who want to integrate network technologies into business and K-12 education. These are vastly different missions, with (presumably) quite different measures of success. What are the similarities and differences of the two programs? Does the agenda of one shape the agenda of the other?
How do these worlds differ? The business world has a bottom-line orientation that is easier to quantify and assess. It has competitive realities that drive faster and more thoroughgoing implementations of technology. It is more willing to spend money on staff development and building of internal skills sets. Educational effectiveness is a hazier target and its definition is hotly contested. Education has fewer resources and moves more slowly. Most unfortunately, it too much ignores staff development. One federal study recommends that 35 percent of school-technology budgets be devoted to teacher training. The national average is about 5 percent, woefully inadequate.
You're interested in the wholly new opportunities that digital technology will provide, rather than in re-implementations of existing ideas and practices. So that readers will understand what "new" means, can you give some real-world examples?
Sure. But first let me suggest that merely pouring money into new technologies in order to do more of what you have always done -- whether it's manufacturing, marketing, or teaching -- is a dubious investment. In the late eighties and early nineties, large American corporations hemorrhaged money on technology with questionable return on their investment. Remember the number of economists who challenged the investment in technology? It wasn't until they fundamentally rethought what they were doing that they saw genuine productivity gains. By the way, that "rethinking" was not top down; it came about when people who did the work were empowered (through training and education and license) to make changes and use technology tools in new ways.
Now, to the question: What kinds of new ways? Ask Barnes & Noble about Amazon.com. This upstart bookseller now has the world's largest inventory and 1.5 million customers, and over half of them are coming back for more. They aren't yet profitable (their business calls for this first-phase growth period), but they are blazing a new way to sell books. Remember that the first $2-million business on the Web was a florist shop. I think direct mail -- which comes to the consumer -- will soon be upended by online buying, in which the consumer comes to the vendor.
"The Computer Delusion" by Todd Oppenheimer (The Atlantic, July, 1997)
"There is no good evidence that most uses of computers significantly improve teaching and learning, yet school districts are cutting programs -- music, art, physical education -- that enrich children's lives to make room for this dubious nostrum, and the Clinton Administration has embraced the goal of 'computers in every classroom' with credulous and costly enthusiasm."
"The Computer and the Economy" by Alan S. Blinder and Richard E. Quandt (The Atlantic, December, 1997)
"Will information technology ever produce the productivity gains that were predicted?"
In education, anyone who is using the library budget in the old ways -- ignoring access, online journals, search engines -- is wasting money. How a student does research is changing and new. With the ability to connect directly to last year's Mars Pathfinder mission, I hope connected science classes were doing some new things. Because learning is fundamentally tied to how we think, it must confront the changes in our noetic economy. How knowledge is being made, how it is stored, how it is shared, and how it is consumed is all changing in fundamental ways. Our grandchildren will think differently than we do. My bet is that what feels like information overload to us will feel very different to them, not because they simply grow up with the white noise of information, but because they will have cognitive moves, thinking strategies if you will, for making sense out of a flood of fragments. The question is not "Are there new ways to teach?" but rather "How are schools managing to keep up with the new?"
The Internet is an embryonic entity that is cell-dividing like crazy. We don't even really know what sort of animal it will become. This early in the game, what can The Graduate Center teach that will have lasting value? What can't you teach? And do you see that situation changing in the next five years or so?
It would be fun to pretend that we shouldn't teach tools because they will be out of date within a year. That's wrong. You can't play the game without facility with the tools. Moreover, learning the current tools provides students with the skills to quickly appropriate new technologies. So there is value in teaching current technologies.
More importantly, we can teach ways to think about innovation, new technologies, and virtual spaces. It is an understanding of history (even a history as short as this one) and context -- the whole matrix of agents, actions, and relationships that drives technological change -- that allows us to better understand, critique, and use that which is just peeking over the horizon line. That's the real value of our programs.
There's a larger and more ambitious vision, too. The slogan for the Center is "Helping create leaders for a world online." Perhaps a bit ambitious for such a small place, but we are much less interested in students who simply want to catch up. We want students who see themselves in front of the pack.
Your programs are rigorously cross-disciplinary. But the incessant change and complexity of the Net seems to demand more specialization, not less. Are you training "digital generalists" to manage specialists? If so, how will they ultimately differ from people with conventional management degrees?
Our imagined market for the programs, supported so far by the first groups of students, is working professionals with reasonably deep knowledge in one area -- perhaps marketing, programming, or design. The program seeks to give them broad training and a macro-level view that allows them to lead their organization's effort. This means assessing organizational skill sets, knowing who to have at the table, knowing who to bring in from outside, what questions to ask, how to assess quality and progress, and knowing best practice. It is a kind of management degree, but with a very specific focus on the Internet. We also supplement the course work with workshops for those seeking deeper knowledge, in areas like Java programming, for example.
I have watched too many companies with specialists -- good graphics people, good technical people, high-quality content providers -- unable to effectively develop an Internet strategy because there was no one person who knew enough about all those areas and what it means to have such a strategy. We're training those people.
Your program for educators stresses the development of "pedagogically sound" applications of the Internet and other new media technologies. Given the radical re-evaluation that the new tools demand, how do you know valid pedagogy when you see it?
Good question. It depends a lot on whom you ask. Accrediting agencies say "Have more and better outcome assessment." Some school boards rely on tradition: "Do more of what you used to do when I was a kid!" The testing industry says, "Use our tests." My own feeling is that schools need to develop a set of philosophical guidelines that make sense for their context and community. There is no one definition of good learning or good teaching for every school. We too often ignore the road children travel in our incessant worry about where they end up.
I say plant your flag and stick with it. For our program, we believe (and here I generalize for a group of faculty who cover a range of beliefs) in student-centered and collaborative learning. We gravitate to the "guide on the side" model of teaching and away from the "sage on the stage" model by which most of us were taught. We want students to be engaged, to be hands on, and to learn how to learn.
What is your favorite general scenario for networked humanity, and what aspects of our old life must we give up to get there?
I hope that the new technology delivers on its democratizing promise. We know that tyrannies have more televisions than telephones; control is strengthened through one-to-many technologies and challenged when people connect. I would like to see a kind of work flexibility that allows more telecommuting and living choices and that doesn't simply extend work hours further into family and personal time. I would like to see it empower those who are marginalized, giving poor schools better access to information, linking seniors who feel isolated, offering new opportunity.
We know from the advent of other cardinal technologies that whole classes of people can redefine themselves. The Industrial Revolution made it possible for people to rise above their station and class -- it created in many ways a truer meritocracy, certainly more so than the agrarian age that proceeded it. Some things were lost (think of the midland weavers of England and the subsequent Luddite revolt), but new opportunities were gained. Technology threatens to widen the poverty gap without adequate efforts to ensure access and training, but the need for new talent, skills, and creativity is immense.
However, I harbor no real utopian dreams for an increasingly digital age (the above notwithstanding). We will see our current human ills played out in those arenas, just as our best qualities are accentuated. Pornography, hate groups, and misogynistic advertising already reside a mere mouse click away from educational sites, list-servs for the elderly, and tours of our great museums. It's not so much that it will be better as that it will be different. Examine the claims we have made for other earlier technologies -- even those technologies that have effected so much change -- and you will hear the echoes of many contemporary claims. Indeed, there may be no more hyped age than the one into which we enter. Think of the ad campaigns of Microsoft, MCI, and others. We better use our time shaping and navigating the different rather than conjuring up the imaginary.
Return to "Where the Rubber Meets the Road."
Join the discussion in the Digital Culture forum of Post & Riposte.
More on Digital Culture in Atlantic Unbound
Copyright © 1998 by The Atlantic Monthly Company. All rights reserved.