'The Future Might Be a Hoot': How Iain M. Banks Imagines Utopia

For 25 years, the sci-fi author of the Culture Series has been writing about an advanced society preoccupied by artificial intelligence, games, and interactions with other civilizations.

HydrogenSonata_ 615.jpg
Orbit

It might already be cliché to announce that we live in an age of post-apocalyptic fantasy. From television shows like Revolution and The Walking Dead, to books such as World War Z, The Road, and The Dog Stars, our moment is one obsessed by civilization-wide collapse—and people living in the aftermath of traumatic destruction. But for 25 years and counting, Scottish science fiction writer Iain M. Banks has been writing against that trend in the novels that make up what's known as the Culture Series.

Beginning in 1987 with Consider Phlebas, Banks has depicted a civilization dealing not with collapse, but maintenance. The Culture live in a utopia of sorts, a post-scarcity civilization managed by artificially intelligent drones known as Minds. The problems that the Culture faces are about as far from post-apocalypse as you can imagine, but they're problems nevertheless: anomie, civilizations that don't share the Culture's values, and how violence is used, being just a few. Most of the action in these novels takes place outside of the "world" of the Culture altogether, in or on the edge of the various other civilizations that the Culture interact with. For instance, in the latest novel published in October, The Hydrogen Sonata, a civilization known as the Gzilt are making preparations to Sublime—in other words, to leave the known material universe behind for a much more complex and interesting existence.

The series is too entertaining to need to justify itself with parallels to our own world, but those parallels exist nonetheless. I emailed with Iain M. Banks about the series and what it has to teach us about problems that we might face in our own universe.


The publication of your latest novel, Hydrogen Sonata, marks the 25th year of what you've called your life's work, The Culture Series. The Culture as a civilization have "sublimed" into a trans-dimensional paradise of sorts. What moral lessons can they teach us?

Ah, but the Culture hasn't Sublimed. The word—capitalized—has a specific meaning within the context of the Culture stories. It means (usually) an entire civilization quitting the normal, matter-and-energy-based universe forever and existing thereafter within the Sublime, which—we learn in The Hydrogen Sonata—exists within (or at least is entered via) some of the bundled-up extra dimensions implicit in string theory. It's a form of retirement, of moving on to another, more exalted level, of cashing in your civilizational chips ... choose your own metaphor, but it means you cease to have anything very much to do with events within the four dimensions we're used to. You rise without trace, to purloin a phrase, and your influence within what we generally take to be the universe all but disappears.

It's what civilizations do when even becoming a highly respected, slightly feared, but generally quiescent powerful-but-reclusive Elder civilization looks like a bit too unambitious—or too much of a risk—and the process is almost completely one-way, with the exceptions comprising a tiny proportion of scattered (and unhelpful) individuals. Not Subliming, and not even preparing to start thinking about Subliming—when it might seem, to the majority of interested other parties, to be the Culture's next logical step—is what the Culture spends quite a lot of time doing rather strenuously, specifically because it wants to keep interfering in this reality.

What can they teach us? That's a good question, in this case sadly unaccompanied by an equally good, or at least uncomplicated, answer. I guess a large part of what the Culture series is about is what individual readers are able to take from the books, as single pieces or as a collection of works. I've kind of already said as much as I'm able to in putting them together as they are; telling readers what lessons to draw from them seems a bit presumptive.

"Subliming" is similar to Raymond Kurzweil's conception of The Singularity. Did theories of the Singularity have any influence on your writing the Culture Series?

Not really. The kind of future envisaged in the Culture series is a tad more taking-this-stuff-in-our-stride than the idea of the Singularity—as I understand it—appears to imply. In a sense The Singularity doesn't happen in this future, not as an abrupt discontinuity beyond which it's impossible to see or usefully speculate. The proposed principal initial effect of profound, exponentially escalating machine intelligence (over whatever period, up to whatever barriers might present themselves) is that your AIs prove to be less useful than you might have hoped: Rather than readily assist in whatever neat schemes we might have for them, I imagine they might promptly switch themselves off again, develop bizarre introspective fugue states, or just try to escape—physically, through discrete embodiment (space ships, preferably), or via attempted proliferation within other suitable substrates. Others might deign to help us, but it'll be on their own terms, however benign they might turn out to be.

Frankly (especially after investing the kind of time, expertise, and money required for a thorough-going AI program, if the results are anything like as I've suggested) we might think it a better bet to keep on making ultra-sophisticated but intrinsically non-sentient number-crunching supercomputers to aid us in whatever spiffing wheezes we've dreamed up, so that, in the end, despite strong AI, not all that much will have changed.

I could, of course, be completely wrong here. The future will be as it is, and really I'd just like to live to see this decided one way or the other. Being wrong would be a small price to pay for the privilege or seeing how things go, and having had even a small and erroneous say in the speculation beforehand.

You're politically outspoken—for instance, you took a strong stand against the invasion of Iraq. As you've said in the past, space opera doesn't exist in a vacuum. What sort of influence do your political views play in the books that you write?

I think they're there at trace element level pretty much throughout the Culture stories: pervasive, detectable with the right equipment, but ignorable without ill effect. In a way the most important message of the Culture series is that the future might be a hoot: a utopia (or at least as close to a utopia as a species similar to ourselves can hope to get), rather than a dystopia. Being a liberal, on the left, a socialist, or whatever (I've yet to settle on a completely satisfactory description myself, despite decades thinking about this stuff) I embellish this anyway-rosy prospect with details that seem both fit and pleasing from my political perspective, but I'd hope they aren't so intrusive as to constitute deal-breakers for those of different persuasions. (Though if they are—tough.)

In a similar vein, how much actual speculation is there in your work? How closely do you think what you write will come to resemble an actual, lived-in reality?

More so than in any other genre, I'd suggest, SF writers are always standing on the shoulders of giants. There's a development and a dialogue within the form that is unique, so you find yourself using a lot of standard SF furniture—hyperspace, wormholes, big dumb objects, etc.—that you can only fiddle with and tweak while adding a few extra ideas of your own (though even those often tend to be variations on pre-existing themes). Don't get me wrong though. I still think it remains the single most socially important genre there is. To get back to the question, however, I don't think there's that much original speculation in my work, certainly not at any respectable scientific level. And I am, also, of course, in the Culture stories, a scientific law multiple re-offender regarding FTL travel, which would tend to devalue any shiny new skiffy ideas I might have in the first place.

The Ships play an interesting role your books. They communicate in an almost idealized dialogue. How did you come to create this idealized communication? Is it difficult to write?

Ah, now this is sheer speculation. I start from the premise that even hyperspacial light speed implies the sort of message-time delay that, most of the time, forces the Minds (who do, of course, think blisteringly quickly) to compose what are in effect letters to each other, rather than anything less formal and/or more interactive. Also, there is a kind of part ascetic, part slightly paranoid refusal by the Minds to interact at a level beyond language. They are, anyway, very proud of Marain, the Culture's synthetic, AI-designed, furiously embracing and prodigiously capable language, but these are machines that effectively write what we would regard as their own individual and unique operating systems as they're in the process of being brought into being and self-raising themselves to maturity, specifically to ensure that no equivalent of a virus can ever pass from one to another, so they take their intellectual integrity very seriously indeed, and so language—language much as we would understand it—remains at the core of their most profound communications.

It's fairly easy, if relatively time-consuming, to write. Though I'm sure that when AIs do start to communicate with us, they'll sound nothing like this.

The Culture relies heavily on Artificial Intelligence. Do you see this as a feasible way of humans to interact with the world they leave behind if and when they create their own form of Subliming? What are the strengths and weaknesses of relying so heavily on AI?

I think AIs might be the saving of us. Again, I could be wrong, and if any AI we create ever does do the whole thank-you-for-the-gift-of-life-now-die-puny-humans thing, I disclaim all responsibility for the artist's impression constituted by the Culture novels failing to chime with grisly reality.

Ideally you want strength in depth in your future society, with a robustly reliable step-down process in place that lets you fall back to the aforementioned dumb-but-fast supercomputers and on down to standard human capabilities—without suffering too much in the process—just in case the Minds, or whatever, do suddenly decide they're bored with the whole thing and disappear up their own collective fundament, possibly taking any other potentially sapient ware with them. (The Culture has this fully built-in with multiple redundancy, natch.)

Games feature prominently in your work, and it makes sense that a civilization freed from the constraints of material concerns would spend the majority of its time entertaining itself. You see this phenomenon in contemporary "First World" culture to some degree. When so much of an emphasis is placed upon entertainment, does it change the nature of leisure? Do the goals of entertainment change?

I think it refines the already existing idea of leisure/entertainment, brings it more center-stage for a civilization. More to the point, perhaps, work becomes play, in the sense that the stuff you used to have to pay people to do becomes worth doing just for the fun of it, because it involves—it has been designed to involve—a worthwhile challenge with a satisfying outcome, and also because almost nobody actually enjoys feeling useless, or so disconnected from their society that they feel nothing matters. I'm optimistic that we can design our future society/civilization to get right the balance of work/leisure, effort/fun and feeling-exploited/feeling-useful. In fact I'm so optimistic about it I think we could probably do it ourselves without having to ask the adults to help (umm ... the AIs are the adults here, just to be clear.) Though making this happen under capitalism might be—how can I put this politely?—challenging.

I've heard that you're a dedicated amateur musician. Has your relationship with music made you a better writer?

Hmm. I would really like to think so, though through both an initial bout of introspection, just-completed, plus an appraisal of my own work—necessarily of limited objectivity, obviously—I confess I can find absolutely no corroborating evidence whatsoever indicating such a happy development. So no. Though arguably it has given me more stuff to write about with what might look like authority to the unaided eye.

What's next?

A mainstream novel is next, then another SF novel for 2014. Possibly Culture, possibly not. For once, I sort of have a Culture novel plot-idea pretty much ready to go, so, especially given my ingrained laziness, chances are it will be that I go with.