u_topn picture
rub_dc picture
Atlantic Unbound Sidebar

Discuss this topic in the Digital Culture forum of Post & Riposte.


Previously in Digital Culture:

Dispelled, by Ralph Lombreglia (November 1997)
Why Riven -- the most anxiously awaited multimedia product ever -- is missing the magic of Myst.

God, Man, and the Interface, by Harvey Blume (October 1997)
Most of us take the computer interface for granted. But for Steven Johnson it is a defining metaphor of our times -- and a summons to the metaphysical.

For more, see the complete Digital Culture Index
Caught in the Flash
What's the future of memory? The answer may lie in our past

Editor's Note: In the wake of Intel Corporation's announcement last September that it had found a revolutionary way to increase computer memory by increasing the capacity of the bit -- a technological innovation known as "multilevel flash memory" -- we asked Harvey Blume, a philosophically minded former programmer who writes on culture and technology, what he thought were the implications. Blume's response went a good bit further than we had expected it would -- and took some surprising turns.


December 16, 1997

From: Harvey Blume
To: Wen Stephenson - Atlantic Unbound
Subject: Caught in the Flash

Wen,

You asked:

>>What are the implications of Intel's announcement
>>that it has found a way to double computer memory?
>>Too many even to mention?
The more I think about it, the more far-reaching the implications seem. It's not just technology that's at stake here but, even more so, culture.

What does it mean that the speed-up we've all felt -- the whoosh from the digital world, you know, like the wind from a big truck that continues to go by -- is itself being sped up?

It has been a given in computer science and the metaphors drawn from it that memory is finite. What if that ceases to be true? It seems that this latest breakthrough in "flash" technology -- the ability to shave or grade a bit into ever finer degrees of charge in the way that one can cut a line into an infinity of points -- gestures toward a world in which memory is, in effect, unlimited, or at least exists at an order of magnitude beyond our ability to use it. Memory, then, would always be an open frontier waiting to be settled rather than a densely populated region within which painful trade-offs have to be made in order to accommodate change. The implications for software engineering seem immense -- and so, too, the implications for the way we think about thinking.

But let me back up for a closer look at the new "flash" memory and how it works, starting with the bit.

Programmers think of a bit as the smallest addressable unit of memory. Multilevel flash memory cancels what has been deemed the fundamental, nearly sacred dictum of the computer age: namely, that a bit is binary, it can be on or off, 0 or 1. Nothing else, nothing in-between. Intel now says, on the contrary, that a bit may store gradations of charge. A bit may in fact be not merely 0 or 1 but, say, 0, 1, 2, or 3 -- four values instead of two, which thereby, in a single stroke, doubles memory. And that's just to start. If a bit can hold four values, it can hold eight, and if eight then perhaps sixteen, and so on to some as yet undisclosed limit.
Related link:

"New Chip May Make Today's Computer Passé" (The New York Times, September 17, 1997)
Requires registration
"The world is no longer flat. Earth is no longer at the center of the solar system. And Moore's Law, a longstanding axiom of the computer age, is no longer true."

All of this means memory is about to increase exponentially -- or, rather, increase exponentially even faster than it already has. The New York Times said Intel's announcement implied the overthrow of Moore's law (which had held good since promulgated in 1965 by Gordon Moore, a co-founder of Intel) according to which "the number of microscopic transistors that could be etched onto the surface of a given piece of silicon wafer would double approximately every 18 months." Multilevel flash leaves the number of transistors unchanged; instead, it slyly doubles their capacity, immediately halving the 18-month interim between generations of memory chips. Soon, perhaps, with flash technology fermenting it, memory will be swelling constantly instead of at decent intervals that can be counted off in months.

Right on the heels of Intel, IBM announced its own breakthrough, a technique by which copper can be combined with silicon at a sub-micron level (described by The Times as "about a five-hundredth the width of a human hair"), thereby increasing conductivity and resulting in faster, cheaper, more abundant chips. With all this pressure, flash and otherwise, put to bear on memory, it would seem only a matter of time until every obstacle to its indefinite expansion buckles and gives way. Someday in the not too distant future we'll be saying, Hey, man, remember memory? The kind we used to have to worry about running out of? Well, memory like that is history now. Memory's everywhere, or will be soon. It's becoming as invisible as God, who, according to Borges, got that way by wrapping Himself in omnipresence, which made Him impossible to see.

But look, here's what really grabs me. The digital world has always been defined by finitude and discrete quantities. That's what it means to be digital: you do away with continuity, you approximate it as best you can, you kludge, fudge, sample. What's a line? A collection of pixels, each separate, together creating the illusion of continuity. What's a circle? A polygon with so many sides no one looking at the screen can tell the difference.

But the splitting of the bit changes what it means to be digital. Which brings us to where the philosophical action is -- philosophical in the sense that Richard Feynman, the late great Nobel Prize-winning physicist, used that word. And it was not a word he used often or lightly.

Feynman pointed out that even though the numerical difference between the mass of an object as measured by relativistic and pre-relativistic physics is insignificant (unless that object is roaring along at something approaching the speed of light), the philosophical difference between the measurements was immense. He observed: "This is a very peculiar thing about the philosophy, or the ideas, behind the laws. Even a very small effect sometimes requires profound changes in our ideas."

I submit that the splitting of the bit is one of those "very small effects" that calls for a profound change in our ideas. Because, after all, the bit was just that very thing that couldn't be split -- not if you wanted to be digital, not if you wanted to define yourself as other than analog. But if the bit is not a binary situation, not zero or one, but more like a straight line, sliceable at any of (potentially) infinite points, hasn't the digital doubled back on itself? Haven't we curved around toward that peculiar place where the digital summons up and joins with its opposite, the continuous, the analog?

Well, what if we have, you say? What's so exciting about one more way to twiddle a bit?

To answer that let me be philosophical for just a bit more by jumping to that place in Arthur O. Lovejoy's insanely great The Great Chain Of Being (1936) where Lovejoy demonstrates what can happen when the discrete and continuous engage in paradoxical fusion.

Lovejoy shows that, courtesy of Aristotle, the Middle Ages inherited the notion of the discrete: every species had its own distinct, well-bounded being. And he shows that, courtesy (once again) of Aristotle, it also inherited the absolutely contrary notion -- namely, that species were completely continuous with and always shading into one another. There were no gaps between species -- or rather, each gap was filled, of necessity, with further species, and each gap between them likewise filled, on to infinity, from lowliest "exiguities" on up to God. The "Great Chain of Being" was formed by this perfect fit of the discrete into the continuous.

As Lovejoy put it: "There are not many differences in mental habit more significant than that between the habit of thinking in discrete, well-defined, class concepts and that of thinking in terms of continuity, of infinitely delicate shadings-off of everything into something else." Having said that, he proceeded to show how the Great Chain of Being was powered for a thousand years by the contradiction installed at its core. Paradox proves to be one of the mind's more durable -- and high-energy -- forms. (And nature's, too: the paradox of light being both wave and particle seems to have come bundled with the Big Bang.) What the Middle Ages got from the collusion of the discrete and the continuous was nothing less than the curve of its cosmos and the intellectual scaffolding of its civilization.

And I'm saying it's happening again. The same paradox is being put into place, though we say "digital" instead of discrete, and "analog" instead of continuous, and though the thing is driven by electricity rather than Aristotle. The discrete and continuous modes of thought join together so deeply and intimately in electrified fusion today that you can just about bank on this rule: the more deeply digital things get, the more analog it all seems. That's really the secret given away by flash technology, whether or not it, or copper-coated silicon, turns out to be the main source of superabundant RAM. The more digital, the more analog. The more discrete, the more continuous.



During the first stage of the electronic revolution change came in discrete installments. Sure, the first people exposed to the telephone were susceptible to dissonance and disorientation; many heard ghosts and demons jabbering out of the receivers, but as a rule people were given some pause between inventions, some time to adjust. Phone, film, phonograph, radio, all had a sort of clear, well bounded existence when they arrived on the scene. A Moore's law of sorts applied to inventions.

But the transformations taking place at the digital end of the electronic revolution seem to be multifarious reworkings of the same almost obscenely flexible digital substratum. How many things can you shape out of digital clay? The answer seems to be a very large number, all of them shading into one another: graphic user interface, desktop publishing, fax, e-mail, Internet, synthesizer, digital television, digital camera, CD, CD-ROM, smart cards, and so on -- and all so rapidly, so continuously, that it's often all we can do to catch our breath.

In the midst of this digital era it turns out that what we're good at is continuity, linkage, gradation, breaking down time-honored divisions, and making epochal connections. Obviously, no site is an island unto itself in cyberspace, where connectivity is all. Both online and off we're joining art form to art form (or is there a simpler way to describe the open-endedness of multimedia?). We're busy bridging the gap between the organic and the inorganic in cyborgs and sci-fi. And the way we use a word like "virus" shows we won't give up trying to network genetic code to computer code until the first DNA computer is shipped.

Where does this get me? Nowhere. It leaves me exactly where I am, looking into the digital headlights, hypnotized by the future. And conjecturing that for those of us who've really looked into those brights our affirmations and our objections are nothing compared to our fascination.

Which poses problems all its own.

Regards for now,

Harvey



  • Join the discussion in the Digital Culture forum of Post & Riposte.

  • More on Digital Culture in Atlantic Unbound


    At a critical point in his life, Harvey Blume chose English over C and therefore writes reviews, criticism, and even the occasional book rather than computer programs. The co-author of
    Ota Benga: The Pygmy in the Zoo (St. Martins), he writes about art, literature, and new media.

    Copyright © 1997 by The Atlantic Monthly Company. All rights reserved.
  • Cover Atlantic Unbound The Atlantic Monthly Post & Riposte Atlantic Store Search