On (Not) Learning to Code


If you check out the homepage for Code Year -- where many thousands of people are following the weekly tutorials in Javascript -- you'll see this line from Douglass Rushkoff: "If we don't learn to program, we risk being programmed ourselves." (He even wrote a book about it.) Similarly, the fine Mac developer Daniel Jalkut writes,

Literacy isn't about becoming a Hemingway or a Chabon. It's about learning the basic tools to get a job done. I think programming -- coding -- is much the same. You don't have to be the world's best programmer to develop a means of expressing yourself, of solving a problem, of making something happen. If you're lucky, you'll be a genius, but you start out with the basics.

Long ago, it would have been ridiculous to assume a whole society could be judged by its ability to read and write prose. It feels ridiculous now, to assume that we might use computer programming as a similar benchmark. Yet it may happen.

Okay, this is all very cool. But what I want to know is this: What counts as "programming"? What counts as "knowing how to code"?

Take me as an example: what can I do that the average computer user can't do? Well, not so much, I'm sorry to say. I can

  • write basic HTML;

  • edit CSS (I can't write it from scratch, at least not without a great deal of labor, but I can sometimes fix pre-existing CSS when I dislike something about it);

  • run a few utilities from the command line, like textutil and cat (which is to say, I know the basics of Unix syntax);

  • write AppleScripts (though often with the help of Automator).

Also, in my spare time I've been going through the Javascript lessons at Codeacademy (about which experience I'll have a post or two later). And anything I write that needs to be printed I write in LaTeX, of which I am perhaps excessively fond.

So can I code? Of course not. None of this could plausibly be called "coding" or "programming." And, to be honest with myself, I'm not likely to learn all that much more than I know already.

The major impediment to my learning to program is the generosity of all the real code monkeys out there. I don't even want to think about how many times I've needed to do something from the command line and, rather than figuring out how to do it myself, just ran a quick Google search to find the appropriate script or command sequence: copy + paste + enter = DONE. I can't write in Javascript or Perl, but I've copied a bucketload of other people's scripts and use them regularly. (For instance, I wrote this post using John Gruber's Perl script to convert Markdown-formatted text to HTML.) Thanks, guys!

It's daunting, actually, to think about how hard I would have to work, and how much I would have to learn, and how many silently-and-inexplicably-failing scripts I would have to write, before I could produce code that would rival what the professionals write and put on the internet gratis.

I just can't resist all the free goodies. If knowing how to program is tomorrow's "basic literacy," I afraid I'm just going to have to die as an illiterate code plagiarist. I am trying to feel bad about that but so far without success.

So let's go back to Daniel Jalkut's definition of "literacy": "learning the basic tools to get a job done." Is there a kind of literacy -- knowledge worthy of that name -- that stems not from being able to use the available tools with any degree of skill, but rather from being able to find out who can use those tools and then making good use of the experts' abilities? I'd like to think so. It's not the same as being able to write your own functional code, but it just might be good enough.

Image: Shutterstock.