If you check out the homepage for Code Year -- where many thousands of people are following the weekly tutorials in Javascript -- you'll see this line from Douglass Rushkoff: "If we don't learn to program, we risk being programmed ourselves." (He even wrote a book about it.) Similarly, the fine Mac developer Daniel Jalkut writes,
Literacy isn't about becoming a Hemingway or a Chabon. It's about learning the basic tools to get a job done. I think programming -- coding -- is much the same. You don't have to be the world's best programmer to develop a means of expressing yourself, of solving a problem, of making something happen. If you're lucky, you'll be a genius, but you start out with the basics.
Long ago, it would have been ridiculous to assume a whole society could be judged by its ability to read and write prose. It feels ridiculous now, to assume that we might use computer programming as a similar benchmark. Yet it may happen.
Okay, this is all very cool. But what I want to know is this: What counts as "programming"? What counts as "knowing how to code"?
Take me as an example: what can I do that the average computer user can't do? Well, not so much, I'm sorry to say. I can
write basic HTML;
edit CSS (I can't write it from scratch, at least not without a great deal of labor, but I can sometimes fix pre-existing CSS when I dislike something about it);
run a few utilities from the command line, like
textutil
andcat
(which is to say, I know the basics of Unix syntax);write AppleScripts (though often with the help of Automator).