Glorified Typewriters And Faster Horses Dept.


There is a book on programming called Oh! Pascal!one of the first college-level texts on programming I ever encountered, and its first edition begins with an exhortation that seems to have been lost somewhere along the way: "It's time we stopped treating the computer like a glorified typewriter." (The second edition seems to have excised this part, which I do miss.)

Twenty-odd years later, the computer has become split along roughly two paths. In one, it is — as someone else once put it wonderfully — an amplifier for human thought, a utility that allows unprecedented freedom of transformation and transmutation. In the other, it is a glorified typewriter — and a glorified camera, tape recorder, walkie-talkie, pager, etc., etc.

At this point some of you are shrugging and saying "So how is that a bad thing? I'm carrying more computing power in my pocket than could even be purchased twenty years ago!" True, but I'd argue that we're not so much using all that power to invent new applications for computing power as we are using them to embrace, extend, and extinguish existing technologies, and to simply reinvent ways to do many of the things we already did. I'm also arguing that it is our tendency to do so with computing technology, and it's a tendency we need to be skeptical of.

Perhaps that itself has been the big lesson of the PC revolution — that it didn't so much create new things as find a way to subsume so many existing, previously stagnant things under a single technological umbrella. No way to deny that's a good thing, at least at first: shift things to the digital realm, and you liberate them from physical constraints that made them only so useful. This is why, for instance, I always giggled up my sleeve at the pseudo-futuristic VR-type HCIs people were constantly dreaming up in the 1990s: why reinvent all the limitations of the physical world in a new form, when the whole point was to do away with them altogether?

Most of what goes on in mobile technology (my above-cited example) is a huge clutter of trivia and reinvention, with the occasional kernel of something genuinely new. It's striking that the one big thing everyone seems to be holding out for in mobile technology is a battery that doesn't peter out halfway through the day.

I could go on in this vein, but instead I'll cut to my prototype of a theory I've been hatching.

Human beings are irrational — that's both our saving grace and our damnation. We believe in ourselves and in the things we dream up, irrationally enough and in defiance enough of the evidence of our senses that we find ways to make those crazy things come true. That's the good part.

The bad part is that we drastically and consistently overstate the positive effects (and maybe also misinterpret the negative ones) of those inventions. We convince ourselves that a given creation will make everything perfect forever, or at least push us into a glorious new space where all the old problems don't matter anymore. Instead, all too often, we end up trading one obvious problem for five subtle ones. What's more, we are only too happy to be kept ignorant of this trade-off. We "know" the computer is dumb and only a tool, but we don't really know that.

This is why we have the tendency to treat the computer like a glorified typewriter — one, because it's easy to do that and hard to invent genuinely new things, and two, because we overestimate the real-world utility and power of many of the things we do invent with it. The first one is hard to avoid because such a mind-set provides us with a cheap, comforting vision of technology as messiah, a view not supported by reality. It allows us not to think about what we are actually doing, only the how.

It's not useless to have email or word processing or Adobe Photoshop; far from it. What we kid ourselves about, though, is how revolutionary those things actually are vs. how revolutionary we like to think of them as being. Too much of the time, the "revolution" in question simply means we become a slave to the pressures of the increased productivity facilitated by technology. The fact that we can "do more" becomes a meager replacement for doing better, for making our technological choices constructive instead of reductive. It becomes our version of "faster horses".

I mentioned Adobe Photoshop and word processing. Inventing those things made it easier to edit images and compose texts, but only easier in the sense that we weren't as limited by those things being tied to some specific physical property. That by itself was a big liberation, but not an unbounded one either. Remove that and you still have the limitations of the person behind the keyboard or at the tablet, to say nothing of the new (often unseen) limitations imposed by said easel or said keyboard. This borders on being a truism; the tool is only as good as the user. But a computer's functions are not like a hammer or a typewriter; the things they can do are decoupled heavily from physical reality, and so unless we have been specifically schooled in how the PC is not magic, we let ourselves believe it.

I wouldn't trade my copy of Microsoft Word for a Smith-Corona Galaxie, but I might well keep one of the latter around the house as a way to remind myself what it's like to write without a digital safety net — something that, in turn, forces you to actually think about writing as a skill, as a craft, and not simply as a kind of video game. The limitations imposed by both the typewriter and the word processor are different; the fact that the latter feels far more open-ended than the former doesn't mean there is no open-endedness. We just can't see it as clearly because it isn't tied to an object, one with a sticky H key and whose carriage return mechanism sounds like the opening of the world's biggest zipper.

At one point I bandied around the idea that the invention of the typewriter — first manual, then electric — and then the word processor had contributed first indirectly and then directly to the denaturing of the written word. Quantity replaced quality, and all that. It was a tempting theory; it's always fun to blame things on technology, because you're all but guaranteed to get some other atavist grouch to agree with you. (That doesn't mean you want them on your side, though.)

I eventually backed off the idea, in big part because it was trite and uninsightful, but I did get one useful auxiliary insight from it: Exposure to a technology by itself has no innate liberating function unless it's accompanied by a feeling for what it means.

When a kid understands for the first time what it means to make the machine do something, a little light goes on and never goes out again. When a person indifferently pokes at their phone to find out what his friends are doing, he's thinking about his friends, not the phone. Some people would argue that's better; after all, a transparent technology is a more genuinely useful one, isn't it? But it also becomes easier to be indifferent to it, and thus more susceptible to its unseen disadvantages — more of an unquestioning believer in its magical properties rather than its status as a tool.

For technology to be really revolutionary, it has to be in the hands of people who have schooled themselves in its meaing and implications. Most of us would rather use the computer to perform glorified analogues of what we already do — talk with other people, write notes, watch TV — because everything outside of that is just too nerdy. Elitist as that view is, I feel more comfortable defending it than I feel comfortable defending the idea that "just enough" computing (the iOS model) is what most people need. That view has no end of existing defenders, most of whom are only too happy to sell you the latest model of tablet or phone, and who have a vested interest in ensuring you don't become too ambitious with the technology you surround yourself with.

What I'm not calling for is that everyone throw away their iPhones and install Linux. Nobody wants that anyway; even I don't want that. What I am calling for is some way for us to instill within ourselves and each other, as early on as possible, a sense of the power we are in fact commanding, and to give ourselves that many less excuses to insulate ourselves from it and pretend it's "not for us".

We need to develop working knowledge of these things not because everyone ought to code, but because everyone ought to be able to understand what's at stake when we do code.

Which is, as you can imagine, a whole hell of a lot more complicated than just using glorified typewriters.


Tags: computers sociology technology




About this Entry

This page contains a single entry by Serdar in the category Uncategorized / General, published on March 15, 2015 10:00 AM.

You can see alphabetical or chronological listings of all entries in this category.

» See other Uncategorized / General entries for the month of March 2015.

» See all other entries for the month of March 2015.

Find recent content on the main index or look in the archives to find all content.

About Me

I'm an independent SF and fantasy author, technology journalist, and freelance contemplator for how SF can be made into something more than just a way to blow stuff up.

My Goodreads author profile.

Learn some more about me.

My Books

Coming Soon

Out Now

More of my books

Search This Site


Other People We Like

Fandom

Archives