Rudolf Flesch is the man who wrote Why Johnny Can't Read; some thirty years later he wrote a follow-up, Why Johnny Still Can't Read. His premise was simple: teaching reading by phonics, rather than what he called the "look-and-say" method, is valid and appropriate even for a language as allegedly inconsistent as English. In the second book, he mentioned his dismay over a motivational technique that involved essentially bribing kids for reading well. Reading, he argued, should be (and is) its own reward.
The idea that anything should be its own reward is actually a cornerstone of Zen as well. You don't sit zazen to get some nebulous reward sometime far off in the future; you do it because sitting zazen is its own accomplishment. Practicing zazen encourages you to believe that much more in what the present moment offers, which is really all there is. The past is fiction; the future is a nebulous promise. The more equipped you are to accept what is right in front of you now, the better.Read more
There’s also this strange, unhealthy love of sick artists not just in video games but in the wider art world; this idea that artists need to be insane or addicted or otherwise impaired to create great art. It’s not to say that those artists can’t create great art (clearly, they can) or that their art is somehow lesser, but that more often than not those self-destructive tendencies interfere with their ability to complete projects that take more than one sitting.
Eric is, apart from being a good friend, one of the better writers I know on the subject of video games as culture — and also video games as forum (as opposed to just form).
I've long wrestled with, and rejected, the idea that damage or sickness is a prerequisite of good art — that the artist needs to be a screwed-up person in order for his art to be "genuine". The most obvious problem with this formulation is how it leads us to believe the reverse: that in order to become an artist, you have to get screwed up.Read more
I have been reading Literature of the Lost Home by Hideo Kobayashi, a literary critic of Taishō-era Japan who has no reputation worth speaking of outside of his country. It's a shame, because this little volume has more quotable lines and more nuggets of insight worth mulling over per page than most anything I've encountered since my last tangle with Edmund Wilson.
There's a lot to mine from this book, and I plan to do that over a succession of posts, but one of the main insights gleaned from it is how most people no longer think of criticism as being distinct from reviewing. To review a movie means to assess its likely appeal to a given audience; to criticize a movie means to look at it in a penetrating and thoughtful way. Roger Ebert was the rare sort to write both kinds of texts. His first look at a movie was a review; his "Great Movies" columns were criticism.Read more
Such was the advice for success given to the young Dustin Hoffman in The Graduate. I mentioned before a best-selling SF&F author (I won't name him here) who has given writer's advice of the same flavor, and which in the end was indistinguishable from marketer's advice.
Among them was something that amounted to "don't be literary" — use small words, words that everyone knows, so others will want to read you. It's difficult to dismiss the immediate flush of annoyance — no, disgust — that I felt when reading that.
Okay, sure — you write to be read, and you write for the audience that's likely to read you. But there's no mistaking the reek of anti-intellectualism there, not least of all because it's not the length of one's words that make what you write literature, but the insight and the perspective behind them. (See: Hemingway.)
What's more, such advice is a slap in the face to the impulse that drives many people to become a writer in the first place. The power of words is what swept us up, and what compels us to sweep others up as well — and yes, you can do that without being pretentious. Telling a writer not to be literary is tantamount to telling a musician not to be musical.Read more
The other day, I took a site I read frequently off my list of RSS feeds. It's not something anyone is likely to know about — it's this little blog curated by a guy who's a self-styled SF&F author and whose opinions about a number of things are more or less congruous to or convergent with mine. (No, I won't link to it here.)
I didn't delist him because I vehemently disagreed with what he had to say. I took him off my list because while I'm technically on the same side as he is on a number of issues, I couldn't for the life of me stand the way he defended them. I'd rather have someone who is an honest, well-defended, and scrupulous proponent of a viewpoint I don't agree with than someone who is a shabby proponent for something I do agree with.Read more
What every economist, and for that matter every writer on any subject, needs to realize is that unless you are a powerful person and people are looking for clues about what you’ll do next, nobody has to read what you write — and lecturing them about what they’re missing doesn’t help. You have to provide the hook, the pitch, whatever you want to call it, that pulls them in. It’s part of the job.
I'm going to risk putting my foot through my keyboard and suggest, at the risk of oversimplification, that there are two kinds of writers: those that have an knack or an inclination for being a vigorous promoter for their work, and those that don't.Read more
Avoid the engineer's and economist's fallacy: don't reason your way to a solution — observe real people. We have to take human behavior the way it is, not the way we would wish it to be.
The author is Donald A. Norman (he of The Design of Everyday Things), who has written extensively about and done tons of research on human-computer interaction. He knows his material the way Linus knows the Linux kernel.Read more
... what does it say about our society that it seems to generate an extremely limited demand for talented poet-musicians, but an apparently infinite demand for specialists in corporate law? (Answer: if 1% of the population controls most of the disposable wealth, what we call “the market” reflects what they think is useful or important, not anybody else.) ... A world without teachers or dock-workers would soon be in trouble, and even one without science fiction writers or ska musicians would clearly be a lesser place.
Emph. mine. The article is absolutely worth a read for its main focus — why do so many of us work jobs we not only hate but subtly sense are worthless? — but these couple of sentences especially caught my attention because of what I bolded there.
Regular readers of these pages know I've long suspected that a major reason why media enterprises tend to be so unadventurous is because it's always easier to sell people an incremental variant in yesterday's experience than it is to do the real work of selling them something new. Human psychology is at work on both ends of the equation: the readers pay lip service to the idea of the new more than they do the new things itself, and the publishers are just manifesting their own incarnation of that behavior.Read more
The insular mentality [of comic book companies] remains. By and large the philosophy is still to create almost exclusively for the audience that’s already here or the one that used to be here. Women couldn’t possibly like superheroes (despite the gads of evidence to the contrary). Children would never buy superhero comics (despite the booming kids and all-ages comics market and kids’ almost-unanimous love of superheroes). When they’re asked why they don’t try harder in these areas, they say that they’ve tried in the past and they just never work out. Why don’t they work out? Because, no matter how well-meaning, they have usually ended up being sabotaged on some level. Budgets are miniscule, or start off reasonable and then vanish when there isn’t instant success. Almost always, the marketing is done to the same audience who has steadfastly resisted reading anything beyond superheroes or similar male-targeted fantasy/adventure. Why expect anything beyond a small percentage of crossover? ... DC and Marvel largely don’t know how to market outside the superhero audience, and when they do usually give it such a miniscule budget that penetration is minimal. Conventional wisdom would say to hire a marketing firm that does know how to reach the target demographic, but of course that requires money.
Does this sound familiar? To my ears, it does: it's the same problem I've touched on before about how it's easier to sell what you know to the people you know than it is to sell something new to anyone at all. Why? Because the latter requires, oh my gosh, work and money. More than that, though, it requires risk, and that's the one word you never want to speak out loud in front of a publisher lest you have them whip out the garlic and crosses.Read more
One of the best comments for this article:
My fervent wish as a fan is to give me something I didn't know I wanted in the first place.
This, to me, is the reason to appreciate most anything creative. Not to get more of what you already know you want, but to be introduced to something you never knew you wanted, and to want more of it.
It happened to me with Kurosawa — I knew almost nothing about Japan or Kurosawa when I saw Ran, but after I walked out of it I wanted to know every damn thing there was to know about both the country and the director. I had never known I could have such curiosity about something, anything.
The same thing happened with Koyaanisqatsi. If I hadn't encountered that movie at an age when my understanding of films was still relatively plastic — that is, that they didn't have to be any particular kind of thing, that they could be as open-ended and all-encompassing as
One of the things I've pounded on ceaselessly in these posts is the need for people to get out of their experiential and prejudicial bubbles — especially if they consider themselves creators. It's too easy to get comfortable, and worse, to never know just how much that comfort is holding you back.
Not likely to be posting much for the next couple of weeks, as work and the final preparations for Flight of the Vajra have devoured my attention. Look for me on the other side of September.
While on the lead-up to the release of Flight of the Vajra, I've talked a lot about the way most of our talk about the future is technological and not social or personal — that we don't feel like the people we'll be in a hundred years will be appreciably different from the people we are right now. Vajra suffers, I think, from much of that as well, but at least I tried to take that awareness with me into the heart of the book and do something with it, instead of just let it overwhelm me or lay traps.
The only way we know what a future humanity can be like is by becoming something else, even if only a little at a time. The people we are now as opposed to a hundred years ago are strikingly different — not just in terms of what we know, but what we're inclined to do, what we want out of our world, and what we're prepared to do to get it. I would like to think we've become incrementally more humane over time, even if there is still violence in our lives, and that we are on the whole less beholden to superstition and compulsive stupidity. How we go about getting to that also makes a lot of difference.Read more
... a healthy book industry is a diverse one, in which it’s possible for a talented author to knock on several doors before resorting to self-publishing. The more gatekeepers, the better the odds ...
Too bad the book industry is going in the opposite direction, as the editorial points out. Penguin and Random House have since merged and are now 25% of the entire global market for publishing.
It's tempting to suggest these big outfits will create sub-imprints that target specific markets, in essence recapitulating the way the big labels acquired indies and let them run their own A&R, simply handling the distribution. But I suspect it'll be more akin to the way George Lucas's predictions about the multiplex went terribly wrong. Instead of having art films and blockbusters cheek-by-jowl in the same building, we ended up with back-to-back showings of the same lousy movies — except in 2D, 3D, and glorious IMAX.
Krugman's column the other day quoted Raymond Chandler, in a way that brought back to mind what I've been thinking about re: the profundity problem.
Other things being equal, which they never are, a more powerful theme will provoke a more powerful performance. Yet some very dull books have been written about God, and some very fine ones about how to make a living and stay fairly honest.
For those who missed it earlier, I once came across Tibor Fischer talking about the problem of profundity in the arts. Or rather, the problem of profundity in certain artists. It's something you either have or you don't, and striving for profundity just makes you look belabored and silly.
And a big part of why you either have it or you don't, I think, is how you look at the things you're drawn to. As Chandler hinted, you can be drawn to very grand subjects but find that you have little to say about them, because you simply don't see anything that isn't already visible to innumerable others. On top of that, you haven't found a container for expressing what you see that is also compelling by itself.
I run into this problem a lot with fledgeling writers, who "just want to tell a good story". And while that by itself is fine, they always seem to believe the act of thinking too deeply about what they're doing will ruin it — an analogue of what I've called Cutting The Drum Open To See What Makes It Go Bang. But the other side of that is not engaging at all with what makes a story resonant in the first place — what compels people to go back and read it again, or even better, stand in line for when your next production comes out.
I'll have more to say about this later, as I have the final bit of Flight of the Vajra edits to put to bed this weekend.
... what I’m proposing is that finance, and indeed consumer Internet companies and all kinds of other people using giant computers, are trying to become Maxwell’s demons in an information network. ... [W]ith big computing and the ability to compute huge correlations with big data, it becomes irresistible. ... [But] what’s wrong with that is that you can’t ever really get ahead. What you’re really doing then is you’re radiating waste heat. I mean, for yourself you’ve created this perfect little business, but you’ve radiated all the risk, basically, to the society at large. And if the society was infinitely large and could absorb it, it would work. There’s nothing intrinsically faulty about your scheme except for the assumption that the society can absorb the risk.
Lanier's views have inspired a lot of ire from people I know — they react to him in the ways that, say, talking heads on TV react to bad news about anthropogenic climate change: by getting angry, or attacking him personally, or just claiming his ideas don't make any sense. And even while I disagree with the way he proposes to solve some of the problems, I agree with his diagnoses of the underlying issues. The more you build an economy around diffusing risk instead of creating things, the more you shift costs into places where they are not detected until they become impossible to sustain.Read more
If you want a classic example of the Folly of the Quants, look no further than Microsoft and Windows 8.
For those of you who don't know the story, Microsoft ditched the classic Start Menu from Windows 8 and replaced it with a full-screen menu that makes people want to put their fists through their monitors. Me included. This was bad enough, but they then tried to justify this nonsensical decision by citing user-behavior telemetry that allegedly showed the Start Menu just wasn't used all that much.
Lies, damned lies, and statistics, said I. Sure, you could say that nobody uses the Start Menu, therefore let's ditch it — which sounds to me more like a justification in hindsight than anything else. But what about the few people who do use it? I know I get tons of use out of it, pinning to the Taskbar notwithstanding. I'm reminded of similar stats about readers: not a lot of people actually read, but those that do read, read voraciously.
I've written more neutrally about this issue elsewhere, but this is my personal opinion: Using all this to justify changing over to a UI that is predominantly organized for touch was a bad idea. Touch UIs are not solely the problem; lousy motives are. (And Windows 8.1 goes some distance towards ameliorating these issues.)Read more