Do games and bad UIs account for rising IQs?

In this month's Wired, Steven Johnson talks about the fact that IQ scores have been on the rise for decades now, and seem to be accelerating. IQ testing companies need to "re-normalize" their tests every couple years, making them harder so that the average score remains about 100. There's lots of controversy over what, if anything IQ results mean, but Steven makes the point that IQ tests are certainly measuring something. Moreover, the area in which the general population is testing better are those tests that focus on reasoning out puzzles that resemble bad user-interfaces and/or video games.

Which is Steven's thesis: playing games and working with runaway, half-designed tech is making us smarter. It's a pretty cool idea:

When you take the Ravens test, you're confronted with a series of visual grids, each containing a mix of shapes that seem vaguely related to one another. Each grid contains a missing shape; to answer the implicit question posed by the test, you need to pick the correct missing shape from a selection of eight possibilities. To "solve" these puzzles, in other words, you have to scrutinize a changing set of icons, looking for unusual patterns and correlations among them.

This is not the kind of thinking that happens when you read a book or have a conversation with someone or take a history exam. But it is precisely the kind of mental work you do when you, say, struggle to program a VCR or master the interface on your new cell phone.

Over the last 50 years, we've had to cope with an explosion of media, technologies, and interfaces, from the TV clicker to the World Wide Web. And every new form of visual media – interactive visual media in particular – poses an implicit challenge to our brains: We have to work through the logic of the new interface, follow clues, sense relationships. Perhaps unsurprisingly, these are the very skills that the Ravens tests measure – you survey a field of visual icons and look for unusual patterns.

Link