To build the future, we must escape the present, or, "The bullet hole misconception"

Air force pilots in WWII got shot like crazy and suffered farcical levels of fatalities; in an effort to save airmen, the Allies used statistical analysis to determine where the planes that limped home had taken flak and armored up those sections — which totally failed to work. That's because the planes that made it home had suffered non-critical damage, so shoring up the sections where they'd been hit had virtually no effect on the rate at which flak to critical sections of the aircraft caused it to be shot out of the sky. In other words, by looking at survivors rather than the dead, they were protecting the least important parts of the planes.


In a transcript of a speech given at Voxxed Days Belgrad, Daniel G Siegel describes how this "bullet hole misconception" traps technology designers — their survivor bias causes them to solve the problems that are sufficiently unserious that they can survive until someone can notice them. It's like a customer service department that only hears from the people whose experience is sufficiently bearable that they buy their purchases and then complain — but they don't hear anything from the vastly larger numbers who never make it to the cash register because of some process failure.

This is particularly exacerbated by the professionalization and financialization of the tech sector. When there was no defined pipeline into computing, people came from lots of different disciplines; when the cash rewards for tech sector success were modest, people came because of their passion, not their dreams of riches. The homogenization of tech to people with engineering degrees who want to get really rich puts the focus on a very narrow class of production and innovation techniques.


What's really fascinating here, is that one of the reasons why the early days of computing were so interesting was that nobody back then was a computer scientist.

Everybody who came into computing back then came with a different background, interest and knowledge of other domains. They didn't know anything about the computer and therefore tried everything they could think of while figuring out how they could leverage it to solve their problems.

So what are talking about here? Well, it is not about optimization, not about disruption, not about just the next thing and not about adding armor to the wrong places on bombers. It is about rotating the point of view.

When I meet people who work at the world's leading tech companies, I ask them about why they don't look at the long term consequences of what they do on society? And I ask why they don't allow radical new ideas that augment human capabilities? Or why their so-called innovation and disruption is merely recycled, old stuff in new clothing? They answer that these problems are hard to solve. But let's be honest, the Googles and Facebooks and Twitters and virtually every tech company in the world solve similarly difficult technical problems every single day.

I want to suggest that this is not entirely a technological proclamation. Rather, it's a proclamation about profits and politics.

the bullet hole misconception [Daniel G Siegel]

(via 4 Short Links)