Submit a link Features Reviews Podcasts Video Forums More ▾

"A reason to hang him": how mass surveillance, secret courts, confirmation bias and the FBI can ruin your life


Brandon Mayfield was a US Army veteran and an attorney in Portland, OR. After the 2004 Madrid train bombing, his fingerprint was partially matched to one belonging to one of the suspected bombers, but the match was a poor one. But by this point, the FBI was already convinced they had their man, so they rationalized away the non-matching elements of the print, and set in motion a train of events that led to Mayfield being jailed without charge; his home and office burgled by the FBI; his client-attorney privilege violated; his life upended.

At every turn, the FBI treated evidence that contradicted their theory as evidence that confirmed it. Mayfield's passport had expired and he couldn't possibly have been in Madrid? Proof that he was a terrorist: he must be using his connections with Al Qaeda to get false papers so that his own passport isn't recorded as crossing any borders. Mayfield starts to freak out once he realizes he's under surveillance? Aha! Only the guilty worry about having their homes burgled by G-men!

The FBI was so sure of their theory that they lied to a judge during their campaign against him. His story is the perfect embodiment of "confirmation bias" -- the tendency of human beings to give undue weight to evidence that confirms their existing belief and to discount evidence that rebuts it. Confirmation bias is one of the underappreciated problems of mass surveillance: gather enough facts about anyone's life and you can find facts that confirm whatever theory you have about them.

Or, as Cardinal De Richelieu said: "If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him." This line is the epitaph in my story Scroogled (here's Wil Wheaton's reading of it), about the risks of automated, unaccountable attributions of guilt based on algorithms that are not subject to scrutiny. But as bad as the automated attribution as guilt can be, it's nothing compared to the directed attribution of guilt from cops who are absolutely sure that they have their man.

Read the rest

Capitalism, casinos and free choice


Tim "Undercover Economist" Harford's column "Casinos’ worrying knack for consumer manipulation," takes a skeptical look at business and markets -- specifically their reputation for offering a fair trade between buyers and sellers. Inspired by Natasha Dow Schüll's book Addiction by Design: Machine Gambling in Las Vegas, a 2012 book on the calculated means by which gamblers are inveigled to part with more money than they consciously intend to, Harford asks a fundamental question about capitalism: are markets built on fair exchanges, or on trickery?

Read the rest

Behavioral economics of Free to Play games

Ramin Shokrizade's "Top F2P Monetization Tricks" shows how the free-to-play world deploys practical behavioral economics to convince players to spend more than they intend to, adapting to players to hook them and then pry open their wallets wider and wider. I was very interested to learn that some games look for behaviors that mark out "spenders" and convert themselves from "skill games" (win by being good at them) to "money games" (win only by spending):


A game of skill is one where your ability to make sound decisions primarily determines your success. A money game is one where your ability to spend money is the primary determinant of your success. Consumers far prefer skill games to money games, for obvious reasons. A key skill in deploying a coercive monetization model is to disguise your money game as a skill game.

King.com's Candy Crush Saga is designed masterfully in this regard. Early game play maps can be completed by almost anyone without spending money, and they slowly increase in difficulty. This presents a challenge to the skills of the player, making them feel good when they advance due to their abilities. Once the consumer has been marked as a spender (more on this later) the game difficulty ramps up massively, shifting the game from a skill game to a money game as progression becomes more dependent on the use of premium boosts than on player skills.

If the shift from skill game to money game is done in a subtle enough manner, the brain of the consumer has a hard time realizing that the rules of the game have changed. If done artfully, the consumer will increasingly spend under the assumption that they are still playing a skill game and “just need a bit of help”. This ends up also being a form of discriminatory pricing as the costs just keep going up until the consumer realizes they are playing a money game.

The Top F2P Monetization Tricks (via O'Reilly Radar)

(Image: image, a Creative Commons Attribution (2.0) image from 76969036@N02's photostream)

Homeless man's A/B test of generosity based on faith


Redditor Ventachinkway caught a photo of a homeless man conducting a clever exercise in behavioral economics disguised as an inquiry into the levels of spontaneous generosity as determined by religious creed or lack thereof.

When I passed him he proudly announced "The atheists are winning!" (i.imgur.com) (via Glinner)

Twenty Four Standard Causes of Human Misjudgement

A great post on Metafilter turned me on to "Twenty Four Standard Causes of Human Misjudgement," a classic 1995 speech by Charlie Munger (much cited, and transcribed here in PDF), in which Munger (a respected investor and partner to Warren Buffet) lays out, in plain language, the cognitive biases and blind-spots that he views as the root of much human misery.

Munger's thinking is greatly influenced by Robert Cialdini's classic popular psychology text Influence, a title that Munger credits with laying out many of the blind spots of both economics and psychology. Munger's thinking is collected in another book: Poor Charlie's Almanack: The Wit and Wisdom of Charles T. Munger.

I converted the talk to MP3 and listened to it twice today. I think I'll return to it again -- this feels like one of those mind-dumps that contains so much to pore over that it might be a work of years.

Cyclists are safe and courteous, and your disdain for them is grounded in cognitive bias


Jim Saska is a jerky cyclist, something he cheerfully cops to (he also admits that he's a dick when he's driving a car or walking, and explains the overall pattern with a reference to his New Jersey provenance). But he's also in possession of some compelling statistics that suggest that cyclists are, on average, less aggressive and safer than they were in previous years, that the vast majority of cyclists are very safe and cautious, and that drivers who view cycling as synonymous with unsafe behavior have fallen prey to a cognitive bias that isn't supported by empirical research.

The fact is, unlike me, most bicyclists are courteous, safe, law-abiding citizens who are quite willing and able to share the road. The Bicycle Coalition of Greater Philadelphia studied rider habits on some of Philly’s busier streets, using some rough metrics to measure the assholishness of bikers: counting the number of times they rode on sidewalks or went the wrong way on one-way streets. The citywide averages in 2010 were 13 percent for sidewalks and 1 percent for one-way streets at 12 locations where cyclists were observed, decreasing from 24 percent and 3 percent in 2006. There is no reason to believe that Philly has particularly respectful bicyclists—we’re not a city known for respectfulness, and our disdain for traffic laws is nationally renowned. Perhaps the simplest answer is also the right one: Cyclists are getting less aggressive.

A recent study by researchers at Rutgers and Virginia Tech supports that hypothesis. Data from nine major North American cities showed that, despite the total number of bike trips tripling between 1977 and 2009, fatalities per 10 million bike trips fell by 65 percent. While a number of factors contribute to lower accident rates, including increased helmet usage and more bike lanes, less aggressive bicyclists probably helped, too...

...[Y]our estimate of the number of asshole cyclists and the degree of their assholery is skewed by what behavioral economists like Daniel Kahneman call the affect heuristic, which is a fancy way of saying that people make judgments by consulting their emotions instead of logic.

The affect heuristic explains how our minds take a difficult question (one that would require rigorous logic to answer) and substitutes it for an easier one. When our emotions get involved, we jump to pre-existing conclusions instead of exerting the mental effort to think of a bespoke answer. The affect heuristic helps explain why birthers still exist even though Obama released his birth certificate—it’s a powerful, negative emotional issue about which lots of people have already made up their minds. When it comes to cyclists, once some clown on two wheels almost kills himself with your car, you furiously decide that bicyclists are assholes, and that conclusion will be hard to shake regardless of countervailing facts, stats, or arguments.

Why You Hate Cyclists (via Skepchick)

(Image: Cyclists Sign, a Creative Commons Attribution (2.0) image from kecko's photostream)

Tor project considers covering costs for exit nodes

The maintainers of the Tor Project -- which provides more anonymous and private Internet use by bouncing traffic around many volunteers' computers -- is considering paying $100/month to people who maintain high-speed "exit nodes." "Exit nodes" are the last hop in the Tor chain, and they sometimes attract legal threats and police attention, which makes some people reluctant to run them. As a result, there aren't enough exit nodes to provide really robust anonymity for Tor users. Tor hopes that by covering costs for organizations and individuals who are willing to provide exit nodes, they'll get more diversity in the population of exits. Darren Pauli has more in SC magazine:

"We've lined up our first funder BBG, and they're excited to have us start as soon as we can," Dingledine wrote on the Tor mailing list.

The backflip came about because exit node diversity was low: most Tor users choose one of just five of the fastest exit relays about a third of the time, from a pool of about 50 relays.

"Since extra capacity is clearly good for performance, and since we're not doing particularly well at diversity with the current approach, we're going to try [the] experiment," he said.

Tor Project mulls $100 cheque for exit relay hosts

(Image: Counterfeit $100 Bill, a Creative Commons Attribution (2.0) image from travisgoodspeed's photostream)

Dan Ariely explains why we cheat and steal, and how we're generally wrong about this

On the occasion of the publication of a new book, behavioral economics writer Dan Ariely (a great favorite of mine) answers questions with Wired about the underlying causes of lying and cheating, and the huge gap between what the evidence tells us and what be intuitively believe.

Ariely: If you thought that crime or dishonesty is driven by a cost-benefit analysis, then you have some very basic solutions — for example, put people in prison. And people who were going to commit a crime would say, ‘Okay, I’ll go to prison, not worth it.’ I’ve been talking to big cheaters, including people who have been to prison, and I tell you, nobody I’ve talked to has ever thought about the long-term consequences of their actions. How many people who did insider trading thought about the probability of being caught and how much time they would get in prison? The number is incredibly close to zero, maybe exactly zero. What will happen if we increase the prison sentence? Basically nothing, because it’s not part of their mindset. What we need to understand is the process by which people become dishonest.

We can look at a cheater and say, we would have never been able to do that. But when we look at the long sequence of events, you see it happened over time. You can ask, did the person who was the criminal think they would take all of these actions, or did they just take one? They took one step that they could rationalize. And after they took one, they became a slightly different person. And then they took another step, and another step. And now you think very differently about dishonesty.

Why We Lie, Go to Prison and Eat Cake: 10 Questions With Dan Ariely

What metrics-driven games have a hard time measuring

Raph Koster's on a tear these days on the theory and practice of game design. Today, it's a fab little sermonette on why it's not right to sneer at data-driven, "free-to-play" games that use extensive instrumentation to make games that captivate players' attention without a lot of flair or imagination. But Koster has a codicil to his message embracing metrics-driven game design: there is much that is important about games that isn't captured by metrics:

And the more the audience divorces itself from we who make their entertainment, the more important it is that we be clear-eyed about what their tastes and behaviors actually are. And that, in turn, greatly undermines the value of “experts,” — because we are in many ways, the most likely to be hidebound and unable to see past the blinkered assumptions precisely because we built them up with hard-won experience.

But! And it’s a big but.

Sometimes, though, what works only works within the field of measurement. If it turns out one of those useless mewling babies was going to grow up to be Einstein, we would have been pretty dumb to toss him out when he was a sullen teenager (even if he did get good grades). A lot of things fall outside of the typical field of measurement.

* Anything that unfolds over a very long period of time. By the time you have true long-term data on a split-test, you’ve essentially chosen a path through inaction.
* Anything that lies in the realm of emotion is invisible — we can easily see results, but we cannot see, barring a focus group, the whys for a given action. (There are various measurement techniques, such as net promoter score, which try to get at this indirectly).
* Anything that is a short-term loss for a long-term gain. Many sorts of behaviors players might engage in may pay out when considered as a systemic aggregate, even though regarding them as a funnel may show them to be terrible. One example might be character customization — it’s an extra step that likely costs some users in an F2P funnel, but it may also yield far greater revenue over time due to character customization.
* Anything that exists outside of the game proper, where it can be hard to tie cause and effect together. Examples include things like community development, the value of strategy websites built by players, etc.

Koster endeth the lesson with some constructive suggestions for fusing traditional and data-driven game design to get the best out of both.

Improving F2P

(Image: Metrics and UX, sitting in a tree, a Creative Commons Attribution Share-Alike (2.0) image from alabut's photostream)

XKCD: Why you should give bad reviews to hotels you like

Today's XKCD proposes a strangely optimal strategy for reviewing the hotels you love, provided you don't mind being a jerk. He calls it the "tragedy of you're a dick."

Who killed videogames? Beautifully written account of behavioral economics and social games

Tim Rogers's essay "Who killed videogames? (a ghost story)" is one of the most interesting pieces of technology reporting I've ever read. It's a long (long!) account of the mechanics of "social games" where psychomathematicians or behavioral economists or engagement designers (all variations on the same theme) create systems to make games compelling without being enjoyable. The sinister science of addictive game design is practiced -- in Rogers's account -- by people who don't like games or gamers, who actually hold them in contempt, and see no reason not to entrap them in awful, life-sucking systems designed to separate them from their money without giving any pleasure or service in return. I've always suspected this to be true, and Rogers's account is awfully well-written and convincing:

The larger man spoke. He gestured while doing so. “You teach the player how to play the game in one minute. Within that one minute, you give them in-game money. You make them spend all of that money to buy an investment that will begin to earn them profit. They build a thing. It says: this thing will be finished in five minutes. Spend one premium currency unit to have it now. You happen to have one free premium currency unit. The game makes you use it now. Now you have a thing. Now it says to wait three minutes to collect from that thing. So they have a reason to stick around for three minutes. When those three minutes are up, you tell them to come back in a half an hour. You say, ‘You’re done for now. Come back in a half an hour.’ The phone sends them a push notification in a half an hour. Right here, you’re telling them to wait. You’re expressing to them the importance of patience. They’re never going to forget the way it feels to wait a half an hour after playing a game for one minute. They’re going to forget the second time they wait for a half an hour, and the third time, and they’ll then not forget the first time they have to wait for four hours, then twenty-four hours. This is why they’ll start to pay to Have Things Right Now.

“So after the first half hour, they get a push notification. Their phone vibrates. It tells them their such-and-such is ready for collection.” The Other Men don’t make any sound. They have collectively folded their hands alongside their Alpine Crystal Spring Superclear Water bottles atop the glass table, collective face intent and weirdly worried, like that of a man hearing the beginning of a joke involving a rabbi, a toddler, and a lizard.

“They open the app. They collect from their such-and-such.

“Now the game tells them they’ve leveled up. It gives them some bonus coins. It tells them they’ve unlocked a new thing — a fancier thing.

As Alice notes, this is long, but the epilog is the best part, and it loses its impact if you haven't read the rest. Keep reading.

(via Wonderland)