As you read this morning, an Annenberg survey of Internet users found that Americans did not feel that trading their privacy to advertisers for access to networked services was a fair trade, but they'd given up on trying to understand or protect their privacy online.
This finding runs contrary to the "revealed preferences" argument given by Internet marketers: no matter what people say about their privacy concerns, their actions reveal that they don't mind giving up their privacy online. The researchers suggest that because Internet users can't figure out how to prevent their data-loss, they have resigned themselves to the inevitable creation of dossiers on their lives, habits and preferences by anonymous aggregators and Internet giants as an unavoidable fact of life.
What's more, many Americans believe a set of incorrect "facts" about the regulation of advertising that they use to comfort themselves — for example, that supermarkets must obtain their permission before selling information about their shopping, or that online stores are allowed to offer different prices to different customers based on secret consultations with repositories of private, personal information on those shoppers.
In my view, much of this problem is the result of tools that, by default, give up private data to third parties. For more than a decade, it's been normal for browser makers to take extraordinary steps to block pop-up ads out of the box; no similar effort is made to block third-party cookies, evercookies, browser profiling, etc. Likewise, mobile OSes offer us take-it-or-leave-it permissions for our apps: we can choose to give an app access to our data, or we can choose not to install the app — but we can't choose to install the app and then block or spoof its attempts to pull that information from our devices. This carries over to Web apps that get permission from Twitter, Google, Facebook, etc.
The fight over pop-up ads is really instructive here. When pop-ups were the norm, publishers and advertisers insisted that the ad-supported Web would collapse unless browser-makers deliberately withheld the ability to block them from Internet users. When pop-up blocking by default became the norm, the use of pop-ups dwindled to nearly nothing — and what's more, advertisers stopped demanding that publishers deploy pop-ups. Why bother, when no one will see them?
I don't know if a privacy market is possible. It may be that people are just not capable of pricing the net-present value of a disclosure that will be used against them in the future.
But if privacy markets are possible, they'll only emerge when users are able to control the flows of data from their tools and devices. What we have today isn't a privacy marketplace, it's a privacy all-you-can-eat buffet. The "sellers" grab any and all private info they want from users, and the users' only remedy for this is to walk away, and even then, it's nearly impossible to figure out if you're still doing business (even if you never use Facebook and have no account with the service, every page with a Facebook Like button sends information about your habits to Facebook).
The good news is that there is evidence for a different kind of privacy marketplace: one in which toolmakers offer access to existing services in ways that obfuscate, chaff and spoof those services. There are legal impediments to this — the Computer Fraud and Abuse Act and section 1201 of the DMCA, for starters — but when the majority of Americans want a product that no one is willing to sell to them, your entrepreneurial instincts should be very excited indeed.
Marketers enthuse over the idea that people's acceptance of the general idea of tradeoffs justifies
marketers' collection of enough data
points about consumers to lead
to the kind of personalization
Yahoo calls "the pathway to
advertising nirvana."
Privacy advocates and
researchers, by contrast,
have challenged the assertion that people make rational calculations when they "opt-in" to privacy-atrophying services. These groups argue that decisions
about information disclosure are not as prone to
contradictions between their expressed attitudes and their actions as
the privacy paradox would suggest.
Instead they have argued that individuals want to manage their data
but lack the necessary knowledge to
make informed decisions about the consequences
of entering into interactions with marketers.Someone observing the behaviors of consumers might
believe people are indulging in tradeoffs when
they are actually resigned to giving up data. There
need not be any differences between how individuals
who are resigned and those who believe in tradeoffs act
in marketing situations. Our goal in conducting
this survey was to look behind the
activities to examine people's beliefs about what they are doing and
why. How broadly does a sense of
resignation exist among Americans?
How broadly does a belief in
the appropriateness of tradeoffs
exist among Americans? How does people's knowledge about everyday
data gathering activities of marketers and the laws that
govern them relate to their sense of resignation or
belief in tradeoffs? And how do knowledge and resignation relate to people's decisions in a realistic
supermarket scenario that offers discount coupons for data?
The Tradeoff Fallacy [PDF] [Joseph Turow, Michael Hennessy & Nora Draper/UPenn]
(via /.)
(Image: Anti Facebook Stickers, Ello
Mike Mozart, CC-BY)