/ Cory Doctorow / 7 am Mon, Jun 8 2015
  • Submit
  • About Us
  • Contact Us
  • Advertise here
  • Forums
  • Internet users care about their privacy but have given up on safeguarding it

    Internet users care about their privacy but have given up on safeguarding it

    It's not a fair trade, and everyone knows it.

    As you read this morning, an Annenberg survey of Internet users found that Americans did not feel that trading their privacy to advertisers for access to networked services was a fair trade, but they'd given up on trying to understand or protect their privacy online.

    This finding runs contrary to the "revealed preferences" argument given by Internet marketers: no matter what people say about their privacy concerns, their actions reveal that they don't mind giving up their privacy online. The researchers suggest that because Internet users can't figure out how to prevent their data-loss, they have resigned themselves to the inevitable creation of dossiers on their lives, habits and preferences by anonymous aggregators and Internet giants as an unavoidable fact of life.

    What's more, many Americans believe a set of incorrect "facts" about the regulation of advertising that they use to comfort themselves -- for example, that supermarkets must obtain their permission before selling information about their shopping, or that online stores are allowed to offer different prices to different customers based on secret consultations with repositories of private, personal information on those shoppers.

    In my view, much of this problem is the result of tools that, by default, give up private data to third parties. For more than a decade, it's been normal for browser makers to take extraordinary steps to block pop-up ads out of the box; no similar effort is made to block third-party cookies, evercookies, browser profiling, etc. Likewise, mobile OSes offer us take-it-or-leave-it permissions for our apps: we can choose to give an app access to our data, or we can choose not to install the app -- but we can't choose to install the app and then block or spoof its attempts to pull that information from our devices. This carries over to Web apps that get permission from Twitter, Google, Facebook, etc.

    The fight over pop-up ads is really instructive here. When pop-ups were the norm, publishers and advertisers insisted that the ad-supported Web would collapse unless browser-makers deliberately withheld the ability to block them from Internet users. When pop-up blocking by default became the norm, the use of pop-ups dwindled to nearly nothing -- and what's more, advertisers stopped demanding that publishers deploy pop-ups. Why bother, when no one will see them?

    I don't know if a privacy market is possible. It may be that people are just not capable of pricing the net-present value of a disclosure that will be used against them in the future.

    But if privacy markets are possible, they'll only emerge when users are able to control the flows of data from their tools and devices. What we have today isn't a privacy marketplace, it's a privacy all-you-can-eat buffet. The "sellers" grab any and all private info they want from users, and the users' only remedy for this is to walk away, and even then, it's nearly impossible to figure out if you're still doing business (even if you never use Facebook and have no account with the service, every page with a Facebook Like button sends information about your habits to Facebook).

    The good news is that there is evidence for a different kind of privacy marketplace: one in which toolmakers offer access to existing services in ways that obfuscate, chaff and spoof those services. There are legal impediments to this -- the Computer Fraud and Abuse Act and section 1201 of the DMCA, for starters -- but when the majority of Americans want a product that no one is willing to sell to them, your entrepreneurial instincts should be very excited indeed.

    Marketers enthuse over the idea that people’s acceptance of the general idea of tradeoffs justifies marketers’ collection of enough data points about consumers to lead to the kind of personalization Yahoo calls “the pathway to advertising nirvana.” Privacy advocates and researchers, by contrast, have challenged the assertion that people make rational calculations when they “opt-in” to privacy-atrophying services. These groups argue that decisions about information disclosure are not as prone to contradictions between their expressed attitudes and their actions as the privacy paradox would suggest. Instead they have argued that individuals want to manage their data but lack the necessary knowledge to make informed decisions about the consequences of entering into interactions with marketers.

    Someone observing the behaviors of consumers might believe people are indulging in tradeoffs when they are actually resigned to giving up data. There need not be any differences between how individuals who are resigned and those who believe in tradeoffs act in marketing situations. Our goal in conducting this survey was to look behind the activities to examine people’s beliefs about what they are doing and why. How broadly does a sense of resignation exist among Americans? How broadly does a belief in the appropriateness of tradeoffs exist among Americans? How does people’s knowledge about everyday data gathering activities of marketers and the laws that govern them relate to their sense of resignation or belief in tradeoffs? And how do knowledge and resignation relate to people’s decisions in a realistic supermarket scenario that offers discount coupons for data?

    The Tradeoff Fallacy [PDF] [Joseph Turow, Michael Hennessy & Nora Draper/UPenn]

    (via /.)

    (Image: Anti Facebook Stickers, Ello Mike Mozart, CC-BY)

    / / 11 COMMENTS

    / / / / / /

    Notable Replies

    1. Fact is, dealing with your privacy protection is an impossible task.

      First, even if you lead a simple life, you're probably going to be dealing with dozens of different organizations each with their own privacy agreement written in difficult-to-understand legalese, several pages long, which the organization reserves the right to reword whenever they feel like. Who has time to keep up with that?

      Second, it's not even clear whether taking the proper steps to opt out of their data sharing makes any difference at all. It's not like you have any way of monitoring the proper safeguarding of your data and violations of your agreement. Even if you could prove someone broke the privacy agreement, what sort of penalty could you exert? Most of the time, you'll be offered credit monitoring services for a few years, but what good is that going to do against advertisers or foreign governments? And if an organization goes bankrupt, quite often your data gets purchased by someone else to whom you have no agreement or control over and there doesn't seem to be anything you can do about it.

    2. Without even trying to hint that BB has unusually invasive practices in this regard for its size, here's where a little self-disclosure in context could go a long way. Put this in context for the people reading this on your website. What sorts of things do you know/track/see/think/remember/care about with respect to us?

      I would genuinely appreciate knowing that, in a fashion other than what might be gleaned from a close reading of some fine print somewhere--although I can appreciate that you've tried to make your fine print readable. But no matter how accessible the language you use to talk about this or that third-party data analytics sharing arrangement, it becomes difficult to comprehend for people who aren't steeped in the business end of the internet. Can you explain it as thoroughly and as accessibly as you might explain the plot of a cool YA science fiction story?

      For example, and this is ultra-trivial, but I remember Antinous (of blessed memory) sarcastically apologizing to a commenter for having to live in the location that the commenter's IP address pointed to. Of course I already knew that incoming IP addresses were basically visible to people at the other end, and I suppose it followed that Antinous (whoever or whatever s/he ultimately was to the site) could have access to that information, but it was kind of amazing, in a jarring sort of way, to see that so openly turned back on a user. Yet that's not a very common sort of omission--and I think the average web-surfer could be forgiven for forgetting that people really are peering out at her from the Panopticon.

      I assume, if only because there's been no reason not to, that BB does nothing unusually shady or invasive for a website of its traffic and for-profit status. In other words, this isn't a "gotcha!" question--I've already made my peace with the baseline. But how about a look at what it means from the perspective of people who, however fully disclosed, are collecting data on us from cookies and analytics and whatever else?

    3. we can choose to give an app access to our data, or we can choose not to install the app -- but we can't choose to install the app and then block or spoof its attempts to pull that information from our devices.

      This is not entirely correct, for Android there is XPrivacy - https://play.google.com/store/apps/details?id=biz.bokhorst.xprivacy.installer

      Granted, it's only going to be usable to a small subset of people that have a recent version of Android and a rooted device. But, I enjoy knowing that I'm sending back random information.

    4. Well, I think that's part of the problem there- The knowledge barrier to even attempting to control your data.

      I mean, I'm not a programmer, but I'm reasonably computer savvy- I know how to maintain my machines, install hardware and drivers, troubleshoot basic problems, use 5 different OSs every day, and know enough HTML and CGI to be able to put together a nice website from scratch or Drupal. By no means an expert, but I feel confident saying that I know how to use a computer.*

      I never heard of XPrivacy. I have no idea how I would even learn how to root my phone, and I'm worried it would impair my ability to use the two pieces of software I bought the phone to run in the first place. I've actually tentatively looked into it, and here's a major problem: I don't know enough about programming or hacking to even tell which instructional sites are legitimate, let alone how to fix it if I screw up.

      It's a paradox I've run into before in other contexts: When the degree of knowledge you need in order to determine whether someone will be a worthwhile teacher, is itself enough knowledge that you no longer need that teacher.

      And the thing is, I'm sure I could learn programming, but it would mean taking time away from making music, running my business(es), and my charity work- Which my day job is already cutting into more than I can really manage.

      So, what to do?


      *Seriously, follow that link. It's a great and depressing article.

    5. This is a big one a lot of Android hackers completely neglect. For a lot of (non-Nexus) phones, the process to root them is "Hey, run this random binary that I (some random from the Internet) made on your computer and/or phone, that uses a security exploit to get root access! I totally promise I won't take advantage of that at all!" Not to mention that most ROMs, even CyanogenMod, are either impossible or at the least extremely difficult to download securely compared to good practice in the GNU/Linux world.

    Continue the discussion bbs.boingboing.net

    6 more replies

    Participants