Civil rights implications of Big Data

Discuss

19 Responses to “Civil rights implications of Big Data”

  1. ZikZak says:

    We live in a “post-privacy” society.  We’ve known that for a little while now.
    We also live in an unjust and oppressive society, where combinations of class, race, gender, nationality, etc. cause some people to be treated worse than others.  We’ve known that for a long time now.

    But the intersection of these observations is important.  As they used to say “On the internet, nobody knows you’re a dog”…what will oppression and injustice look like in a society where any reasonably powerful institution knows everything about you everywhere all the time?

    • Mordicai says:

      I don’t know; we live in a post-anonymity, but privacy is a contact.  Privacy isn’t “no one knows,” privacy is “no one looks.”  Like– when I take a shower, I’m naked in the shower.  That isn’t a secret.  You could just open the door.  Even if I lock it, come on, a bobby pin can open that lock, it hardly counts.  But you DON’T open the door, because…privacy!  I think there is room for privacy in the modern world, even if anonymity is gone.

      • Interesting—”privacy isn’t what they know, it’s how they act on it” is certainly true in this world. It’s like in the village millennia ago: everyone knew you, and likely your secrets, through the thin walls of the hut. But if they didn’t act differently, it didn’t matter.

        I was listening to NPR yesterday in Boston, and they were talking about Logan Airport’s screening practices, and involuntary discrimination. It’s something innate in humans to behave differently if we have subconscious biases.

        But as Jonathan Haidt so eloquently explains in The Righteous Mind, our conscious brain is like a lawyer for our moral reasoning, grabbing hold of any cue or clue to defend our reactions after the fact. The issue here is that Big Data might give that internal lawyer a whole bunch of “case law”—seemingly just, reasonable, scientific explanations that are based on predictions about a person, but aren’t accurate.

        • Mordicai says:

          Oh, I’m not trying to let anyone off the hook here– I think the case for institutional racism is clear & present on basically every strata of society– but rather pointing out what I hope is a path for reconciliation, at a broader level. I think people are culpable for their bias, even when it isn’t “on purpose,” yes. I think that privacy, at a top-down level, can be a tool used to alleviate that.

  2. Teller says:

    Good article. Progressive Insurance, linked in Croll’s article, already asks for one’s gender, so obviously this is just about race. I don’t begrudge certain businesses for engaging in risk management – provided it’s not capricious. The service I’d pay for is personal privacy management – as long as I don’t have to submit an application.

  3. glaborous_immolate says:

    Aw, c’mon, OKCUPID can’t tell us controversial stuff? What’s the worst that could happen?

  4. Boundegar says:

    Alistair Croll?  Wasn’t he The Great Beast 666 Metatron?  I was wondering what ever happened to him.

  5. zibuki says:

    This reminds me of Eli Pariser’s excellent cautionary TED presentation:

    Beware Online Filter Bubbles
    http://blog.ted.com/2011/05/02/beware-online-filter-bubbles-eli-pariser-on-ted-com/

  6. ryane says:

    Hadoop and MapReduce! Thanks Apache and Google! Business Intelligence is my field of work, I love this stuff!

  7. Sarge Misfit says:

    I’ve always done my best to avoid all that personalization crap. Yes, partly because I avoid giving out my info, but there’s another reason that is important to me.  I like getting odd things. I don’t like some other person or computer deciding what I should be seeing. Surprise me! There’s a ton of really cool and wild stuff out there and I don’t want to miss it because some algorithm decides I might not like it.

  8. SedanChair says:

    I got a banner ad for hair straightener today. Apparently The Internet knows that my hair is kinky, and also knows that it should be straight

  9. Palomino says:

    I’m surprised a recent BB article, a perfect example, hasn’t been mentioned, not even in the link.  How easy some of us forget. 

    http://boingboing.net/2012/02/19/targets-creepy-data-mining-p.html 

  10. I happened to click on an Indochino ad once. Now I see it every godamn place I go on the internet. I also happened once to look at security cameras after my neighbor had his house burglarized. Now google thinks I should security camera ads everywhere I go. this is making advertizing stupider and less effective.

  11. vipulvedprakash says:

    Is credit a civil right now?

  12. vipulvedprakash says:

    The story is about a private financial institution adjusting someones credit limit.  Amex has the right to use whatever data they have to adjust credit limits. Consumers have the right to not do business with them if their products are shitty. This is hardly a civil rights issue. 

    Re credit ratings, the current system is busted.  The fact that you have to repeatedly borrow and return to become creditworthy encourages risky behavior and keeps many otherwise deserving people from getting access to credit (they can’t borrow in the first place or don’t want to often).  If diverse data were rolled up to create credit ratings such that ratings were better reflections of risk, more people will be able to access housing and other products that require loans.  

Leave a Reply