Facebook's massive psychology experiment likely illegal


Researchers from Facebook, Cornell and UCSF published a paper describing a mass-scale experiment in which Facebook users' pages were manipulated to see if this could induce and spread certain emotional states.

They say it was legal to do this without consent, because Facebook's terms of service require you to give consent for, basically, anything.

But as legal scholar James Grimmellmann points out, there's a federal law that prohibits universities from conducting this kind of experiment without explicit, separate consent (none of this burying-consent-in-the-fine-print bullshit). Two of the three researchers who worked on this were working for federally funded universities with institutional review boards, and the project received federal funds.

Facebook says that it manipulates feeds all the time, and this was no different, but Facebook is acting as a private company when it does this, not working on a federally funded project, in concert with federally funded researchers. Besides, Grimmelmann further points out that there was real potential for harm in the protocol of the study.

As Grimmelmann says: This is bad, even for Facebook.

Experimental evidence of massive-scale emotional contagion through social networks [Adam D. I. Kramera, Jamie E. Guillory, and Jeffrey T. Hancock/Proceedings of the National Academy of Sciences]

As Flies to Wanton Boys [James Grimmelmann/Labmatorium]

(via Techdirt)

Notable Replies

  1. Schadenfreude, I haz it.

  2. Is this why one day nothing was on my feed besides videos of scruffy looking puppies and Sarah McLachlan's "Arms of an Angel" playing on repeat?

  3. Ygret says:

    What I want to see is a social media app that is owned by the membership and that is controlled by the membership. Each member will have total control of the data they post, period.

    The truth is facebook is a simple app. A few programmers could cobble together the same functionality (without all the manipulation) in a few weeks. Its a joke. Its past time we create an open source social media app to replace this garbage.

  4. Ygret says:

    If we use our common sense, instead of our legalese here, we can see that emotionally manipulating people and studying their responses without consent is a hugely unethical act. I don't care what the Dr Mengele wannabees at HHS say. Honestly, if we're going to use the decisions of government to decide what's ethical and what's not, we might as well give up on ethics completely. I mean seriously: starting unnecessary wars that kill hundreds of thousands of innocents; CIA mind control, MKUltra, Tuskegee experiments, the housing bubble and government response, etc. The list of government's inhumanity to its own citizenry and humanity generally is eons long. Its time to stop lawyer-balling everything and start honing our own ethical senses.

  5. Case 1: a bunch of self-proclaimed "social media experts" decide to make some semi-random changes to Facebook. These results are never made public, but people still complain about the fact that Facebook's layout keeps changing.

    Case 2: Facebook consults with academic experts in their fields to perform a focused and limited experiment and then shares the results with the rest of the world.

    Case 1 happens ALL THE TIME and we don't even know it. Case 2 suddenly pops up and people become unhinged.

    Personally, I find Case 2 to be far more ethical, even though Case 1 is well within their legal rights. I think this is a great opportunity to encourage large companies to continue to engage in fairly public research (which is something Facebook has been famous for when it comes to Comp Sci).

Continue the discussion bbs.boingboing.net

46 more replies

Participants