If you're a dominant near-monopolist like Facebook, your first preference is to have no regulation at all — but your close second choice is to have lots of regulation that you can afford, but that potential competitors can't, sparing you the tedious exercise of buying and killing any company that might grow up to compete with you some day.
By an incredible coincidence, all of Facebook's answers to its many privacy and monopoly woes have been the kinds of "fixes" that do little to address the problem but make competing with Facebook nearly impossible. Meanwhile, lawmakers (well-meaning and otherwise) have proposed a raft of solutions (well-informed and otherwise) that sometimes get a little closer to the mark, but often at the expense of allowing new companies or projects to come along and challenge Facebook.
That brings me to the most exciting project to come out of the Electronic Frontier Foundation in recent memory: an ambitious analysis of Facebook's monopoly position and a set of proposed regulatory responses that help alternatives bloom, while heading off Facebook's worst abuses and most urgent dangers.
EFF's proposals are divided into new things that governments and regulators can do to keep the public safe while preserving the open internet; and rules that governments should undo — rules that lock in Facebook's dominance in the name of reining in its conduct.
The proposals come under three headings: real data portability (including all the data that Facebook has nonconsensually gathered on its billions of users); opening up the platform for competitors who directly compete with (and possibly federate with) Facebook; and interoperability via open standards, specifically the W3C's social web protocols.
EFF's Bennett Cyphers and Danny O'Brien go into depth on each of these, setting out concrete suggestions for next steps.
The Cambridge Analytica scandal was a result of Facebook offering extremely powerful APIs to third-party apps. Facebook made it too easy for apps to request data about users and all of their friends, and too easy for users to agree to sharing data without understanding the implications.
In response to the scandal, Facebook has tightened control over their interoperable tools across the board and removed some of the more problematic APIs altogether. However, the scandal has also given the company an excuse to make life more difficult for would-be innovators. We must detangle the two if we're going to reduce Facebook's power.
Currently, the "platform policy" that Facebook requires developers to agree to in order to use its APIs is designed to protect Facebook's interests as much as, if not more than, its users'. For example, Section 4.2 prevents offering "experiences that change the way Facebook looks and functions." This explicitly prevents app developers from trying to improve the UI, or even allowing users to customize it for themselves. Other clauses, like "respect the limits we've placed on Facebook functionality," similarly reflect Facebook's desire to maintain tight control over the ways its users interact with their data in the platform.
Furthermore, Section 4.1 states, "Don't replicate core functionality that Facebook already provides." This gives the company grounds to reject any competitive social network that would federate its service with Facebook.
App review is an important practice, and Facebook should continue working to prevent malicious developers from leveraging its platform to harm users. However, the company should allow others to build on and differ from what it has created in meaningful ways. A platform as vast and powerful as Facebook should be a jumping-off point for innovators, not a means for the company to impose a single experience on everyone in its network.
Facing Facebook: Data Portability and Interoperability Are Anti-Monopoly Medicine
[Bennett Cyphers and Danny O'Brien/EFF]