Offering users transparency and privacy is the only way Big Tech can avoid being turned into content cops

Dan Gillmor's got an excellent point about tech platforms: they more they act as technological regulators of what we see (the more they spy on us and filter-bubble us), the more they're going to face calls to be political regulators of what we see.

The more they assert the right and capability to uprank and downrank the things we pay attention to, the harder it is for them to tell states and other entities that it would be technologically impossible and morally reckless to order them to shape the news to suit someone else's need.


The more platforms are designed to let users decide what they see and who they interact with, even if the platform doesn't like those choices, the harder it will be to argue that they should be involuntarily deputized to serve as sheriffs for the whole net.

This is a realization that the platforms have been coming to for many years, with a variety of reactions.


So what else should go into a dashboard? Here are just a few of the items I'd suggest to improve the actual news and community information content:

1) A filter or no-filter button on Facebook. Give me the feed of whom and what I follow, in reverse chronological order (newest first), or a curated feed. Google should offer a non-curated search that doesn't also require me to sign out.

2) A selection of ways to alter the curated feed. The platforms should give me the ability to prioritize not according to how they interpret what I do, but how I tell them to prioritize. (One alternative, of course, should be to just take their suggestions — and as we all know, the default usually wins. That's a bug, not a feature.)

3) Community-driven filtering. Give me a way to organize with others to create streams of information that we decide are useful. Yes, this could end up creating even worse filter bubbles, but that way of seeing the world — however shallow and narrow-minded — should be an option.

4) Filter bubble fixers. Give me a setting that will specifically put items in my feed that I know I'll disagree with or have different world views. On Twitter, I specifically follow people who make my blood boil or see the world differently. Search and social algorithms need to include these capabilities.

Why tech platforms should give users more control — and how they can do it [Dan Gillmor/Medium]