Vidangel is the latest attempt (along with services like Clearplay and Sony's own filtering tool) to sell a product that allows cringing, easily triggered evangelicals to skip swear words, sex and blasphemy in the media they watch.
These tools re-emerge periodically, and every time they do, artists come out to decry them as violations of copyright or moral rights, or both (and Vidangel is no exception).
This is silly and worse than silly: it's dangerous.
The right to decide how media shows up on the screen you own is sacrosanct, for the same reason that you have the right to skip boring parts of books, or fast-forward through ads, or dial the contrast up on hard-to-read grey text to pure black, or change the fonts in your ebooks, or block ads or third-party trackers, or turn down the volume when the super-loud music plays in a movie's climactic action scene, or just re-read your favorite passages of a book (or skip to the end of a book to see if your guess about the ending is right).
Furthermore, the right to make an index of a work is also sacrosanct: indexing all the scenes with your favorite characters, the parts of a climate-denial text that are most risable, or all the scenes that annoy you (or your enemies) are all activities that are key to free expression and the enjoyment of your work.
Also: the right to share and collaborate on those indexes is sacrosanct.
Finally: the right to make a tool to help people use indexes to control the playback of media is sacrosanct. What's the point of having the right to block ads if you can't get an ad blocker?
None of this is to say that the choices people make about their media consumption should be exempt from criticism. You might choose to skip scenes that are vital, you might choose to mute the most important dialog, you might choose to block parts of a web-page that make it more legible and easier to read.
But dumb choices are yours to make, and using copyright to force other people to use media in the way you want is an abuse of copyright.
I made all these points two years ago when Clean Reader made a bid to do this for ebooks.
I renew them now that Vidangel is in the news. On Wired, Emma Grey Ellis has made a lot of excellent points about why using a service like Vidangel is unhelpful, but misses out on this important point: even if it's a dumb choice to make, it's your dumb choice to make.
Filtering objectionable content out of a tidal wave of posts requires a legion of humans or an algorithm trained by a legion of humans. Either way, how those humans see the world dictates how they interpret "objective." "It's difficult to apply global standards to subjective information," says Kate Klonick, a lawyer at Yale who studies private platform moderation of online speech. Hence the outrage over YouTube marking innocuous videos by LGBTQ content creators as not family friendly, or Facebook and Instagram removing photos of mothers breastfeeding. And the equal but opposite backlash against Facebook for (eventually) not removing photos of gay people kissing.
Because social media users are so politically polarized, the perspectives and biases of filterers—who, ultimately, are deciding what your world looks like—matter even more. When I asked Harmon about the workers who apply the filtering tags VidAngel customers can choose from, he said, "To be candid, Christians are probably overrepresented, but I couldn't really even say. We have not done proper research on who our taggers are, other than they're in the United States and they're geographically diverse."
FILTERING YOUR WORLD IS UNDERSTANDABLE—BUT IT'S NOT HELPFUL
[Emma Grey Ellis/Wired]