Facebook provides a suite of turnkey app-building tools for Android that are widely used among the most popular Google Play apps, with billions of combined installs; naturally, these tools create incredibly data-hungry defaults in the apps that incorporate them, so that even before you do anything with an app, it has already snaffled up a titanic amount of data, tied it into your Google Ad ID (which is recycled by Facebook to join up data from different sources) and sent it to Facebook.
Needless to say, the GDPR made these practices radioactively illegal, but despite two years' warning that the GDPR was coming into effect last spring, Facebook dragged another six months out before updating its tools, and these updates still have propagated to all the apps in Google Play.
The data harvested from phones — including, for example, which Bible verses you read using a King James Bible app, and which searches you made on Kayak — is added to your "shadow profile", and no one (outside of Facebook) knows for sure how that's used.
You can practice a little self-defense, but it's cumbersome: root your phone and you can block all network traffic to *.facebook.com; you can also reset your Ad ID and disaggregate the data coming off your phone. I've had a poke around but can't find a tool that resets the Ad ID every 10 seconds — please leave a comment if you know of one.
Frederike Kaltheuner and Christopher Weatherhead from Privacy International gave an outstanding talk on the subject at the Chaos Communications Congress in Leipzig last month; an accompanying paper gives more detail, including methods.
Kaltheuner and Weatherhead were able to gain insight into the apps' behavior by rooting an Android phone and installing a man-in-the-middle proxy that used forged certificates to intercept and decrypt data on its way to Facebook. Ominously, none of the apps they tested used certificate pinning (let alone certificate transparency) to detect/prevent this kind of man-in-the-middle activity.
It's not clear whether the same conduct is present in apps in Apple's App Store; Apple uses unique Ad IDs that are similar to Google/Android's and could be exploited in the same way. However, Apple's DRM is designed to make this kind of research much harder. I hope the Privacy International researchers take a crack at it: perhaps they could use simulated, cloud-based Ios devices used for developer testing.
(via Bruce Sterling)