Yale Privacy Lab and Exodus Privacy's devastating report on the dozens of invasive, dangerous "trackers" hidden in common Android apps was generated by writing code that spied on their target devices' internal operations, uncovering all manner of sneaking trickery.
it would be great if we had effective regulatory oversight and the power to seek legal relief from these companies for lying to us and/or sneaking spyware into our lives; but every bit as important is the right to independently audit their actions (as Privacy Lab and Exodus have done) and to install code that overrides the undesirable functions of this spyware -- for example, by blocking its communications or chaffing it with plausible garbage data.
The Exodus Privacy app's functionality is key to attaining the first goal, gathering independent evidence about the conduct of mobile firms and app providers. Without that evidentiary basis, there's no way to know you need self-help measures, nor is there any way to convince regulators to take action, nor is there the possibility of creating public clamour for competing products that would spur investors and entrepreneurs to make tools that let you reclaim control over your device.
As Exodus and Yale note, these trackers are almost certainly also present in iOS: the companies that make them advertise their iOS compatibility, for one thing. But iOS is DRM-locked and it's a felony -- punishable by a 5-year prison sentence and a $500,000 fine for a first offense in the USA under DMCA 1201, and similar provisions of Article 6 of the EUCD in France where Exodus is located -- to distribute tools that bypass this DRM, even for the essential work of discovering whether billions of people are at risk due to covert spying from the platform.
It's true that the US Copyright Office gave us a soon-to-expire exemption to this rule that started in 2016, but that exemption only allows Exodus to use that tool; it doesn't allow Exodus to make that tool, or to distribute it so independent researchers can investigate iOS.
Capabilities of the trackers uncovered by Exodus include targeting users based on third-party data, identifying offline movement through machine learning, tracking behavior across devices, uniquely identifying and correlating users, and targeting users who abandon shopping carts. Most trackers work by deriving an identification code from your mobile device or web browser and sharing it with third parties to more specifically profile you. App makers can even tie data collected from trackers with their own profiles of individuals, including names and account details. Some tracking companies say they anonymize data, and have strict rules against sharing publicly identifiable information, but the sheer wealth of data collected can make it possible to identify users even in the face of such safeguards.
“How many people actually know that these trackers are even there?” said Michael Kwet, another visiting fellow at Yale Privacy Lab. “Exodus had to create this software to even detect that they were in there.”
A few of the trackers offer users the option to opt out via email or through their privacy settings. But tracking can resume even after this step is taken. For example, one app requires that users who clear their cache set up the opt-out again. Some opt-outs are temporary. Even if the opt-outs do end up being permanent, few users would even know to activate them in the first place.
Yale Privacy Lab - Tracker Profiles [Github]
STAGGERING VARIETY OF CLANDESTINE TRACKERS FOUND IN POPULAR ANDROID APPS [Yael Grauer/The Intercept]