The measure is billed as an anti-fraud system and Apple claims that its surveillance is "designed so Apple cannot learn the real values on your device."
Though Apple doesn't provide any details on how this works, the company has previously deployed a privacy measure called "differential privacy" that allows for some aggregate data-gathering and analysis that theoretically protects the subjects' privacy -- however, Apple's differential privacy implementation was fatally flawed, a fact that was slow to come to light in part because of the company's notorious secrecy and its hostility to independent repair and unauthorized analysis of its security measures.
Apple's locked-down systems are often a useful line of defense against fraud, theft and surveillance -- but as the company's record in China shows, this control is a dual-edged sword. By locking its Iphones to its App Store, and then capitulating to the Chinese government by banning secure VPNs from the Chinese App Store, Apple has made Chinese mass surveillance and retaliation against political dissidents much easier, and made evading surveillance and retaliation much harder.
The provision, first spotted by Venture Beat, appears in an update to the iTunes Store and Privacy page and comes ahead of the release of the iPhone Xs and iPhone Xs Plus on Friday, 21 September.
"To help identify and prevent fraud, information about how you use your device, including the approximate number of phone calls or emails you send and receive, will be used to compute a device trust score when you attempt a purchase," the page reads.
"The submissions are designed so Apple cannot learn the real values on your device. The scores are stored for a fixed time on our servers."
Apple is quietly giving people 'trust scores' based on their iPhone data [Anthony Cuthbertson/The Independent]
(Image: Cryteria, CC-BY)