FTC says it will go after companies that ‘quietly’ change privacy policies to mine user data for AI
The Federal Trade Commission (FTC) warned Tuesday that it will strictly enforce how artificial intelligence firms and other companies treat their terms of service and will pursue those who “quietly” change them in order to exploit data to enhance their AI tools.
Calling data the “new oil,” a blog post from the FTC’s Division of Privacy and Identity Protection noted that vast amounts are collected by businesses, particularly by AI companies, which the FTC blog post described as possessing a “continuous appetite for more and newer data.”
The blog post noted that companies may be tempted to mine their own user base for pre-existing data to feed AI models despite having privacy and security policies on the books prohibiting the practice.
Calling companies’ potential dilemma a conflict of interest, the FTC said there are “powerful business incentives” to use existing user data to bolster AI products.
“Market participants should be on notice that any firm that reneges on its user privacy commitments risks running afoul of the law,” the blog post said.
The post said the FTC will specifically pursue cases against companies which “adopt more permissive data practices,” for example, by beginning to “share consumers’ data with third parties or using that data for AI training — and to only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service or privacy policy.”
The FTC has gone after companies which have changed privacy policies in ways that surreptitiously undermine existing promises made to consumers in the past.
The blog post noted that almost 20 years ago the FTC charged the company behind “Hooked on Phonics” for failing to alert consumers or obtain their consent after changing its privacy policy to allow sharing of consumer data with third parties.
Last summer, the FTC alleged that the genetic testing company 1Health broke the law by altering its privacy policy to retroactively expand types of third parties with which it shared consumer data. The company failed to alert consumers or get their consent prior to doing so, the FTC alleged.
The company settled with the FTC, agreeing to begin instructing “third-party contract laboratories” to trash all consumer DNA samples stored for more than 180 days.
“The FTC will continue to bring actions against companies that engage in unfair or deceptive practices — including those that try to switch up the ‘rules of the game’ on consumers by surreptitiously re-writing their privacy policies or terms of service to allow themselves free rein to use consumer data for product development,” the blog post said.
“Ultimately, there’s nothing intelligent about obtaining artificial consent,” it added.
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.