Ben Wiseman, FTC

A top FTC official on the consumer privacy message the agency is sending to industry

Under Chair Lina Khan, the Federal Trade Commission (FTC) has been far more aggressive in its pursuit of data privacy cases, including unprecedented enforcements against data brokers who sell geolocation information. The agency also has taken action against health care companies for sharing customer data without consent. 

Khan’s FTC has forced companies to eliminate entire business models related to data and has banned businesses from using products they have built from unlawfully collected data. The agency is now working to implement a new rule laying out marketwide restrictions for commercial surveillance — or how companies collect, analyze and monetize consumer data.

One of the top officials behind the agency’s newly muscular stance on consumer data privacy is Ben Wiseman, associate director for the Division of Privacy and Identity Protection.

Wiseman spoke to Recorded Future News about how the FTC is thinking about its work on the issues embedded in the potential new commercial surveillance rule; about why the connected-car industry should take note of a recent agency warning on its data collection and sharing practices; and why he believes industry is closely watching and adapting to the agency’s enforcement actions.

This conversation has been edited for length and clarity.

RECORDED FUTURE NEWS: You're close to releasing a new proposed rule governing commercial surveillance. It has been reported that it will focus on data security and data minimization as well as algorithmic accountability and civil rights. Why is this rule needed now?

BEN WISEMAN: I can’t speak to any specifics. I can more broadly speak about the increased rulemaking over the past several years and I think it reflects two things. One, the Supreme Court in the AMG decision [in 2021] took away the commission's most effective tool to return money for consumers. So now, absent a rule violation, it's much more difficult to obtain redress for consumers in our cases. One reason that the commission, I believe, is moving to create rules in certain segments is to restore another avenue for us to return money to consumers. 

The second reason is I think there is this recognition that in some cases, case-by-case enforcement alone might not be enough to address these broader marketwide problems that we're seeing. 

RFN: Even if you can't discuss the specifics of the [potential commercial surveillance] rule, your recent orders and public statements make it clear that the agency is focused on data minimization and algorithmic accountability. Why have these issues been priorities?

BW: Over the past several years you've heard many of the commissioners talk about some of the real challenges of the commercial surveillance business model and the real limitations of the current notice and choice regime. In particular, notice [to consumers] is a fiction if it means having to sift through hundreds of thousands of pages of privacy policies. Consent cannot be meaningful when consumers don't have information to actually make real choices and they are forced to live their lives online as the digital economy becomes more and more entrenched in our everyday lives, including our lives at work. 

And then with respect to commercial surveillance, we've seen over the past several decades a business model that has incentivized collecting lots and lots of information from consumers, which has led to harms from scams to the harms that we've identified in some of our recent cases involving precise location data, the ability to identify people and their visits to highly sensitive locations like medical facilities, places of religious practice, but also creating broader systemic risk for data breaches and other cyber incidents. 

That is the ecosystem where you've seen recent commission actions really try to, in our orders, provide substantive protections for consumers' privacy. You’ve seen bans on specific practices and prohibitions on specific practices, recognizing some of the challenges of notice and consent, and focusing on those substantive protections and really encouraging and requiring companies to minimize the data that they collect in the first place.

RFN: What lessons have you learned from this commercial surveillance rulemaking process? Obviously, it's taking a long time and especially if a Trump administration comes in, what could that mean for the effort?

BW: We're just going about our work. We received about 11,000 comments on the rule and our staff are closely reviewing them. 

RFN: How are you thinking about the proposed America Privacy Rights Act and the new enforcement authorities that it would give the agency?

BW: The commission has for a while supported comprehensive privacy legislation. Providing baseline protections for all consumers’ privacy is a real positive development. It would be very positive both in the privacy space, but also as we're seeing new AI business models continue the trend of the collection and overcollection of data. 

RFN: The agency recently issued its first public comment since 2018 on the enforcement of data privacy surrounding cars. I know you can't disclose whether there are any pending enforcement actions or specific investigations underway. But can you talk about why the agency is speaking up on the matter now?

BW: The blog post that came out [on May 14] really speaks to some of the concerns we're seeing and also just highlights the signals from our recent enforcement actions that geolocation data is going to be sensitive, and using sensitive data for automated decisions can be unlawful. What we put out in the recent post really sort of highlights our concerns in this space.

RFN: A privacy advocate said to me that the FTC doesn’t just think about this as a casual blog post — they said that industry really watches these and they are noticed for setting guidelines for what kind of behavior the agency expects. Do you agree with that? Do you expect that industry is taking note?

BW: I do. When we put out messages to the public we do so intentionally to signal where we have concerns with current practices in the marketplace. 

RFN: The FCC also has recently taken an interest in car data privacy and you all do standing meetings with them to set priorities and collaborate. So can you both confirm that you do have standing meetings with them, but then beyond that characterize any kind of broad discussions you've had with the FCC on how they might regulate location data gathered by car companies?

BW: We have collaborative relationships with agencies across the federal government. We often are engaged in enforcement actions together as we did with the CFPB [Consumer Financial Protection Bureau] last year on our TransUnion matter. We make sure to work closely with other agencies and part of that is ongoing dialogues about what we're seeing in various markets and what we're seeing, how firms are reacting to new technologies.

RFN: Can you say more about the FTC’s work with the CFPB on the TransUnion enforcement?

BW: Last year, we brought a very significant action against TransUnion, along with the CFPB, involving some of their … screening practices [for housing rentals], and charged them with violations of the Fair Credit Reporting Act. 

RFN: The FTC recently warned industry that it will go after companies which quietly change privacy policies to feed AI models. I'd love to hear more about that. I also wonder how you think about companies which publicly announce they're using user posts or materials for AI, but in doing so depart significantly from the business model their customers would have expected when they signed up for the service.

BW: You're highlighting some guidance that was on our website. We're flagging as a concerning trend companies changing Terms of Service to allow for either use of data for machine learning purposes or sale of data for machine learning, or other AI purposes. And what we're doing is highlighting the cases that we’ve brought in this space. We’ve brought a number of cases where companies were alleged to have violated the [FTC Act’s prohibitions of “unfair or deceptive acts or practices”] when they retroactively changed their privacy policies. What we're doing there is what we talked about earlier — providing guidance through our website and highlighting concerns we see in the marketplace as well as past actions we've brought.

RFN: What are some of the past actions in this context?

BW: Look to the Vitagene case that the FTC brought last year. [Renamed to 1Health.io], Vitagene was a genetic testing company and we alleged that it retroactively changed its privacy policy to expand the kinds of parties that it would share consumers’ personal data [with] and the company did that without notifying consumers who had previously shared their personal data and it didn't obtain any consent. This is genetic data that is highly sensitive. Look at that case and the allegations the FTC made there alleging that that type of conduct was unlawful.

RFN: You mentioned notice and consent being broken. How should the concept be reformulated?

BW: What we're trying to do as a law enforcement agency is to ensure that in the cases we bring, we're trying to seek and provide substantive protections for consumers' data. …  If you look at some of our recent work on precise geolocation — two cases against a data broker and a data aggregator. We alleged in our X-Mode case, for example, that that data broker was unlawfully selling precise geolocation information that included visits to sensitive locations like medical facilities, religious facilities, places where children visit, places like homeless, domestic abuse shelters, etc. 

We alleged that that was unfair and the relief we obtained in that case and in the case we brought against a data aggregator for similar allegations, we banned the practice entirely so the companies were prohibited from disclosing precise geolocation information. That is one way that we are focused on really trying to, in our orders, address the harmful conduct that is happening and also provide these substantive protections for consumers. 

RFN: Algorithmic disgorgement is a tool that this commission has used more than prior ones. Can you explain what that means and how often it's used in your settlements? Does it have a deterrent effect?

BW: Speaking about using every tool we have, in I believe it's about eight cases now, we have not just required companies to delete data that was unlawfully obtained or unlawfully used, but also delete the data products that were created with that data. Two things: Going back to trying to address the incentives that led to the harmful conduct in the first place, the incentives in some of these cases to collect as much information as possible and then use that information to create data products. 

That's why you've also seen in our data security cases and our privacy cases, minimization provisions. We've had about 17 cases now where we’ve had specific provisions designed to require companies to minimize the data collected in the first place because the less data that you have to begin with, the less harm that could happen from a cyber incident or a breach. And it's also just a reflection of the marketplace and what we're seeing in how companies are using data that we have alleged has been unlawfully collected or unlawfully used.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles
Suzanne Smalley

Suzanne Smalley

is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.