The head of the California Privacy Protection Agency on the future of data privacy regulation
Tom Kemp took over as the head of the trailblazing California Privacy Protection Agency (CPPA) in April. Established in 2020, the agency enforces the country’s strongest state privacy legislation, the California Consumer Privacy Act (CCPA), and other cutting-edge privacy laws.
California is seen as an incubator for privacy regulation nationally, giving Kemp a pivotal role in setting the privacy agenda for the country. As a longtime advocate warning against the dangers posed by data brokers, and as the main figure implementing California’s landmark Delete Act, Kemp’s leadership of the CPPA has major implications for the future of an industry that has most recently come under fire for contributing to the slayings of a Minnesota lawmaker and her husband.
Kemp spoke with Recorded Future News about why he believes data brokers are dangerous, whether forthcoming federal privacy legislation is likely to wipe out California’s pioneering privacy law and why a controversial package of automated decision-making and cybersecurity rules is more significant than critics believe.
This conversation has been edited for length and clarity.
Recorded Future News: You've been a longtime leader in the movement to regulate data brokers. After a Minnesota lawmaker and her husband were killed last month, it emerged that the alleged murderer had a list of data brokers in his car, along with names and addresses for dozens of targets.
Tom Kemp: The fundamental issue is that there are just hundreds, if not thousands, of data brokers out there that do not have a direct relationship with the actual consumer, whose business model is to collect and sell information, specifically personal information, about people and they do it at such a large scale. Consumers would literally have to spend hundreds of hours of their own time to make deletion requests to each and every one of those [brokers to have their information removed from the web].
RFN: Do you think that data brokers’ wares are actually leading to violence even more regularly than we may know?
TK: This is a societal problem where there is just all this personal information about us that is accessible, and so what we're trying to do [under the California Delete Act] is provide tools at scale for consumers that — if they are so interested and they may fear for the misuse of their personal information — enable them to do deletions through a portal that we're building as part of the Delete Act [which will require all data brokers to remove consumer information based on a single request] … One of the big goals is to operationalize privacy rights at scale.
RFN: What would you say to those who say what data brokers sell is public record, and in the Minnesota case, the killer could have just gone to a local courthouse to find that information so data brokers’ role is negligible?
TK: There's a fundamental difference between having to drive to a courthouse, make a formal request, maybe paying 50 cents or $1 for a Xerox copy of something, and having that logged, jumping through the hoops as opposed to simply just typing someone's name on Google and clicking and immediately seeing some of this information. … And when you actually buy a record of an individual, it not only says this is Tom Kemp's address, his phone number, etc., but here's the phone number and address of his parents, his brother.
These huge digital dossiers are provided and when it comes to more vulnerable members of society, like elderly folks, this enables fraudsters to be able to reach elderly people and say, "Oh, I'm calling on behalf of your cousin or your granddaughter and please wire me information."
RFN: It's going beyond just people-search sites giving relatives’ names. It extends to large data broker companies taking a bunch of data from disparate sources that they can then package to sell.
TK: You could take a public record and then you could take additional information and then combine it. One example of additional information is, say that there's a data breach that occurs … for example, the Sutter Health breach here in California also revealed medical conditions associated with [individuals]. … You can combine the public records with the hacked information and not only can you tell that this individual lives at this address and this is the phone number of their mom, but you can also know through the hacked information what medical conditions that the consumer has. It’s the combination of information, and businesses are doing this at scale. They're doing this massively.
RFN: Have you heard anything about a renewed push to do more at the state level to rein in brokers after the Minnesota killings?
TK: We've actually seen here in the [California] legislature that Rebecca Bauer-Kahan, an assemblymember, introduced after this assassination occurred basically the equivalent of Daniel's Law in New Jersey [which requires personal information belonging to law enforcement and judges to be removed from the web within 72 hours upon request]. The way that it's written right now, it’s for judges and lawmakers.
RFN: You mentioned the Delete Act. In August 2026 enforcement of the provision of that law requiring data brokers to erase data that consumers have asked to be taken down goes into effect. Fines for not deleting information once it has been requested are fairly large. How will this change the landscape in California?
TK: If you are a data broker and you don't start deleting from August 2026, it is $200 per incident. If it turns out that they have a very vast database of California consumers, and those California consumers register [to have data brokers delete their information], the fines can be, say that there's a million people that have registered … you do a million times 200 and the number is very large, and that's where things really kick in.
RFN: Are there other privacy measures you would like to see in California relating to data privacy enforcement?
TK: We certainly think that whistleblower protections as it relates to privacy [are needed]. An insider may see the company that they work for doing something that's against the law. We would like to have protections for those types of whistleblowers. That is an example of legislation that we would like to see because that could better facilitate our enforcement and could also potentially lead to bigger fines because it's much better to have someone on the inside tell us "Here's the scale of the privacy harms," as opposed to us having to methodically find it, research it, prove it.
RFN: Texas recently sued Allstate for what their data broker subsidiary Arity is doing, allegedly selling driver behavior data to insurers. The AG there won a $1.375 billion settlement with Google over data privacy violations. Would you like to see the CPPA eventually get to the level of ambition and fines that Texas has achieved?
TK: We're not chasing after fines for the sake of the size of the fines. We’re very much focused on addressing real world privacy harms that are impacting people.
RFN: At the federal level, a congressional working group is now drafting comprehensive data privacy legislation, and many of the industry stakeholders they're working with have historically pushed for federal legislation to preempt state laws. If this happens, California's privacy laws would be wiped out by whatever Congress comes up with at the federal level. How do you see this fight playing out this time?
TK: I think we're going to probably see something comparable to what happened with artificial intelligence [the proposed congressional moratorium on state AI laws being killed in Congress]. It dawned on a lot of people on both sides that there does need to be the ability for states to innovate. Much like artificial intelligence, the technology [regarding data privacy] is so rapidly changing that you don’t want to put in amber a law and not be able to have it updated.
RFN: But given Republicans control Congress, and have always supported preemption, do you have any concern about this?
TK: It's definitely possible.
RFN: Tell me about the cybersecurity audits the CPPA has proposed a rule for.
TK: Outside of New York’s cybersecurity requirements that only apply to financial service companies and which only focuses on a narrow subset of personal information, [California’s proposed cybersecurity audit rule] applies to all businesses. We've really focused a lot on aligning them with prominent cybersecurity frameworks such as the NIST cybersecurity framework and the Center for Internet Security security controls. This is a huge win where we will represent the state that will have the most robust set of requirements to have independent audits of businesses and their usage of and the protection of personal information.
RFN: Privacy advocates have been disappointed in how the CPPA has, in their words, watered down proposed rules to regulate automated decision-making technology [ADMT], which typically refers to using artificial intelligence algorithms to make decisions. At a recent CPPA board meeting, it emerged that only 10% of California businesses will be regulated under the latest version of ADMT rules proposed by the agency.
TK: That's kind of misleading, that 10%, because of the removal of behavioral advertising. Specifically, the rulemaking as it relates to automated decisions, yes, it did decrease significantly, but the vast majority of organizations are not themselves doing behavioral advertising. They leverage Google or Facebook to facilitate that.
This rules package is not just about automated decision-making, which a lot of companies are not yet doing. The AI and automated decision revolution has just started, but the rules package also applies to risk assessments and cybersecurity, which applies to all organizations.
RFN: I take your point, but the definition [for how to determine when ADMT is used] was narrowed from being broader. Privacy advocates would also say that the proposed rules allowing companies to self certify is a problem. They take issue with the risk assessments, saying that businesses aren't obligated to share the content of the assessments they conduct with CPPA. So would you acknowledge that in some ways the proposed rule has been weakened?
TK: We're writing regulations based on the law that was passed by the voters in 2020 and the word AI does not show up in that [law], so we are writing regulations that reflect the will of the voters in 2020. The good news is the legislature here in California has a bunch of new laws that are coming out that will further empower ourselves and other agencies to do more.
RFN: Why were the proposed rules more expansive before if the statute is a limiting factor? They've been narrowed from what was originally proposed.
TK: The key thing is that at the end of the day the regulations are approved by the board. I personally can't comment on why decisions were made a year or two ago to put certain language in. The public comment came back and said, "Hey, we think you're maybe going beyond the mandate that you have." Going through that process and listening to people, we've evolved. We're very much trying to make sure that we're aligned with the legislature, the governor and other key stakeholders.
RFN: A lot of people in the privacy community would say that CPPA bowed to pressure from industry and Gov. Newsom [who answers to a tech industry focused on AI innovation]. Do you think that's fair?
TK: We just went through a public comment period, and industry is asking for a lot more changes. Civil society groups and privacy [groups] want more changes. One side says we should still do less. Another side says it should be more. I think that tells us that maybe we're at the right place, that we have that happy medium. We still have the best and the most robust set of regulations [in the country] as it relates to risk assessments, ADMT and cybersecurity and I don't apologize for that.
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.