Biden AI order could lead to reforms in how federal agencies work with data brokers
A little-noticed provision of the Biden administration’s recently issued executive order on artificial intelligence could lead to important reforms of the federal government’s data collection practices, experts say.
The provision orders the White House Office of Management and Budget to assess how federal agencies gather and share often non-anonymized commercially available information (CAI), with a strong focus on data that includes personally identifiable information. The order calls out data brokers as targeted peddlers of CAI and says the measure is meant to “mitigate privacy risks potentially exacerbated by AI.”
The executive order also requires OMB to study how agencies’ collect, process, maintain, use, disseminate and dispose of commercial data in order to shape “potential guidance” on how to reduce individual privacy risks.
“There’s never been a government-wide accounting like this of how agencies buy and use personal data from commercial vendors,” said John Davisson, director of litigation at the Electronic Privacy Information Center.
The government’s use of CAI has become increasingly controversial. Legislation introduced Tuesday to renew Section 702 of the Foreign Intelligence Surveillance Act, which permits the National Security Agency to obtain the electronic data of foreign targets, folds in language from previously introduced legislation prohibiting law enforcement from buying consumer data from brokers without a court order.
In June, the Office of the Director of National Intelligence, which oversees the CIA, NSA and portions of the FBI, revealed that intelligence agencies had been buying vast amounts of CAI from data brokers, calling the granularity of the individual data for sale “of a type and level of sensitivity that historically could have been obtained, if at all, only through targeted (and predicated) collection.”
CAI is available to any government agency that has the money to buy it, but is so “highly personal and highly sensitive that it's the sort of thing that we would want an entity like law enforcement to only obtain with a warrant or some sort of judicial oversight,” said Cody Venzke, senior policy counsel for surveillance, privacy, and technology at the American Civil Liberties Union.
A study released Monday by Duke’s Sanford School of Public Policy, for example, surfaced data brokers selling military service members’ names, addresses, and financial and health information to researchers using a ".asia" domain for as little as 12 cents an individual.
Privacy experts said the executive order’s inclusion of such a robust federal agency assessment was not anticipated and will likely uncover previously unknown information on how the federal government, including law enforcement, immigration and even health agencies, gather data.
Davisson said the agencies’ reports will be “very revealing,” shining a light on the volume and types of data agencies gather from brokers, the companies selling it to them, and the ways data has been used and potentially abused.
The OMB’s investigation should “reinforce why we need strict limits on the data broker-government pipeline,” Davisson added.
The agency evaluation provision in the order is explicitly linked to how AI facilitates, collects and uses information about individuals “or the making of inferences about individuals.”
Those inferences can be incorrect — and harmful.
In April, a law professor discovered that the AI chatbot ChatGPT had listed him among a group of legal scholars guilty of sexual harassment. The OpenAI chatbot falsely asserted that the professor, Jonathan Turley, had made inappropriate sexual comments and tried to touch a student on a class trip.
Some fear such false inferences could be made in a law enforcement context.
In an interview at the IBM Security Summit Wednesday, Deputy Attorney General Lisa Monaco announced the Department of Justice is standing up an Emerging Technology Board to help the agency develop guidelines for the ethical use of AI.
She added that Justice is now using AI in “some of our biggest cases,” including in the January 6 insurrection probe and even in drug investigations.
Monaco said Justice is focused on setting AI best practices for both federal law enforcement and the approximately 18,000 state and local law enforcement agencies it works with.
The EO also directs Monaco’s boss, the Attorney General, to investigate how to make privacy assessments required by a federal law known as the E-Government Act more effective.
The inclusion of the E-Government Act in the order is significant, according to Samir Jain, vice president of policy at the Center for Democracy & Technology, because better enforcement of it would give new guidance to agencies more impact.
He added that the EO’s direction on CAI is not limited to the use of data for AI purposes so it could be broadly applied.
Regardless of whether that happens, “the mere fact of transparency will be helpful, either through public pressure or through driving Congress into passing laws,” Jain said.
Suzanne Smalley is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.