A red computer chip
Image: Michael Dziedzic via Unsplash

British intelligence warns AI will cause surge in ransomware volume and impact

Ransomware attacks will increase in both volume and impact over the next two years due to artificial intelligence (AI) technologies, British intelligence has warned.

In an all-source intelligence assessment published on Wednesday — based on classified intelligence, industry knowledge, academic material and open source — the National Cyber Security Centre (NCSC) said it was “almost certain” about the increase, the highest confidence rating used by British intelligence analysts.

Experts at NCSC, a part of the cyber and signals intelligence agency GCHQ, warned that AI tools were going to benefit different threat actors unevenly.

At present, generative AI is already being used to create a “capability uplift in reconnaissance and social engineering” making both of these tasks “more effective, efficient, and harder to detect.”

AI is also considered likely to assist with “malware and exploit development, vulnerability research and lateral movement by making existing techniques more efficient.”

The good news according to the intelligence experts is that these more sophisticated uses of AI to enhance cyber operations are only likely to be available to the best resourced threat actors, and even then are “are unlikely to be realised before 2025.”

One of the limitations on the use of AI tools for sophisticated hacking is the need for the developers to have access to high-quality exploit data to train their models. Currently, it is only a realistic possibility “that highly capable states have repositories of malware that are large enough to effectively train an AI model for this purpose.”

“To 2025, training AI on quality data will remain crucial for its effective use in cyber operations. The scaling barriers for automated reconnaissance of targets, social engineering and malware are all primarily related to data,” the assessment explained.

But these barriers mean that as more successful hacks for this data are conducted, the threat actors will thus be able to train more sophisticated tools, again enabling them to steal more data in a positive feedback loop.

“To 2025 and beyond, as successful exfiltrations occur, the data feeding AI will almost certainly improve, enabling faster, more precise cyber operations,” the assessment states.

According to the most recent tranche of security incident trends data released by the Information Commissioner’s Office (ICO), there were 874 ransomware attacks against British organizations in the first three quarters of 2023, a surge compared to the 739 incidents recorded throughout the entirety of 2022.

James Babbage, the director general for threats at the National Crime Agency, stated: “Ransomware continues to be a national security threat. As this report shows, the threat is likely to increase in the coming years due to advancements in AI and the exploitation of this technology by cyber criminals.

“AI services lower barriers to entry, increasing the number of cyber criminals, and will boost their capability by improving the scale, speed and effectiveness of existing attack methods,” warned Babbage, adding that cases of fraud and child sexual abuse would also likely be affected.

Lindy Cameron, the outgoing chief executive of the NCSC, said “The emergent use of AI in cyber attacks is evolutionary not revolutionary, meaning that it enhances existing threats like ransomware but does not transform the risk landscape in the near term.

“As the NCSC does all it can to ensure AI systems are secure-by-design, we urge organisations and individuals to follow our ransomware and cyber security hygiene advice to strengthen their defences and boost their resilience to cyber attacks.”

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Alexander Martin

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.