UK Smarsh Study Reveals AI Communication Boom in Finance as Compliance Gaps Widen

A new study by Smarsh, the global leader in communications data and intelligence, has laid bare a growing tension at the heart of the UK financial services sector: generative AI has become a daily fixture for most professionals, yet the compliance infrastructure meant to govern it is struggling to keep up. The findings point to widening AI compliance gaps in financial services that regulators and firms alike can no longer afford to overlook.
According to the research, 61% of UK financial services and insurance professionals now use generative AI every day. The adoption is not just concentrated among younger workers, while those aged 25 to 34 are the most active users, with over 36% reaching for AI tools multiple times daily, nearly a third of 35–54-year-olds and more than a quarter of 55–64-year-olds report daily usage too. AI has moved well beyond the pilot stage.
What makes these findings particularly significant is where AI is being applied. Beyond routine internal tasks like briefing notes and call summaries, professionals are deploying AI to produce client communications, marketing content, and, strikingly, compliance documentation itself. A third of respondents said they use AI to draft compliance materials, yet fewer than half say they thoroughly review and make significant edits to AI-generated content before it goes out.
The scale of output is growing fast, with 69% of professionals saying AI has increased the volume of content they produce. But the systems meant to monitor that content are not keeping pace. Fewer than a third of financial services professionals believe their organisation’s surveillance tools are fully equipped to detect risks in AI-generated communications. The concern runs deepest among the youngest workers, 43% of those aged 25 to 34 flagged this as a worry, the same group driving the most usage. These are precisely the AI compliance gaps in financial services that could expose firms to serious regulatory scrutiny.
Paul Taylor, Vice President of Product at Smarsh, framed the stakes clearly: “Compliance leaders are now under pressure to ensure every AI-assisted interaction is transparent, supervised, and defensible. Firms need the ability to capture and govern these communications across all channels, or they risk introducing critical blind spots at a time when regulatory scrutiny is intensifying.”
The findings also reveal an important counterintuitive truth about employee sentiment. Far from resisting oversight, 81% of financial services professionals said they would feel more confident using AI tools for work if they knew the outputs were being properly monitored, a 12-point rise from the same question asked a year ago, as noted in Smarsh’s previous research. Workers, especially younger ones, are not trying to dodge accountability, they are asking for the guardrails that allow them to innovate without personal liability.
For UK firms operating under the watchful eye of the Financial Conduct Authority, the message from this data is unambiguous. AI compliance gaps in financial services are not a future risk, they are a present one. Client advice, regulated communications, and compliance documentation are all being produced at volumes and speeds that legacy monitoring systems were never designed to handle. Firms that treat AI governance as a secondary concern may soon find themselves on the wrong side of a regulatory conversation they were not prepared for.






