As businesses double down on AI to drive efficiency and growth, they face an emerging strategic challenge: consumer digital trust in how data is used is eroding – and fast.
New research from Usercentrics reveals that 61% of British consumers feel they have “become the product”, and 58% are concerned about their data being used to train AI models.
A Pivotal Shift
This is not mere background discomfort.
It marks a pivotal shift in how consumers perceive their relationship with brands in an algorithm-driven economy.
The State of Digital Trust 2025 report, based on a survey of 10,000 frequent internet users across Europe and the US, paints a sobering picture.
Almost three quarters (73%) do not fully understand how businesses use their data, yet transparency has become their number one demand.
44% said clarity on data use is the single most important driver of trust, ahead of security guarantees and the ability to control or limit data sharing.
In other words, consent is no longer a checkbox. It is a brand’s first moment of truth.
Rising Scepticism
This rising scepticism is compounded by AI’s growing role in shaping choices and experiences, often invisibly.
As Usercentrics CMO Adelina Peltea puts it: “This isn’t a backlash, it’s a reset. For too long, user data privacy has been defined as a trade-off between growth and compliance.
If privacy and consent aren’t placed at the heart of marketing strategies, especially as AI adoption accelerates, companies risk losing consumer trust entirely.”
Indeed, the research shows that AI is intensifying privacy concerns.
Nearly half (48%) trust AI systems less than humans with their personal data, while 67% believe society has become overly reliant on certain technologies.
The opacity of AI models – where data inputs and outcomes are rarely explained to users – is fuelling this discomfort.
Consumers are now asking sharper questions:
What is my data used for? Who profits from it? How does it influence decisions that affect me?
This sentiment is reshaping behaviour.
46% of UK consumers click “accept all” cookies less often than three years ago, and 42% regularly read consent banners before engaging further.
The notion of passive consent is fading; consumers are becoming active gatekeepers of their data.
Notably, this trend is strongest among digital-savvy cohorts, suggesting that future consumers will be even more privacy-conscious.
Interestingly, heavily regulated sectors such as banking and public services retain higher trust levels, with 62% trusting banks and 47% public institutions.
Social media (27%), hospitality (22%) and automotive (22%) lag far behind.
This disparity signals a competitive opportunity for brands willing to treat privacy as a strategic differentiator rather than a compliance burden.
Implications for Payments and Fintech
The implications for payments and fintech companies are profound.
Digital trust is fast becoming the ultimate currency in customer relationships.
As AI capabilities are embedded across personalisation, fraud prevention, and payment UX optimisation, firms must ensure they remain transparent about data use – not only to meet regulatory obligations, but to preserve and enhance user trust.
Privacy-led marketing, as Usercentrics frames it, is the way forward.
By designing consent interactions as brand moments, articulating data value exchange clearly, and building explainability into AI deployments, companies can convert privacy from a risk to a growth lever.
In a digital economy defined by accelerating AI innovation, trust remains the one asset that cannot be automated – it must be earned, protected, and continually renewed.











Comments