One In Three Adults Turning To AI For Emotional Support
One in three adults in the UK have used artificial intelligence (AI) for companionship, emotional support or social interaction, according to new research from a government-backed AI safety body, a finding that takes on added significance during the Christmas and New Year period when loneliness and mental health pressures often peak.
Frontier AI Trends
The finding comes from the first Frontier AI Trends Report published by the AI Security Institute, a body established in 2023 to help the UK government understand the risks, capabilities and societal impacts of advanced AI systems. The report draws on two years of evaluations of more than 30 frontier AI models and combines technical testing with research into how people are actually using these systems in everyday life.
Emotional Impact
While much of the report focuses on national security issues such as cyber capabilities, safeguards and the risk of loss of human control, it also highlights what AISI describes as “early signs of emotional impact on users”. One of the clearest and most surprising indicators of this is how widely conversational AI is already being used for emotional and social purposes.
How Many People Are Using AI For Emotional Support?
The AISI report highlights how “over a third of UK citizens have used AI for emotional support or social interaction”. AISI explains that this figure was uncovered after it carried out a census-representative survey of 2,028 UK adults. The results showed that 33 per cent had used AI models for emotional support, companionship or social interaction in the past year. Also, it seems that usage was not confined to occasional curiosity. For example, 8 per cent of respondents said they used AI for these purposes weekly, while 4 per cent said they did so daily.
Use A Mix of AI
The report also notes that people were not relying solely on specialist “AI companion” products. In fact, respondents reported using a mix of general-purpose chatbots and voice assistants, suggesting that emotional and social use is emerging as a mainstream behaviour linked to widely available consumer AI tools.
It should be noted here that AISI isn’t presenting these stats as proof of widespread harm. Instead, it frames the figures as an early signal that deserves attention as AI systems become more capable, more persuasive and more deeply woven into everyday routines.
What Happens When AI Companions Go Offline?
To move beyond self-reported survey data, AISI also examined behaviour in a large online community focused on AI companions. Researchers analysed activity from more than two million Reddit users and paid particular attention to what happened when AI services experienced outages.
According to the report, chatbot outages triggered “significant spikes in negative posts”. In one example, posting volumes increased to more than 30 times the average number of posts per hour. During these periods, many users described what AISI calls “symptoms of withdrawal”, including anxiety, low mood, disrupted sleep and neglect of normal responsibilities.
Again, AISI is being careful not to over-interpret these findings and doesn’t seem to be suggesting that most users are dependent on AI systems or that emotional reliance is inevitable. Instead, its analysis can be used as evidence that some users can form emotional attachments or routines around conversational AI, particularly when it acts as an always-available, non-judgemental listener.
Christmas And New Year
The timing of these findings is particularly relevant during Christmas and the New Year, when loneliness, grief and isolation often intensify across the UK. For example, seasonal pressures can amplify the reasons people turn to conversational technology in the first place.
Charities have long warned that Christmas can be one of the loneliest times of the year. Shorter days, cold weather, disrupted routines and the expectation of celebration can all heighten feelings of exclusion or loss. For people who are bereaved, estranged from family, living alone or struggling financially, the festive period can magnify existing emotional strain.
Age UK has repeatedly highlighted the scale of seasonal loneliness among older people, saying that one million feel more isolated at Christmas than at any other time of year. Hundreds of thousands will spend Christmas Day without seeing or speaking to anyone, while millions eat dinner alone. Although AISI’s data focuses on adults of all ages, the festive period provides a clear context in which an always-available AI chatbot may feel like a lifeline rather than a novelty.
Mental health charities also point out that access to support can become more difficult over Christmas and New Year. For example, many services run reduced hours, GP appointments are harder to secure, and waiting lists do not pause just because it is the festive season. For people already waiting weeks or months for help, the gap can feel even wider.
It’s easy to see, therefore, why in that context, AI systems that respond instantly, at any hour, may appear particularly attractive. AISI’s finding that 4 per cent of UK adults use AI for emotional purposes daily suggests that for some people, these tools are already filling gaps that become more visible during holiday periods.
The Youth Mental Health Context In The UK
The adult data from AISI becomes more striking when placed alongside evidence about young people’s mental health and their use of online support tools.
For example, research from the Youth Endowment Fund paints quite a stark picture of teenage mental health in England and Wales. In its Children, Violence and Vulnerability 2025 report, YEF says: “The scale of poor mental health among teenagers is alarming.”
Using the Strengths and Difficulties Questionnaire, a standard 25-item screening tool, YEF found that more than one in four 13–17-year-olds reported high or very high levels of mental health difficulties. YEF says this is equivalent to nearly one million teenage children struggling with their well-being.
Complex and Unmet Needs
Behind this figure lie complex and often unmet needs. For example, a quarter of teenagers reported having a diagnosed mental health or neurodevelopmental condition, such as depression or ADHD. A further 21 per cent suspected they had a condition but had not been formally diagnosed, suggesting many young people are experiencing difficulties without recognition or support.
YEF also reports high levels of distress. Fourteen per cent of teenagers said they had deliberately harmed themselves in the past year, while 12 per cent said they had thought about ending their life. In total, almost one in five teenagers, around 710,000 young people, had self-harmed or experienced suicidal thoughts.
Why Many Young People Are Turning Online
YEF’s research shows that most teenagers with mental health difficulties do talk to someone they trust, usually a parent or friend, but the problem arises when it comes to professional support.
YEF’s research found that more than half of teenagers with a diagnosed mental health condition were receiving no support at all. Also, among those not receiving help, around half were on a waiting list and others were neither receiving treatment nor expecting to receive it.
With services stretched and waiting times long, YEF says it is, therefore, unsurprising that young people are increasingly turning online, e.g., to AI chatbots. In fact, more than half of all teenagers reported using some form of online mental health support in the past year, rising to two-thirds among those with the highest levels of difficulty.
AI Commonly Used
One of the most striking YEF findings is how common AI chatbot use already is. YEF reports that a quarter of all teenage children had turned to AI chatbots for help, making them more widely used than traditional mental health websites or telephone helplines.
Violence
This pattern is even stronger among teenagers affected by serious violence. For example, the YEF found that nine out of ten young people who had perpetrated serious violence said they had sought advice or help online, which is nearly twice the rate of those with no experience of violence.
Festive Pressures And Always-On Technology
Christmas and New Year can be especially challenging for teenagers as well as adults. For example, school routines are disrupted, family tensions can rise, and support services may be harder to reach. For young people already dealing with anxiety, grief or trauma, the festive period can intensify feelings of isolation.
When combined with YEF’s findings about access gaps, this seasonal pressure helps explain why AI chatbots may become a go-to source of support. Unlike helplines or appointments, they do not close for bank holidays, require no waiting, and carry no perceived judgement.
AISI’s report does not suggest that AI should replace human support. Instead, it highlights a reality that becomes particularly visible at Christmas, i.e., conversational AI is already playing an emotional role in people’s lives, not because it was designed as therapy, but because other forms of connection and support are often unavailable when they are needed most.
A Trend With Wider Implications
AISI’s emotional support findings actually sit alongside its broader warnings about rapidly advancing AI capabilities and uneven safeguards. The institute says AI performance is improving quickly across multiple domains, while protections remain inconsistent.
In that context, the growing emotional role of AI raises some difficult questions. As systems become more persuasive and more human-like in conversation, understanding how people use them during periods of heightened vulnerability, e.g., Christmas and New Year, is becoming increasingly important.
Although neither AISI nor YEF presents AI as the root cause of loneliness or poor mental health, both sets of research seem to point to structural issues such as isolation, violence exposure, long waiting lists and gaps in support. The festive season simply brings those pressures into sharper focus, at the same time as AI tools are more accessible than ever.
Looking at this research, the evidence may show that now, for a growing number of people in the UK, AI may be less of a productivity tool or a novelty, and more a part of how they cope, reflect and seek connection.
What Does This Mean For Your Business?
This evidence seems to highlight a gap between emotional need and available human support, with AI increasingly stepping into that space by default rather than by design. Neither the AI Security Institute nor the Youth Endowment Fund suggests that conversational AI is a substitute for professional care or human connection. What their findings do show, however, is that when support is slow, fragmented or unavailable, people will turn to tools that are immediate, private and always on, especially during periods like Christmas and New Year when loneliness and pressure intensify.
For UK businesses, this has practical implications that go beyond technology policy. For example, employers are already grappling with rising mental health needs, winter absenteeism and the wellbeing impact of long waiting lists for NHS and community support. If staff are increasingly relying on AI tools for emotional reassurance, that signals unmet need rather than a tech trend to ignore. Organisations that take mental health seriously may now need to think harder about access to support, signposting, and how seasonal pressures affect staff, customers and communities alike.
For policymakers, regulators, educators and technology developers, the challenge is really achieving the right balance. AI is clearly providing something people value, particularly accessibility and responsiveness. However, the risk lies in leaving that role unexamined as systems become more persuasive and more embedded in daily life. As this research shows, the emotional use of AI is no longer hypothetical, but is already happening at scale, shaped by wider social pressures that Christmas simply makes harder to ignore.
Sponsored
Ready to find out more?
Drop us a line today for a free quote!