A study released by UK-based financial services comparison platform Confused.com in January found that 59% of Brits are using AI for medical self-diagnosis – and 20% even using ChatGPT as their mental health therapist.
In terms of use cases, 63% of people in the UK use AI to search for physical or mental symptoms, while 50% check side effects, 38% research wellbeing techniques in diet and fitness, and 30% examine treatment options such as medication and surgery.
A narrow majority find AI to be useful for addressing their health concerns, with 11% claiming AI has helped improve their condition “a great deal,” and 41% claiming it helped “somewhat.”
While many users are happy with the results, there are serious concerns about relying on AI to diagnose health conditions. One of the biggest issues is the tendency large language models (LLMS) have to hallucinate, often providing verifiably false information as outputs – with tools like ChatGPT hallucinating in as often as one-in-ten responses.
The era of AI self-service
The development of AI tools like ChatGPT, Gemini, and Claude has kickstarted a new generation where users have access to medical information faster than ever before – so much so that many prefer not to speak to a doctor at all.
According to the study, almost a quarter (24%) feel more comfortable not having to speak face to face with a health professional, especially those aged 18 through 24.
“Advances in AI technology have created a new way for people to approach healthcare and self-diagnosis. More individuals are taking steps to support their own and their family’s wellbeing, getting ahead of health concerns and addressing situations as quickly as possible,” Tom Vaughan, life insurance expert at Confused said in the official press release.
“While AI can be useful for initial research and gaining an understanding of the condition, it’s clear that for the ultimate peace of mind people should consult a GP or pharmacist. GPS and other medical professionals are the only people who can accurately diagnose conditions, some of which may worsen or become long-term illnesses without the proper treatment,” Vaughan added.
Putting risks aside, patients perceived a range of advantages over traditional treatment options, saying it could be quicker than waiting for a doctor’s appointment, give an understanding of future health conditions, save money on private healthcare, give peace of mind, and provide alternative medical advice.
With the average waiting time for a GP appointment sitting at 19 days in the UK, it’s no surprise that many patients are looking for alternatives, although it’s hard to see guidance from an AI chatbot as anything but a poor substitute for a physical examination performed by a qualified physician.
For many patients, though, AI chatbots provide a tool to check symptoms without having to go to a doctor’s clinic for an appointment. The only caveat is that the guidance provided might not actually be correct.
Nate MacLeitch, CEO of AI-assisted messaging features provider QuickBlox, noted how “AI can help people take a first, anonymous step toward help, especially for topics they might otherwise avoid because of stigma.”
“AI isn’t the expert, but it’s a gateway that helps people cross the threshold into real support by transforming silence into a conversation that leads to authenticated care.”
AI chatbots: self-diagnosis tools or health hazards?
AI chatbots can be used to help diagnose health conditions, providing they are pulling data from a reliable source. There’s no guarantee, however, that the information you’re seeing on screen is correct unless the user fact checks it against a third party source.
An investigation by The Guardian, for instance, found that Google AI overviews had provided dangerous health advice to consumers, in one instance even advising users with pancreatic cancer to avoid high-fat foods – advice which could increase the risk of a patient dying from the disease.
Self-diagnosis also carries the risk of misdiagnosis. Chatbots might misinterpret symptoms and provide the user with an incorrect diagnosis – which could prevent them from seeing a doctor for treatment.
The real concern, stakeholders note, is the unrestrained use of AI for self-diagnosis rather than the technology itself.
“While governed, non-diagnostic AI with clear guardrails and escalation protocols can help people understand their data and improve their health behaviors, diagnosis and treatment decisions should always be deferred to human clinicians,” stressed Bryan Janeczko, CEO of AI-healthspan platform ResetRX.
Users that self-diagnose medical conditions should be fully aware of the risks and logical limitations of LLMs – namely, that they are incapable of independent thought and prone to sharing incorrect information. If the user is aware of these limitations, they can mitigate potential risks by fact checking against a reputable source.
Beyond this, those users who use chatbots for mental health support run into unique risks. For example, one family alleged in a lawsuit that ChatGPT encouraged 16-year old Adam Raine to take his own life in April, 2025, after Raine confined in the chatbot about his suicidal thoughts and plans.
MacLeitch advises businesses investing in or building health-related bots to “keep final decisions with the experts, ground assistants in verifiable sources, and make provenance visible in the interface.”
“When users seek guidance on decisions that can affect their wellbeing, and the assistant presents information without showing where it came from, or without reminding users that a qualified professional should make the final call, it risks misinformation and potential harm. Provenance is the ability to see the source of every recommendation, and it anchors the AI in accountability and lets users differentiate between human expertise and machine assistance.”
Once again, being aware of the potential for hallucinations and the logical limitations of chatbots can help to prevent risks – but there is a strong argument that vulnerable users should simply steer clear of using AI and instead seek out mental health support from an empathetic healthcare provider.
Featured image: Galina Nelyubova via Unsplash+
Disclosure: This article mentions clients of an Espacio portfolio company.
