AI in Primary Care: Humanising Healthcare Through Connection, Not Just Code

Introduction

I still remember the first time I messaged a healthcare chatbot. It was right in the thick of the pandemic, when everything felt a bit upside down. Getting an appointment with a GP? Practically impossible. So, out of sheer curiosity (and maybe a bit of desperation), I clicked on this little chat option on my doctor’s website. It asked me a handful of questions. Nothing fancy, but within five minutes, it had triaged my symptoms and offered up some genuinely useful advice. Welcome to AI in primary care.

It was not perfect nor a magic, but it was comforting. Quick, calm, and oddly enough, I felt heard, even though I was just typing into a screen.

That moment stuck with me because it challenged the narrative that technology always creates distance in healthcare. Maybe it can bring us closer, provided we use it well.

It is Not Just Working: It is Changing the Conversation

Fast forward a few years, and AI in primary care is not some quirky experiment in the UK. It is part of the mainstream now. From the NHS to private practices, we are seeing digital tools support triage, appointment scheduling, mental health check-ins, you name it.

For instance, a recent study (Blease et al., 2024) analysed the use of Gen AI in General Practitioners’ (GPs) clinics across the UK and it was found that at least 20% use it for documentation, another 28% for differential diagnosis, indicating the fair usage of AI in primary care despite minimal guidance. Thus, the study highlights the need for education and regulation of the ethical use of AI for effective use in clinical settings.

Another example would be a study on the Ambient Digital Scribing (ADS) tools used by clinicians for jotting patient notes. While this tool was tested across 40 real patient visits, it did work well, but at times, the tool missed recording important facts and new medications correctly. This again indicates the importance of the stringent testing of such tools for safe use in healthcare settings (Wang et al., 2025).

it is not just about efficiency, but about well-being. As someone who looks at healthcare through the lens of Positive Psychology, this feels like an enormous opportunity, not just to optimise care, but to (Seligman, 2011) humanise AI.

From Triage to Thriving: The PERMA Perspective

By now, you may be familiar with Positive Psychology (Seligman, 2011), and you will know the PERMA model. It breaks well-being into five key pillars:

  • Positive Emotion
  • Engagement
  • Relationships
  • Meaning
  • Accomplishment

When I first used that chatbot, I felt: Engaged, because it responded right away. Respected, because it did not rush me or interrupt. A sense of accomplishment, because I took a step towards looking after myself.

That was not just a convenient shortcut. That was the start of what PERMA would call a more empowered, connected healthcare experience.

Why AI Makes Sense as a First Step

Let us think about what people need at the start of a health journey:

  • To be heard
  • To feel safe
  • To get help without shame or delay

Imagine a factual scenario when calling a surgery at 8:30 a.m. on a Monday? The nerves, the hold music, the worry about wasting someone’s time. It is a lot.

Now compare that to typing your symptoms into a chatbot at 11 p.m., wrapped up in a blanket with a cup of tea. You are not judged not rushed, instead you are guided, and that matters.

Chatbots That Nurture, Not Just Navigate

Let us consider a few examples of tools doing this well:

  • Ada – A symptom checker that helps people understand their bodies a bit better→ Promotes engagement and encourages self-awareness (Fraser et al., 2022; Knitza et al., 2024)
  • Youper – A mental health chatbot grounded in CBT→ Supports emotional regulation and reinforces positive emotion (Farzan et al., 2025; Mehta et al., 2021)
  • Woebot – A digital companion for anxiety and mood support —> Builds micro-resilience and helps users reflect (Farzan et al., 2025; Fitzpatrick et al., 2017)

These tools are not just techy band-aids. They are doorways into self-care and early touchpoints for mental and physical well-being. Think of them as gentle nudges back into alignment, rather than just diagnosis machines.

We are Not Replacing GPs—We are Giving Them Room to Breathe

Let me say this loud and clear: AI is not replacing doctors, and it should not.

What it is doing is giving clinicians a bit of breathing space by helping them in handling the admin, filtering the noise, and helping patients land in the right place faster. That is not just good system design but a possible burnout prevention mechanism.

From a Positive Psychology perspective, this is gold because when doctors aren’t drowning in emails and phone queues, they have more capacity to build real relationships, to be present and connected.

Can a Bot Feel Safer Than a Human?

It sounds strange, but for some people, yes, chatbots do feel safer.

They are not going to raise an eyebrow. They are not in a rush. They let you take your time and that sense of psychological safety.  It is foundational to trust-building in healthcare.

If we want people to engage with care earlier, before crisis hits, we need more of these low-barrier entry points.

Designing AI for Flourishing, Not Just Function

Here is a thought: What if AI did not just help manage illness but actively supported wellness?

Imagine this:

A bot that asks about your mood as well as your temperature.

One that celebrates small wins, like sticking to your medication schedule.

One that says, “You have done three check-ins this week. Well done for showing up for yourself.”

These might feel small, but in behavioural science, they are called “micro-moments of progress.” They matter deeply for motivation, engagement, and long-term change.

Why This Matters to You

A Note for Clinics, Coaches, and Digital Health Creators: If you work in healthcare, now is the time to lean into this shift.

  • Think beyond efficiency. Think human:
  • Build chatbots that ask rather than tell
  • Use affirming, emotionally intelligent language.
  • Track patterns of mood, not just symptoms
  • Add gentle prompts: “Would you like a 2-minute breathing exercise while you wait for results?”

This is about relational automation, not just technical automation. It is about using digital tools to bring people closer to care, not push them further away.

If you are a GP, a health coach, or someone working in digital health, this isn’t just about new technology; it’s about real people. AI tools can help you support patients earlier, ease your workload, and make space for more meaningful conversations. If you are building or delivering care through technology, this is your chance to design tools that feel human, ones that are helpful and kind.

References

  1. Blease, C.R., Locher, C., Gaab, J., Hägglund, M., Mandl, K.D., 2024. Generative artificial intelligence in primary care: an online survey of UK general practitioners. BMJ Health Care Inform. 31, e101102. https://doi.org/10.1136/bmjhci-2024-101102
  2. Farzan, M., Ebrahimi, H., Pourali, M., Sabeti, F., 2025. Artificial Intelligence-Powered Cognitive Behavioral Therapy Chatbots, a Systematic Review. Iran. J. Psychiatry 20, 102–110. https://doi.org/10.18502/ijps.v20i1.17395
  3. Fitzpatrick, K.K., Darcy, A., Vierhile, M., 2017. Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment. Health 4, e19. https://doi.org/10.2196/mental.7785
  4. Fraser, H.S.F., Cohan, G., Koehler, C., Anderson, J., Lawrence, A., Pateña, J., Bacher, I., Ranney, M.L., 2022. Evaluation of Diagnostic and Triage Accuracy and Usability of a Symptom Checker in an Emergency Department: Observational Study. JMIR MHealth UHealth 10, e38364. https://doi.org/10.2196/38364
  5. Knitza, J., Hasanaj, R., Beyer, J., Ganzer, F., Slagman, A., Bolanaki, M., Napierala, H., Schmieding, M.L., Al-Zaher, N., Orlemann, T., Muehlensiepen, F., Greenfield, J., Vuillerme, N., Kuhn, S., Schett, G., Achenbach, S., Dechant, K., 2024. Comparison of Two Symptom Checkers (Ada and Symptoma) in the Emergency Department: Randomized, Crossover, Head-to-Head, Double-Blinded Study. J. Med. Internet Res. 26, e56514. https://doi.org/10.2196/56514
  6. Mehta, A., Niles, A.N., Vargas, J.H., Marafon, T., Couto, D.D., Gross, J.J., 2021. Acceptability and Effectiveness of Artificial Intelligence Therapy for Anxiety and Depression (Youper): Longitudinal Observational Study. J. Med. Internet Res. 23, e26771. https://doi.org/10.2196/26771
  7. Seligman, M.E.P., 2011. Flourish: a new understanding of happiness and well-being, and how to achieve them, 1. publ. ed. Brealey, London.
  8. Wang, H., Yang, R., Alwakeel, M., Kayastha, A., Chowdhury, A., Biro, J.M., Sorrentino, A.D., Handley, J.L., Hantzmon, S., Bessias, S., Economou-Zavlanos, N.J., Bedoya, A., Agrawal, M., Ratwani, R.M., Poon, E.G., Pencina, M.J., Pollak, K.I., Hong, C., 2025. An evaluation framework for ambient digital scribing tools in clinical applications. Npj Digit. Med. 8, 358. https://doi.org/10.1038/s41746-025-01622-1