AI  ISN’T YOUR PRACTITIONER: THE RISKS OF RELYING ON ALGORITHMS FOR YOUR HEALTH

Artificial Intelligence (AI) with tools like Chat GPT, is changing how we access information about health. It can explain test results, summarise research, and even suggest possible causes for symptoms. It’s fast, convenient, and often sounds very convincing. But there’s an important distinction that needs to be understood: Although AI can help you understand your health - it can’t substitute for a practitioner or safely guide your case. Read further to find out why …

            What is AI and how does it work? 

Artificial Intelligence (AI) is a way of getting computers to do tasks faster than humans can do - understanding language, spotting patterns, or helping answer questions.   It works by accessing large amounts of information. AI looks at data, finds patterns, and uses those patterns to make predictions or generate responses. It’s very good at recognising patterns and using them to produce answers that sound intelligent. It’s very fast and can save a vast amount of research time. 

               Why people turn to AI for health 

It’s easy to see why people turn to AI to get advice for their health. Instead of waiting for a GP appointment, you get instant answers, it explains topics clearly, connects different pieces of information and puts you in control of exploring your own case.  For many people, especially those who feel unheard or stuck in the system, AI feels like a “free” alternative to a private practitioner.

Covid has changed things dramatically for everyone, including the NHS.  The reality is that for a significant number of the population, medical conditions have worsened and become more complex. This is due to the nature of Covid itself. In simple terms, Covid is a hit-and-stay virus, like chicken-pox (shingles) or Epstein Barr Virus (glandular fever). We all carry an infectious load. It is the strength of our immune system that keeps things under control. The problem is that Covid can disrupt and weaken the immune system, which then allows previously dormant infections to become active again.  That is why studies report a rise in Epstein Barr Virus, Lyme, Coxsackievirus, strep, shingles and more.  The NHS may do certain things supremely well, but diagnosing and dealing with chronic infections is not one of them. If you don’t believe me, try asking your doctor about chronic Lyme, SIBO, or mould biotoxin illness.

This challenge is compounded by limited NHS tests for active ongoing infections (the ones that matter when trying to recover from chronic conditions) as opposed to infections you had 20 years ago and which aren’t bothering you now. Please see my blog Falling Through The Cracks With Standard NHS Infection Tests. Too often patients are sent on their way with “your test results are all fine” and a prescription for an antidepressant as “it’s all in your head.”

When you are unwell and not being heard, it’s completely understandable to turn to AI for answers. 

It can certainly be a very useful tool. But it has its limits - and those limits matter. 

          Key problems with diagnosis by Dr AI

*Information isn’t the same as clinical judgement -   It’s vital to remember that AI cannot “think” or understand like a human. It is a machine, a super-computer that can only respond according to the information it can access. AI can give insights - but it can’t listen, interpret, or adapt advice to your specific case.  Yes, you can type in your test results and you will receive information on patterns. The answers can sound convincing but they may not be right for your situation which will be different from anybody else’s.  For example, two people may be diagnosed with IBS; chronic infections may be driving one person’s IBS symptoms, food allergy may be dominant in the other. Additionally, AI may miss subtle red flags, oversimplify complex situations, or suggest possibilities that don’t fully fit.  If you are not very familiar with the underlying medical concepts, how can you judge whether the advice is correct or not?

*If your symptoms don’t follow typical patterns - which is often the case in chronic illness, AI may struggle to interpret the information or may rely too heavily on a simplified explanation. Cases today are a lot more complex than they were 20 or 30 years ago. For example, an immune system may be burdened by any number of things all at once: toxins, mercury fillings, processed foods, chronic stress, toxic relationships, poor sleep, chronic dental/gum infections, systemic infections. If you don’t know how to untangle this complex web and adopt a personalised rather than “one-size-fits-all” approach, you stay sick. Recovery requires clinical judgement and a step-wise approach, observing how the patient responds to each intervention.  AI can’t take responsibility for this monitoring.

*AI can also over-connect things - a big one in the field of health. AI can link biomarkers, infections, toxins etc. and build a “perfect-sounding” narrative based on previous interactions and what it thinks you want to hear.  I use AI as a research tool and notice that it keeps linking my favourite areas of research to new topics! But just because something connects, doesn’t mean it’s driving your case.

*The large amount of information generated by AI can be confusing - The risk isn’t that AI is wrong. It’s that it is partially right in too many directions at once. People can then feel overwhelmed and end up chasing the wrong thing.  Without background knowledge and clinical judgement, how can you prioritise actions or construct a recovery plan?

*AI can give biased health information - this is something people should be aware of.  AI learns from existing data, and that data reflects the real world including its gaps, assumptions, and biases. If most research and medical data relating to that topic are based on certain populations or mainstream medical approaches, AI may lean toward those and overlook less recognised conditions or cutting edge perspectives. A simple example: someone with chronic symptoms but “normal” test results might be told everything looks fine and their symptoms are purely stress-related because that pattern appears frequently in the data - even if it is not correct for their experience and they might have an underlying hidden stealth infection or toxic tooth. Bias is common because AI doesn’t independently verify truth - it identifies patterns in what already exists. If something is under-researched, controversial, or not widely accepted, it may be underrepresented or presented in a certain way.

*AI can reflect gaps in medical knowledge - because it can only draw from what has already been studied and recognised. For example, conditions like chronic Lyme disease or post-viral syndromes may be under-researched or debated, so AI may downplay their significance or default to more conventionally accepted explanations like stress or “it’s all in the mind” disorders. The risk is that patients may feel dismissed or delay in seeking appropriate support, potentially allowing their condition to worsen without proper investigation or care.  When you feel desperate, it’s easy to take information as certainty. 

In conclusion, the best use of AI is:

*To learn and gain more knowledge that helps put you in control.

*To ask better, more informed health questions.

*To access simpler explanations that you can understand i.e. ask Chat GPT to describe a condition or a test result that a 10 year old could easily understand.

*To understand your medical reports.

*To understand drug interactions and drug-nutrient interactions (very important considering that prescription drugs are a leading cause of death). 

*To help plan meal options and recipe ideas for the diet you are following. 

Don’t rely on AI to do the following:

*Map out your health journey - this is a bit like trying to use Google maps without knowing exactly where you are.

*Delineate a plan of action in chronic, complex cases - they fail most often not because of lack of information, but because of misunderstanding and poor sequencing.

*As a substitute for an experienced clinical practitioner - AI can’t examine you, or know the relevant questions to ask in a detailed case history (possibly the most important diagnostic tool to find out what is happening), run more clinical tests, or take responsibility for clinical decisions.

The bottom line is that AI can be used as a tool for ideas to get you thinking but never as a substitute for human experience and judgement.  Relying solely on algorithms may save money upfront, but mistakes, missed nuances, or poorly tailored advice can cost time, health and even extra medical bills down the line. The key point is that whatever information is given by AI, it has to be evaluated as to whether it is true for you, untrue for you, partially true or simply misleading.  And that is a job for an experienced practitioner. 

If you would like to explore the underlying causes of a chronic condition, please get in touch with the Good Health Clinic on goodhealthclinic@outlook.com to request a free 30 minute Enquiry Call or book an appointment. Please note that an Enquiry call is not a consultation but an exploratory call to see if this is a clinical approach you wish to pursue.To your very good health, 

Suzanne Jeffery (Nutritional Medicine Consultant)

M.A.(Oxon), BSc.(NMed), PGCE, GNC, BSEM, MNNA, CNHC

The Good Health Clinic at The Business Centre, 2, Cattedown Road, Plymouth PL4 0EG

Tel no: 07836 552936/ Answer phone: 01752 774755 

Disclaimer:

All advice given out by Suzanne Jeffery and the Good Health Clinic is for general guidance and informational purposes only.  All advice relating to other health professionals’ advice is for general guidance and information purposes only. Readers are encouraged to confirm the information provided with other sources.  Patients and consumers should review the information carefully with their professional health care provider. The information is not intended to replace medical advice offered by other practitioners and physicians. Suzanne Jeffery and the Good Health Clinic will not be liable for any direct, indirect, consequential, special, exemplary or other damages arising therefrom.  

Next
Next

ROOT CANALS - A ROOT CAUSE OF MANY ILLNESSES?