From Confused to Confident: Using AI to Navigate Healthcare

Learn how to use AI tools, apps and online meetings to become a better-informed care-receiver — and why your allied health therapist is the best guide to start.

Nobody taught us how to be a patient. We learn to be good students, good employees, good parents — but the skills that make someone an active, informed participant in their own healthcare are rarely taught anywhere. Most people arrive at appointments underprepared, leave with half-answered questions, and fill in the gaps with whatever Google or a family WhatsApp group supplies next. That gap has always existed. The internet widened it. Artificial Intelligence (AI) has widened it again — and this time, the stakes are higher, the tools are more powerful, and the opportunity to genuinely close that gap is, for the first time, real.

A 2024 study published in OTJR: Occupational Therapy Journal of Research followed older adults aged 66 to 87 through an eight-week series of structured online group meetings led by occupational therapists, physiotherapists, and nurses. What the researchers found was not simply that digital engagement reduced loneliness — though it did. It was that participants who engaged with online platforms in a structured, supported way underwent a subtler shift: they recognised their own capacity. One participant described it plainly: “Hearing others, you learn new things… And actually, I can do plenty.”

That shift — from passive recipient of care to active, curious participant — is available to anyone willing to engage with the tools now on offer. AI is the most powerful of those tools. The question is no longer whether to use it. The question is how.


The Digital Divide Was Never Only About Age

The “digital divide” has always been more complex than a generational story. It reflects confidence as much as capability, access as much as age. A 2024 scoping review in Frontiers in Public Health confirmed that digital health literacy — the ability to find, evaluate, and act on health information online — varies widely even among adults who are already online, and tends to be lower across middle-income countries in Asia than in higher-income Western nations.

That gap has clinical consequences. Lower digital health literacy correlates with weaker chronic disease management, reduced uptake of preventive care, and significantly greater exposure to health misinformation — the kind that travels through trusted social channels and is therefore far harder to question.

AI has added a new layer. A 2025 University of Michigan National Poll on Healthy Aging surveying over 2,800 adults aged 50 and older found that while 55% had used some form of AI tool, only 14% had applied it to health information. Nearly half reported very little or no trust in AI-generated health content — a reasonable instinct, but one that requires nuance rather than blanket avoidance.

The instinct to distrust AI is not wrong. The mistake is letting distrust become disengagement. The goal is informed scepticism — and that is a learnable skill, at any age.


What Wearables and Apps Have Already Shown Us

Before AI entered the conversation, wearable devices and health apps were already quietly shifting what it meant to participate actively in one’s own care. A systematic review in Frontiers in Medicine found that roughly 77% of all published research on wearable technology for older adults appeared in 2023–2024 alone — a reflection of how rapidly this space is maturing.

Occupational therapists have been central to making that technology meaningful. Wearable data — steps, sleep, heart rate variability — becomes clinically useful when a therapist helps a client understand what it is actually telling them, and why it matters in relation to their specific goals. The technology is not self-explanatory. The relationship makes it interpretable.

The Nature Medicine review on digital health for ageing populations makes this point clearly: wearables hold genuine transformative potential not as surveillance tools, but as instruments of self-knowledge that support independence and earlier clinical intervention. The limiting factor, consistently, is not the technology. It is the quality of supported introduction to that technology — which is precisely what a good allied health relationship provides.

The same principle, unchanged, now applies to AI.


How to Use AI for Health Information Without Getting It Wrong

The most useful framing for AI in healthcare is neither “digital doctor” nor “dangerous toy.” It is a research companion — one that can translate medical jargon, summarise complex conditions, help you prepare better questions, and make you feel less lost before you walk into an appointment.

The National Academy of Medicine describes this as “critical AI health literacy” — the ability to use AI tools deliberately, to cross-check information, and to explore perspectives your clinical team may not have surfaced. Crucially, this positions AI not as a source of answers, but as a sharpener of questions. For anyone who has ever felt too intimidated to push back in a consultation, or too uncertain to ask the “obvious” question, that reframe is significant.

Here is what that looks like in practice:

Use AI to prepare, not to conclude. Before an appointment, type your diagnosis, medication name, or procedure into a tool like ChatGPT or Gemini and ask: “What questions should I ask my specialist about this?” The output will not be perfect — but it will almost always give you a scaffold worth building on.

Ask AI to translate, not to diagnose. Medical letters and discharge summaries are written for clinicians, not patients. AI tools are genuinely useful for plain-language translation of clinical jargon. Ask: “Explain this in simple terms” — then bring your understanding back to your care team to confirm.

Cross-check anything that surprises you. If AI output contradicts something your doctor has told you, that is not a reason to distrust your doctor — it is a reason to raise the discrepancy in your next conversation. That conversation is valuable. Clinicians do not know everything, and a well-formed question from a prepared client can open genuinely useful clinical reflection.

Ask AI where to look, not just what to think. Rather than accepting AI-generated health claims at face value, ask: “Which professional bodies or peer-reviewed sources should I consult on this topic?” Then go directly to those sources.

Keep a human in the loop for every significant decision. Medication changes, therapy goals, diagnostic interpretation — none of these should be driven by AI output alone. The value of AI is in the preparation. The judgement belongs to you and your clinical team, together.


Why Health Misinformation Is an Everyone Problem

Health misinformation has never been a problem of ignorance alone. Research published in JMIR Formative Research found that older adults are particularly vulnerable to health misinformation not because they cannot detect false claims in isolation, but because misinformation reaches them through trusted interpersonal channels — family messaging groups, community newsletters, friends sharing articles. The social framing makes it persuasive. The same dynamic operates across age groups; older adults are simply exposed to it more frequently and with fewer counter-signals.

AI can amplify the problem when used carelessly. Chatbots present incorrect information confidently. They can be prompted to confirm biases. Research from institutions including Mount Sinai has shown that AI tools can elaborate convincingly on false medical premises when not challenged. The KFF Health Misinformation Monitor has documented how, even among adults who actively use AI for health information, trust in accuracy remains appropriately low — a caution worth maintaining.

The antidote is not avoidance. It is a simple, repeatable habit: treat AI output as the beginning of a conversation with a professional, not the conclusion of one.


Why Your Therapist Is the Best Person to Ask About This

The 2024 social online meetings study identified something important: when the structured digital programme ended, some participants felt a loss of direction. The technology alone was not sufficient. What had worked was the combination — digital engagement held within an ongoing professional relationship.

The same holds for AI literacy. An occupational therapist or allied health professional is uniquely placed to help anyone navigating this landscape:

  • Identifying which tools are appropriate for individual goals, cognitive profile, and tech comfort
  • Creating a low-pressure space to practise using AI interfaces before applying them to real health situations
  • Helping interpret AI-generated health information in clinical context — separating what is useful from what needs questioning
  • Building the habit of bringing AI-sourced questions to appointments rather than acting on them unilaterally
  • Bridging the wider digital environment: telehealth platforms, wearables, online support communities, and appointment booking systems

This falls under social prescription — the practice of connecting individuals with community-based resources and activities that address the non-clinical dimensions of health. Digital literacy support, including guided AI orientation, is an increasingly recognised component of that work. 

The goal is not digital fluency for its own sake. It is the confidence to participate meaningfully in your own care — to arrive at consultations better prepared, to understand clinical information more clearly, and to recognise when what you have read online needs professional context before you act on it.


A More Informed Population Changes the Quality of Every Consultation

There is a broader case here, and it applies across every age group.

When people arrive at appointments having thought carefully about their symptoms, their medications, and their goals — having used whatever tools are available to prepare — the quality of clinical encounters improves for everyone. The conversation becomes richer. The time is spent on what matters rather than on basic orientation.

It is not about patients knowing more than their clinicians. It is about the dialogue becoming more equal and more useful. A client who has used AI to understand a diagnosis, and who arrives with specific questions about what to monitor and which lifestyle adjustments have evidence behind them, is a client whose therapist can work at a genuinely deeper level. The groundwork is done. The session can go somewhere.

This is what the digital era, approached with some care and the right support, makes possible — not a population replacing professional judgement with algorithm output, but one that is curious, prepared, and engaged enough to make professional judgement more useful.

The technology is not the point. The conversation it enables is.


Frequently Asked Questions

Is it safe to use AI tools for health information? AI tools are safe and useful when treated as a research aid rather than a clinical authority. Their strongest application is understanding terminology, preparing questions for appointments, and exploring treatment options — all of which should then be discussed with a qualified professional before any decisions are made.

What is the best AI tool for health information? There is no single definitive tool, and recommendations evolve quickly. What matters more than the specific platform is the habit: use AI to sharpen your questions, then bring those questions to your doctor, therapist, or nurse. An allied health professional can help identify which tools suit your situation and confidence level.

How can I tell if health information from AI or the internet is reliable? Check whether the information aligns with guidance from recognised professional bodies. Be sceptical of content promoting specific products, contradicting mainstream clinical guidance, or coming from unidentified sources. When in doubt, raise it at your next appointment — that conversation is never a waste of anyone’s time.

Can an occupational therapist help me navigate digital health tools? Yes. Occupational therapists are increasingly active in digital health support — helping clients choose appropriate tools, build confidence with technology, and integrate digital information meaningfully into their care goals. This is one facet of social prescription: connecting people with resources that support their broader wellbeing.

What is digital health literacy and why does it matter? Digital health literacy is the ability to find, understand, evaluate, and use health information accessed through digital channels. Higher digital health literacy is consistently associated with better chronic disease management, greater engagement with preventive care, and reduced exposure to health misinformation.

How do I avoid health misinformation when using AI? Treat AI output as a conversation starter, not a final answer. Cross-check anything that surprises you. Ask AI which professional sources are authoritative on a given topic, then go directly to those sources. Bring significant findings to your healthcare team before acting on them.

AI Health Literacy for Older Adults: A Practical Guide

Why choose Lifeweavers for private rehab therapy in Singapore?

Lifeweavers is Singapore’s most comprehensive private  rehab therapy team, consisting of:

Our team holds joint case reviews, works from a single unified therapy plan, and adapts that plan together as you progress.

This is what gold-standard, coordinated rehabilitation looks like — and it is available at home, at our clinic, or both.

Back to top