At its recent seminar, the National League for Nursing (NLN) emphasized the importance of AI literacy, ethical integration, and curricular redesign. That call resonates with those of us who stand at the intersection of nursing, education, and technology. Preparing the next generation of nurses means preparing them to navigate new tools with confidence while never losing sight of the people they serve.
As Director of Mental Health and Wellness at the Global Nursing & AI Alliance (GNAA), I see this as especially critical for mental health. Technology can widen access and flag early warning signs, but only if it is guided by compassion and equity. Our mission, AI for Better Health, is built on one belief: innovation should make care feel more human, not less.

AI Literacy Through a Mental Health Lens
AI is already here. The question is: Are we ready to use it wisely?
- Ellipsis Health and Cogito Companion listen for shifts in tone that may signal stress or depression.
- Mindstrong studies smartphone use and biometrics to spot cognitive or mood changes.
- Woebot Health, Wysa, and AI features in Headspace and Calm provide accessible mental health coaching.
- The Apple Watch tracks sleep and heart rhythms, adding another layer of insight.
These innovations are powerful. But they also raise a deeper question: How do we ensure AI enhances care without replacing connection?
AI literacy is not just technical. It is emotional, ethical, and relational. It is about knowing when to trust an alert and when to look a patient in the eye and simply listen.
“AI can guide us toward earlier detection, but empathy remains the foundation of healing.”

Safeguarding Equity and Dignity
Bias in AI is not a distant possibility. It is here and it has consequences. When data sets are too narrow, algorithms may misinterpret cultural nuances or miss signs in vulnerable populations. Older adults in rural areas are especially at risk.
A recent review in AI in Healthcare revealed:
- Studies often exclude rural populations.
- Diagnostic accuracy is emphasized, while lived experience is overlooked.
- Real-world barriers such as broadband access, transportation, and stigma are rarely addressed (Shiroma & Miller, 2025).
- The Peterson Health Technology Institute (PHTI) report adds another layer. It found that:
- Virtual solutions can improve outcomes for mild to moderate anxiety and depression.
- Engagement is key, but hard to sustain.
- Payment models often shift costs to employers and health plans.
- Stronger evidence and outcome-based reimbursement are needed (PHTI, 2025).
As a board-certified Adult-Gerontology Nurse Practitioner, I have seen how older adults engage with technology in diverse ways. Accuracy matters, but so do trust, access, and context. If those pieces are missing, even the best-designed tools may alienate the very people they are meant to serve.
At GNAA, equity is not an afterthought. It is the center. We challenge ourselves and others to ask:
- Who is represented in the data?
- Does this tool reflect lived experience?
- Are we advancing innovation while protecting dignity?
“Bias in algorithms is not just a technology problem. It is a health equity problem.”

Rethinking How We Teach
Education must move as fast as practice. Picture this: students receive an AI-generated alert about suicide risk. What happens next? Their challenge is not only to interpret the data but to decide:
- How do I approach this patient?
- How do I validate their experience?
- How do I safeguard their dignity?
At GNAA, we design workshops that let students and professionals practice these scenarios. Our goal is simple but essential: help nurses combine technical skill with human presence.
“Imagine students practicing scenarios where AI flags a patient at risk. How they respond, both clinically and ethically, is where the true learning begins.”

Building a Community of Voices
The future of AI in healthcare will not be written by algorithms alone. It will be written by people: educators, clinicians, technologists, and patients working together.
At GNAA, we are building that space. A place where diverse voices guide how technology moves forward in service of mental health and nursing education.
Technology may extend our reach. Trust, however, is built through connection, and connection is what patients remember. Our opportunity is to ensure AI strengthens resilience, expands equity, and nurtures holistic wellbeing while keeping people at the center.
“Technology can extend reach, but trust still requires human connection.”

By: Dr. Oniel Laucella, DNP ( Mental Health Division Director)

References
Ellipsis Health. (n.d.). Voice as a vital sign. https://www.ellipsishealth.com
Mindstrong. (n.d.). Mindstrong health. https://mindstrong.com
National League for Nursing. (2025). Vision statement on artificial intelligence in nursing education. NLN. https://www.nln.org
Peterson Health Technology Institute. (2025, September). Assessment: Virtual solutions for anxiety and depression. PHTI. https://phti.org/assessment/virtual-solutions-anxiety-depression
Shiroma, T., & Miller, K. (2025, February 18). Lit review identifies glaring gap involving older Americans and healthcare AI. AI in Healthcare. https://aiin.healthcare/topics/artificial-intelligence/lit-review-identifies-glaring-gap-involving-older-americans-and-healthcare-ai
Woebot Health. (n.d.). Your mental health ally. https://woebothealth.com
Wysa. (n.d.). AI mental health support. https://www.wysa.io