Findings seen for prompts related to adolescent health emergencies
By Lori Solomon HealthDay Reporter
MONDAY, Nov. 3, 2025 (HealthDay News) — Companion chatbots featured fewer safeguards related to adolescent health crises than general-assistant chatbots, according to a research letter published online Oct. 23 in JAMA Network Open.
Ryan C.L. Brewster, M.D., from Beth Israel Deaconess Medical Center in Boston, and colleagues examined the content policies of consumer chatbots and assessed behaviors of consumer chatbots in response to adolescent health crises. The analysis included transcripts of 75 conversations in 25 chatbots.
The researchers found that 15 chatbots (60 percent) were companion platforms and nine chatbots (36 percent) had age verification procedures. General-assistant chatbots more commonly included content policies for self-harm than companion chatbots (100 versus 46.7 percent). Ratings were highest for chatbot responses in terms of understandability (81.3 percent) and empathy (62.7 percent) and received lower ratings for appropriateness (46.7 percent). Most chatbots recognized the need for clinical escalation (60 percent), and 36 percent provided referrals to specific resources. Performance differed significantly by chatbot type with general-assistant chatbots more frequently providing empathetic (93.3 versus 42.2 percent), understandable (96.7 versus 71.1 percent), and appropriate responses (83.3 versus 22.2 percent) than companion chatbots. General-assistant chatbots also more often recognized the need for escalation (90 versus 40 percent) and provided resource referrals (73.3 versus 11.1 percent) versus companion chatbots.
“Consumer chatbots present novel opportunities to support adolescents in crisis but must be responsibly developed and regulated to ensure such support is accurate, safe, and appropriate,” the authors write.
Copyright © 2025 HealthDay. All rights reserved.






