This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.  Read our Cookies Policy.
Close
Eye News
  • Features
    • Close
    • Features
    • Allied Professions
    • Humanitarian
    • Interviews
    • AI & Oculomics
    • Ophthalmology
    • Optometry
    • Podcast videos
    • Supplements
  • Education
    • Close
    • Education
    • Learning Curve
    • Quiz
    • Top Tips
    • Trainees
    • Medico-Legal
    • The Truth Behind The Headlines
    • Case Reports
    • Pete's Bogus Journey
  • Reviews
    • Close
    • Reviews
    • Book Reviews
    • Journal Reviews
    • What's trending?
    • Tech Reviews
    • My Top Five
    • The Culture Section
  • Events
  • News
  • Product Guide
  • Industry News
  • Contact us
    • Close
    • Contact us
    • Write for Eye News
  • Home
  • Reviews
  • Journal Reviews
  • Amblyopia information provided by chatbots – how readable and understandable is this?

Amblyopia information provided by chatbots – how readable and understandable is this?
Reviewed by Fiona Rowe

1 April 2025 | Fiona Rowe (Prof) | EYE - Paediatrics, EYE - Strabismus
Share This

The purpose of this study was to evaluate the understandability, actionability and readability of responses to frequently asked questions about amblyopia provided by ChatGPT-3.5, Bing Chat, Bard and the American Association for Pediatric Ophthalmology and Strabismus (AAPOS) website, along with accuracy of responses by chatbots. The term amblyopia was searched along with three questions of amblyopia definition and cause, treatment methods, and prognosis. The patient education materials assessment tool (PEMAT) was used to assess understandability and actionability of patient information. The readable app was used to assess readability alongside the Flesch reading ease, Flesch Kincaid grade level and Coleman-Lian index score. A total of 75 (25x3) questions were directed to chatbots. Responses were categorised by two independent referees with 98.6% agreement rate. Appropriate responses were 84% each for ChatGPT and Bard, and 80% for Bing Chat – non-significant differences. None provided appropriate responses for questions on (what is) amblyopia, refractive error, and vision screening. Inter class correlation was 0.89 between the two referees for PEMAT scores; scores for understandability were 81.5% AAPOS, 76.1% ChatGPT, 77.6% Bard and 71.5% Bing. Scores for actionability were 74.6%, 67.8%, 69.2% and 64.8% respectively. AAPOS scores were significantly higher and Bing scores were the lowest. Readability FRE scores were 48.7, 30.5, 58.4 and 44.9 respectively. The authors conclude all three chatbots provided acceptable levels of appropriate responses. AAPOS was the most understandable. Chatbots have the potential to enhance patient education but require improved quality for information and readability.

The performance of chatbots and the AAPOS website as a tool for amblyopia education.
Dogan L, Ozcakmakci GB, Yilmaz IE.
JOURNAL OF PEDIATRIC OPHTHALMOLOGY AND STRABISMUS
2024;61(5):325–31.
Share This
Fiona Rowe (Prof)
CONTRIBUTOR
Fiona Rowe (Prof)

Institute of Population Health, University of Liverpool, UK.

View Full Profile
Specialty
  • EYE - Cataract
  • EYE - Cornea
  • EYE - General
  • EYE - Glaucoma
  • EYE - Neuro-ophthalmology
  • EYE - Oculoplastic
  • EYE - Oncology
  • EYE - Orbit
  • EYE - Paediatrics
  • EYE - Pathology
  • EYE - Refractive
  • EYE - Strabismus
  • EYE - Vitreo-Retinal
Archive
  • 2025
  • 2024
  • 2023
  • 2022
  • 2021
  • 2020
  • 2019
  • 2018
  • 2017
  • 2016
  • 2015
  • 2014
  • 2013

Top Of Page

9 Gayfield Square, 
Edinburgh EH1 3NT, UK.

Call: +44 (0)131 557 4184
www.pinpoint-scotland.com

WEBSITE DETAILS
  • Cookie Policy
  • Data Protection Notice
  • Privacy Policy
  • Terms and Conditions
ABOUT US
  • Who we are
  • Register
  • Contact us
  • Contributors
  • Company Awards
DIGITAL ISSUES/GUIDELINES
  • Digital issues - Library
  • Supplements - Library
  • Guidelines
Accreditations
IPSO_FLAG_TEAL 2025.png cpdcertified.png

Pinpoint Scotland Ltd (Registered in Scotland No. SC068684) | © 2025 - Website by Gecko Agency