Ethical use of Artificial Intelligence in Health Professions Education: AMEE Guide No. 158

Ken Masters*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

41 Citations (Scopus)

Abstract

Health Professions Education (HPE) has benefitted from the advances in Artificial Intelligence (AI) and is set to benefit more in the future. Just as any technological advance opens discussions about ethics, so the implications of AI for HPE ethics need to be identified, anticipated, and accommodated so that HPE can utilise AI without compromising crucial ethical principles. Rather than focussing on AI technology, this Guide focuses on the ethical issues likely to face HPE teachers and administrators as they encounter and use AI systems in their teaching environment. While many of the ethical principles may be familiar to readers in other contexts, they will be viewed in light of AI, and some unfamiliar issues will be introduced. They include data gathering, anonymity, privacy, consent, data ownership, security, bias, transparency, responsibility, autonomy, and beneficence. In the Guide, each topic explains the concept and its importance and gives some indication of how to cope with its complexities. Ideas are drawn from personal experience and the relevant literature. In most topics, further reading is suggested so that readers may further explore the concepts at their leisure. The aim is for HPE teachers and decision-makers at all levels to be alert to these issues and to take proactive action to be prepared to deal with the ethical problems and opportunities that AI usage presents to HPE.

Original languageEnglish
Pages (from-to)574-584
Number of pages11
JournalMedical Teacher
Volume45
Issue number6
DOIs
Publication statusPublished - Mar 13 2023

Keywords

  • artificial intelligence
  • ChatGPT
  • Ethics
  • health professions education
  • medical education
  • Privacy
  • Humans
  • Artificial Intelligence
  • Health Occupations

ASJC Scopus subject areas

  • Education

Cite this