Could artificial intelligence write mental health nursing care plans?

JOURNAL OF PSYCHIATRIC AND MENTAL HEALTH NURSING(2024)

引用 0|浏览0
暂无评分
摘要
Background: Artificial intelligence (AI) is being increasingly used and discussed in care contexts. ChatGPT has gained significant attention in popular and scientific literature although how ChatGPT can be used in care-delivery is not yet known. Aims: To use artificial intelligence (ChatGPT) to create a mental health nursing care plan and evaluate the quality of the output against the authors' clinical experience and existing guidance. Materials & Methods: Basic text commands were input into ChatGPT about a fictitious person called 'Emily' who presents with self-injurious behaviour. The output from ChatGPT was then evaluated against the authors' clinical experience and current (national) care guidance. Results: ChatGPT was able to provide a care plan that incorporated some principles of dialectical behaviour therapy, but the output had significant errors and limitations and thus there is a reasonable likelihood of harm if used in this way. Discussion: AI use is increasing in direct-care contexts through the use of chatbots or other means. However, AI can inhibit clinician to care-recipient engagement, 'recycle' existing stigma, and introduce error, which may thus diminish the ability for care to uphold personhood and therefore lead to significant avoidable harms. Conclusion: Use of AI in this context should be avoided until a point where policy and guidance can safeguard the wellbeing of care recipients and the sophistication of AI output has increased. Given ChatGPT's ability to provide superficially reasonable outputs there is a risk that errors may go unnoticed and thus increase the likelihood of patient harms. Further research evaluating AI output is needed to consider how AI may be used safely in care delivery.
更多
查看译文
关键词
art of nursing,nursing role,quality of care,self-harm,therapeutic relationships
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要