Pomegranate Health

Ep99: When AI goes wrong

August 09, 2023 the Royal Australasian College of Physicians Episode 99
Ep99: When AI goes wrong
Pomegranate Health
More Info
Pomegranate Health
Ep99: When AI goes wrong
Aug 09, 2023 Episode 99
the Royal Australasian College of Physicians

This is the fourth part in a series on artificial intelligence in medicine and we try and unpick the causes and consequences of adverse events resulting from this technology. Our guest David Lyell is a research fellow at the Australian Institute of Health Innovation (Macquarie University) who has published a first-of-its kind audit of adverse events reported to the US regulator, the Federal Drugs Administration. He breaks down those that were caused by errors in the machine learning algorithm, other aspects of a device or even user error. 
 
We also discuss where these all fit in to the four stages of human information processing, and whether this can inform determinations about liability. Uncertainty around the medicolegal aspects of AI-assisted care is of the main reasons that practitioners report discomfort about the use of this technology. It's a question that hasn’t been well tested yet in the courts, though according to academic lawyer Rita Matulonyte, AI-enhanced devices don’t change the scope of care that has been expected of practitioners in the past. 

Guests
>
Rita Matuolynte PhD (Macquarie Law School, Macquarie University; ARC Centre of Excellence for Automated Decision Making and Society; MQ Research Centre for Agency, Values and Ethics)
>David Lyell PhD (Australian Institute of Health Innovation, Macquarie University; owner Future Echoes Business Solutions)

Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Kryptonite’ by Blue Steel and ‘Illusory Motion’ by Gavin Luke. Music courtesy of Free Music Archive includes ‘Impulsing’ by Borrtex. Image by EMS-Forster-Productions licenced from Getty Images. 

Editorial feedback kindly provided by physicians David Arroyo, Stephen Bacchi, Aidan Tan, Ronaldo Piovezan and Rahul Barmanray and RACP staff Natasa Lazarevic PhD. 

Key References
More than algorithms: an analysis of safety events involving ML-enabled medical devices reported to the FDA [Lyell, J Am Med Inform Assoc. 2023]
How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices [Lyell, BMJ Health Care Inform. 2021]
Should AI-enabled medical devices be explainable? [Matulonyte, Int J Law Inform Tech. 2022]

Please visit the Pomegranate Health web page for a transcript and supporting references. Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify, Castbox or any podcasting app.



 

Show Notes

This is the fourth part in a series on artificial intelligence in medicine and we try and unpick the causes and consequences of adverse events resulting from this technology. Our guest David Lyell is a research fellow at the Australian Institute of Health Innovation (Macquarie University) who has published a first-of-its kind audit of adverse events reported to the US regulator, the Federal Drugs Administration. He breaks down those that were caused by errors in the machine learning algorithm, other aspects of a device or even user error. 
 
We also discuss where these all fit in to the four stages of human information processing, and whether this can inform determinations about liability. Uncertainty around the medicolegal aspects of AI-assisted care is of the main reasons that practitioners report discomfort about the use of this technology. It's a question that hasn’t been well tested yet in the courts, though according to academic lawyer Rita Matulonyte, AI-enhanced devices don’t change the scope of care that has been expected of practitioners in the past. 

Guests
>
Rita Matuolynte PhD (Macquarie Law School, Macquarie University; ARC Centre of Excellence for Automated Decision Making and Society; MQ Research Centre for Agency, Values and Ethics)
>David Lyell PhD (Australian Institute of Health Innovation, Macquarie University; owner Future Echoes Business Solutions)

Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Kryptonite’ by Blue Steel and ‘Illusory Motion’ by Gavin Luke. Music courtesy of Free Music Archive includes ‘Impulsing’ by Borrtex. Image by EMS-Forster-Productions licenced from Getty Images. 

Editorial feedback kindly provided by physicians David Arroyo, Stephen Bacchi, Aidan Tan, Ronaldo Piovezan and Rahul Barmanray and RACP staff Natasa Lazarevic PhD. 

Key References
More than algorithms: an analysis of safety events involving ML-enabled medical devices reported to the FDA [Lyell, J Am Med Inform Assoc. 2023]
How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices [Lyell, BMJ Health Care Inform. 2021]
Should AI-enabled medical devices be explainable? [Matulonyte, Int J Law Inform Tech. 2022]

Please visit the Pomegranate Health web page for a transcript and supporting references. Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify, Castbox or any podcasting app.