How One ICU and ER Doctor is Using AI Today!
Ever read about applications of artificial intelligence in medicine, but they weren’t methods you can actually use right now? Or maybe they’re in use, but not in a widely applicable way?
Today that’s going to change.
We had the opportunity to chat with practising ICU and ER clinician and AI enthusiast, Dr Sameer Shaikh, who’s found many ways to use AI to save time and increase efficiency in his work. Here’s what we found.
GPTs at the bedside
One of the often-overlooked applications of artificial intelligence in medicine is custom GPTs. With a ChatGPT Plus account, you can create custom GPTs to answer questions based on data that you provide to them.
The way it works is you create GPTs on your desktop computer, then access the GPT at work via the ChatGPT mobile app on your phone. Dr Shaikh says,
It allows for the creation of personalised GPTs without any coding knowledge… We all have trusted open-access resources and personal notes; now, these can be integrated into custom GPTs within minutes. These personalised GPTs can be used at the bedside, offering immediate access to trusted information without the need to scroll through articles. I have created several GPTs myself…
Study materials for residents and junior learners
Monic.AI is a tool he likes for its ease of use to create quizzes, flashcards, and summaries on any topic, all based on whatever document you upload to the system. He says a big benefit of doing things this way is that it’s an efficient method for creating study materials that lets you “spend most of your time learning instead of creating content.”
AI-generated consultation notes
Nabla is a HIPAA-compliant AI-powered software that listens to your entire conversation with your patient (up to 3 hours long) and generates your consultation note automatically. It’s currently integrated with EHRs like NextGen and Opus. The company’s website says Epic is coming soon.
Dr Shaikh said you get 30 consultations each month for free, and that Nabla provides “a hands-on opportunity to evaluate the strengths and limitations of AI in real-time clinical settings.” (Interns and residents get unlimited consultations.) If you upgrade to the paid plan (unlimited consultations for everyone), it starts at 119 USD per month (Pro plan).
AI-generated teaching cases for residents
“AI tools like ChatGPT have revolutionised the way I develop teaching cases,” said Dr Shaikh. He uses OpenAI’s ChatGPT to assist in educating residents via two methods.
Firstly, he uses ChatGPT to generate the most important learning points for clinical cases, customised to match the learner’s level. Here’s an example of the type of prompt he uses:
And here’s an example of the type of output you can expect:
Secondly, he uses ChatGPT to create simulation cases, and said that this process saves him hours of time. He describes how the AI “…can introduce complexities to the scenarios and suggest how the simulation should flow and create a list of debriefing points.” Dr Shaikh explains that it’s especially useful for mimicking situations you’d come across in the real world.
He emphasises that AI is just an initial step, and that as a clinician, he still needs to refine the output based on his own experience. Also, he cautions that, “It’s important to cross-check the information with reliable sources to avoid potential inaccuracies.”
4 important considerations with applications of artificial intelligence in medicine
Dr Shaikh explained the importance of keeping the following limitations of AI in mind:
1.) You need to proofread and verify the output: He cautions that these tools can’t replace dedicated medical resources like UpToDate or DynaMed, and says their output “should be meticulously edited and fact-checked against trusted knowledge and resources.”
2.) Remember that they aren’t compliant with HIPAA: Due to the lack of HIPAA compliance, don’t share information with ChatGPT that could be connected with or identify a patient.
3.) Stay sceptical: Approach AI-generated content with scepticism—don’t accept it at face value, and always check the AI’s output for inaccuracies or errors.
4.) Be mindful of the risk for junior learners: Learners who don’t yet have a strong foundation in clinical knowledge may not be able to identify incorrect or misleading information in the AI’s output. So, he recommends that they use AI for summarising or organising information from verified and trusted sources. Don’t use them as a primary source of knowledge.
- ChatGPT Essentials for Clinicians. Medmastery
- Guilleminot S. AI in Healthcare. LITFL
- The Neural Medwork
AI in HEALTHCARE
Want to become a pro at prompting, and consistently get usable results? Be sure to check out Medmastery’s AI prompting course. Learn techniques to apply to the plethora of AI resources in constant development.