September 10, 2021

AI-Powered Best Practice Recommendation Program

One radiology group leveraged artificial intelligence to significantly improve adherence to evidence-based guidelines.
  • Radiology Partners Inc. wanted to expand its best practice recommendation (BPR) program and knew it needed artificial intelligence (AI) to help.
  • With no data scientists in house at the time, the radiologists worked with an outside vendor before adding an in-house team.
  • The practice now has AI algorithms for five BPRs, helping it achieve 100% adherence to some guidelines and more than 90% adherence to others.

In 2015, Radiology Partners Inc. began rolling out a best practice recommendation (BPR) program to help its radiologists more consistently use evidence-based guidelines, such as the 新澳门六合彩官网 Appropriateness Criteria, to make population-health-focused follow-up care recommendations. The group started with just three BPRs, which the radiologists manually referenced each time they read a correlating case. The approach reduced reporting variability as the radiologists made follow-up care recommendations based on guidelines that are known to improve care and decrease costs, but it was labor intensive.

The group knew that if it wanted to expand the program to include more BPRs, its radiologists would need help. That鈥檚 when it turned to artificial intelligence (AI). 鈥淲e wanted the radiologists to have a digital assistant to help them use and apply the BPRs as we scaled the program,鈥 says Nina Kottler, MD, MS, associate chief medical officer for clinical AI and vice president of clinical operations at Radiology Partners. 鈥淭hat meant creating an AI program that uses natural language processing to understand what the radiologists are saying as they dictate their reports and automatically identify the appropriate follow-up recommendations for each pathology. We looked around, and that kind of AI system didn鈥檛 exist, so we decided to create it.鈥
Nina Kottler, MD, MS associate chief medical officer for clinical artificial intelligence and vice president of clinical operations at Radiology Partners
Nina Kottler, MD, MS, associate chief medical officer for clinical artificial intelligence (AI) and vice president of clinical operations at Radiology Partners, has led the integration of AI throughout the practice.
With the help of a team of data scientist consultants, the group began developing and implementing the AI system in 2017. Since then, the system has allowed the group to expand its BPR program to include five recommendations that address abdominal aortic aneurysm, lung nodules, incidental thyroid nodules, ovarian cysts, inferior vena cava filters, COVID-19, and adrenal nodules. The program helped the group increase adherence to some guidelines by more than 80%, results that are encouraging the group to expand the program and its use of clinical AI.

鈥淲e鈥檙e using evidence-based research to determine which follow-up recommendations our radiologists should provide to referring clinicians and, ultimately, to patients. This widespread standardization of care wouldn鈥檛 be possible without AI,鈥 says Kelly Denney, who started out as an AI consultant on the project and is now the Radiology Partners鈥 director of data science and clinical analytics. 鈥淭he technology gives our radiologists the extra support they need to drive added value in radiology.鈥

Forming a Partnership

When Radiology Partners decided that it needed AI to expand its BPR program, it didn鈥檛 have any data scientists on staff. Instead, the group鈥檚 information technology team recruited the help of a vendor. Working with the consultants, Kottler and her colleagues explained that they needed an AI system to identify the appropriate BPR based on each radiologist鈥檚 dictation. To start, the consultants worked on a BPR for incidental thyroid nodules, but things didn鈥檛 go as expected. 鈥淲hen the consultants showed us their initial presentation and the direction they were headed, I said, 鈥楾his is all wrong,鈥欌 Kottler explains. 鈥淚 talked to the head of our IT team and told him it wasn鈥檛 working and that we鈥檙e going to have to fire the vendor, because they don鈥檛 get it.鈥欌

Before dismissing the vendor and starting over, however, Kottler reconsidered how her team had been working with the consultants. 鈥淚 realized that they are brilliant data scientists, but they don鈥檛 know anything about healthcare or radiology, and they probably don鈥檛 even know what the thyroid does,鈥 Kottler says. 鈥淚nstead of giving them requirements and telling them what to do, we realized that we had to create a partnership with them, so I actually flew out to their office in Columbus, Ohio, and spent time talking with them about how we work and what we were trying to achieve.鈥
Kelly Denney, director of data science and clinical analytics for Radiology Partners
Kelly Denney, director of data science and clinical analytics for Radiology Partners, is part of a team that helped build the group鈥檚 BPR algorithms.
Over three days, Kottler taught the consultants about anatomy, physiology, and thyroid function. She also discussed the language that radiologists use when dictating their reports and reviewed the best practice guidelines, why they were created, and their purpose. 鈥淭hat gave the consultants an idea of why we were creating this AI system, which was way more important than telling them specifically what to create,鈥 Kottler says. In turn, the consultants taught the radiologists more about how natural language processing and AI work. 鈥淲e all learned a lot about the necessary components to build a successful solution,鈥 Denney says.

From there, the consulting team delivered an AI algorithm for the incidental thyroid nodule BPR that exceeded the radiologists鈥 expectations. 鈥淲e decided from then on that was how we were going to work with the consulting team: Either I or one of my colleagues would spend a few days with them before we even started on a new BPR, orienting them with the anatomy and walking them through the best practice and why it was created,鈥 Kottler says. 鈥淚t was an iterative process of educating them about what we were doing and the meaning behind it so that they could apply that understanding to create an AI system that provided value to our work.鈥

The approach was such a success that after a year of working with the radiology team on a consulting basis, the AI team asked to join the radiology practice full time. 鈥淲e asked Radiology Partners to hire us to continue the work on the tool that we鈥檇 created,鈥 explains Denney, adding that the tool requires long-term management for sustained accuracy and updates as the BPRs evolve. 鈥淭he collaborative working relationship we鈥檇 built with Nina and others from Radiology Partners was exactly the type of culture that made us want to come to work every day because it wasn鈥檛 really work 鈥 it was fun and rewarding.鈥

Radiology Partners agreed to hire the team. It now has eight data scientists and four clinical analysists on staff, and it plans to add more to further expand the BPR program. 鈥淥ur chief executive officer and chief operations officer are strong believers in the value of the AI tool and understand the value of having a team of people on staff who are familiar with the tool and who can maintain it and update it appropriately,鈥 says Kottler, noting that all AI tools require maintenance. 鈥淭he business case was clear: The tool helps our radiologists adhere to our BPRs, resulting in improved population-health management and better patient follow-up.鈥

Integrating the Algorithms

Once Kottler and the consultants successfully developed the first AI algorithm, they turned their attention to integrating the tool into the radiologists鈥 workflow. This involved working with the radiology group鈥檚 IT team to ensure that the radiology workstations had the proper permissions to run the AI algorithm. 鈥淲e needed to send the algorithm out to the workstations and incorporate it into the radiologists鈥 startup menu,鈥 Kottler explains. 鈥淵ou would think that would be easy, but it actually takes some backend work to make sure your architecture is set up to deploy the algorithm. You have to get the tool on their workstations before you can roll it out clinically.鈥

As the team integrated the algorithm to one of Radiology Partners鈥 practices, Kottler began educating the radiologists about the tool. To start, she delivered a presentation to the radiology leaders at that practice. During the recorded presentation, Kottler presented slides, discussed an overview of the technology, outlined the burning platform for why the group needed the technology, and answered questions. After that, she shared the recording with her team and then worked with them to conduct one-on-one in-person training with each radiologist, making sure to follow up a few days later to see how they were doing. The approach worked well but required a lot of time and resources, Kottler says.
Kent Hutson, MD, CPE, director of innovation clinical operations for Radiology Partners
Kent Hutson, MD, CPE, director of innovation clinical operations for Radiology Partners, says that the algorithms integrate seamlessly into the radiologists鈥 workflow for improved patient care.
To make the training more manageable across Radiology Partners鈥 network of more than 60 local practices, especially during the COVID-19 pandemic, Kottler and her team began using a virtual meeting platform to train trainers who would in turn train the radiologists as the group deployed its tool to more of its local practice. 鈥淲e combined a clinical trainer with an IT trainer, and they worked remotely to train each radiologist on their actual workstations,鈥 Kottler says. Now that the radiologists are generally familiar with the tool, Kottler and her team are moving away from the formal training sessions to making training videos that the radiologists watch on their own. 鈥淓ach video is five to seven minutes long, and each one demonstrates a certain functionality of the AI tool works and what assumptions the tool is making,鈥 Kottler explains. 鈥淭hen after the radiologists pass a quiz, we roll the tool out for in-practice use.鈥

Generating Buy-In

As Kottler and her team deploy each additional algorithm, they run a pilot program to gather feedback from the radiologists about the tool鈥檚 performance, accuracy, and efficiency. A feedback mechanism within the AI system makes it easy for the radiologists to comment on the algorithms. 鈥淲hen we get that feedback, we reply to the radiologists to let them know that we鈥檝e heard them and that we will make any necessary changes,鈥 Kottler says. 鈥淲e also conduct a one-question survey that asks the radiologists to rate from zero to 10 how likely they are to recommend the product to a friend, client, or colleague. We鈥檝e gotten really high scores from the radiologists.鈥

In addition to soliciting feedback from the radiologists, Kottler and her team work with radiologist champions at each of Radiology Partners鈥 practices. To recruit these champions, Kottler and her team share a list of responsibilities with practice leaders and ask them to identify radiologists who they think would be a good fit for the role. The responsibilities include educating radiologists about the available programs, encouraging the radiologists to use the AI algorithms, helping to track how well each radiologist is adhering to the BPRs, and following up with radiologists who fail to consistently apply the BPRs.

鈥淧eople tend to respect and listen to the people they work with directly, so you need local practice leaders to follow up with the radiologists to make sure they鈥檙e using the algorithms,鈥 Kottler says. 鈥淭o make them effective, you need to arm those champions with information about who is and who is not using the tool, specifically for those rads who could improve on BPR adherence. Then, the champions can encourage those rads to use the tool. This type of local rad-to-rad discussion tends to be very effective.鈥

For the radiologists, using the tool is relatively simple, says Kent Hutson, MD, CPE, director of innovation clinical operations for Radiology Partners. As the radiologists dictate their reports, the algorithms work in the background, listening for terms that correlate with the BPRs. Depending on the radiologist鈥檚 preferences, the tool鈥檚 window will slide into view as soon as it identifies a correlating BPR, or it will wait until the radiologist gets to the impression section of their report and then pop up with the BPR follow-up information. 鈥淎t that point, the radiologist looks at the recommendation and clicks a thumbs up button, indicating that they agree with the algorithm, before inserting the recommendation into their report. If they disagree with the generated recommendation, they click a thumbs down button, and leave the information out of their report,鈥 Hutson explains.

Focusing on Impact

The BPR program has helped the radiologists leverage evidence-based guidelines more consistently. For instance, Kottler says that before the first Radiology Partners practice instituted the program, its radiologists followed the abdominal aortic aneurysm guidance 4% of the time, the ovarian cyst guidance 4% of the time, and the incidental thyroid nodule guidance 56% of the time. Two weeks after the first practice implemented the AI-enabled BPR program, that group's adherence to the BRP for abdominal aortic aneurysm increased to 92%, adherence to the BRP for ovarian cysts increased to 100%, and adherence to the BPR for thyroid nodules increased to 99%. 鈥淭his program is helping us reduce variation and improve quality across the board for all of our radiologists,鈥 Hutson says. 鈥淭hese value-enabling activities are what we need to pursue in healthcare. That鈥檚 where we want to go with medicine in general. We want quality improvement initiatives that are evidence based and applied transparently so that clinicians can act on the best evidence.鈥

Recognizing that it would have been unlikely to achieve these results without AI, Radiology Partners began expanding its use of the advanced technology in 2019. While the group鈥檚 internal AI team focused on the BPR program, Radiology Partners worked with a vendor to pilot two algorithms for detecting and triaging intracranial hemorrhage and pulmonary embolism, two potential life-threatening conditions.

The results of the pilot showed that the algorithms helped the radiologists detect 2.4% more intracranial hemorrhage and 4.4% more pulmonary embolism findings, and the radiologists gave the tools a satisfaction rating of 8.7 out of 10.  鈥淲e ran both of those detection algorithms over six months across a large part of our practice,鈥 Kottler says. 鈥淭here were great results across the board and decided that this was something that we wanted to make available to everyone.鈥 In the spring of 2021, the group entered into a contract to offer the vendor鈥檚 seven FDA-cleared algorithms across its network. (For a list of FDA-cleared algorithms, visit the 新澳门六合彩官网 Data Science Institute鈥檚 .)

While these new algorithms assist with imaging interpretation, Radiology Partners is also adding AI for non-interpretative activities. For instance, the group is piloting a natural language processing algorithm that automatically creates the impression of the radiology report for X-ray, CT, and MRI; and an algorithm designed to help the radiologists provide more coordinated care by ensuring cases requiring follow-up actually get that follow-up. 鈥淯p to this point, we鈥檝e been mainly focused on using AI as an assistant to help the radiologist as they interpret exams. But now we鈥檙e looking forward and backward in that workflow, from the order all the way through to the interpretation and then even follow up and peer learning, to see where else AI can drive value,鈥 Denney explains. 鈥淔or AI to be valuable, it has to improve patient care. It鈥檚 not just a cool use of technology; it鈥檚 what matters to our patients.鈥

With this in mind, Kottler encourages more radiologists to get comfortable using AI to ensure that these tools have a positive impact on patient care. 鈥淭his will change our profession,鈥 she says. 鈥淎s radiologists, we will become the information experts who provide context for the massive amount of data that AI and other information-extracting technologies (radiomics, genomics, molecular imaging, etc.) present and who make the data actionable for referring physicians and patients. We should be driving that change. To do that, we must take the wheel and embrace this technology because we can鈥檛 drive anything from the backseat.鈥

Creative Commons

AI-Powered Best Practice Recommendation Program by 新澳门六合彩官网 is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Based on a work at www.acr.org/imaging3. Permissions beyond the scope of this license may be available at www.acr.org/Legal.


Share Your Story

Have a case study idea you’d like to share with the radiology community? To submit your idea, please click here.

Now It's Your Turn

Follow these steps to begin integrating AI into your group’s workflow for improved patient care, and tell us how you did at imaging3@acr.org or on Twitter at .

  • Identify a use case for AI in your practice and look to see whether the tools exist for your use case.
  • Collaborate closely with data scientists to ensure the AI tools provide the value you need to improve patient care.
  • When implementing algorithms, collect radiologist feedback and respond to the feedback to help generate buy-in.

Author

Jenny Jones, Imaging 3.0 Manager

Join the Discussion

Twitter_320

Want to join the discussion about how radiologists can lead quality improvement projects for improved image ordering? Let us know your thoughts on Twitter at #imaging3.

Call for Case Studies

StayUptoDate_320

Have a suggestion for a future case study? Please share your idea with us!

Submit your idea »