GE HealthCare, Mass General Brigham to Collaborate on Foundation Models
Chicago-based GE HealthCare and Mass General Brigham are expanding their partnership to integrate medical imaging foundation models into their artificial intelligence research work, with a strong focus on responsible AI practices.
The organizations have been working together on AI solutions since announcing a 10-year commitment in 2017 to explore the use of AI across a broad range of diagnostic and treatment paradigms through sustainable AI development.
The traditional approach to integrating AI into healthcare systems requires the retraining of models to accommodate the unique requirements of different patient populations and hospital settings, the organizations said. This can lead to increased costs and complexity, and in addition, hinder the broad adoption of AI technologies in the healthcare industry. Foundation models have the potential to transform healthcare by improving workflow efficiency and imaging diagnosis, since they have demonstrated strong capabilities in solving a diverse set of tasks. Foundation models have emerged as a reliable and adaptable foundation for developing AI applications tailored to the healthcare sector.
“The relationship between Mass General Brigham’s commercial AI business (Mass General Brigham AI) and GE HealthCare has helped accelerate the introduction of AI into a range of product offerings and digital health solutions. With foundation models, we are witnessing the next wave of AI innovation, and it is already reshaping how we build, integrate and use AI,” said Keith Dreyer, Ph.D., D.O., chief data science officer, Mass General Brigham, in a statement. “I think we are all optimistic that foundation models may actually complement and enhance the work we have been doing with convolutional neural networks over the past few years. Hopefully, this work will help make healthcare delivery more efficient for our practitioners, more accessible for our patients and more equitable for our diverse communities.”
“Adding foundation models to our research work, we will be able to take the next step of digital and AI transformation to develop technology innovations that provide better patient care and outcomes,” said Parminder Bhatia, Chief AI Officer, GE HealthCare, in a statement. "Incorporating responsible AI practices into this phase, we are committed to ensuring these innovations adhere to guidelines, prioritize patient safety and privacy, and promote fairness and transparency across all applications."
In 2017, Chicago-based GE HealthCare and Mass General Brigham began a 10-year commitment to explore the use of AI across a broad range of diagnostic and treatment paradigms. At the time, the organizations said the initial focus of the relationship would be on the development of applications aimed to improve clinician productivity and patient outcomes in diagnostic imaging. Over time, the groups said they would create new business models for applying AI to healthcare and develop products for additional medical specialties like molecular pathology, genomics and population health.
The first innovative AI application from the collaboration is the schedule predictions dashboard of Radiology Operations Module (ROM), a digital imaging tool that helps optimize scheduling, reduce cost, and free providers from administrative burden, allowing more time for the clinician-patient relationship, the groups said. ROM is commercially available to healthcare institutions.
Operational AI-enabled tools can address challenges that often pose a threat to patient care such as cost of care, and hospital inefficiencies. When a patient misses an appointment, fails to schedule a follow up or is late, also known as missed care opportunities (MCO), the impact can be significant. The co-developed algorithm is intended to predict MCO and late arrivals, which could better accommodate urgent, inpatients, or walk-in appointments. In preliminary tests, the algorithm was able to predict the missed care opportunity correctly, at rates of up to 96 percent, with limited false positives.