Radboud University Medical Center Nijmegen
EDL P16-25 P5: DL for Human and Animal Health
Real-time monitoring of patients during MRI-guided surgery
|MRI-guided surgery allows for minimally invasive procedures that precisely target pathology, while preserving body integrity. It relies on insertion of small surgical equipment that has to be guided towards a targeted treatment area using MRI guidance. Navigating this equipment under MRI guidance is challenging due to a combination of low resolution 2D views and a recurring slow manual viewpoint repositioning step, performed by a technician in the control room, to keep the surgical equipment within view.|
An automated tracking system could drastically reduce the delay of the viewpoint repositioning step and thus increase the overall efficiency of MRI-guided surgery. Deep learning has shown promising results in interpreting medical images for diagnostic purposes and could potentially perform this tracking task. However, utilizing deep learning for automated MRI viewpoint repositioning is not trivial as it requires real-time 3D object motion prediction by spatially adjusting and interpreting 2D image streams.
We aim to establish a bi-directional connection between a deep learning tracking algorithm and the MRI scanner. Our research focuses on the following EDL research lines to make deep learning suitable for real-time monitoring of patients. First, R5: real-time inference to minimize the computational delay of the automated viewpoint repositioning step. Second, R2: integrating spatial and temporal information for 3D motion estimation. Third, R2: combining 3D patient specific anatomical prior information with real-time 2D MRI image streams in order to increase tracking and navigation robustness.
Real-time AI steered MRI-guided guidewire tracking. Patrick Brand, Han Nijsink, Tristan de Boer, Jurgen Fütterer, Henkjan Huisman. Poster at ICT.Open2021.