RBE PhD Speaking Qualifiers: Xihan Ma | Towards Automatic Robotic Ultrasound Imaging
1:00 p.m. to 2:30 p.m.

RBE PhD Speaking Qualifiers
Xihan Ma
Towards Automatic Robotic Ultrasound Imaging
December 7th, 2023
1:00 PM - 2:00 PM
Location : 50 Prescott Street Room 3610
Abstract: The global COVID-19 pandemic has accentuated the pressing need for effective respiratory disease diagnosis. Traditional freehand ultrasound (US) procedures, while valuable, pose a significant risk of infection for healthcare workers due to close patient contact. To address this, robotic US systems (RUSS) have emerged as a promising solution. RUSS not only minimize physical contact but also offer potential advantages in standardizing and automating US imaging procedures. This can lead to enhanced diagnostic consistency across patients, reducing dependence on operator expertise and freeing sonographers from repetitive tasks. Our efforts in this domain encompass three key areas.
First, a scanning target localization pipeline is developed to preoperatively recognize the patient and estimate the pose of the patient using an RGB-D camera. The pose estimation enables automatic computation of the optimal position and orientation for placing the US probe following a standard lung US (LUS) protocol. phantom evaluation reveals high scanning target localization accuracy, facilitating successful US image collection showing lung pathological landmarks.
Second, a novel robot end-effector design is introduced to perceive the local body geometry in real-time with proximity sensors, providing intraoperative probe orientation updates to compensate for patient respiration and movements. Integrated with our RUSS, this active-sensing end-effector (A-SEE) enables autonomous normal positioning of the probe relative to the skin surface during imaging for optimal acoustic coupling.
Finally, to enhance imaging consistency across patients, a standardized imaging plane (SIP) navigation framework is proposed. Leveraging the scanning target localization pipeline, which positions the probe near the SIP, this framework precisely guides the probe to the SIP using US image feedback. This is achieved by extracting anatomical features from real-time images and employing non-patient-specific template matching for probe motion control. Integration with A-SEE maintains optimal probe alignment with the contact surface, preserving US signal quality. Validation through phantom and in-vivo evaluations demonstrates the framework's high precision when navigating to the SIP.
In conclusion, these research efforts collectively underscore the development of robotic ultrasound systems that enhance the safety, standardization, and consistency of US procedures, particularly in the diagnosis of respiratory diseases through LUS. These systems aim to revolutionize the way US is conducted, with a focus on safety, accuracy, and accessibility.
Advisor:
Professor Chris Nycz, Worcester Polytechnic Institute (WPI)
Committee:
Professor Haichong (Kai) Zhang, Worcester Polytechnic Institute (WPI)
Professor Jing Xiao, Worcester Polytechnic Institute (WPI)