RBE MS Thesis Presentation
Chronic Wound Image Segmentation and Analysis of Camera Position-based Errors
Abstract: Lower extremity chronic wounds such as diabetic, venous, arterial or pressure ulcers require regular monitoring and treatment to ensure proper healing. Smartphone-based image analysis systems has recently emerged as a helpful tool for monitoring the wound and providing actionable feedback. Wound and skin segmentation is an important step in the wound image analysis pipeline, after which various wound attributes can be computed on the wound segment. In this thesis, we make three main contributions. First, we systematically compare Associative Hierarchical Random Fields (AHRF), a graphical image segmentation algorithm with U-Net, an approach that uses Convolutional Neural Networks for wounds. We find that AHRF is more accurate on smaller datasets but that U-Net is more accurate on larger datasets and makes much faster inferences. Secondly, we investigate the performance improvements possible with various pre-processing methods such as the Contrast Limited Adaptive Histogram Equalisation (CLAHE) and Conditional Random Fields (CRF) for post-processing segmentation masks. Thirdly, imaging errors caused by different camera positions are explored by imaging an artificial wound moulage from various camera positions using a robot arm, yielding a dataset of 17,000 images. Errors in wound area due to warping are studied. Analysis of AHRF and UNet segmentation scores on this dataset reveals UNet is more robust to camera position overall, and that specular reflections negatively affect the performance of both algorithms.
Thesis Advisor: Prof. Emmanuel Agu
Thesis Committee: Prof. Michael Gennert, Prof. Carlo Pinciroli