During September 2017, Intuitive Surgical Inc. conducted four different challenges during the Medical Image Computing and Computer-Assisted Interventions Conference 2017 (MICCAI 2017) held in Ontario, Canada. My labmate Yun-Hsuan Su and I participated in the Robotic Instrument Segmentation Sub-Challenge, a part of the larger Endoscopic Vision Challenge.
We were placed 8th amongst all participating teams. We approached the problem using traditional computer vision, by using features such as color (Oppo1, Oppo2, Hue, and Saturation), GrabCut, level of blur, edges, and depth. Tapping into prior knowledge regarding the presence of instruments, we introduced an additional border constraint on contours to eliminate noise. Our segmentation resulted in an average Dice Coefficient of 0.716 on the test dataset. A sample output of the process is as shown below:
After exploring further, we are currently working on using Deep Learning, combined with traditional computer vision, to allow improvement of performance on a limited dataset.