Instrument tracking via online learning in retinal microsurgery

Med Image Comput Comput Assist Interv. 2014;17(Pt 1):464-71. doi: 10.1007/978-3-319-10404-1_58.

Abstract

Robust visual tracking of instruments is an important task in retinal microsurgery. In this context, the instruments are subject to a large variety of appearance changes due to illumination and other changes during a procedure, which makes the task very challenging. Most existing methods require collecting a sufficient amount of labelled data and yet perform poorly in handling appearance changes that are unseen in training data. To address these problems, we propose a new approach for robust instrument tracking. Specifically, we adopt an online learning technique that collects appearance samples of instruments on the fly and gradually learns a target-specific detector. Online learning enables the detector to reinforce its model and become more robust over time. The performance of the proposed method has been evaluated on a fully annotated dataset of retinal instruments in in-vivo retinal microsurgery and on a laparoscopy image sequence. In all experimental results, our proposed tracking approach shows superior performance compared to several other state-of-the-art approaches.

MeSH terms

  • Artificial Intelligence*
  • Humans
  • Microsurgery / instrumentation
  • Microsurgery / methods*
  • Online Systems
  • Ophthalmologic Surgical Procedures / instrumentation
  • Ophthalmologic Surgical Procedures / methods*
  • Pattern Recognition, Automated / methods*
  • Reproducibility of Results
  • Retina / anatomy & histology*
  • Retina / surgery*
  • Sensitivity and Specificity
  • Surgery, Computer-Assisted / methods*