Improving instrument detection for a robotic scrub nurse using multi-view voting

Show simple item record

dc.identifier.uri http://dx.doi.org/10.15488/15381
dc.identifier.uri https://www.repo.uni-hannover.de/handle/123456789/15501
dc.contributor.author Badilla-Solórzano, Jorge
dc.contributor.author Ihler, Sontje
dc.contributor.author Gellrich, Nils-Claudius
dc.contributor.author Spalthoff, Simon
dc.date.accessioned 2023-11-21T05:43:46Z
dc.date.available 2023-11-21T05:43:46Z
dc.date.issued 2023
dc.identifier.citation Badilla-Solórzano, J.; Ihler, S.; Gellrich, N.-C.; Spalthoff, S.: Improving instrument detection for a robotic scrub nurse using multi-view voting. In: International Journal of Computer Assisted Radiology and Surgery 18 (2023), S. 1961-1968. DOI: https://doi.org/10.1007/s11548-023-03002-0
dc.description.abstract Purpose: A basic task of a robotic scrub nurse is surgical instrument detection. Deep learning techniques could potentially address this task; nevertheless, their performance is subject to some degree of error, which could render them unsuitable for real-world applications. In this work, we aim to demonstrate how the combination of a trained instrument detector with an instance-based voting scheme that considers several frames and viewpoints is enough to guarantee a strong improvement in the instrument detection task. Methods: We exploit the typical setup of a robotic scrub nurse to collect RGB data and point clouds from different viewpoints. Using trained Mask R-CNN models, we obtain predictions from each view. We propose a multi-view voting scheme based on predicted instances that combines the gathered data and predictions to produce a reliable map of the location of the instruments in the scene. Results: Our approach reduces the number of errors by more than 82% compared with the single-view case. On average, the data from five viewpoints are sufficient to infer the correct instrument arrangement with our best model. Conclusion: Our approach can drastically improve an instrument detector’s performance. Our method is practical and can be applied during an actual medical procedure without negatively affecting the surgical workflow. Our implementation and data are made available for the scientific community (https://github.com/Jorebs/Multi-view-Voting-Scheme). eng
dc.language.iso eng
dc.publisher Berlin ; Heidelberg [u.a.] : Springer
dc.relation.ispartofseries International Journal of Computer Assisted Radiology and Surgery 18 (2023)
dc.rights CC BY 4.0 Unported
dc.rights.uri https://creativecommons.org/licenses/by/4.0
dc.subject Mask R-CNN eng
dc.subject Multi-viewpoint inference eng
dc.subject Robot-assisted surgery eng
dc.subject Robotic scrub nurse eng
dc.subject Surgical instrument detection eng
dc.subject.ddc 610 | Medizin, Gesundheit
dc.title Improving instrument detection for a robotic scrub nurse using multi-view voting eng
dc.type Article
dc.type Text
dc.relation.essn 1861-6429
dc.relation.doi https://doi.org/10.1007/s11548-023-03002-0
dc.bibliographicCitation.volume 18
dc.bibliographicCitation.firstPage 1961
dc.bibliographicCitation.lastPage 1968
dc.description.version publishedVersion
tib.accessRights frei zug�nglich


Files in this item

This item appears in the following Collection(s):

Show simple item record

 

Search the repository


Browse

My Account

Usage Statistics