dc.identifier.uri |
http://dx.doi.org/10.15488/3833 |
|
dc.identifier.uri |
https://www.repo.uni-hannover.de/handle/123456789/3867 |
|
dc.contributor.author |
Dietz, Armin
|
|
dc.contributor.author |
Pösch, Andreas
|
|
dc.contributor.author |
Reithmeier, Eduard
|
|
dc.contributor.editor |
Zhang, Jianguo
|
|
dc.contributor.editor |
Chen, Po-Hao
|
|
dc.date.accessioned |
2018-10-11T08:42:11Z |
|
dc.date.available |
2018-10-11T08:42:11Z |
|
dc.date.issued |
2018 |
|
dc.identifier.citation |
Dietz, A.; Pösch, A.; Reithmeier, E.: Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks. In: Proceedings of SPIE - The International Society for Optical Engineering 10579 (2018), 1057914. DOI: https://doi.org/10.1117/12.2294047 |
|
dc.description.abstract |
The number of health-care associated infections is increasing worldwide. Hand hygiene has been identified as one of the most crucial measures to prevent bacteria from spreading. However, compliance with recommended procedures for hand hygiene is generally poor, even in modern, industrialized regions. We present an optical assistance system for monitoring the hygienic hand disinfection procedure which is based on machine learning. Firstly, each hand and underarm of a person is detected in a down-sampled 96 px x 96 px depth video stream by pixelwise classification using a fully convolutional network. To gather the required amount of training data, we present a novel approach in automatically labeling recorded data using colored gloves and a color video stream that is registered to the depth stream. The colored gloves are used to segment the depth data in the training phase. During inference, the colored gloves are not required. The system detects and separates detailed hand parts of interacting, self-occluded hands within the observation zone of the sensor. Based on the location of the segmented hands, a full resolution region of interest (ROI) is cropped. A second deep neural network classifies the ROI into ten separate process steps (gestures), with nine of them based on the recommended hand disinfection procedure of the World Health Organization, and an additional error class. The combined system is cross-validated with 21 subjects and predicts with an accuracy of 93.37% (± 2.67%) which gesture is currently executed. The feedback is provided with 30 frames per second. |
eng |
dc.language.iso |
eng |
|
dc.publisher |
Bellingham, Wash. : SPIE |
|
dc.relation.ispartof |
Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications : 13-15 February 2018, Houston, Texas, United States |
|
dc.relation.ispartofseries |
Proceedings of SPIE 10579 (2018) |
|
dc.rights |
Es gilt deutsches Urheberrecht. Das Dokument darf zum eigenen Gebrauch kostenfrei genutzt, aber nicht im Internet bereitgestellt oder an Außenstehende weitergegeben werden. Dieser Beitrag ist aufgrund einer (DFG-geförderten) Allianz- bzw. Nationallizenz frei zugänglich. |
|
dc.subject |
Gesture recognition |
eng |
dc.subject |
Hand hygiene |
eng |
dc.subject |
Hand tracking |
eng |
dc.subject |
Machine learning |
eng |
dc.subject |
Segmentation |
eng |
dc.subject |
Artificial intelligence |
eng |
dc.subject |
Convolution |
eng |
dc.subject |
Deep neural networks |
eng |
dc.subject |
Disinfection |
eng |
dc.subject |
Gesture recognition |
eng |
dc.subject |
Health care |
eng |
dc.subject |
Image segmentation |
eng |
dc.subject |
Learning systems |
eng |
dc.subject |
Medical imaging |
eng |
dc.subject |
Video streaming |
eng |
dc.subject |
Convolutional networks |
eng |
dc.subject |
Frames per seconds |
eng |
dc.subject |
Hand disinfections |
eng |
dc.subject |
Hand hygienes |
eng |
dc.subject |
Hand tracking |
eng |
dc.subject |
Pixelwise classification |
eng |
dc.subject |
Region of interest |
eng |
dc.subject |
World Health Organization |
eng |
dc.subject |
Palmprint recognition |
eng |
dc.subject.classification |
Konferenzschrift |
ger |
dc.subject.ddc |
600 | Technik
|
ger |
dc.title |
Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks |
|
dc.type |
BookPart |
|
dc.type |
Text |
|
dc.relation.essn |
1996-756X |
|
dc.relation.isbn |
978-1-5106-1647-9 |
|
dc.relation.issn |
0277-786X |
|
dc.relation.doi |
https://doi.org/10.1117/12.2294047 |
|
dc.bibliographicCitation.volume |
10579 |
|
dc.bibliographicCitation.firstPage |
1057914 |
|
dc.description.version |
publishedVersion |
|
tib.accessRights |
frei zug�nglich |
|