Accurate Long-Term Multiple People Tracking Using Video and Body-Worn IMUs

Zur Kurzanzeige

dc.identifier.uri http://dx.doi.org/10.15488/12652
dc.identifier.uri https://www.repo.uni-hannover.de/handle/123456789/12752
dc.contributor.author Henschel, Roberto
dc.contributor.author Marcard, Timo von
dc.contributor.author Rosenhahn, Bode
dc.date.accessioned 2022-08-04T08:31:58Z
dc.date.available 2022-08-04T08:31:58Z
dc.date.issued 2020
dc.identifier.citation Henschel, R.; Von Marcard, T.; Rosenhahn, B.: Accurate Long-Term Multiple People Tracking Using Video and Body-Worn IMUs. In: IEEE Transactions on Image Processing 29 (2020), 9166762. DOI: https://doi.org/10.1109/TIP.2020.3013801
dc.description.abstract Most modern approaches for video-based multiple people tracking rely on human appearance to exploit similarities between person detections. Consequently, tracking accuracy degrades if this kind of information is not discriminative or if people change apparel. In contrast, we present a method to fuse video information with additional motion signals from body-worn inertial measurement units (IMUs). In particular, we propose a neural network to relate person detections with IMU orientations, and formulate a graph labeling problem to obtain a tracking solution that is globally consistent with the video and inertial recordings. The fusion of visual and inertial cues provides several advantages. The association of detection boxes in the video and IMU devices is based on motion, which is independent of a person's outward appearance. Furthermore, inertial sensors provide motion information irrespective of visual occlusions. Hence, once detections in the video are associated with an IMU device, intermediate positions can be reconstructed from corresponding inertial sensor data, which would be unstable using video only. Since no dataset exists for this new setting, we release a dataset of challenging tracking sequences, containing video and IMU recordings together with ground-truth annotations. We evaluate our approach on our new dataset, achieving an average IDF1 score of 91.2%. The proposed method is applicable to any situation that allows one to equip people with inertial sensors. © 1992-2012 IEEE. eng
dc.language.iso eng
dc.publisher New York, NY : IEEE
dc.relation.ispartofseries IEEE Transactions on Image Processing 29 (2020)
dc.rights CC BY 4.0 Unported
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.subject Inertial navigation systems eng
dc.subject Inertial measurement unit eng
dc.subject Motion information eng
dc.subject Multiple people tracking eng
dc.subject Tracking accuracy eng
dc.subject Tracking sequence eng
dc.subject Tracking solutions eng
dc.subject Video information eng
dc.subject Visual occlusions eng
dc.subject Wearable sensors eng
dc.subject graph labeling eng
dc.subject human motion analysis eng
dc.subject IMU eng
dc.subject Multiple people tracking eng
dc.subject sensor fusion eng
dc.subject.ddc 620 | Ingenieurwissenschaften und Maschinenbau ger
dc.subject.ddc 004 | Informatik ger
dc.title Accurate Long-Term Multiple People Tracking Using Video and Body-Worn IMUs
dc.type Article
dc.type Text
dc.relation.essn 1941-0042
dc.relation.issn 1057-7149
dc.relation.doi https://doi.org/10.1109/TIP.2020.3013801
dc.bibliographicCitation.volume 29
dc.bibliographicCitation.firstPage 9166762
dc.description.version publishedVersion
tib.accessRights frei zug�nglich


Die Publikation erscheint in Sammlung(en):

Zur Kurzanzeige

 

Suche im Repositorium


Durchblättern

Mein Nutzer/innenkonto

Nutzungsstatistiken