Automation of the Timed-Up-and-Go test using a conventional video camera

dc.contributor.authorSavoie, Patrick
dc.contributor.authorCameron, James A. D.
dc.contributor.authorKaye, Mary E.
dc.contributor.authorScheme, Erik J.
dc.date.accessioned2023-12-21T19:11:08Z
dc.date.available2023-12-21T19:11:08Z
dc.date.issued2019-08-09
dc.description.abstractThe Timed-Up-and-Go (TUG) test is a simple clinical tool commonly used to quickly assess the mobility of patients. Researchers have endeavored to automate the test using sensors or motion tracking systems to improve its accuracy and to extract more resolved information about its sub-phases. While some approaches have shown promise, they often require the donning of sensors or the use of specialized hardware, such as the now discontinued Microsoft Kinect, which combines video information with depth sensors (RGBD). In this work, we leverage recent advances in computer vision to automate the TUG test using a regular RGB video camera without the need for custom hardware or additional depth sensors. Thirty healthy participants were recorded using a Kinect V2 and a standard video feed while performing multiple trials of 3 and 1.5 meter versions of the TUG test. A Mask Regional Convolutional Neural Net (R-CNN) algorithm and a Deep Multitask Architecture for Human Sensing (DMHS) were then used together to extract global 3D poses of the participants. The timing of transitions between the six key movement phases of the TUG test were then extracted using heuristic features extracted from the time series of these 3D poses. The proposed video-based vTUG system yielded the same error as the standard Kinect-based system for all six key transitions points, and average errors of less than 0.15 seconds from a multi-observer hand labeled ground truth. This work describes a novel method of video-based automation of the TUG test using a single standard camera, removing the need for specialized equipment and facilitating the extraction of additional meaningful information for clinical use.
dc.description.copyright© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
dc.identifier.urihttps://unbscholar.lib.unb.ca/handle/1882/37614
dc.language.isoen
dc.publisherIEEE
dc.relationNatural Sciences and Engineering Research Council of Canada
dc.relationNew Brunswick Innovation Foundation
dc.relation.hasversionhttps://doi.org/10.1109/JBHI.2019.2934342
dc.rightshttp://purl.org/coar/access_right/c_abf2
dc.subject.disciplineElectrical and Computer Engineering
dc.titleAutomation of the Timed-Up-and-Go test using a conventional video camera
dc.typejournal article
oaire.citation.endPage1205
oaire.citation.issue4
oaire.citation.startPage1196
oaire.citation.titleIEEE Journal of Biomedical and Health Informatics
oaire.citation.volume24
oaire.license.conditionother
oaire.versionhttp://purl.org/coar/version/c_ab4af688f83e57aa

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Savoie et al. JBHI 2020, Final Submission.pdf
Size:
1.1 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.13 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections