Paper
Document
Download
Flag content
0

Video-based motion-resilient reconstruction of 3D position for fNIRS and EEG head mounted probes

Published
May 2, 2019
Show more
Save
TipTip
Document
Download
Flag content
0
TipTip
Save
Document
Download
Flag content

Abstract

Abstract Significance We propose a novel video-based, motion-resilient, and fast method for estimating the position of optodes on the scalp. Aim Measuring the exact placement of probes (e.g., electrodes, optodes) on a participant’s head is a notoriously difficult step in acquiring neuroimaging data from methods which rely on scalp recordings (e.g., EEG and fNIRS), and is particularly difficult for any clinical or developmental population. Existing methods of head measurements require the participant to remain still for a lengthy period of time, are laborious, and require extensive training. Therefore, a fast and motion-resilient method is required for estimating the scalp location of probes. Approach We propose an innovative video-based method for estimating the probes’ positions relative to the participant’s head, which is fast, motion-resilient, and automatic. Our method builds on capitalizing the advantages, and understanding the limitations, of cutting-edge computer vision and machine learning tool. We validate our method on 10 adult subjects and provide proof of feasibility with infant subjects. Results We show that our method is both reliable and valid compared to existing state-of-the-art methods by estimating probe positions in a single measurement, and by tracking their translation and consistency across sessions. Finally, we show that our automatic method is able to estimate the position of probes on an infant head without lengthy offline procedures, a task which is considered challenging until now. Conclusions Our proposed method allows, for the first time, the use of automated spatial co-registration methods on developmental and clinical populations, where lengthy, motion-sensitive measurement methods routinely fail.

Paper PDF

This paper's license is marked as closed access or non-commercial and cannot be viewed on ResearchHub. Visit the paper's external site.