Paper
Document
Download
Flag content
4

Deep learning enables accurate soft tissue deformation estimation in vivo

4
TipTip
Save
Document
Download
Flag content

Abstract

ABSTRACT Image-based deformation estimation is an important tool used in a variety of engineering problems, including crack propagation, fracture, and fatigue failure. These tools have been instrumental in biomechanics research where measuring in vitro and in vivo tissue deformations help evaluate tissue health and disease progression. However, accurately measuring tissue deformation in vivo is particularly challenging due to limited image signal-to-noise ratio. Therefore, we created a novel deep-learning approach for measuring deformation from a sequence of in vivo images called StrainNet . Utilizing a training dataset that incorporates image artifacts, StrainNet was designed to maximize performance in challenging in vivo settings. Artificially generated image sequences of human flexor tendons undergoing known deformations were used to compare StrainNet against two conventional image-based strain measurement techniques. StrainNet outperformed the traditional techniques by nearly 90%. High-frequency ultrasound imaging was then used to acquire images of the flexor tendons engaged during contraction. Only StrainNet was able to track tissue deformations under the in vivo test conditions. Findings revealed strong correlations between tendon deformation and contraction effort, highlighting the potential for StrainNet to be a valuable tool for assessing preventative care, rehabilitation strategies, or disease progression. Additionally, by using real-world data to train our model, StrainNet was able to generalize and reveal important relationships between the effort exerted by the participant and tendon mechanics. Overall, StrainNet demonstrated the effectiveness of using deep learning for image-based strain analysis in vivo .

Paper PDF

This paper's license is marked as closed access or non-commercial and cannot be viewed on ResearchHub. Visit the paper's external site.