Precise control of the tongue is necessary for drinking, eating, and vocalizing. Yet because tongue movements are fast and difficult to resolve, neural control of lingual kinematics remains poorly understood. We combine kilohertz frame-rate imaging and a deep-learning based artificial neural network to resolve 3D tongue kinematics in mice performing a cued lick task. Cue-evoked licks exhibit previously unobserved fine-scale movements which, like a hand searching for an unseen object, were produced after misses and were directionally biased towards remembered locations. Photoinhibition of anterolateral motor cortex (ALM) abolished these fine-scale adjustments, resulting in well-aimed but hypometric licks that missed the spout. Our results show that cortical activity is required for online corrections during licking and reveal novel, limb-like dynamics of the mouse tongue as it reaches for, and misses, targets.