Mounting evidence indicates that information processing in the visual hierarchy follows a sequential progression from low-level perceptual to high-level conceptual features during visual perception, with the reverse order during memory retrieval. However, the nature of this hierarchical processing and its modulation by selective attention remains unclear. By integrating a deep neural network with the drift-diffusion model on reaction time data, we identified parallel processing of perceptual and conceptual features during perception, rather than sequential processing. The slower reaction times observed in conceptual tasks compared to perceptual tasks were primarily driven by the larger decision boundary and longer non-decision time associated with conceptual features, which could not be compensated by their faster evidence accumulation rates. Using single-trial multivariate decoding of MEG (magnetoencephalography) data to examine the timing of neural representations of different features, we found that selective attention reversed the onset times of perceptual and conceptual features in the occipital and parietal lobes during perception. This led to earlier detection of conceptual features compared to perceptual features in the animacy and size tasks. During retrieval, the colour task exhibited earlier peak times for perceptual features compared to conceptual features in the frontal lobe, indicating that perceptual features were reconstructed with the highest fidelity before conceptual features. These findings provide novel insights into the hierarchical nature of information processing during perception and retrieval, emphasizing the crucial role of selective attention in modulating the speed of feature information accumulation.
Support the authors with ResearchCoin
Support the authors with ResearchCoin