Identifying emotional responses in products is essential for product design and user research. Traditional methods, such as interviews and surveys, for gathering product experience data are time-consuming and resource-intensive, and often fail to capture users' genuine emotional intentions. This article introduces an intelligent method for accurately identifying user-product emotions using multimodal physiological signals and machine learning techniques. The study involves designing experiments with 63 representative product images, and collecting various physiological signals (eye movement, electrodermal activity, and pulse). Using the K-means algorithm, we establish efficacy-arousal assessment labels to create a product emotion dataset. Preprocessing methods, including time–frequency analysis and wavelet transformation, ensure high-quality signal analysis data. By extracting temporal, frequency, and time–frequency features from multimodal physiological signals, the Relief algorithm determines the optimal feature combination. Finally, the study evaluates the performance of five machine learning classifiers, with results indicating an 81.31% recognition accuracy for the proposed product emotion recognition framework. This framework proves effective for objectively analyzing user emotions in product experiences.