Thermal infrared (TIR) images capture temperature in a non-invasive manner, making them valuable for generating 3D models that reflect the spatial distribution of thermal properties within a scene. Current TIR image-based 3D reconstruction methods primarily focus on static conditions, which only capture the spatial distribution of thermal radiation but lack the ability to represent its temporal dynamics. The absence of dedicated datasets and effective methods for dynamic 3D representation are two key challenges that hinder progress in this field. To address these challenges, we propose a novel dynamic thermal 3D reconstruction method, named ThermalGS, based on 3D Gaussian Splatting (3DGS). ThermalGS employs a data-driven approach to directly learn both scene structure and dynamic thermal representation, using RGB and TIR images as input. The position, orientation, and scale of Gaussian primitives are guided by the RGB mesh. We introduce feature encoding and embedding networks to integrate semantic and temporal information into the Gaussian primitives, allowing them to capture dynamic thermal radiation characteristics. Moreover, we construct the Thermal Scene Day-and-Night (TSDN) dataset, which includes multi-view, high-resolution aerial RGB reference images and TIR images captured at five different times throughout the day and night, providing a benchmark for dynamic thermal 3D reconstruction tasks. Experimental results demonstrate that the proposed method achieves state-of-the-art performance on the TSDN dataset, with an average absolute temperature error of 1 °C and the ability to predict surface temperature variations over time.