Federated learning faces challenges associated with privacy breaches, client communication efficiency, stragglers' effect, and heterogeneity. To address these challenges, this paper reformulates the optimal client selection problem as a sparse optimization task, proposes a secure and efficient optimal client selection method for federated learning, named secure orthogonal matching pursuit federated learning (SecOMPFL). Therein, we first introduce a method to identify correlations in the local model parameters of participating clients, addressing the issue of duplicated client contributions highlighted in recent literature. Next, we establish a secure variant of the OMP algorithm in compressed sensing using secure multiparty computation and propose a novel secure aggregation protocol. This protocol enhances the global model's convergence rate through sparse optimization techniques while maintaining privacy and security. It relies entirely on the local model parameters as inputs, minimizing client communication requirements. We also devise a client sampling strategy without requiring additional communication, resolving the bottleneck encountered by the optimal client selection policy. Finally, we introduce a strict yet inclusive straggler penalty strategy to minimize the impact of stragglers. Theoretical analysis confirms the security and convergence of SecOMPFL, highlighting its resilience to stragglers' effect and systematic/statistical heterogeneity with high client communication efficiency. Numerical experiments were conducted to compare the convergence rate and client communication efficiency of SecOMPFL with those of FedAvg, FOLB, and BN2. These experiments used natural and synthetic with statistical heterogeneity datasets, considering varying numbers of clients and client sampling scales. The results demonstrate that SecOMPFL achieves a competitive convergence rate, with communication overhead 39.96% lower than that of FOLB and 28.44% lower than that of BN2. Furthermore, SecOMPFL shows good resilience to statistical heterogeneity.