The bigger pictureThe rapid advancement of large language models (LLMs) has significantly transformed natural language processing (NLP), enabling machines to understand and generate human-like text. However, most LLMs are predominantly English centric, limiting their applicability in our linguistically diverse world. With over 7,000 languages spoken globally, there is a pressing need for models that can comprehend and generate text across multiple languages. Multilingual large language models (MLLMs) address this gap by processing and producing content in various languages, thereby enhancing global communication and accessibility. This survey provides a comprehensive overview of MLLMs, introducing a systematic taxonomy based on alignment strategies to deepen understanding in this field. By highlighting emerging trends and challenges, this survey aims to guide future research and development, fostering the creation of more inclusive and effective language models that cater to the diverse linguistic landscape of our world.SummaryMultilingual large language models (MLLMs) leverage advanced large language models to process and respond to queries across multiple languages, achieving significant success in polyglot tasks. Despite these breakthroughs, a comprehensive survey summarizing existing approaches and recent developments remains absent. To this end, this paper presents a unified and thorough review of the field, highlighting recent progress and emerging trends in MLLM research. The contributions of this paper are as follows. (1) Extensive survey: to our knowledge, this is the pioneering thorough review of multilingual alignment in MLLMs. (2) Unified taxonomy: we provide a unified framework to summarize the current progress in MLLMs. (3) Emerging frontiers: key emerging frontiers are identified, alongside a discussion of associated challenges. (4) Abundant resources: we collect abundant open-source resources, including relevant papers, data corpora, and leaderboards. We hope our work can provide the community quick access and spur breakthrough research in MLLMs.