- 1Tsinghua University , Department of Civil Engineering , China (yang-s22@mails.tsinghua.edu.cn)
- 2Southeast University, School of Civil Engineering, China
Accurate simulation of crowd evacuation processes is essential for evaluating the safety and resilience of community during disaster emergencies. Conventional agent-based evacuation models effectively capture individual movement and interactions but often rely on predefined behavioral rules, limiting their ability to represent adaptive reasoning, information exchange, and context-dependent decision-making in rapidly changing environments. This study presents an agent-based evacuation simulation framework in which large language models (LLMs) are embedded as the decision-making components of individual agents. Each agent maintains internal states, including personality attributes, environmental perceptions, and decision histories, while the LLM enables adaptive reasoning and communication based on evolving situational context. To ensure scalability for large populations, batch prompting and parallel computation strategies are adopted to mitigate the computational cost introduced by LLM integration. The framework supports both pedestrian and vehicular agents, allowing multimodal evacuation dynamics to be examined within a unified simulation environment. A real-world disaster evacuation scenario is used to evaluate the proposed approach. Results indicate that LLM-enhanced agents exhibit more flexible, context-aware, and realistic behavioral patterns compared with traditional rule-based models. The proposed framework reduces dependence on manually specified behavioral assumptions and provides a scalable foundation for probabilistic evacuation performance assessment and strategy evaluation under diverse hazard conditions.
How to cite: Yang, S., Zhang, Y., and Gu, C.: Large Language Model–Enhanced Agent-Based Modeling for Intelligent Crowd Evacuation under Disaster Scenarios, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-3725, https://doi.org/10.5194/egusphere-egu26-3725, 2026.