- IT4Innovations, VSB - Technical University of Ostrava, 17. listopadu 2172/15, 708 00, Ostrava - Poruba, Czech Republic
Biodiversity research increasingly relies on digital twins—high‑fidelity, data‑driven simulations that replicate ecosystems across spatial and temporal scales. While these twins excel at integrating heterogeneous observations (e.g., remote sensing, genomics, citizen science), they often remain limited by static model architectures and manual parameter tuning, constraining rapid hypothesis testing and adaptive management. We propose an inference service supporting agentic Large Language Model (LLM) workflows with High Performance Computing (HPC) resources, enabling a new range of research directions in digital twins, such as accelerating development of data processing pipelines, agentic modelling, information exchange, natural language results interpretation, accelerating adoption, and many more.
Presented service facilitates usage of the HPC resources by exposing an OpenAI API which can be used with any OpenAI API compatible client. This means it exposes a diverse set of possibilities while keeping the computation and data at secure infrastructure, boosting not just research, but also European sovereignty. The most suitable application of this service are ones that are sending large number of requests in short amount of time such as agentic workflows. This would mean that the agents could interact or enhance digital twins, create an interaction layer for general public or non-technical stakeholders either ingesting natural language queries or create interpreted model results in natural language. Additionally, for modellers it allows to combine classical models and LLM agents to create a dynamic semi-autonomous hyper-parameter search or simulation.
We tested this service with agentic simulation for disaster relief in wildfire simulation. This simulation built upon CREW-Wildfire framework simulating a forest wildfire and actions of the rescue team. Our test showed this service works and new agentic framework such as CrewAI and Langchain can provide really good results in this benchmark. Of course the results also depends on the LLM models used.
Technically the service is build upon the same technologies which were leveraged in the BioDT prototype digital twins to interact with the HPC thus providing secure access to these resources. This removes a problem of choosing either an external LLM provider or using small local models with reduced capabilities during the research.
How to cite: Martinovic, T., Matus, A., Bot Goncalves, K., and Martinovic, J.: AI inference service in Digital Twins, World Biodiversity Forum 2026, Davos, Switzerland, 14–19 Jun 2026, WBF2026-553, https://doi.org/10.5194/wbf2026-553, 2026.