Digital twins and metaverse are merging: AI redefines future of architecture
The integration of AI and XR is central to the observed convergence. For digital twins, machine learning and LLMs enhance functionality by enabling natural language querying, real-time anomaly detection, predictive analytics, and intelligent decision support. Platforms like Unreal Engine, Autodesk Tandem, and NVIDIA Omniverse were tested for their LLM integration, with Unreal showing the most promise for immersive interaction and multi-agent AI deployment.

The boundary between digital twins and the metaverse is rapidly collapsing, according to a pioneering study that redefines their roles in architectural innovation. The study, titled “Metaverse and Digital Twins in the Age of AI and Extended Reality”, was published in the 2025 issue of Architecture and led by researchers at the University of Cincinnati.
The paper presents a detailed comparative framework and experimental findings across multiple architectural applications, revealing how artificial intelligence (AI), extended reality (XR), and large language models (LLMs) are fusing once-discrete paradigms.
Through real-world case studies and platform evaluations, the researchers explored how these technologies now create hybrid environments, spaces that are both mirrored and imagined, data-driven and experiential. The findings suggest that architects are poised to shape a new form of digital ecosystem, one that simultaneously optimizes operations and fosters immersive user experiences.
What are the fundamental differences between digital twins and the metaverse?
The study begins by clearly distinguishing digital twins (DTs) and the metaverse. DTs serve as data-intensive, real-time simulations of physical systems, typically applied in architecture, engineering, and construction for energy management, predictive maintenance, and operational optimization. They integrate tools like Building Information Modeling (BIM), IoT sensors, and AI-powered analytics to replicate and forecast physical building behavior.
In contrast, the metaverse represents a more speculative and immersive digital realm, characterized by multi-user interaction, virtual socialization, and creative experimentation. Instead of replicating the real world, it enables new environments governed by user agency and spatial storytelling. Architectural use of the metaverse includes virtual real estate, education platforms, and design exploration free from physical constraints.
Despite these distinctions, the paper reveals a growing overlap in technology, platform use, and conceptual goals. For instance, a virtual building might operate as a DT by collecting live sensor data while also existing as a metaverse space for collaborative exploration, exhibition, or simulation-based education.
How are AI and XR blurring the lines between these environments?
The integration of AI and XR is central to the observed convergence. For digital twins, machine learning and LLMs enhance functionality by enabling natural language querying, real-time anomaly detection, predictive analytics, and intelligent decision support. Platforms like Unreal Engine, Autodesk Tandem, and NVIDIA Omniverse were tested for their LLM integration, with Unreal showing the most promise for immersive interaction and multi-agent AI deployment.
The metaverse, on the other hand, leverages AI primarily for creative generation and interactivity. Generative AI tools, such as Stable Diffusion, Midjourney, and Tripo, enable rapid asset creation from text or image prompts, while tools like ControlNet and PromeAI translate sketches into detailed 3D scenes. The study notes, however, that despite their speed and accessibility, generative tools currently lack the precision required for professional-grade architectural output and are best used in early-stage ideation.
Large language models are also driving innovation in metaverse interaction. The researchers developed spatially aware, conversational bots embedded in Unreal Engine environments. These bots, trained on LLMs like ChatGPT-4o, could detect user location and respond contextually, enabling applications such as lab tours, virtual tutoring, or client simulation. This enhances the metaverse's utility for education, collaborative design, and user experience testing.
While AI amplifies both DT and metaverse environments, it does so differently, analytical enhancement for the former, narrative interactivity for the latter. Yet, the tools and frameworks often overlap, further collapsing the distinction between the two.
What applications are emerging from this convergence?
The convergence of DTs and the metaverse is producing hybrid applications that redefine architectural workflows, pedagogy, and stakeholder engagement. In one pilot study, fourth-year architecture students used platforms like Hyperspace, Mesh, and Mozilla Hubs to build immersive educational spaces. Generative AI tools were used to prototype forms, while avatars and non-player characters (NPCs) enabled dynamic critiques and walkthroughs. These activities blended traditional studio work with virtual world-building, offering new pedagogical models for architectural education.
At the infrastructure level, DTs with VR integration were used to simulate emergency egress and behavioral response in real-time. Using IoT data and treadmill-based motion tracking, the team recreated a physical building as a DT, adding smoke and fire simulation to study human movement in evacuation scenarios. Findings suggested a marked improvement in user engagement and training efficacy compared to traditional drills.
Meanwhile, augmented reality (AR) was applied in field contexts, particularly for construction management. Using devices like Microsoft HoloLens and mobile-based AR platforms, the researchers projected BIM data onto construction sites, enabling real-time inspection, coordination, and progress tracking. Although AR support within metaverse platforms remains limited, advances in wearable devices are expected to enhance future cross-reality experiences.
A core insight is that XR technologies are no longer exclusive to the metaverse. Both DTs and metaverse environments benefit from immersive visualization, whether for user training, design validation, or collaborative interaction. This convergence is reshaping not only how environments are designed but also how they are experienced and managed.
However, barriers remain. Interoperability between BIM and IoT platforms is still limited, few systems support seamless bidirectional data exchange, and integration of LLMs into real-time environments is technically demanding. Moreover, most DT and metaverse tools are not built with standardized frameworks, limiting their scalability across industries.
- FIRST PUBLISHED IN:
- Devdiscourse