SIERA
Spatial Intelligence for Environmental Reasoning and AI Lab
SIERA develops foundational spatial intelligence and world-modeling systems that allow AI to understand the physical world with unprecedented fidelity. The lab integrates multimodal sensing, 3D/4D spatial reasoning, and physics-informed machine learning to interpret buildings, cities, and environmental processes under real-world complexity. SIERA’s research advances three interconnected fronts: multimodal environmental perception using novel sensor fusion architectures; generative world models for reasoning about climate, materials, and spatial change; and embodied AI systems for autonomous environmental monitoring. Building on methods developed under ESI, SIERA has created generalizable segmentation models for heat and flood risk mapping across cities, and AI-driven biodiversity detection and classification frameworks deployed with NACERA in Colombian biodiversity hotspots.
The lab is exploring whether AI can develop persistent world models of environmental affordances, thermal gradients, airflow patterns, and spatial comfort from visual experience alone and is advancing retrieval-augmented generation methods for climate policy analysis, environmental governance, and cross-cultural reasoning about the experiential qualities of the built environment. As ERA’s computational and spatial-AI engine, SIERA provides the modeling infrastructure that powers a deeper understanding of how environmental systems behave across scales, from individual building systems to planetary urban networks, revealing risks, guiding adaptation, and improving resilience. By treating world models as emerging forms of environmental intelligence, SIERA develops tools that empower communities, planners, and decision-makers to anticipate change and design more sustainable futures.
Developing multimodal, physics-aware perception systems that fuse heterogeneous environmental sensor data, thermal, multispectral, LiDAR, and satellite imagery, to characterize building performance, urban heat exposure, and ecosystem health across scales;
Advancing generative world models that integrate memory, geometry, and physical dynamics to reason about how built and natural environments respond to climate variability, enabling anticipatory planning for extreme heat, flooding, and ecological transitions;
Creating generalizable segmentation and classification methods for urban and ecological remote sensing, supporting high-resolution climate risk assessment and biodiversity monitoring in partnership with NaCERA and UMERA;
Designing embodied AI and robotic sensing systems, including UAV-based platforms, for autonomous environmental monitoring in regions facing climate-induced hazards such as landslides, deforestation, and urban heat stress;
Developing retrieval-augmented generation and large language model methods for synthesizing environmental governance documents and frameworks, and urban performance data into actionable intelligence for researchers, communities, and decision-makers;
Investigating how AI systems can reason about the experiential and phenomenological qualities of the built environment, by formalizing knowledge from diverse global architectural languages into structured computational frameworks that advance both spatial intelligence and cross-cultural understanding;
Contributing to multi-institutional roadmaps for climate-relevant robotics and spatial AI, positioning ERA at the forefront of computational approaches to planetary environmental challenges.







