AR Design


This project builds off a project started in 2014 in Singapore, and a specific output of the project in using low-cost VR interfaces of navigating real-time building sensor data

The goal of the project is as follows:

  • It is becoming increasingly affordable to deploy large, sophisticated building sensor and control hardware in non-residential buildings, all governed by centralized computer programs known as building management systems (BMSs).
  • In recent years, it has also become increasingly affordable to make real-time building system data transmittable to web services via standardized data transmission formats (i.e., REST API). This has opened the door for new technology developments in user interaction with building data and BMSs.

One particularly interesting idea is the use of AR technology to allow building managers and engineers to access building sensor data and BMS information in-situ, allowing for a faster, more tactile approach to building maintenance fault detection and problem rectification. In this respect, the present-day use of BMSs in building maintenance situations is as follows: (for example):

  1.  An engineer receives a customer report that an air-conditioning device is failing, or perhaps several occupants are complaining that indoor conditions in several rooms are uncomfortable;
  2.  The engineer may arrive to site and physically inspect air-conditioning hardware and room thermostats;
  3. As the majority of control devices across the building’s air-conditioning system are either hidden from view (or contain sensors without visible meters), the engineer must frequently alternate between inspection of physical hardware and dissection of the system data on a laptop interface to the BMS’ user dashboard.
  4. This back-and-forth is slow, and non-intuitive. The BMS interface is not designed to be used as a rapid maintenance tool (despite what large BMS providers like Siemens think). Whenever the engineer needs to access further data from the BMS, he/she must stop what they are doing, return to the laptop, access information (which often includes many mouse clicks), and then return to work with the memory of what was viewed on-screen.

The inefficiency of this process is a common grievance. The proposed solution is as follows:

  1. Imagine a wearable augmented reality (AR) device (i.e., Hololens, new Google Glass, etc.) that is worn by a building maintenance
  2. As the engineer walks through a maintenance/problem site, he/she receives a live stream of building sensor data, with data visualized in spatially appropriate locations (see conference paper above);
  3.  The engineer may also have, in-view, access to spatially relevant maintenance logs and memos regarding specific problem areas (i.e., “This valve was last serviced 2 weeks ago.”)
  4. The engineer may also have direct access to virtual control levers in spatially appropriate locations, where hand gestures may be used to directly access control variables within the building’s BMS. Developing a prototype of such an AR platform, within the context of the UBC Campus is a Living Lab ecosystem, is the overall objective of this work.


The proposed AR platform will significantly support teaching and learning regarding sustainable, low-carbon building design. One of the greatest shortcomings in educating students and the public on how buildings perform is because buildings are designed deliberately to hide much of their own engineering systems from view. An AR platform to interact with live-streaming building data and metadata will be a significant learning tool. It will make visible the science underlying a building’s performance.

Over the course of developing the idea, we have solved many of the technical challenges it faces:

  1.  We have a technical solution in place for making real-time BMS data and controls from UBC institutional buildings accessible via an API
  2. We have access to a technology for generating very precise 3D models of as-built indoor environments (i.e., Matterhorn) to allow for superimposition of a virtual environment over the physical environment
  3. We have tested a basic version of this idea in a VR environment The problem we need to solve is the underlying software development of the AR interface, based on a selection of an appropriate AR device.

Project Goals

In addition to the development of peer-reviewed publications and (potentially) open-source software libraries for further adaptation of the project by the global research audience, the project’s success will be marked by two goals:

  1. Adoption of the platform by UBC Building Operations for maintenance of at least 1 building on UBC’s Vancouver campus.
  2. Widespread adoption of a ‘lite’ version of the AR platform for student building site visits, as covered by curriculum within UBC’s Master of Engineering Leadership in High-Performance Buildings

The Team


  • Adam Rysanek, Principal Investigator, Ph.D, UBC School of Architecture and Landscape Architecture (SALA)


  • Abel Waller
  • Andrea Tang (Project Coordinator)
  • Andrew Zulaybar
  • Cheng Zhou (Design Lead)
  • Miriam Wagner
  • Natalie Nguyen