University of California, Berkeley RAPMOD (OPEN 2012)


PROJECT TITLE: Rapid Building Energy Modeler - RAPMOD
AWARD: $2,834,899
PROJECT TEAM: University of California – Berkeley (Lead), Indoor Reality, Baumann Consulting, Lawrence Berkeley National Laboratory (LBNL)
PROJECT TERM: April 2013 – November 2016


Building energy use accounts for ~40% of total U.S. energy consumption (or approximately 39 quads in 2015) and is the largest cost in building operations. As much as 30% of a building’s energy consumption can be lost due to inefficiencies in its operation or its original design. Retrofitting or retro-commissioning buildings for improved energy efficiency, requires determining the most impactful modifications through an energy audit. While retro-commissioning can typically reduce whole building energy consumption by ~16% and pay back in a few years, less than 5% of buildings undergo an audit and retro-commissioning due to the high upfront cost, time, and complexity associated with the auditing process. A fast, automated, accurate, and inexpensive process is needed to rapidly assemble building energy models to recommend the most impactful energy retrofit and recommissioning options to save building energy.


Building energy audits today are performed manually, requiring extensive time (normally several days) to assess the various components of large buildings by a skilled professional, leading to high costs. Subsequent development of building energy models used to simulate the interplay of energy-relevant components to identify problems and predict the impact of retrofits is also a labor-intensive process. Automation of energy auditing and model generation is now possible due to improved sensors, robotics, computational power and image processing. High sensitivity optical sensors are now available that can provide precise information on scanned features, thereby decreasing measurement inaccuracies. Advances in computing power and optical image recognition algorithms can allow rapid sensing and detection on mobile platforms. Techniques to deal with a moving sensor platform, which have been developed for robots and drones, can be adapted for energy audits, but innovation is still needed to deal with navigation in confined or uneven areas, such as closets, utility rooms, and stairwells.


The Berkeley team’s goals were to create a human-carried (backpack) audit package, and demonstrate no more than a 10% difference in the predicted energy usage between a manual audit and the backpack audit process in EnergyPlus models for a commercial building with at least 3 floors and a backpack capital cost of <$40,000. Prior to ARPA-E support, the team at UC Berkeley had demonstrated a set of hardware and software capable of generating crude 3D indoor maps. A wearable backpack-style device had been assembled combining Light Detection and Ranging (LiDAR) scanners, inertial measurement units, and visible light optical cameras to capture data (point clouds) as an operator walked through a building space. The associated software was able to stitch together and render the captured images performing simultaneous localization and mapping (SLAM) while compensating for operator motion/orientation to produce an output that mimics the Google Streetview® type experience for indoor environments.

The project team built on this foundational work by enhancing the accuracy and scalability of the sensor fusion algorithms and integrating the data from added infrared cameras into the SLAM output to allow automated construction of building energy models for use by building auditors. The project team undertook three inter-related technical tasks to address each step of the automatic audit process. The first task was to synchronize the thermal imaging data with the existing SLAM data to assemble visible and infrared (IR) point clouds with improved localization accuracy and scalability. The second task involved improving sensing and detection algorithms with all optical data sets to locate energy-relevant features such as windows, lighting, and plug loads in the building. The third task was to combine and streamline the SLAM + building element data for easy import into the EnergyPlus building-modeling environment.

The project began with integrating and synchronizing the infrared imaging data with the visible camera data. The team developed optical recognition algorithms to identify key building features and data simplification and translation software to import the results into the EnergyPlus modeling environment. They demonstrated that the system could automatically calculate window to wall ratio, a critical metric for building energy models, within 10% error of that determined via a manual audit.

Figure 1: (a) The RAPMOD backpack captures (b) visual and infrared optical data as the operator walks by objects, which is (c) stitched together into 3D point clouds and integrated into building geometry and energy models. The resulting information is streamlined and ported into energy models (d) to recommend energy conservation measures and their impact on building energy use


The team addressed measurements not well suited to automated detection. Specifically, to determine window U value, the team added a commercial handheld optical scanner that the backpack operator places on building windows to capture the characteristics needed to calculate the U value. The team also addressed the problem of identifying energy consuming components (e.g. HVAC equipment, water heaters, etc.), by using a handheld device (i.e. smartphone) to capture images of large equipment nameplates which are time stamped for synchronization with the location of the backpack. The degree of automated data entry depends on whether the loads are visible or concealed, and whether the load-type has been trained into the recognition algorithms.

After two years, the team demonstrated a backpack audit of a ~69,000 sqft building performed in 1/6 of the time required to complete the same audit manually using established practices. Comparison of total annual building energy consumed in the EnergyPlus models created using the two audit methods showed less than 10% error for the components included in the audit. The cost of the latest version of the backpack is around $80,000, and the end of project cost target of $40,000 per backpack appears to be achievable through reductions in the number and sensitivity of sensors incorporated on the backpack.

Continuing work includes streamlining and automating many of the software detection and processing algorithms that remained manually operated. In addition, audit functionality has been developed that incorporates external information about the buildings and allows the operator to dynamically adjust variables within the building model to simulate energy efficiency retrofit/retro-commissioning measures. A client-facing web-based interactive tool has been developed for use on an unlimited number of data sets by the design/construction/audit end-user. The team has also completed testing the audit process in a cold climate region where the heating load is a more crucial factor, with promising early results. Further demonstrations and comparisons are underway in a broader variety of climate zones to refine the energy model processing algorithms and provide assurance of accuracy to potential clients.


The team at UC Berkeley formed a startup-company, Indoor Reality, to continue development and commercialization of the backpack and associated software built for indoor building mapping. Indoor Reality is targeting the architecture, engineering, and construction (AEC) industry as a first market customer for their building mapping technology. The preparation of accurate as-built documentation is a strong need in the AEC industry, particularly in regions requiring documentation of energy-relevant performance metrics. The team estimates that ~3.8 billion sqft of building space is affected by benchmarking, transparency, or audit policies. The team plans to employ a service-based business model in which Indoor Reality manages the physical survey and the online documentation and analysis software will be the principal paid product.

In 2016, Indoor Reality completed indoor building scans for approximately 10 customers, has several additional projects booked, and other potential customers in their pipeline. The team has obtained $200k in follow-on funding from The House Fund, $500k in seed funding from DPR Construction, and $345k in angel investments.


The project has made substantial advances in optical recognition technology to achieve precise measurement of building geometry and identify key building elements while ignoring extraneous elements (i.e. furniture). Furthermore, scaling the mapping technology and processing the extremely large amount of data captured as point clouds was a formidable computational challenge overcome by the team. Synthesizing and simplifying data into building dimensions while preserving the needed detail to more accurately model building energy use (compared to today’s audit methodologies) represents a significant output of the project.

Building retro-commissioning typically reduces whole building energy consumption by ~16% whereas energy efficiency retrofits can save on average ~45% of building energy. Indoor Reality projects that its products will significantly reduce the cost (by about a factor of 3), time (from days-weeks to hours), and complexity (high to low skill labor) of energy audits, lowering the barrier to entry for many more audit providers, creating jobs in auditing, lowering costs for construction, and ultimately improving U.S. building efficiency.


As of January 2017, the project has generated two invention disclosures to ARPA-E. The UC Berkeley team has also published the scientific underpinnings of this technology in the open literature. A list of publications is provided below:

Oreifej, O., Cramer, J., & Zakhor, A. (2014). Automatic generation of 3d thermal maps of building interiors. ASHRAE Trans.

Corso, N. and Zakhor, A. (2013), "Indoor Localization Algorithms for an Ambulatory Human Operated 3D Mobile Mapping System," Remote Sensing 2013, vol. 5, no. 12, pp. 6611-6646