Publications / 2024 Proceedings of the 41st ISARC, Lille, France

Pre-trained language model based method for building information model to building energy model transformation at metamodel level

Zhichen Wang, Mario Bergés, Burcu Akinci
Pages 17-25 (2024 Proceedings of the 41st ISARC, Lille, France, ISBN 978-0-6458322-1-1, ISSN 2413-5844)
Abstract:

Building energy model (BEM) creation based on building information models (BIM) can save model re-creation time during building design phase. However, current BIM-to-BEM transformation is at the model level so the BEM is re-generated every time when the change happens to the BIM. Since design changes happen frequently and the generated BEM needs fine-tuning, such model regeneration is still time-consuming. Mapping rules between BIM and BEM are needed to achieve component level transformation, so that only the corresponding part in BEM instead of whole BEM are updated when changes happen to the BIM. These mapping rules can be defined explicitly using model transformation languages. However, these rule-based transformation methods have limitations in scalability. To solve this issue, this study proposes a pre-trained language model (PLM) based method to construct the mapping relationships between BIM and different types of BEM at the metamodel level. In summary, we formulate the BIM-BEM mapping as a machine translation task and solve it using PLM. For evaluation we collected and generated 35 pairs of BIM and BEM metamodels, and these metamodels are preprocessed into formatted texts that are readable by PLM. The 82% matching accuracy is achieved by proposed method which is higher than the 61% accuracy achieved by a baseline model in previous work. This paper shows the potential to utilize PLMs to facilitate the BIM to BEM transformation from BIM to a varying type of BEMs at the metamodel level. Future work will be focus on the realizing instance-level mapping and transformation.

Keywords: Model transformation, Building information model, Building energy model, Interoperability, Natural language processing, Pre-trained language model