About two months ago, I created an AI energy model to determine how much time and effort it required. Much to my surprise, it was faster and easier than I expected, with some interesting challenges, and I will outline the process here. Note that this process works with any software. I chose eQUEST because it required me to gather data from varying text-based files. After all, most AI requires collecting and organizing data that isn't in Excel format.
Here's a quick rundown of how it worked:
What exactly is the AI Model I created? It is a tiny calculator that can instantly predict the load for the "CAD Captivity Center" for infinite combinations of LPD, Shading Coefficient, and R-value. Effectively a little machine brain that computes the load for the Captivity Center, using 3 variables (and assuming EVERYTHING else stays the same). It is very good at that, and it cannot do anything else.
This approach offers plenty of practical uses. For example, we could set the variables to a cost function (e.g., Insulation Cost = Install Fee + Factor * R-Value), and we can solve for the lowest-cost options to achieve a target load—without running 5,000 iterations.
One of the great things about this method is its expandability. There is no strict limit on input and output variables—only the availability of a broad dataset. The model works for any combination of inputs and outputs. For example, we could train it to predict energy consumption using any input variables. One simply needs to obtain enough data and define reasonable variables to make a conclusive output, and understand its limits. For example, we could make a model that predicts the energy consumption of a building based on LPD and plug loads. However, it would be limited to one specific building, assuming no other variables.
Of course, you can expand variables as long as you have enough data! For example, to expand the load calculator of the CAD Captivity center, we could add a new variable, such as glass R-value, to the initial models. This requires re-running initial simulations with the four variables. We might need to run 80 iterations (or more) to maintain the same accuracy as we obtained with three variables in 50 simulations.
1. Data Collection The biggest hurdle is building a dataset to train the model, involving two key steps:
In my example, the load outputs were found in eQUEST simulation files. To extract these values, I wrote a Python script to pull outputs from 50 SIM files. This was tricky for a few of them. Another option is to plug the data into CSV manually.
2. Model Expansion We can expand the model. The expanded model requires data. If we wanted to add complex variables such as climate zone, we'd need to simulate all iterations for each climate zone—an 8x increase if the input is Climate Zone 1-8 (not including a, b, c). This would push the iteration requirement to a minimum of 400. In reality, location has so many variables, that it might require thousands of models. The point is that it is expandable.
Below is a screenshot of TensorFlow training from the data. I set it to run 100 "Epochs" and you can choose more or less. The machine learns more and more as Epochs progress; here you can see that most epochs took 3 milliseconds, so this is not a significant limitation.
Limitations
The model is limited to the variables on which it was trained and isn't designed to replace traditional modeling software. Instead, it acts as a powerful, complementary tool, enabling one to explore thousands of iterations of variables.
Once created, the AI model is a compact file, easily shareable—a small but powerful calculator. I considered embedding it here, but why would I want to answer a bunch of emails about it?
The key takeaway is that AI modeling is readily available and it is highly specialized. With sufficient training data, it can handle certain tasks exceptionally well—and extremely fast. I see a future where robust AI simulations emerge. At first, they will be used in conjunction with models, and eventually, AI engines can become energy modeling software. We simply need a large enough dataset.
From what I've seen, an AI model could produce "above average" results using 20 inputs. This is because I have seen many models that are GIGO (garbage in garbage out). An AI engine would ensure users only input the relevant items. It could easily replace a weak modeler. On the other hand, AI is far away from competing with expert models.
Interested in the Steps? The step-by-step procedure? If you're curious about the exact steps I took to build this AI model, feel free to connect with me on LinkedIn below.
Energy-Models.com is a site for energy modelers, building simulators, architects, and engineers who want learn the basics, to advanced concepts of energy modeling. We've got online training courses and tutorials for eQUEST, Trane TRACE 700, OpenStudio, and LEED for energy modeling. All our energy modeling courses are video based. What better way to learn energy modeling software than screen-casts of exactly how things are done?
Copyright © 2010-2024 CosmoLogic LLC. TRACE 700 and eQUEST are ™ of Trane Inc. and James J. Hirsch respectively. Energy-Models.com is built in San Francisco, CA and Slinger, WI USA.