About two months ago, I created an AI energy model to determine how much time and effort it required. Much to my surprise, it was faster and easier than I expected, with some interesting challenges, and I will outline the process here. Note that this process works with any software. I chose eQUEST because it required me to gather data from varying text-based files. After all, most AI requires collecting and organizing data that isn't in Excel format.
The Process
Here's a quick rundown of how it worked:
- Start with a specific model I began with a file I had of an architect's office. I named it CAD Captivity Center
- Generate the data set I ran 50 simulations using the same three variables. For each simulation, I set each variable as follows
- LPD (Lighting Power Density): Randomly selected between 0.5 and 1.5 W/Sq ft.
- Shading Coefficient: Randomly chosen between 0.3 and 0.8.
- Insulation (R-Value): Random values between R-5 and R-50.
- Put the Data into Spreadsheet format Collected the inputs and organized them into a spreadsheet. Used python to export the load results from the simulation files into put them into a spreadsheet to train the machine.
- Process the Data Divide the data into two groups, 40 models for training the AI, and 10 models for testing it. (the 80/20 rule)
- Generate the AI Model I used open-source machine learning software, Tensorflow.js, and a little Python code; I fed the data into Tensorflow, and it generated a working AI model. The training process took about one minute!
- Testing and Verification I ran manual tests of the model by inputting LPD, Shading Coefficient, and R-Value. I compared the AI model load prediction to real eQUEST results. The results were mostly within 1-8% of accuracy, with a few outliers—a win for a proof of concept with limited data.
- Exporting the AI Model The AI model was then ready for export so anyone could run it.
Instant Simulations
What exactly is the AI Model I created? It is a tiny calculator that can instantly predict the load for the "CAD Captivity Center" for infinite combinations of LPD, Shading Coefficient, and R-value. Effectively a little machine brain that computes the load for the Captivity Center, using 3 variables (and assuming EVERYTHING else stays the same). It is very good at that, and it cannot do anything else.
Practical Applications
This approach offers plenty of practical uses. For example, we could set the variables to a cost function (e.g., Insulation Cost = Install Fee + Factor * R-Value), and we can solve for the lowest-cost options to achieve a target load—without running 5,000 iterations.
Rapid Expandability
One of the great things about this method is its expandability. There is no strict limit on input and output variables—only the availability of a broad dataset. The model works for any combination of inputs and outputs. For example, we could train it to predict energy consumption using any input variables. One simply needs to obtain enough data and define reasonable variables to make a conclusive output, and understand its limits. For example, we could make a model that predicts the energy consumption of a building based on LPD and plug loads. However, it would be limited to one specific building, assuming no other variables.
Of course, you can expand variables as long as you have enough data! For example, to expand the load calculator of the CAD Captivity center, we could add a new variable, such as glass R-value, to the initial models. This requires re-running initial simulations with the four variables. We might need to run 80 iterations (or more) to maintain the same accuracy as we obtained with three variables in 50 simulations.
Challenges
1. Data Collection The biggest hurdle is building a dataset to train the model, involving two key steps:
- Obtaining Raw Data
- Converting Data to CSV Format
In my example, the load outputs were found in eQUEST simulation files. To extract these values, I wrote a Python script to pull outputs from 50 SIM files. This was tricky for a few of them. Another option is to plug the data into CSV manually.
2. Model Expansion We can expand the model. The expanded model requires data. If we wanted to add complex variables such as climate zone, we'd need to simulate all iterations for each climate zone—an 8x increase if the input is Climate Zone 1-8 (not including a, b, c). This would push the iteration requirement to a minimum of 400. In reality, location has so many variables, that it might require thousands of models. The point is that it is expandable.
Below is a screenshot of TensorFlow training from the data. I set it to run 100 "Epochs" and you can choose more or less. The machine learns more and more as Epochs progress; here you can see that most epochs took 3 milliseconds, so this is not a significant limitation.
Limitations
The model is limited to the variables on which it was trained and isn't designed to replace traditional modeling software. Instead, it acts as a powerful, complementary tool, enabling one to explore thousands of iterations of variables.
Sharing the AI Model
Once created, the AI model is a compact file, easily shareable—a small but powerful calculator. I considered embedding it here, but why would I want to answer a bunch of emails about it?
Final Takeaway
The key takeaway is that AI modeling is readily available and it is highly specialized. With sufficient training data, it can handle certain tasks exceptionally well—and extremely fast. I see a future where robust AI simulations emerge. At first, they will be used in conjunction with models, and eventually, AI engines can become energy modeling software. We simply need a large enough dataset.
From what I've seen, an AI model could produce "above average" results using 20 inputs. This is because I have seen many models that are GIGO (garbage in garbage out). An AI engine would ensure users only input the relevant items. It could easily replace a weak modeler. On the other hand, AI is far away from competing with expert models.
Interested in the Steps? The step-by-step procedure? If you're curious about the exact steps I took to build this AI model, feel free to connect with me on LinkedIn below.