Optimizing Neural Networks for Modeling and Simulation (Machine Learning Blog Part 2)

Written by Ryan Dudgeon

May 19, 2023
machine learning neural networks simulation

Why Neural Networks are Effective in Machine Learning

Neural networks are powerful machine learning [ML] models that can capture highly nonlinear relationships between inputs and outputs within a dataset while being computationally inexpensive to execute. The benefits of neural networks for modeling and simulation activities, using the simulation platform GT-SUITE, were covered in part 1 of this blog series.

In this blog, we’ll look at how to optimize the predictive accuracy and training of neural networks. When utilizing neural networks, a barrier to overcome is hyperparameters. This is a term that describes the configuration properties that affect predictive accuracy and training time. For example, two main hyperparameters that determine the neural network configuration or architecture are the number of hidden layers and the number of neurons in each hidden layer. Unfortunately, the user cannot know the best combination of hyperparameters suited for a given dataset.

One common solution to determining the best set of hyperparameters for a given dataset, and thereby finding an optimal neural network, is to perform what is sometimes called grid search. Grid search consists of defining a list of values to try for each hyperparameter to be studied, then creating a list of neural networks that represents the full combination of all hyperparameters that are varied. The set of neural networks would be trained, and a metric such as validation root-mean-squared error would be used to choose a final metamodel with which to proceed.

For example, if the user wants to use a neural network with two hidden layers and test it with 5, 10, and 15 neurons in the first hidden layer, and 4 and 8 neurons in the second, six total neural networks would be created and trained:

Leverage GT-SUITE’s New Neural Network Capabilities in v2023

Manually creating many dozens or hundreds of neural networks with different combinations of hyperparameters in GT-SUITE’s DOE post-processor would be impractically tedious and time consuming. Fortunately, a new tool available in GT-SUITE v2023 facilitates generating these many neural network candidates with just a few mouse clicks.

In the Create Metamodels page of the DOE post-processor, a button is available to “Create Multiple multi-layer perceptrons (MLP) Metamodels.” The main dialog of this tool appears as follows, which lists hyperparameters and allows the user to choose multiple values to try for each (additional, more advanced hyperparameters are not shown here):

With the values shown in the screenshot, 108 total metamodels will be created, where the following hyperparameters are varied: 

  • Neural networks with 2 and 3 hidden layers are tested
  • 3, 6, and 9 neurons in hidden layer 1 are tested 
  • 3 and 6 neurons in hidden layer 2 are tested
  • 3 and 6 neurons in hidden layer 3 are tested
  • Normalization and standardization scaling methods are tested
  • To deal with the stochastic nature of the training process, each neural network will be trained 3 times, as entered in the Number of Repeated Trials 

GT users can preview a table listing the 108 metamodels before they are sequentially trained (the first 15 are shown below): 

After training the 108 metamodels, determining the best one consists of simply sorting the validation root-mean-square error in the Compare Metamodel Metrics page (assuming the dataset was portioned to have training, validation, and test samples).  

Here’s a full video demonstration of creating these MLP hyperparameters:

Learn More About Our Machine Learning Simulation Solutions

The next time a lookup table or lookup map is needed for your GT-SUITE model, consider adding more accuracy to the model by training the data to a neural network or other ML model. For a GT-SUITE model that already utilizes one or more lookups, one might also consider upgrading it with ML.

If you are interested in learning more about how you can implement machine learning in GT-SUITE, see our productivity abilities. You can also contact us here.

New Design of Experiments and Machine Learning Training!

If you’re a model builder and an active and/or new GT-SUITE user and would like to learn more about GT-SUITE’s DOE and machine learning capabilities, we now have a new training course!

Click here to access the training

The machine learning training content is divided into 16 separate videos ranging between 3-9 minutes.

Training videos breakdown:

  • The first two videos provide background information about the benefits and motivations of the Design of Experiments (DOE) and machine learning
  • One video covers creating and running a DOE on a GT-SUITE model
  • The remaining videos demonstrate how to use the machine learning tool that is integrated in GT-POST

NOTE: These trainings are only for those with GT-SUITE accounts