Skip to content

General workflow

The general steps to perform a model calibration using the Calibration library are outlined below.

Set up a baseline experiment🔗

In order to apply the correct simulation analysis and settings, a baseline experiment is used. Before calibrating the model it is therefore required to simulate the model with desired analysis and settings. These will later be re-used during calibration.

Set up a reference result🔗

To specify the desired values for the model to output after calibration, a simulation result is utilized. In this way both results from other models as well as file data can be utilized. It is also possible to inspect the reference data from the analysis view.

The reference result can be generated in three ways:

  • Load a data file using the Reference file loader found in the app menu. This app can only be accessed when a model is activated. The result uploaded from the app will be assigned to the active model. Any data that is recognized as a parameter in the active model will be considered an operating condition in the reference result.
  • Simulate another model (usually a higher fidelity model) which result is supposed to be reproduced.
  • Use the Calibration. Generate Reference Results function to generate a reference result using a data file existing in a resource folder from your active workspace.

Note

In both cases it is required that the names of reference variables, calibration parameters and operating conditions coincide between the reference result and the model to be calibrated.

In order to achieve this for a simulation model it is highly recommended to utilize model templates and inheritance. For data files, make sure the column names in the data file matches the variable names from the model.

Select calibration parameters and set bounds🔗

The calibration parameters are the model parameters to be tweaked in order to find the set that gives the smallest error when comparing with the reference result. In order to help the algorithm progress efficiently and ensure that valid parameter values are used within the calibration algorithm, bounds are given to these variables.

  • Bounds are set by assigning min and max attributes to the parameters in the model that are to be used as calibration parameters.

Select reference variables and set nominals🔗

The reference variables are the reference output that is supposed to be reproduced by the model.

  • Make sure the reference variables are present in the reference result.

When there are more than one such variable, proper scaling is needed in order to balance the contribution between them to the objective error. If not, some reference variable might dominate the error and the calibration algorithm will prioritize minimizing that contribution. This could result in that the algorithm ends up at a suboptimal solution or does not find a solution at all.

To solve this, the nominal attributes of the reference variables are re-used in the algorithm to normalize the contribution of each reference variable.

  • Make sure proper nominal values are assigned to the reference variables.

Run the calibrate function🔗

With the above finalized, the actual calibration can be performed.

Create an experiment using the Calibration. Calibrate custom analysis. As arguments to the function, utilize the:

  • Baseline experiment
  • Reference result
  • Calibration parameters
  • Reference variables

That was selected or created as the result of the previous steps and run the custom function.

Analyzing the results🔗

A simulation result is created when the calibration is finished. It contains by default:

A summary case:

  • This case does not include any calculated values.
  • The log can be inspected to see statistics and info from the calibration.

A result with the model simulated using:

  • The analysis specified by the baseline experiment.
  • The optimized calibration parameters applied.
  • One case is generated for every case in the reference result using the operating conditions (parameterization) for each of these cases.

An artifact can be found in the Artifacts tab while having the calibration result active It contains an html-report comparing the reference variables from the baseline, reference and calibrated results. It can be inspected by using the "preview" option.

Common issues🔗

  • When generating a reference result using another model, it is important that any operating are applied through the experiment parametrization. Model parameters only stored in the model will not be applied to the calibration.

Tips & Tricks🔗

The 'output' parameter🔗

In the calibration function there is an output parameter. This can be used to inspect intermediary results of the calibration which could come in handy when for example debugging. This controls what result is displayed in the results browser.

  • If baseline is selected, the function will stop after running the baseline.
  • If initialization is selected, the function will stop after running the initialization simulation of the baseline experiment.

As mentioned earlier, there will be one case in the result per case in the reference result, applying the respective operating conditions (parameterization) for each of these cases.