Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Define metrics for the quality of the simulation results #248

Open
phelps-sg opened this issue May 13, 2020 · 0 comments
Open

Define metrics for the quality of the simulation results #248

phelps-sg opened this issue May 13, 2020 · 0 comments

Comments

@phelps-sg
Copy link
Contributor

It would be useful to quantify the quality of the simulation results. If we can measure how good the simulation is, those metrics could be used to determine whether the model is improving in quality as new features are implemented or as issues are resolved. In order to assess the performance of the model, it will be necessary to state the purpose of the model (#246) so that suitable performance metrics can be defined. If the goal of the model is to make forecasts, then the quality of the simulation outputs could be assessed by some kind of MSE metric (#206).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant