Skip to content

Testing Procedures

Bill Sacks edited this page Jun 2, 2021 · 6 revisions

CESM Component Testing Procedures

Contents

This page provides links to documentation describing the testing procedures for each component / repository that makes up CESM (or in some cases this page may provide this documentation directly). The audience is CESM software engineers who are familiar with CESM / CIME system testing in general, but who need to run testing on a component that they do not regularly work with.

Many components have different testing procedures that are run depending on what has changed. This page is not meant to be exhaustive, but rather is meant to capture the most common testing procedures – e.g., procedures that should cover about 90% of cases.

Components can organize their documentation in whatever way they wish. However, they should aim to include at least the following information. Ideally, the documentation's organization will allow readers to quickly find these pieces of information. The goal is to make it relatively easy for CESM software engineers to run the testing for a different component when the need arises; needing to spend an hour reading through documentation defeats that goal.

Give a very brief overview of the testing, merging and tagging process for this component. Some example questions to cover are:

  • Is it expected that an issue is opened for every planned change?
  • Is there a ChangeLog entry for each merge to the main branch?

Does this component support a standalone checkout for testing, or does it need to be checked out in the context of CESM or some other "umbrella" repository? If the latter, describe how to determine a reasonable version of the umbrella repository to use for testing – and in particular, how to determine what versions will have baselines available for system tests of the component.

Give details of the exact test commands to run to achieve full system testing, and on what machine(s). This does not need to exhaustively cover all possible testing scenarios. Rather, it is meant to cover the typical cases – say, the common 90% of scenarios. The testing commands should be detailed enough that someone can essentially copy and paste the commands into a terminal (replacing some placeholder text, e.g., for the baseline tag name to be used for comparison). Different components have different procedures for how to kick off their test suites – for example, whether a separate create_test invocation is made for each compiler in the test suite – so please be very explicit in these instructions.

If there is anything non-standard in how to check the test results, document that as well.

For each test machine, give the location on disk in which baselines can be found for the above testing.

Describe how expected failures can be determined for any given tag.

Does the full testing described above need to be run on each set of changes? Or is it common to combine multiple changes / PRs together and have a component SE run full testing on the batch of changes? If the latter, is there some minimum expected / recommended testing on the individual changes?

State whether a feature branch needs to merge in the latest version of the main branch before running final testing. If so, there will typically be some process for tag ordering, so briefly describe that process.

Give a very rough estimate of test turnaround time and core-hour cost, to help answer the question: if I run this test suite a number of times, am I going to burn through my allocation?

Add anything else that is important but not covered above. Include any references to additional relevant documentation.

For components that have their own externals (e.g., CTSM's FATES, or CAM's CARMA, CLUBB, PUMAS, etc.): provide links to any documentation on procedures for changing those externals.

3.1   CESM integration testing (prealpha, prebeta)

3.2   CIME

3.3   CMEPS

3.4   CDEPS

3.5   FMS

3.6   CAM

3.7   CTSM/CLM

3.8   POP

3.9   MOM

3.10   CICE5

3.11   CISM

3.12   MOSART

3.13   RTM

3.14   WW3