Skip to content

Commit

Permalink
edit Quick Start ch & misc
Browse files Browse the repository at this point in the history
  • Loading branch information
gspetro-NOAA committed Oct 23, 2023
1 parent edac412 commit f13402e
Show file tree
Hide file tree
Showing 4 changed files with 49 additions and 14 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ The UFS Weather Model contains a number of sub-repositories, which are documente
.. COMMENT: Update link to release docs!
.. note::
The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `spack-stack <https://github.com/NOAA-EMC/spack-stack>`__ repository assembles these prerequisite libraries. Spack-stack has already been built on `preconfigured (Level 1) platforms <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__. However, it must be built on other systems. See the :doc:`spack-stack Documentation <spack-stack:index>` for details on installing spack-stack.
The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `spack-stack <https://github.com/JCSDA/spack-stack>`__ repository assembles these prerequisite libraries. Spack-stack has already been built on `preconfigured (Level 1) platforms <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__. However, it must be built on other systems. See the :doc:`spack-stack Documentation <spack-stack:index>` for details on installing spack-stack.

.. _TopLevelDirStructure:

Expand Down
4 changes: 4 additions & 0 deletions docs/UsersGuide/source/BuildingRunningTesting/AQM.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@
Air Quality Modeling (SRW-AQM)
=====================================

.. attention::

AQM capabilities are an unsupported feature of the SRW App. This means that it is available for users to experiment with, but assistance for AQM-related issues is limited.

The standard SRW App distribution uses the uncoupled version of the UFS Weather Model (atmosphere-only). However, users have the option to use a coupled version of the SRW App that includes the standard distribution (atmospheric model) plus the Air Quality Model (AQM).

The AQM is a UFS Application that dynamically couples the Community Multiscale Air Quality (:term:`CMAQ`) model with the UFS Weather Model (WM) through the :term:`NUOPC` Layer to simulate temporal and spatial variations of atmospheric compositions (e.g., ozone and aerosol compositions). The CMAQ model, treated as a column chemistry model, updates concentrations of chemical species (e.g., ozone and aerosol compositions) at each integration time step. The transport terms (e.g., :term:`advection` and diffusion) of all chemical species are handled by the UFS WM as tracers.
Expand Down
55 changes: 43 additions & 12 deletions docs/UsersGuide/source/BuildingRunningTesting/Quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,16 @@ Quick Start Guide
This chapter provides a brief summary of how to build and run the SRW Application. The steps will run most smoothly on `Level 1 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ systems. Users should expect to reference other chapters of this User's Guide, particularly :numref:`Section %s: Building the SRW App <BuildSRW>` and :numref:`Section %s: Running the SRW App <RunSRW>`, for additional explanations regarding each step.


Install the HPC-Stack
===========================
SRW App users who are not working on a `Level 1 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ platform will need to install the prerequisite software stack via :term:`HPC-Stack` prior to building the SRW App on a new machine. Users can find installation instructions in the :doc:`HPC-Stack documentation <hpc-stack:index>`. The steps will vary slightly depending on the user's platform. However, in all cases, the process involves (1) cloning the `HPC-Stack repository <https://github.com/NOAA-EMC/hpc-stack>`__, (2) reviewing/modifying the ``config/config_<system>.sh`` and ``stack/stack_<system>.yaml`` files, and (3) running the commands to build the stack. This process will create a number of modulefiles required for building the SRW App.
Install the Prerequisite Software Stack
=========================================
SRW App users who are **not** working on a `Level 1 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ platform will need to install the prerequisite software stack via :term:`spack-stack` or :term:`HPC-Stack` prior to building the SRW App on a new machine. Users can find installation instructions in the :doc:`spack-stack documentation <spack-stack:index>` or the :doc:`HPC-Stack documentation <hpc-stack:index>`. The steps will vary slightly depending on the user's platform, but detailed instructions for a variety of platforms are available in the documentation. Users may also post questions in the `spack-stack Discussions tab <https://github.com/JCSDA/spack-stack/discussions/categories/q-a>`__ (for spack-stack) or in the `ufs-community Discussions tab <https://github.com/orgs/ufs-community/discussions/categories/q-a>`__ (for HPC-Stack).

Once the HPC-Stack has been successfully installed, users can move on to building the SRW Application.
.. COMMENT: Check whether spack-stack Discussions tab of ufs-community is better.
Once spack-stack or HPC-Stack has been successfully installed, users can move on to building the SRW Application.

.. attention::
Although HPC-Stack is currently the fully-supported software stack option, UFS applications are gradually shifting to :term:`spack-stack`, which is a :term:`Spack`-based method for installing UFS prerequisite software libraries. Users are encouraged to check out `spack-stack <https://github.com/NOAA-EMC/spack-stack>`__ to prepare for the upcoming shift in support from HPC-Stack to spack-stack.
Most SRW App `Level 1 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ systems have shifted to spack-stack from HPC-Stack (with the exception of Derecho). Spack-stack is a Spack-based method for installing UFS prerequisite software libraries. Currently, spack-stack is the software stack validated by the UFS Weather Model (:term:`WM <Weather Model>`) for running regression tests. UFS applications and components are also shifting to spack-stack from HPC-Stack but are at various stages of this transition. Although users can still build and use HPC-Stack, the UFS WM no longer uses HPC-Stack for validation, and support for this option is being deprecated.

.. _QuickBuildRun:

Expand Down Expand Up @@ -42,7 +44,7 @@ For a detailed explanation of how to build and run the SRW App on any supported
./devbuild.sh --platform=<machine_name>
where ``<machine_name>`` is replaced with the name of the user's platform/system. Valid values include: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` | ``wcoss2``
where ``<machine_name>`` is replaced with the name of the user's platform/system. Valid values include: ``derecho`` | ``gaea`` | ``hera`` | ``hercules`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` | ``wcoss2``

For additional details, see :numref:`Section %s <DevBuild>`, or view :numref:`Section %s <CMakeApproach>` to try the CMake build approach instead.

Expand All @@ -52,7 +54,7 @@ For a detailed explanation of how to build and run the SRW App on any supported

.. code-block:: console
source /path/to/etc/lmod-setup.sh <platform>
source /path/to/ufs-srweather-app/etc/lmod-setup.sh <platform>
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<platform>
Expand All @@ -74,27 +76,56 @@ For a detailed explanation of how to build and run the SRW App on any supported
cd ush
cp config.community.yaml config.yaml
Users will need to open the ``config.yaml`` file and adjust the experiment parameters in it to suit the needs of their experiment (e.g., date, grid, physics suite). At a minimum, users need to modify the ``MACHINE`` parameter. In most cases, users will need to specify the ``ACCOUNT`` parameter and the location of the experiment data (see :numref:`Section %s <Data>` for Level 1 system default locations). Additional changes may be required based on the system and experiment. More detailed guidance is available in :numref:`Section %s <UserSpecificConfig>`. Parameters and valid values are listed in :numref:`Chapter %s <ConfigWorkflow>`.
Users will need to open the ``config.yaml`` file and adjust the experiment parameters in it to suit the needs of their experiment (e.g., date, grid, physics suite). At a minimum, users need to modify the ``MACHINE`` parameter. In most cases, users will need to specify the ``ACCOUNT`` parameter and the location of the experiment data (see :numref:`Section %s <Data>` for Level 1 system default locations).

For example, a user on Gaea might adjust or add the following fields to run the 12-hr "out-of-the-box" case on Gaea using prestaged system data and :term:`cron` to automate the workflow:

.. code-block:: console
user:
MACHINE: gaea
ACCOUNT: hfv3gfs
workflow:
EXPT_SUBDIR: run_basic_srw
USE_CRON_TO_RELAUNCH: true
CRON_RELAUNCH_INTVL_MNTS: 3
task_get_extrn_ics:
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_ICS: /lustre/f2/dev/role.epic/contrib UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh}
task_get_extrn_lbcs:
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /lustre/f2/dev/role.epic/contrib UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh}
Users on a different system would update the machine, account, and data paths accordingly. Additional changes may be required based on the system and experiment. More detailed guidance is available in :numref:`Section %s <UserSpecificConfig>`. Parameters and valid values are listed in :numref:`Chapter %s <ConfigWorkflow>`.

#. Generate the experiment workflow.

.. code-block:: console
./generate_FV3LAM_wflow.py
#. Run the workflow from the experiment directory (``$EXPTDIR``). By default, the path to this directory is ``${EXPT_BASEDIR}/${EXPT_SUBDIR}`` (see :numref:`Section %s <DirParams>` for more detail). There are several methods for running the workflow, which are discussed in :numref:`Section %s <Run>`. One possible method is summarized below. It requires the :ref:`Rocoto Workflow Manager <RocotoInfo>`.
#. Run the workflow from the experiment directory (``$EXPTDIR``). By default, the path to this directory is ``${EXPT_BASEDIR}/${EXPT_SUBDIR}`` (see :numref:`Section %s <DirParams>` for more detail). There are several methods for running the workflow, which are discussed in :numref:`Section %s <Run>`. Most require the :ref:`Rocoto Workflow Manager <RocotoInfo>`. For example, if the user automated the workflow using cron, run:

.. code-block:: console
cd $EXPTDIR
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
The user can resubmit the ``rocotostat`` command as needed to check the workflow progress.

If the user has Rocoto but did *not* automate the workflow using cron, run:

.. code-block:: console
cd $EXPTDIR
./launch_FV3LAM_wflow.sh
To (re)launch the workflow and check the experiment's progress:
To (re)launch the workflow and check the experiment's progress, run:

.. code-block:: console
./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow
The workflow must be relaunched regularly and repeatedly until the log output includes a ``Workflow status: SUCCESS`` message indicating that the experiment has finished. The :term:`cron` utility may be used to automate repeated runs. The last section of the log messages from running ``./generate_FV3LAM_wflow.py`` instruct users how to use that functionality. Users may also refer to :numref:`Section %s <Automate>` for instructions.
The workflow must be relaunched regularly and repeatedly until the log output includes a ``Workflow status: SUCCESS`` message indicating that the experiment has finished.

Optionally, users may :ref:`configure their own grid <UserDefinedGrid>`, instead of using a predefined grid, and/or :ref:`plot the output <PlotOutput>` of their experiment(s).
Optionally, users may :ref:`configure their own grid <UserDefinedGrid>` or :ref:`vertical levels <VerticalLevels>` instead of using a predefined grid. Users can also :ref:`plot the output <PlotOutput>` of their experiment(s) or run :ref:`verification tasks using METplus <vxconfig>`.
2 changes: 1 addition & 1 deletion docs/UsersGuide/source/Reference/Glossary.rst
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ Glossary
`Spack <https://spack.readthedocs.io/en/latest/>`__ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures.

spack-stack
The `spack-stack <https://github.com/NOAA-EMC/spack-stack>`__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) <https://ufscommunity.org/>`__ and the `Joint Effort for Data assimilation Integration (JEDI) <https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/latest/>`__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack.
The `spack-stack <https://github.com/JCSDA/spack-stack>`__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) <https://ufscommunity.org/>`__ and the `Joint Effort for Data assimilation Integration (JEDI) <https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/latest/>`__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack.

tracer
According to the American Meteorological Society (AMS) `definition <https://glossary.ametsoc.org/wiki/Tracer>`__, a tracer is "Any substance in the atmosphere that can be used to track the history [i.e., movement] of an air mass." Tracers are carried around by the motion of the atmosphere (i.e., by :term:`advection`). These substances are usually gases (e.g., water vapor, CO2), but they can also be non-gaseous (e.g., rain drops in microphysics parameterizations). In weather models, temperature (or potential temperature), absolute humidity, and radioactivity are also usually treated as tracers. According to AMS, "The main requirement for a tracer is that its lifetime be substantially longer than the transport process under study."
Expand Down

0 comments on commit f13402e

Please sign in to comment.