From 4bacf564e8c4e68375c9d74d2054ac4cccfa9f49 Mon Sep 17 00:00:00 2001 From: Tracy Date: Thu, 18 Jul 2024 20:45:02 +0000 Subject: [PATCH 01/19] Initial updates to UG --- scm/doc/TechGuide/acknow.rst | 2 +- scm/doc/TechGuide/chap_cases.rst | 103 ++++++++++++++++--------------- scm/doc/TechGuide/chap_ccpp.rst | 20 +++--- scm/doc/TechGuide/chap_intro.rst | 52 +++++++--------- scm/doc/TechGuide/chap_quick.rst | 48 +++++++------- scm/doc/TechGuide/chap_repo.rst | 41 ++++++------ scm/doc/TechGuide/index.rst | 2 +- 7 files changed, 134 insertions(+), 134 deletions(-) diff --git a/scm/doc/TechGuide/acknow.rst b/scm/doc/TechGuide/acknow.rst index c8014aba5..d950fd745 100644 --- a/scm/doc/TechGuide/acknow.rst +++ b/scm/doc/TechGuide/acknow.rst @@ -8,5 +8,5 @@ For referencing this document please use: Firl, G., D. Swales, L. Carson, L. Bernardet, D. Heinzeller, M. Harrold, T. Hertneky, and M. Kavulich, 2024. Common Community Physics Package Single Column Model v7.0.0 User and - Technical Guide. Available at https://ccpp-scm.readthedocs.io/en/latest/. + Technical Guide. Available at https://ccpp-scm.readthedocs.io/en/v7.0.0/. diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index b6bd183dd..92db3f54e 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -128,35 +128,15 @@ arguments) are: .. _`case input`: -Case input data file (CCPP-SCM format) -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -The initialization and forcing data for each case is stored in a NetCDF -(version 4) file within the ``ccpp-scm/scm/data/processed_case_input`` directory. Each file has at least two -dimensions (``time`` and ``levels``, potentially with additions for vertical snow and soil -levels) and is organized into 3 groups: scalars, initial, and forcing. -Not all fields are required for all cases. For example the fields ``sh_flux_sfc`` and ``lh_flux_sfc`` -are only needed if the variable ``sfc_flx_spec = .true.`` in the case configuration file -and state nudging variables are only required if ``thermo_forcing_type = 3`` or ``mom_forcing_type = 3`` -. Using an active LSM (Noah, NoahMP, RUC) requires many more variables -than are listed here. Example files for using with Noah and NoahMP LSMs -are included in ``ccpp-scm/scm/data/processed_case_input/fv3_model_point_noah[mp].nc``. - -.. _`case input arm`: -.. literalinclude:: arm_case_header.txt - :name: lst_case_input_netcdf_header_arm - :caption: example NetCDF file (CCPP-SCM format) header for case initialization and forcing data - Case input data file (DEPHY format) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The Development and Evaluation of Physics in atmospheric models (DEPHY) format is an internationally-adopted data format intended for use by SCM -and LESs. The initialization and forcing data for each case is stored in -a NetCDF (version 4) file, although these files are not by default -included in the CCPP SCM repository. To access these cases you need to -clone the DEPHY-SCM repository, and provide the DEPHY-SCM file location -to the SCM. For example: +and LESs. The initialization and forcing data for each case in the CCPP SCM +repository is stored in a NetCDF (version 4) file. Additional cases in DEPHY +format, not maintained by the DTC, can be cloned from the DEPHY-SCM repository, +and run by providing the DEPHY-SCM file location to the SCM. For example: .. code:: bash @@ -166,11 +146,14 @@ to the SCM. For example: ./run_scm.py -c MAGIC_LEG04A --case_data_dir [...]/ccpp-scm/scm/data/DEPHY-SCM/MAGIC/LEG04A -v Each DEPHY file has three dimensions (``time``, ``t0``, ``levels``) and contains the initial -conditions (``t0``, ``levels``) and forcing data (``time``, ``levels``). Just as when using the CCPP-SCM -formatted inputs, :numref:`Subsection %s `, not all fields -are required for all cases. More information on the DEPHY format -requirements can be found at -`DEPHY `__. +conditions (``t0``, ``levels``) and forcing data (``time``, ``levels``). Not all fields +are required for all cases. For example, the fields ``hfss`` and ``hfls`` are only needed if the +global attributes ``surface_forcing_temp`` or ``surface_forcing_moisture`` are set to ``surface_flux`` +and state nudging variables are only required if the ``nudging_*`` terms in the global attributes +are turned on. Using an active LSM (Noah, NoahMP, RUC) requires many more variables than are listed +here. Example files for using with Noah and NoahMP LSMs are included in +ccpp-scm/scm/data/processed_case_input/fv3_model_point_noah[mp].nc. More information on the DEPHY +format requirements can be found at `DEPHY `__. .. _`case input dephy`: .. literalinclude:: dephy_case_header.txt @@ -214,7 +197,7 @@ included for using UFS Atmosphere initial conditions: - UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on Oct. 3, 2016 with NoahMP variables on the C96 FV3 grid (``fv3_model_point_noahmp.nc``) -See :numref:`Section %s ` for information on how to generate these +See :numref:`Section %s ` for information on how to generate these files for other locations and dates, given appropriate UFS Atmosphere initial conditions and output. @@ -228,9 +211,8 @@ original format to the format that the SCM expects, listed above. An example of this type of script written in Python is included in ``ccpp-scm/scm/etc/scripts/twpice_forcing_file_generator.py``. The script reads in the data as supplied from its source, converts any necessary variables, and writes a NetCDF (version 4) file in the format -described in subsections :numref:`Subsection %s ` and -:numref:`Subsection %s `. For reference, the following -formulas are used: +described in subsection :numref:`Subsection %s `. +For reference, the following formulas are used: .. math:: \theta_{il} = \theta - \frac{\theta}{T}\left(\frac{L_v}{c_p}q_l + \frac{L_s}{c_p}q_i\right) @@ -262,7 +244,7 @@ specified for the new case, one must also include a time series of the kinematic surface sensible heat flux (K m s\ :math:`^{-1}`) and kinematic surface latent heat flux (kg kg\ :math:`^{-1}` m s\ :math:`^{-1}`). The following variables are expected as 2-dimensional -arrays (vertical levels first, time second): the geostrophic u (E-W) and +arrays (time first, vertical levels second): the geostrophic u (E-W) and v (N-S) winds (m s\ :math:`^{-1}`), and the horizontal and vertical advective tendencies of :math:`\theta_{il}` (K s\ :math:`^{-1}`) and :math:`q_t` (kg kg\ :math:`^{-1}` s\ :math:`^{-1}`), the large scale @@ -378,12 +360,12 @@ following steps: one) in ``ccpp-scm/scm/etc/case_config``. Be sure that the ``case_name`` variable points to the newly created/processed case input file from above. -.. _`UFSreplay`: +.. _`UFScasegen`: -Using UFS Output to Create SCM Cases: UFS-Replay ------------------------------------------------- +Using UFS Output to Create SCM Cases: UFS Case Generation +--------------------------------------------------------- -.. _`pydepend_replay`: +.. _`pydepend_casegen`: Python Dependencies ~~~~~~~~~~~~~~~~~~~ @@ -407,12 +389,12 @@ Activate environment: > conda activate env_ufsreplay -.. _`ufsicgenerator`: +.. _`ufscasegen`: -UFS_IC_generator.py -~~~~~~~~~~~~~~~~~~~ +UFS_case_gen.py +~~~~~~~~~~~~~~~ -A script exists in ``scm/etc/scripts/UFS_IC_generator.py`` to read in UFS history (output) files and their +A script exists in ``scm/etc/scripts/UFS_case_gen.py`` to read in UFS history (output) files and their initial conditions to generate a SCM case input data file, in DEPHY format. @@ -421,7 +403,7 @@ format. ./UFS_IC_generator.py [-h] (-l LOCATION LOCATION | -ij INDEX INDEX) -d DATE -i IN_DIR -g GRID_DIR -f FORCING_DIR -n CASE_NAME [-t {1,2,3,4,5,6,7}] [-a AREA] [-oc] - [-lam] [-sc] [-near] + [-lam] [-sc] [-near] [-fm] [-vm] [-wn] [-geos] Mandatory arguments: @@ -462,20 +444,31 @@ Optional arguments: #. ``--use_nearest (-near)``: flag to indicate using the nearest UFS history file gridpoint +#. ``--forcing_method (-fm)``: method used to calculate forcing (1=total tendencies from UFS dycore, + 2=advective terms calculated from UFS history files, 3=total time tendency terms calculated), default=2 + +#. ``--vertical_method (-vm)``: method used to calculate vertical advective forcing (1=vertical advective + terms calculated from UFS history files and added to total, 2=smoothed vertical velocity provided), default=2 + +#. ``--wind_nudge (-wn)``: flag to turn on wind nudging to UFS profiles + +#. ``--geostrophic (-geos)``: flag to turn on geostrophic wind forcing + .. _`ufsforcingensemblegenerator`: UFS_forcing_ensemble_generator.py ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -There is an additional script in ``scm/etc/scripts/UFS_forcing_ensemble_generator.py`` to create UFS-replay case(s) starting +There is an additional script in ``scm/etc/scripts/UFS_forcing_ensemble_generator.py`` to create UFS-caseGen case(s) starting with output from UFS Weather Model (UWM) Regression Tests (RTs). .. code:: bash UFS_forcing_ensemble_generator.py [-h] -d DIR -n CASE_NAME (-lonl LON_1 LON_2 -latl LAT_1 LAT_2 -nens NENSMEMBERS | - -lons [LON_LIST] -lats [LAT_LIST]) - [-dt TIMESTEP] [-cres C_RES] [-sdf SUITE] [-sc] [-near] + -lons [LON_LIST] -lats [LAT_LIST] | + -fxy [LON_LAT_FILE]) + [-dt TIMESTEP] [-cres C_RES] [-sdf SUITE] [-sc] [-near] [-fm] [-vm] [-wn] [-geos] Mandatory arguments: @@ -490,6 +483,8 @@ Mandatory arguments: - ``--lon_list (-lons)`` AND ``--lat_list (-lats)``: longitude and latitude of cases + - ``--lonlat_file (fxy)``: file containing longitudes and latitudes + Optional arguments: #. ``--timestep (-dt)``: SCM timestep, in seconds @@ -502,6 +497,16 @@ Optional arguments: #. ``--use_nearest (-near)``: flag to indicate using the nearest UFS history file gridpoint +#. ``--forcing_method (-fm)``: method used to calculate forcing (1=total tendencies from UFS dycore, + 2=advective terms calculated from UFS history files, 3=total time tendency terms calculated), default=2 + +#. ``--vertical_method (-vm)``: method used to calculate vertical advective forcing (1=vertical advective + terms calculated from UFS history files and added to total, 2=smoothed vertical velocity provided), default=2 + +#. ``--wind_nudge (-wn)``: flag to turn on wind nudging to UFS profiles + +#. ``--geostrophic (-geos)``: flag to turn on geostrophic wind forcing + Examples to run from within the ``scm/etc/scripts`` directory to create SCM cases starting with the output from a UFS Weather Model regression test(s): @@ -513,7 +518,7 @@ staged UWM RTs located at: .. _`example1`: -Example 1: UFS-replay for single point +Example 1: UFS-caseGen for single point ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ UFS regression test, ``control_c192``, for single point. @@ -533,7 +538,7 @@ The file ``scm_ufsens_control_c192.py`` is created in ``ccpp-scm/scm/bin/``, whe .. _`example2`: -Example 2: UFS-replay for list of points +Example 2: UFS-caseGen for list of points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ UFS regression test, ``control_c384``, for multiple points. @@ -562,7 +567,7 @@ number of points provided. The contents of the file should look like: .. _`example3`: -Example 3: UFS-replay for an ensemble of points +Example 3: UFS-caseGen for an ensemble of points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ UFS regression test, ``control_p8``, for an ensemble (10) of randomly selected points diff --git a/scm/doc/TechGuide/chap_ccpp.rst b/scm/doc/TechGuide/chap_ccpp.rst index 1f4e9f829..cc5420ee5 100644 --- a/scm/doc/TechGuide/chap_ccpp.rst +++ b/scm/doc/TechGuide/chap_ccpp.rst @@ -3,8 +3,8 @@ CCPP Interface ============== -Chapter 6 of the CCPP v6 Technical Documentation -(https://ccpp-techdoc.readthedocs.io/en/v6.0.0/) provides a wealth of +Chapter 6 of the CCPP v7 Technical Documentation +(https://ccpp-techdoc.readthedocs.io/en/v7.0.0/) provides a wealth of information on the overall process of connecting a host model to the CCPP framework for calling physics. This chapter describes the particular implementation within this SCM, including how to set up, @@ -23,7 +23,7 @@ Preparing data from the SCM ~~~~~~~~~~~~~~~~~~~~~~~~~~~ As described in sections 6.1 and 6.2 of the `CCPP Technical -Documentation `__ a host +Documentation `__ a host model must allocate memory and provide metadata for variables that are passed into and out of the schemes within the physics suite. As of this release, in practice this means that a host model must do this for all @@ -33,7 +33,7 @@ schemes are allocated and documented in the file ``ccpp-scm/scm/src/scm_type_def within the ``physics`` derived data type. This derived data type initializes its component variables in a ``create`` type-bound procedure. As mentioned in section 6.2 of the `CCPP Technical -Documentation `__, files +Documentation `__, files containing all required metadata was constructed for describing all variables in the derived data type. These files are ``scm/src/GFS_typedefs.meta,``, ``scm/src/CCPP_typedefs.meta``, and ``scm_physical_constants.meta``. Further, ``scm_type_defs.meta`` exists to provide metadata for derived data type definitions and their @@ -47,7 +47,7 @@ Editing and running ``ccpp_prebuild.py`` General instructions for configuring and running the ``ccpp_prebuild.py`` script can be found in chapter 8 of the `CCPP Technical -Documentation `__. The +Documentation `__. The script expects to be run with a host-model-dependent configuration file, passed as argument ``–config=path_to_config_file``. Within this configuration file are variables that hold paths to the variable definition files (where metadata tables can @@ -98,7 +98,7 @@ described in sections respectively. A more general description of the process for performing suite initialization and running can also be found in sections 6.4 and 6.5 of the `CCPP Technical -Documentation `__. +Documentation `__. Changing a suite ---------------- @@ -110,7 +110,7 @@ Prior to being able to swap a scheme within a suite, one must first add a CCPP-compliant scheme to the pool of available schemes in the CCPP physics repository. This process is described in chapter 2 of the `CCPP Technical -Documentation `__. +Documentation `__. Once a CCPP-compliant scheme has been added to the CCPP physics repository, the process for modifying an existing suite should take the @@ -129,7 +129,7 @@ following steps into account: - Do any of the new variables need to be calculated in an interstitial scheme? If so, one must be written and made CCPP-compliant itself. The `CCPP Technical - Documentation `__ + Documentation `__ will help in this endeavor, and the process outlined in its chapter 2 should be followed. @@ -157,7 +157,7 @@ following steps into account: associated interstitial ```` elements and simply replacing the scheme names to reflect their replacements. See chapter 4 of the `CCPP Technical - Documentation `__ for + Documentation `__ for further details. Modifying “groups” of parameterizations @@ -236,7 +236,7 @@ would do so: cannot be used in a physics scheme yet. For that, you’ll need to add an entry in the corresponding metadata file. See section 2.2 of the `CCPP Technical - Documentation `__ + Documentation `__ for more information regarding the format. #. On the physics scheme side, there will also be a metadata file entry diff --git a/scm/doc/TechGuide/chap_intro.rst b/scm/doc/TechGuide/chap_intro.rst index 94d134d85..1bcd768ef 100644 --- a/scm/doc/TechGuide/chap_intro.rst +++ b/scm/doc/TechGuide/chap_intro.rst @@ -15,10 +15,10 @@ parameterizations (CCPP framework). In fact, this SCM serves as perhaps the simp example for using the CCPP and its framework in an atmospheric model. This version contains all parameterizations of NOAA’s evolved operational GFS v16 suite (implemented in 2021), plus additional -developmental schemes. The schemes are grouped in six supported suites +developmental schemes. The schemes are grouped in five supported suites described in detail in the `CCPP Scientific -Documentation `__ -(GFS_v16, GFS_v17p8, RAP, HRRR, and RRFS_v1beta, and WoFS_v0). +Documentation `__ +(GFS_v16, GFS_v16_RRTMGP, GFS_v17_p8_ugwpv1, HRRR_gf, and WoFS_v0). This document serves as both the User and Technical Guides for this model. It contains a Quick Start Guide with instructions for obtaining @@ -35,39 +35,39 @@ through the CCPP infrastructure. Version Notes ------------- -The CCPP SCM v6.0.0 contains the following major and minor changes since -v5.0. +The CCPP SCM v7.0.0 contains the following major and minor changes since v6.0. Major -- Inclusion of regression testing functionality +- Ability to generate SCM cases from UFS simulations using either derived forcings + or native forcings from the dynamical core. -- Combine single- and multi-run capabilities into one script +- Support for single precision physics within the SCM. Minor -- Add RUC LSM support +- Addition of new physics schemes; RRTMGP radiation and CLM Lake Model, along with + updates to existing schemes. -- Add the GFS_v17p8, HRRR, RRFS_v1beta, and WoFS_v0 suites +- CCPP SCM support for the latest operational/research physics configurations used + across UFS applications, including the GFS_v17_p8_ugwpv1, GFS_v16_RRTMGP, and + HRRR_gf suites. -- Update the vertical coordinate code to better match latest FV3 - vertical coordinate code +- New SCM cases; MOSAiC-AMPS, MOSAiC-SS, COMBLE, and a catolog of cases in the + `GdR-DEPHY `__ repository that can be run + with CCPP SCM. -- Simplify the case configuration namelists +- Updated `Scientific Documentation `__, User's Guide, Technical Documentation, and + online tutorials. -- Add greater flexibility for output location (outside of bin - directory) +- Generalized plotting and visualization tools. Limitations ~~~~~~~~~~~ This release bundle has some known limitations: -- In the output file, temperature tendency variables all mistakenly - have the same description, although their variable names are correct. - This has been fixed in the development code. - -- Using the RRFS_v1beta, HRRR, and WoFS_v0 suites for cases where deep +- Using the HRRR_gf and WoFS_v0 suites for cases where deep convection is expected to be active will likely produce strange/unreliable results, unless the forcing has been modified to account for the deep convection. This is because forcing for existing @@ -101,16 +101,12 @@ This release bundle has some known limitations: LSMs for the supplied cases over land points, there should be no technical reason why they cannot be used with LSMs, however. -- As of this release, using the SCM over a land point with an LSM is +- Using the SCM over a land point with an LSM is possible through the use of UFS initial conditions (see - :numref:`Section %s `). However, advective forcing terms - are unavailable as of this release, so only short integrations using - this configuration should be employed. Using dynamical tendencies - (advective forcing terms) from the UFS will be part of a future - release. + :numref:`Section %s `). - There are several capabilities of the developmental code that have not been tested sufficiently to be considered part of the supported - release. Those include additional parameterizations. Users that want - to use experimental capabilities should refer to - :numref:`Subsection %s `. + release. Those include additional parameterizations and the CCPP + Suite Simulator. Users that want to use experimental capabilities + should refer to :numref:`Subsection %s `. diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index f1524079e..205ee2947 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -25,7 +25,7 @@ Clone the source using .. code:: bash - git clone --recursive -b v6.0.0 https://github.com/NCAR/ccpp-scm + git clone --recursive -b v7.0.0 https://github.com/NCAR/ccpp-scm The ``--recursive`` option is required to retrieve the ccpp-physics and ccpp-framework code, which are stored in separate repositories and linked to the SCM repository as submodules. @@ -129,7 +129,7 @@ Beyond the standard shell scripts, the build system relies on use of the Python scripting language, along with cmake, GNU make and date. -For the latest release, the minimum required Python version is 3.8, and CMake requires a minimum version of 3.14. +For the latest release, the minimum required Python version is 3.10, and CMake requires a minimum version of 3.23. While exact minimum required versions of other prerequisites have not been established, users can reference the list of Continuous Integration tests run on the CCPP SCM repository (see :numref:`Section %s `) for examples of known working configurations. @@ -137,7 +137,7 @@ for examples of known working configurations. Spack-stack ^^^^^^^^^^^^ -A joint effort between NOAA's Unified Forecast System (UFS) and Joint Effort for Data assimilation Integration (JEDI). +This is a joint effort between NOAA's Unified Forecast System (UFS) and Joint Effort for Data assimilation Integration (JEDI). It is designed to be a comprehensive, all-in-one package containing prerequisite libraries and tools needed for all software in the UFS ecosystem, including the CCPP SCM. As of the version 7, installing spack-stack is the main supported method of installing the prerequisites needed for building the SCM. The latest version of the SCM is meant @@ -146,11 +146,11 @@ contains the following set of libraries needed for building the SCM: - Netcdf-c (v4.9.2) - - Netcdf-FORTRAN (v4.6.0) + - Netcdf-FORTRAN (v4.6.1) - BACIO (v2.4.1) - Binary I/O Library - - SP (v2.3.3) - Spectral Transformation Library + - SP (v2.5.0) - Spectral Transformation Library - W3EMC (2.10.0) - GRIB decoder and encoder library @@ -203,7 +203,7 @@ Installing Libraries on Non-preconfigured Platforms ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ For users on supported platforms such as generic Linux or macOS systems -that have not been preconfigured, installing ``spack-stack`` (see :ref:`Section %s `) +that have not been preconfigured, installing ``spack-stack`` (see :numref:`Section %s `) is highly recommended, as it provides all the necessary prerequisite libraries needed for installing the SCM. The CCPP/SCM team does not support spack-stack, so users with questions or requiring help with spack-stack installation @@ -249,7 +249,7 @@ Python requirements """"""""""""""""""""" The SCM build system invokes the ``ccpp_prebuild.py`` script, and so the Python environment must be set up prior to building. -As mentioned earlier, a minimum Python version of 3.8 is required. Additionally, there are a few non-default modules required for the SCM to +As mentioned earlier, a minimum Python version of 3.10 is required. Additionally, there are a few non-default modules required for the SCM to function: ``f90nml`` (`documentation `__) and ``netcdf4`` (`documentation `__). Users can test if these are installed using this command in the shell: @@ -281,8 +281,8 @@ Compiling SCM with CCPP ----------------------- The first step in compiling the CCPP and SCM is to properly setup your -user environment as described in -sections :numref:`%s ` and :numref:`Section %s `. +user environment as described in :numref:`Section %s ` +and :numref:`Section %s `. Following this step, the top level build system will use ``cmake`` to query system parameters, execute the CCPP prebuild script to match the physics @@ -443,12 +443,12 @@ execute the following scripts: If the download step fails, make sure that your system’s firewall does not block access to GitHub. If it does, download the files ``comparison_data.tar.gz``, ``physics_input_data.tar.gz``, ``processed_case_input.tar.gz``, and ``raw_case_input.tar.gz`` -from the `SCM release page `__ using your browser and manually extract its +from the `SCM release page `__ using your browser and manually extract its contents in the directory ``scm/data``. Similarly, do the same for ``thompson_tables.tar.gz`` and ``MG_INCCN_data.tar.gz`` and extract to ``scm/data/physics_input_data/``. -New with the SCM v7 release, static data is available for running cases with GOCART climatological aerosols (where the value of ``iaer`` in the ``&gfs_physics_nml`` namelist starts with 1; see the `CCPP Scientific Documentation `__ for more information); one example of this is with the default namelist settings for the GFS_v17_HR3 scheme. This dataset is very large (~12 GB), so it is recommended only to download it if you will be using it. +New with the SCM v7 release, static data is available for running cases with GOCART climatological aerosols (where the value of ``iaer`` in the ``&gfs_physics_nml`` namelist starts with 1; see the `CCPP Scientific Documentation `__ for more information); one example of this is with the default namelist settings for the GFS_v17_p8_ugwpv1 scheme. This dataset is very large (~12 GB), so it is recommended only to download it if you will be using it. .. code:: bash @@ -596,17 +596,15 @@ configuration files located in ``../etc/case_config`` (*without the .nml extensi specifying a suite other than the default, the suite name used must match the value of the suite name in one of the suite definition files located in ``../../ccpp/suites`` (Note: not the filename of the suite definition file). As -part of the sixth CCPP release, the following suite names are supported: +part of the CCPP SCM v7.0.0 release, the following suite names are supported: #. SCM_GFS_v16 -#. SCM_GFS_v17p8 +#. SCM_GFS_v16_RRTMGP -#. SCM_RAP +#. SCM_GFS_v17_p8_ugwpv1 -#. SCM_HRRR - -#. SCM_RRFS_v1beta +#. SCM_HRRR_gf #. SCM_WoFS_v0 @@ -620,7 +618,7 @@ the SCM, especially when invoking ``cmake`` with the ``-DOPENMP=ON`` option. Also note that some cases require specified surface fluxes. Special suite definition files that correspond to the suites listed above have -been created and use the ``*_prescribed_surface`` decoration. It is not necessary to specify this +been created and use the ``*_ps`` decoration. It is not necessary to specify this filename decoration when specifying the suite name. If the ``spec_sfc_flux`` variable in the configuration file of the case being run is set to ``.true.``, the run script will automatically use the special suite definition file that @@ -684,7 +682,7 @@ result in a runtime error in all supported suites. #. q_rimef -A NetCDF output file is generated in an output directory located named +A NetCDF output file is generated in an output directory named with the case and suite within the run directory. If using a Docker container, all output is copied to the directory in container space for volume-mounting purposes. Any standard NetCDF file viewing or analysis @@ -710,7 +708,7 @@ from the ``bin`` directory. Additional details regarding the SCM may be found in the remainder of this guide. More information on the CCPP can be found in the CCPP Technical Documentation available at -https://ccpp-techdoc.readthedocs.io/en/v6.0.0/. +https://ccpp-techdoc.readthedocs.io/en/v7.0.0/. .. _docker: @@ -756,7 +754,7 @@ internet search. Building the Docker image ^^^^^^^^^^^^^^^^^^^^^^^^^ -The Dockerfile builds CCPP SCM v6.0.0 from source using the GNU +The Dockerfile builds CCPP SCM v7.0.0 from source using the GNU compiler. The CCPP SCM has a number of system requirements and necessary libraries @@ -801,7 +799,7 @@ and then executing the following steps: Inspect the Dockerfile if you would like to see details for how the image is built. The image will contain SCM prerequisite software from DTC, the SCM and CCPP code, and a pre-compiled executable for the SCM - with the 6 supported suites for the SCM. To view + with the 5 supported suites for the SCM. To view .. code:: bash @@ -820,7 +818,7 @@ following from the terminal where Docker is run: .. code:: bash - docker pull dtcenter/ccpp-scm:v6.0.0 + docker pull dtcenter/ccpp-scm:v7.0.0 To verify that it exists afterward, run @@ -898,7 +896,7 @@ Running the Docker image - ``-v`` specifies the volume mount from host directory (outside container) to inside the container. Using volumes allows you to share data between the host machine and container. For running the SCM, the - output is being mounted from inside the container to the on the + output is being mounted from inside the container to the host machine. Upon exiting the container, data mounted to the host machine will still be accessible. @@ -908,7 +906,7 @@ Running the Docker image .. note:: If you are using a prebuilt image from Dockerhub, substitute the name of the image that was pulled from Dockerhub in the commands - above; i.e. instead of ``ccpp-scm`` above, one would have ``dtcenter/ccpp-scm:v6.0.0``. + above; i.e. instead of ``ccpp-scm`` above, one would have ``dtcenter/ccpp-scm:v7.0.0``. #. To use the SCM interactively, run non-default configurations, create plots, or even develop code, issue the following command: diff --git a/scm/doc/TechGuide/chap_repo.rst b/scm/doc/TechGuide/chap_repo.rst index 36f0a10ce..fcc2a46e2 100644 --- a/scm/doc/TechGuide/chap_repo.rst +++ b/scm/doc/TechGuide/chap_repo.rst @@ -23,41 +23,42 @@ Cubed-Sphere (FV3) dynamical core. | ``ccpp-scm/`` | ``├── CMakeModules`` -| ``├── CODEOWNERS`` - list of code maintainers/developers who are automatically assigned to review Pull Requests on GitHub +| ``├── CODEOWNERS`` - List of code maintainers/developers who are automatically assigned to review Pull Requests on GitHub | ``├── LICENSE`` | ``├── README.md`` -| ``├── ccpp`` - contains the CCPP prebuild configuration file -| ``│   ├── config`` +| ``├── ccpp`` +| ``│   ├── config`` - Contains the CCPP prebuild configuration file | ``│   ├── framework`` - Contains CCPP framework submodule. See https://github.com/NCAR/ccpp-framework for contents | ``│   ├── physics`` - Contains CCPP physics submodule. See https://github.com/NCAR/ccpp-physics for contents -| ``│   ├── physics_namelists`` - contains physics namelist files associated with suites -| ``│   └── suites`` - contains suite definition files +| ``│   ├── physics_namelists`` - Contains physics namelist files associated with suites +| ``│   └── suites`` - Contains suite definition files | ``├── contrib`` -| ``│   ├── get_all_static_data.sh`` - script for downloading/extracting the processed SCM case data -| ``│   ├── get_mg_inccn_data.sh`` - script for downloading/extracting the Morrison-Gettelman data -| ``│   └── get_thompson_tables.sh`` - script for downloading/extracting the Thompson lookup tables +| ``│   ├── get_all_static_data.sh`` - Script for downloading/extracting the processed SCM case data +| ``│   ├── get_mg_inccn_data.sh`` - Script for downloading/extracting the Morrison-Gettelman data +| ``│   └── get_thompson_tables.sh`` - Script for downloading/extracting the Thompson lookup tables +| ``│   └── get_aerosol_climo.sh`` - Script for downloading/extracting the GOCART climatological aerosol data | ``├── docker`` -| ``│   └── Dockerfile`` - contains Docker instructions for building the CCPP SCM image +| ``│   └── Dockerfile`` - Contains Docker instructions for building the CCPP SCM image | ``├── environment-suite-sim.yml`` - Python environment dependency file for the CCPP Suite Simulator -| ``├── environment-ufsreplay.yml`` - Python environment dependency file for the UFS Replay capability +| ``├── environment-ufsreplay.yml`` - Python environment dependency file for the UFS Case Generator capability | ``├── environment.yml`` - Python environment dependency file for the SCM | ``├── scm`` | ``│   ├── LICENSE.txt`` - Contains licensing information | ``│   ├── data`` - Directory where data is staged by scripts in the ``ccpp/contrib/`` directory -| ``│   │   └── vert_coord_data`` - contains data to calculate vertical coordinates (from GSM-based GFS only) +| ``│   │   └── vert_coord_data`` - Contains data to calculate vertical coordinates (from GSM-based GFS only) | ``│   ├── doc`` | ``│   │   └── TechGuide`` - Contains source code and other files for this User’s/Technical Guide -| ``│   ├── etc`` - contains case configuration, machine setup scripts, and plotting scripts -| ``│   │   ├── CENTOS_docker_setup.sh`` - contains machine setup for Docker container -| ``│   │   ├── case_config`` - contains case configuration files -| ``│   │   ├── modules`` - contains module files for loading build environments on both pre-configured and custom platforms -| ``│   │   ├── scm_qsub_example.py`` - example ``qsub`` (LSF) run script -| ``│   │   ├── scm_slurm_example.py`` - example ``srun`` (SLURM) run script +| ``│   ├── etc`` - Contains case configuration, machine setup scripts, and plotting scripts +| ``│   │   ├── CENTOS_docker_setup.sh`` - Contains machine setup for Docker container +| ``│   │   ├── case_config`` - Contains case configuration files +| ``│   │   ├── modules`` - Contains module files for loading build environments on both pre-configured and custom platforms +| ``│   │   ├── scm_qsub_example.py`` - Example ``qsub`` (LSF) run script +| ``│   │   ├── scm_slurm_example.py`` - Example ``srun`` (SLURM) run script | ``│   │   ├── scripts`` - Python scripts for setting up cases, plotting, and the CCPP Suite Simulator | ``│   │   │   ├── ccpp_suite_sim`` - Python scripts for the CCPP Suite Simulator -| ``│   │   │   ├── plot_configs`` - plot configuration files -| ``│   │   └── tracer_config`` - tracer configuration files -| ``│   └── src`` - source code for SCM infrastructure, Python run script, CMakeLists.txt for the SCM, example multirun setup files, suite_info.py +| ``│   │   │   ├── plot_configs`` - Plot configuration files +| ``│   │   └── tracer_config`` - Tracer configuration files +| ``│   └── src`` - Source code for SCM infrastructure, Python run script, CMakeLists.txt for the SCM, example multirun setup files, suite_info.py | ``└── test`` - Contains scripts for regression testing, Continuous Integration tests Testing diff --git a/scm/doc/TechGuide/index.rst b/scm/doc/TechGuide/index.rst index 3d4bacb58..58e3aeb59 100644 --- a/scm/doc/TechGuide/index.rst +++ b/scm/doc/TechGuide/index.rst @@ -1,4 +1,4 @@ -CCPP Single Column Model (SCM) User and Technical Guide v6.0.0 +CCPP Single Column Model (SCM) User and Technical Guide v7.0.0 ============================ .. toctree:: From c252d69da3962646e09480238122bd65bfbd571c Mon Sep 17 00:00:00 2001 From: Tracy Date: Wed, 24 Jul 2024 21:11:17 +0000 Subject: [PATCH 02/19] update options for running cases --- ...fsreplay.yml => environment-ufscasegen.yml | 0 scm/doc/TechGuide/chap_cases.rst | 81 +++++-------------- 2 files changed, 20 insertions(+), 61 deletions(-) rename environment-ufsreplay.yml => environment-ufscasegen.yml (100%) diff --git a/environment-ufsreplay.yml b/environment-ufscasegen.yml similarity index 100% rename from environment-ufsreplay.yml rename to environment-ufscasegen.yml diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index 92db3f54e..b619cabb0 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -23,51 +23,16 @@ The ``case_config`` namelist expects the following parameters: This string must correspond to a dataset included in the directory ``ccpp-scm/scm/data/processed_case_input/`` (without the file extension). -- ``runtime`` - - - Specify the model runtime in seconds (integer). This should - correspond with the forcing dataset used. If a runtime is - specified that is longer than the supplied forcing, the forcing is - held constant at the last specified values. - -- ``thermo_forcing_type`` - - - An integer representing how forcing for temperature and moisture - state variables is applied (1 :math:`=` total advective - tendencies, 2 :math:`=` horizontal advective tendencies with - prescribed vertical motion, 3 :math:`=` relaxation to observed - profiles with vertical motion prescribed) - -- ``mom_forcing_type`` - - - An integer representing how forcing for horizontal momentum state - variables is applied (1 :math:`=` total advective tendencies; not - implemented yet, 2 :math:`=` horizontal advective tendencies with - prescribed vertical motion, 3 :math:`=` relaxation to observed - profiles with vertical motion prescribed) - - ``relax_time`` - A floating point number representing the timescale in seconds for the relaxation forcing (only used if ``thermo_forcing_type = 3`` or ``mom_forcing_type = 3``) -- ``sfc_flux_spec`` - - - A boolean set to ``.true.`` if surface flux are specified from the forcing - data (there is no need to have surface schemes in a suite - definition file if so) - - ``sfc_roughness_length_cm`` - Surface roughness length in cm for calculating surface-related fields from specified surface fluxes (only used if ``sfc_flux_spec`` is True). -- ``sfc_type`` - - - An integer representing the character of the surface (0 :math:`=` - sea surface, 1 :math:`=` land surface, 2 :math:`=` sea-ice - surface) - - ``reference_profile_choice`` - An integer representing the choice of reference profile to use @@ -75,22 +40,6 @@ The ``case_config`` namelist expects the following parameters: “McClatchey” profile, 2 :math:`=` mid-latitude summer standard atmosphere) -- ``year`` - - - An integer representing the year of the initialization time - -- ``month`` - - - An integer representing the month of the initialization time - -- ``day`` - - - An integer representing the day of the initialization time - -- ``hour`` - - - An integer representing the hour of the initialization time - - ``column_area`` - A list of floating point values representing the characteristic @@ -113,18 +62,28 @@ The ``case_config`` namelist expects the following parameters: - ``input_type`` - - 0 => original DTC format, 1 => DEPHY-SCM format. + - 1 => DEPHY-SCM format. Optional variables (that may be overridden via run script command line arguments) are: +- ``npz_type`` + + - Changes the type of FV3 vertical grid to produce (see src/scm_vgrid.F90 for + valid values), default=''. + - ``vert_coord_file`` - - File containing FV3 vertical grid coefficients. + - File containing FV3 vertical grid coefficients, default=''. - ``n_levels`` - - Specify the integer number of vertical levels. + - Specify the integer number of vertical levels, default=127. + +- ``dt`` + + - Specify the timestep to use (if different than the default specified in + ../../src/suite_info.py), default=600. .. _`case input`: @@ -372,22 +331,22 @@ Python Dependencies The scripts here require a few python packages that may not be found by default in all python installations. There is a YAML file with the -python environment needed to run the script in ``ccpp-scm/environment-ufsreplay.yml``. To create and activate +python environment needed to run the script in ``ccpp-scm/environment-ufscasegen.yml``. To create and activate this environment using conda: Create environment (only once): .. code:: bash - > conda env create -f environment-ufsreplay.yml + > conda env create -f environment-ufscasegen.yml -This will create the conda environment ``env_ufsreplay`` +This will create the conda environment ``env_ufscasegen`` Activate environment: .. code:: bash - > conda activate env_ufsreplay + > conda activate env_ufscasegen .. _`ufscasegen`: @@ -510,10 +469,10 @@ Optional arguments: Examples to run from within the ``scm/etc/scripts`` directory to create SCM cases starting with the output from a UFS Weather Model regression test(s): -On the supported platforms Cheyenne (NCAR) and Hera (NOAA), there are +On the supported platforms Derecho (NCAR) and Hera (NOAA), there are staged UWM RTs located at: -- Cheyenne ``/glade/scratch/epicufsrt/GMTB/CCPP-SCM/UFS_RTs`` +- Derecho ``/glade/scratch/epicufsrt/GMTB/CCPP-SCM/UFS_RTs`` - Hera ``/scratch1/BMC/gmtb/CCPP-SCM/UFS_RTs`` .. _`example1`: @@ -583,7 +542,7 @@ for more details. For the purposes of this example the ``control_p8`` test has already been rerun, but if starting from your own UWM RTs, you can rerun the UWM regression test, -on Cheyenne for example, by running the following command in the RT +on Derecho for example, by running the following command in the RT directory: ``qsub job_card`` Now the cases can be generated with the following command: From c900e143c0cf63ab692f9e131574808804563d7a Mon Sep 17 00:00:00 2001 From: Tracy Date: Mon, 29 Jul 2024 18:27:27 +0000 Subject: [PATCH 03/19] doc updates --- environment-ufscasegen.yml | 2 +- scm/doc/TechGuide/chap_cases.rst | 122 +++++++++---------------------- 2 files changed, 37 insertions(+), 87 deletions(-) diff --git a/environment-ufscasegen.yml b/environment-ufscasegen.yml index fc76fe24b..a89585b80 100644 --- a/environment-ufscasegen.yml +++ b/environment-ufscasegen.yml @@ -1,4 +1,4 @@ -name: env_ufsreplay +name: env_ufscasegen dependencies: - conda-forge::python=3.8.5 diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index b619cabb0..c1bf822e8 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -23,15 +23,11 @@ The ``case_config`` namelist expects the following parameters: This string must correspond to a dataset included in the directory ``ccpp-scm/scm/data/processed_case_input/`` (without the file extension). -- ``relax_time`` - - - A floating point number representing the timescale in seconds for - the relaxation forcing (only used if ``thermo_forcing_type = 3`` or ``mom_forcing_type = 3``) - - ``sfc_roughness_length_cm`` - Surface roughness length in cm for calculating surface-related - fields from specified surface fluxes (only used if ``sfc_flux_spec`` is True). + fields from specified surface fluxes (only used if surface fluxes + are specified). - ``reference_profile_choice`` @@ -85,6 +81,20 @@ arguments) are: - Specify the timestep to use (if different than the default specified in ../../src/suite_info.py), default=600. +- ``do_spinup`` + + - Set to ``.true.`` when allowing the model to spin up before the "official" + model integration starts. + +- ``spinup_timesteps`` + + - Number of timesteps to spin up when ``do_spinup`` is true + +- ``lsm_ics`` + + - Set to ``.true.`` when LSM initial conditions are included (but not all ICs from + another model) + .. _`case input`: Case input data file (DEPHY format) @@ -143,20 +153,25 @@ following observational field campaigns: (LASSO) for May 18, 2016 (with capability to run all LASSO dates - see :numref:`Section %s `) continental shallow convection +- GEWEX Atmospheric Boundary Layer Study (GABLS3) for July 1, 2006 + development of a nocturnal low-level jet + +- Multidisciplinary drifting Observatory for the Study of Arctic Climate + expedition (MOSAiC) + - SS: Strongly stably stratified boundary layer (March 2-10 2020) + - AMPS: Arctic mixed-phase stratocumuluas cloud (Oct 31 - Nov 5 2019) + +- Cold-Air Outbreaks in the Marine Boundary Layer Experiment (COMBLE) for + March 12, 2020 mixed phased clouds in the polar marine boundary layer + For the ARM SGP case, several case configuration files representing different time periods of the observational dataset are included, denoted by a trailing letter. The LASSO case may be run with different forcing applied, so three case configuration files corresponding to -these different forcing are included. In addition, two example cases are -included for using UFS Atmosphere initial conditions: - -- UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on - Oct. 3, 2016 with Noah variables on the C96 FV3 grid (``fv3_model_point_noah.nc``) +these different forcing are included. -- UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on - Oct. 3, 2016 with NoahMP variables on the C96 FV3 grid (``fv3_model_point_noahmp.nc``) - -See :numref:`Section %s ` for information on how to generate these +In addition, cases can be generated from UFS initial conditions See +:numref:`Section %s ` for information on how to generate these files for other locations and dates, given appropriate UFS Atmosphere initial conditions and output. @@ -166,11 +181,10 @@ How to set up new cases Setting up a new case involves preparing the two types of files listed above. For the case initialization and forcing data file, this typically involves writing a custom script or program to parse the data from its -original format to the format that the SCM expects, listed above. An -example of this type of script written in Python is included in ``ccpp-scm/scm/etc/scripts/twpice_forcing_file_generator.py``. The -script reads in the data as supplied from its source, converts any -necessary variables, and writes a NetCDF (version 4) file in the format -described in subsection :numref:`Subsection %s `. +original format to the DEPHY format, listed above. Formatting for DEPHY +is documented in the `DEPHY repository +`__. + For reference, the following formulas are used: .. math:: \theta_{il} = \theta - \frac{\theta}{T}\left(\frac{L_v}{c_p}q_l + \frac{L_s}{c_p}q_i\right) @@ -186,67 +200,6 @@ specific humidity, :math:`q_v` is the water vapor specific humidity, :math:`q_l` is the suspended liquid water specific humidity, and :math:`q_i` is the suspended ice water specific humidity. -As shown in the example NetCDF header, the SCM expects that the vertical -dimension is pressure levels (index 1 is the surface) and the time -dimension is in seconds. The initial conditions expected are the height -of the pressure levels in meters, and arrays representing vertical -columns of :math:`\theta_{il}` in K, :math:`q_t`, :math:`q_l`, and -:math:`q_i` in kg kg\ :math:`^{-1}`, :math:`u` and :math:`v` in m -s\ :math:`^{-1}`, turbulence kinetic energy in m\ :math:`^2` -s\ :math:`^{-2}` and ozone mass mixing ratio in kg kg\ :math:`^{-1}`. - -For forcing data, the SCM expects a time series of the following -variables: latitude and longitude in decimal degrees [in case the -column(s) is moving in time (e.g., Lagrangian column)], the surface -pressure (Pa) and surface temperature (K). If surface fluxes are -specified for the new case, one must also include a time series of the -kinematic surface sensible heat flux (K m s\ :math:`^{-1}`) and -kinematic surface latent heat flux (kg kg\ :math:`^{-1}` m -s\ :math:`^{-1}`). The following variables are expected as 2-dimensional -arrays (time first, vertical levels second): the geostrophic u (E-W) and -v (N-S) winds (m s\ :math:`^{-1}`), and the horizontal and vertical -advective tendencies of :math:`\theta_{il}` (K s\ :math:`^{-1}`) and -:math:`q_t` (kg kg\ :math:`^{-1}` s\ :math:`^{-1}`), the large scale -vertical velocity (m s\ :math:`^{-1}`), large scale pressure vertical -velocity (Pa s\ :math:`^{-1}`), the prescribed radiative heating rate (K -s\ :math:`^{-1}`), and profiles of u, v, T, :math:`\theta_{il}` and -:math:`q_t` to use for nudging. - -Although it is expected that all variables are in the NetCDF file, only -those that are used with the chosen forcing method are required to be -nonzero. For example, the following variables are required depending on -the values of ``mom_forcing_type`` and ``thermo_forcing_type`` specified in the case configuration file: - -- ``mom_forcing_type = 1`` - - - Not implemented yet - -- ``mom_forcing_type = 2`` - - - geostrophic winds and large scale vertical velocity - -- ``mom_forcing_type = 3`` - - - u and v nudging profiles - -- ``thermo_forcing_type = 1`` - - - horizontal and vertical advective tendencies of - :math:`\theta_{il}` and :math:`q_t` and prescribed radiative - heating (can be zero if radiation scheme is active) - -- ``thermo_forcing_type = 2`` - - - horizontal advective tendencies of :math:`\theta_{il}` and - :math:`q_t`, prescribed radiative heating (can be zero if - radiation scheme is active), and the large scale vertical pressure - velocity - -- ``thermo_forcing_type = 3`` - - - :math:`\theta_{il}` and :math:`q_t` nudging profiles and the large - scale vertical pressure velocity - For the case configuration file, it is most efficient to copy an existing file in ``ccpp-scm/scm/etc/case_config`` and edit it to suit one’s case. Recall from subsection :numref:`Subsection %s ` that this file is used to configure @@ -260,17 +213,14 @@ configured for the case (without the file extension). The parameter should be less than or equal to the length of the forcing data unless the desired behavior of the simulation is to proceed with the last specified forcing values after the length of the forcing data has been -surpassed. The initial date and time should fall within the forcing -period specified in the case input data file. If the case input data is +surpassed. If the case input data is specified to a lower altitude than the vertical domain, the remainder of the column will be filled in with values from a reference profile. There is a tropical profile and mid-latitude summer profile provided, although one may add more choices by adding a data file to ``ccpp-scm/scm/data/processed_case_input`` and adding a parser section to the subroutine ``get_reference_profile`` in ``scm/src/scm_input.f90``. Surface fluxes can either be specified in the case input data file or calculated using a surface scheme using -surface properties. If surface fluxes are specified from data, set ``sfc_flux_spec`` to ``.true.`` -and specify ``sfc_roughness_length_cm`` for the surface over which the column resides. Otherwise,` -specify a ``sfc_type``. In addition, one must specify a ``column_area`` for each column. +surface properties. In addition, one must specify a ``column_area`` for each column. To control the forcing method, one must choose how the momentum and scalar variable forcing are applied. The three methods of Randall and From 5100474e9c3bdad094787c7fff063ea88385086c Mon Sep 17 00:00:00 2001 From: Tracy Date: Mon, 29 Jul 2024 19:32:35 +0000 Subject: [PATCH 04/19] more ug updates --- scm/doc/TechGuide/chap_cases.rst | 6 +++--- scm/doc/TechGuide/chap_function.rst | 2 +- scm/doc/TechGuide/chap_intro.rst | 2 -- scm/doc/TechGuide/chap_repo.rst | 2 +- 4 files changed, 5 insertions(+), 7 deletions(-) diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index 621401aaf..20cf32a99 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -88,12 +88,12 @@ arguments) are: - ``spinup_timesteps`` - - Number of timesteps to spin up when ``do_spinup`` is true + - Number of timesteps to spin up when ``do_spinup`` is true - ``lsm_ics`` - - Set to ``.true.`` when LSM initial conditions are included (but not all ICs from - another model) + - Set to ``.true.`` when LSM initial conditions are included (but not all ICs from + another model) .. _`case input`: diff --git a/scm/doc/TechGuide/chap_function.rst b/scm/doc/TechGuide/chap_function.rst index 15543244d..88dd9c966 100644 --- a/scm/doc/TechGuide/chap_function.rst +++ b/scm/doc/TechGuide/chap_function.rst @@ -34,7 +34,7 @@ The following steps are performed at the beginning of program execution: sets some variables within the ``scm_state`` derived type from the data that was read. -#. Call ``get_case_init()`` (or ``get_case_init_DEPHY()`` if using the DEPHY format) in the ``scm_input`` module to read in the +#. Call ``get_case_init_DEPHY()`` in the ``scm_input`` module to read in the case input data file (see :numref:`Section %s `). This subroutine also sets some variables within the ``scm_input`` derived type from the data that was read. diff --git a/scm/doc/TechGuide/chap_intro.rst b/scm/doc/TechGuide/chap_intro.rst index 1bcd768ef..a879fea7f 100644 --- a/scm/doc/TechGuide/chap_intro.rst +++ b/scm/doc/TechGuide/chap_intro.rst @@ -60,8 +60,6 @@ Minor - Updated `Scientific Documentation `__, User's Guide, Technical Documentation, and online tutorials. -- Generalized plotting and visualization tools. - Limitations ~~~~~~~~~~~ diff --git a/scm/doc/TechGuide/chap_repo.rst b/scm/doc/TechGuide/chap_repo.rst index fcc2a46e2..4e5a73cd0 100644 --- a/scm/doc/TechGuide/chap_repo.rst +++ b/scm/doc/TechGuide/chap_repo.rst @@ -40,7 +40,7 @@ Cubed-Sphere (FV3) dynamical core. | ``├── docker`` | ``│   └── Dockerfile`` - Contains Docker instructions for building the CCPP SCM image | ``├── environment-suite-sim.yml`` - Python environment dependency file for the CCPP Suite Simulator -| ``├── environment-ufsreplay.yml`` - Python environment dependency file for the UFS Case Generator capability +| ``├── environment-ufscasegen.yml`` - Python environment dependency file for the UFS Case Generator capability | ``├── environment.yml`` - Python environment dependency file for the SCM | ``├── scm`` | ``│   ├── LICENSE.txt`` - Contains licensing information From 32568dd0eca563f6afb4a1ff439af608eab36764 Mon Sep 17 00:00:00 2001 From: Tracy Date: Wed, 31 Jul 2024 20:21:32 +0000 Subject: [PATCH 05/19] update case-gen scripts for LAM capability --- scm/etc/scripts/UFS_case_gen.py | 34 +++++++++++-------- .../scripts/UFS_forcing_ensemble_generator.py | 2 ++ 2 files changed, 22 insertions(+), 14 deletions(-) diff --git a/scm/etc/scripts/UFS_case_gen.py b/scm/etc/scripts/UFS_case_gen.py index 54802d3a7..8e049bb4c 100755 --- a/scm/etc/scripts/UFS_case_gen.py +++ b/scm/etc/scripts/UFS_case_gen.py @@ -459,12 +459,15 @@ def find_lon_lat_of_indices(indices, dir, tile, lam): ######################################################################################## # ######################################################################################## -def find_loc_indices_UFS_history(loc, dir): +def find_loc_indices_UFS_history(loc, dir, lam): """Find the nearest neighbor UFS history file grid point given a lon/lat pair""" #returns the indices of the nearest neighbor point in the given tile, the lon/lat of the nearest neighbor, #and the distance (m) from the given point to the nearest neighbor grid cell - - filename_pattern = 'atmf000.nc' + + if lam: + filename_pattern = 'dynf000.nc' + else: + filename_pattern = 'atmf000.nc' for f_name in os.listdir(dir): if fnmatch.fnmatch(f_name, filename_pattern): @@ -657,7 +660,7 @@ def check_IC_hist_surface_compatibility(dir, i, j, surface_data, lam, old_chgres # Determine UFS history file format (tiled/quilted) if lam: - filename_pattern = '*sfcf000.tile{}.nc'.format(tile) + filename_pattern = '*phyf000.nc' else: filename_pattern = '*sfcf000.nc' @@ -738,7 +741,7 @@ def get_IC_data_from_UFS_history(dir, i, j, lam, tile): # Determine UFS history file format (tiled/quilted) if lam: - filename_pattern = '*atmf000.tile{}.nc'.format(tile) + filename_pattern = '*dynf000.nc' else: filename_pattern = '*atmf000.nc' @@ -1956,8 +1959,8 @@ def get_UFS_forcing_data_advective_tendency(dir, i, j, tile, neighbors, dx, dy, # Determine UFS history file format (tiled/quilted) if lam: - atm_ftag = 'atmf*.tile{0}.nc'.format(tile) - sfc_ftag = 'sfcf*.tile{0}.nc'.format(tile) + atm_ftag = '*dynf*.nc' + sfc_ftag = '*phyf*.nc' else: atm_ftag = '*atmf*.nc' sfc_ftag = '*sfcf*.nc' @@ -2308,8 +2311,8 @@ def get_UFS_forcing_data(nlevs, state_IC, location, use_nearest, forcing_dir, gr # Determine UFS history file format (tiled/quilted) if lam: - atm_ftag = 'atmf*.tile{0}.nc'.format(tile) - sfc_ftag = 'sfcf*.tile{0}.nc'.format(tile) + atm_ftag = '*dynf*.nc' + sfc_ftag = '*phyf*.nc' else: atm_ftag = '*atmf*.nc' sfc_ftag = '*sfcf*.nc' @@ -3625,9 +3628,12 @@ def write_comparison_file(comp_data, case_name, date, surface): ######################################################################################## # ######################################################################################## -def find_date(forcing_dir): - - atm_ftag = '*atmf*.nc' +def find_date(forcing_dir, lam): + + if lam: + atm_ftag = '*dynf*.nc' + else: + atm_ftag = '*atmf*.nc' atm_filenames = [] for f_name in os.listdir(forcing_dir): @@ -3668,7 +3674,7 @@ def main(): old_chgres, lam, save_comp, use_nearest, forcing_method, vertical_method, geos_wind_forcing, wind_nudge) = parse_arguments() #find indices corresponding to both UFS history files and initial condition (IC) files - (hist_i, hist_j, hist_lon, hist_lat, hist_dist_min, angle_to_hist_point, neighbors, dx, dy) = find_loc_indices_UFS_history(location, forcing_dir) + (hist_i, hist_j, hist_lon, hist_lat, hist_dist_min, angle_to_hist_point, neighbors, dx, dy) = find_loc_indices_UFS_history(location, forcing_dir, lam) (IC_i, IC_j, tile, IC_lon, IC_lat, IC_dist_min, angle_to_IC_point) = find_loc_indices_UFS_IC(location, grid_dir, lam, tile, indices) @@ -3704,7 +3710,7 @@ def main(): if not date: # date was not included on command line; look in atmf* file for initial date - date = find_date(forcing_dir) + date = find_date(forcing_dir, lam) #get grid cell area if not given if not area: diff --git a/scm/etc/scripts/UFS_forcing_ensemble_generator.py b/scm/etc/scripts/UFS_forcing_ensemble_generator.py index 26b15e89a..1cacf9ec6 100755 --- a/scm/etc/scripts/UFS_forcing_ensemble_generator.py +++ b/scm/etc/scripts/UFS_forcing_ensemble_generator.py @@ -27,6 +27,7 @@ parser.add_argument('-sdf', '--suite', help='CCPP suite definition file to use for ensemble', default = 'SCM_GFS_v16') parser.add_argument('-sc', '--save_comp', help='flag to save a file with UFS data for comparisons', action='store_true') parser.add_argument('-near', '--use_nearest', help='flag to indicate using the nearest UFS history file gridpoint, no regridding',action='store_true') +parser.add_argument('-lam', '--lam', help='flag to signal that the ICs and forcing is from a limited-area model run' ,action='store_true') parser.add_argument('-fm', '--forcing_method', help='method used to calculate forcing (1=total tendencies from UFS dycore, 2=advective terms calculated from UFS history files, 3=total time tendency terms calculated)', type=int, choices=range(1,4), default=2) parser.add_argument('-vm', '--vertical_method',help='method used to calculate vertical advective forcing (1=vertical advective terms calculated from UFS history files and added to total, 2=smoothed vertical velocity provided)', type=int, choices=range(1,3), default=2) parser.add_argument('-wn', '--wind_nudge', help='flag to turn on wind nudging to UFS profiles', action='store_true') @@ -142,6 +143,7 @@ def main(): com_config = '' if args.save_comp: com_config = com_config + ' -sc' if args.use_nearest: com_config = com_config + ' -near' + if args.lam: com_config = com_config + ' -lam' if args.forcing_method: com_config = com_config + ' -fm ' + str(args.forcing_method) if args.vertical_method: com_config = com_config + ' -vm ' + str(args.vertical_method) if args.wind_nudge: com_config = com_config + ' -wn' From f361ab167a0ef5c994dd927e848cdb4f47247e50 Mon Sep 17 00:00:00 2001 From: Tracy Date: Tue, 13 Aug 2024 23:09:20 +0000 Subject: [PATCH 06/19] Remove redundancy --- scm/doc/TechGuide/chap_quick.rst | 9 +-------- 1 file changed, 1 insertion(+), 8 deletions(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index 205ee2947..19dbfd405 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -220,14 +220,7 @@ are set to the correct values so that the build system can find them, as describ Setting up compilation environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -For users on a pre-configured platform, you can load the spack-stack environment via one of the provided modules in ``scm/etc/modules/``. -For example, users on the NSF NCAR machine Derecho who wish to use Intel compilers can do the following: - -:: - - cd [path/to/ccpp-scm/] - module use scm/etc/modules/ - module load derecho_intel +For users on a pre-configured platform, the spack-stack environment can be loaded via one of the provided modules in ``scm/etc/modules/`` as described in :numref:`Section %s `. Additionally, for users who have installed spack-stack on their own MacOS or Linux machine can use the provided ``macos_clang`` or ``linux_gnu`` modules. From 0a58e4fd972decae13cba6c7bc66ba2d01e4adf7 Mon Sep 17 00:00:00 2001 From: Grant Firl Date: Wed, 14 Aug 2024 15:07:18 -0400 Subject: [PATCH 07/19] update docker running instructions for mpi_command that requires root --- scm/doc/TechGuide/chap_quick.rst | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index 19dbfd405..78191f568 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -860,12 +860,12 @@ Running the Docker image #. To run the SCM, you can run the Docker container that was just created and give it the same run commands as discussed in :numref:`Section %s ` - **Be sure to remember to include the ``-d`` - include the option for all run commands**. For example, + **Be sure to remember to include the ``-d`` and ``--mpi_command "mpirun -np 1 --allow-run-as-root"`` + options for all run commands**. For example, .. code:: bash - docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -c twpice -d + docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -c twpice --mpi_command "mpirun -np 1 --allow-run-as-root" -d will run through the TWPICE case using the default suite and namelist and put the output in the shared directory. @@ -878,7 +878,7 @@ Running the Docker image .. code:: bash - docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -f ../../test/rt_test_cases.py --runtime_mult 0.1 -d + docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -f ../../test/rt_test_cases.py --runtime_mult 0.1 --mpi_command "mpirun -np 1 --allow-run-as-root" -d The options included in the above ``run`` commands are the following: From c874ad708863bb4e4bdb647766bb4e2e308ba0c5 Mon Sep 17 00:00:00 2001 From: Grant Firl Date: Wed, 14 Aug 2024 16:26:40 -0400 Subject: [PATCH 08/19] update forcing_method 1 in chap_cases --- scm/doc/TechGuide/chap_cases.rst | 24 +++++++++++++++++++++++- 1 file changed, 23 insertions(+), 1 deletion(-) diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index 20cf32a99..4faf7ba22 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -351,7 +351,8 @@ Optional arguments: #. ``--save_comp (-sc)``: flag to create UFS reference file for comparison -#. ``--use_nearest (-near)``: flag to indicate using the nearest UFS history file gridpoint +#. ``--use_nearest (-near)``: flag to indicate using the nearest UFS history file gridpoint for calculation + of forcing; only valid for use with -fm=1 or -fm=3 #. ``--forcing_method (-fm)``: method used to calculate forcing (1=total tendencies from UFS dycore, 2=advective terms calculated from UFS history files, 3=total time tendency terms calculated), default=2 @@ -363,6 +364,27 @@ Optional arguments: #. ``--geostrophic (-geos)``: flag to turn on geostrophic wind forcing +Notes Regarding Implemented Forcing Methods + +The ``--forcing_method`` option hides some complexity that should be understood when running this script since +each method has a particular use case and produces potentially very different forcing terms. Forcing method 1 is +designed to be used in concert with the three-dimensional UFS. I.e., the UFS must be run with diagnostic tendencies +activated so that the `nophysics` term is calculated and output for all grid points. This diagnostic term +represents the tendency produced for each state variable by the UFS between calls to the "slow" physics. This +includes the tendency due to advection, but also any tendencies due to other non-physics processes, e.g. "fast" +physics, coupling to external components, data assimilation, etc. Within the SCM, this diagnostic is used as the +forcing term for each state variable. Although one can achieve results as close as possible between a UFS column +and the single column model using this method, it will NOT be bit-for-bit for many reasons. Some of these reasons +include: diagnostic output is not typically instantaneous for every timestep, the UFS' vertical coordinate is +semi-Lagrangian and includes a remapping step as the surface pressure changes for each column, whereas the SCM +uses a Eulerian vertical coordinate without the vertical remapping step, and some interpolation happens in the +UFS_case_gen.py script due to the UFS initial conditions and history files using different grids. This method +can only be used when the UFS has been configured and run with the anticipation of running the SCM using this +forcing method afterward because it requires considerable extra disk space for the additional output. + + + + .. _`ufsforcingensemblegenerator`: UFS_forcing_ensemble_generator.py From 7093c54c808953e1bf5d88af56b76c2761f903a6 Mon Sep 17 00:00:00 2001 From: "Michael J. Kavulich, Jr" Date: Thu, 18 Jul 2024 18:35:35 +0000 Subject: [PATCH 09/19] Upgrade modulefiles for spack-stack 1.6.0 --- scm/etc/modules/derecho_gnu.lua | 14 +++++++------- scm/etc/modules/derecho_intel.lua | 16 ++++++++-------- scm/etc/modules/hera_gnu.lua | 2 +- scm/etc/modules/hera_intel.lua | 2 +- scm/etc/modules/jet_gnu.lua | 15 +++++---------- scm/etc/modules/jet_intel.lua | 17 ++++++----------- scm/etc/modules/linux_gnu.lua | 6 ++---- scm/etc/modules/macos_clang.lua | 6 ++---- scm/etc/modules/orion_gnu.lua | 12 +++++------- scm/etc/modules/orion_intel.lua | 11 +++++------ 10 files changed, 42 insertions(+), 59 deletions(-) diff --git a/scm/etc/modules/derecho_gnu.lua b/scm/etc/modules/derecho_gnu.lua index be07155c4..70c01bd92 100644 --- a/scm/etc/modules/derecho_gnu.lua +++ b/scm/etc/modules/derecho_gnu.lua @@ -1,28 +1,28 @@ help([[ This module loads libraries for building the CCPP Single-Column Model on -the CISL machine Derecho (Cray) using Intel-classic-2023.0.0 +the CISL machine Derecho (Cray) using GNU 12.2.0 ]]) whatis([===[Loads spack-stack libraries needed for building the CCPP SCM on Derecho with GNU compilers]===]) setenv("LMOD_TMOD_FIND_FIRST","yes") load("ncarenv/23.09") -load("cmake/3.26.3") -prepend_path("MODULEPATH","/glade/work/epicufsrt/contrib/spack-stack/derecho/modulefiles") -prepend_path("MODULEPATH","/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") +prepend_path("MODULEPATH","/lustre/desc1/scratch/epicufsrt/contrib/modulefiles_extra") +prepend_path("MODULEPATH","/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core") load("stack-gcc/12.2.0") load("stack-cray-mpich/8.1.25") load("stack-python/3.10.8") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.0") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") -load("sp/2.3.3") -load("w3emc") +load("sp/2.5.0") +load("w3emc/2.10.0") setenv("CMAKE_C_COMPILER","mpicc") setenv("CMAKE_CXX_COMPILER","mpicxx") diff --git a/scm/etc/modules/derecho_intel.lua b/scm/etc/modules/derecho_intel.lua index 37f784bfe..07b8d3627 100644 --- a/scm/etc/modules/derecho_intel.lua +++ b/scm/etc/modules/derecho_intel.lua @@ -1,28 +1,28 @@ help([[ This module loads libraries for building the CCPP Single-Column Model on -the CISL machine Derecho (Cray) using Intel-classic-2023.0.0 +the CISL machine Derecho (Cray) using Intel-classic-2021.10.0 ]]) whatis([===[Loads spack-stack libraries needed for building the CCPP SCM on Derecho with Intel compilers]===]) setenv("LMOD_TMOD_FIND_FIRST","yes") load("ncarenv/23.09") -load("cmake/3.26.3") -prepend_path("MODULEPATH","/glade/work/epicufsrt/contrib/spack-stack/derecho/modulefiles") -prepend_path("MODULEPATH","/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") +prepend_path("MODULEPATH","/lustre/desc1/scratch/epicufsrt/contrib/modulefiles_extra") +prepend_path("MODULEPATH", "/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core") load("stack-intel/2021.10.0") load("stack-cray-mpich/8.1.25") -load("stack-python/3.10.8") +load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.0") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") -load("sp/2.3.3") -load("w3emc") +load("sp/2.5.0") +load("w3emc/2.10.0") setenv("CMAKE_C_COMPILER","cc") setenv("CMAKE_CXX_COMPILER","CC") diff --git a/scm/etc/modules/hera_gnu.lua b/scm/etc/modules/hera_gnu.lua index 10d0c82f4..5725ba371 100644 --- a/scm/etc/modules/hera_gnu.lua +++ b/scm/etc/modules/hera_gnu.lua @@ -12,10 +12,10 @@ prepend_path("MODULEPATH", "/scratch2/NCEPDEV/stmp1/role.epic/spack-stack/spack- load("stack-gcc/13.3.0") load("stack-openmpi/4.1.6") -load("cmake/3.23.1") load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") load("netcdf-fortran/4.6.1") diff --git a/scm/etc/modules/hera_intel.lua b/scm/etc/modules/hera_intel.lua index e682bef4e..35e483934 100644 --- a/scm/etc/modules/hera_intel.lua +++ b/scm/etc/modules/hera_intel.lua @@ -9,10 +9,10 @@ prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-s load("stack-intel/2021.5.0") load("stack-intel-oneapi-mpi/2021.5.1") -load("cmake/3.23.1") load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") load("netcdf-fortran/4.6.1") diff --git a/scm/etc/modules/jet_gnu.lua b/scm/etc/modules/jet_gnu.lua index 96d5a1c97..090a61af7 100644 --- a/scm/etc/modules/jet_gnu.lua +++ b/scm/etc/modules/jet_gnu.lua @@ -5,25 +5,20 @@ the NOAA RDHPC machine Jet using GNU 9.2.0 whatis([===[Loads libraries needed for building the CCPP SCM on Jet with GNU compilers ]===]) -prepend_path("MODULEPATH", "/lfs4/HFIP/hfv3gfs/spack-stack/modulefiles") - -load("cmake/3.26.4") -load("miniconda/3.9.12") - -prepend_path("MODULEPATH", "/lfs4/HFIP/hfv3gfs/role.epic/modulefiles") -prepend_path("MODULEPATH", "/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") +prepend_path("MODULEPATH","/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core") load("stack-gcc/9.2.0") load("stack-openmpi/3.1.4") load("stack-python/3.10.8") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.0") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") -load("sp/2.3.3") -load("w3emc") +load("sp/2.5.0") +load("w3emc/2.10.0") setenv("CMAKE_C_COMPILER","mpicc") setenv("CMAKE_CXX_COMPILER","mpicxx") diff --git a/scm/etc/modules/jet_intel.lua b/scm/etc/modules/jet_intel.lua index 8c232ed26..5376fe006 100644 --- a/scm/etc/modules/jet_intel.lua +++ b/scm/etc/modules/jet_intel.lua @@ -5,25 +5,20 @@ the NOAA RDHPC machine Jet using Intel-2021.5.0 whatis([===[Loads libraries needed for building the CCPP SCM on Jet with Intel compilers ]===]) -prepend_path("MODULEPATH", "/lfs4/HFIP/hfv3gfs/spack-stack/modulefiles") - -load("cmake/3.26.4") -load("miniconda/3.9.12") - -prepend_path("MODULEPATH", "/lfs4/HFIP/hfv3gfs/role.epic/modulefiles") -prepend_path("MODULEPATH", "/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") +prepend_path("MODULEPATH","/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core") load("stack-intel/2021.5.0") load("stack-intel-oneapi-mpi/2021.5.1") -load("stack-python/3.10.8") +load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.0") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") -load("sp/2.3.3") -load("w3emc") +load("sp/2.5.0") +load("w3emc/2.10.0") setenv("CMAKE_C_COMPILER","mpiicc") setenv("CMAKE_CXX_COMPILER","mpiicpc") diff --git a/scm/etc/modules/linux_gnu.lua b/scm/etc/modules/linux_gnu.lua index 1c0c65efa..488706d2e 100644 --- a/scm/etc/modules/linux_gnu.lua +++ b/scm/etc/modules/linux_gnu.lua @@ -17,13 +17,11 @@ load("stack-openmpi/4.1.6") load("cmake/3.28.3") -load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.1") - load("py-f90nml/1.4.3") load("py-netcdf4/1.5.8") - +load("netcdf-c/4.9.2") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") load("sp/2.5.0") load("w3emc/2.10.0") diff --git a/scm/etc/modules/macos_clang.lua b/scm/etc/modules/macos_clang.lua index 41aef7680..b25aad8ca 100644 --- a/scm/etc/modules/macos_clang.lua +++ b/scm/etc/modules/macos_clang.lua @@ -15,13 +15,11 @@ load("stack-openmpi/4.1.6") load("cmake/3.28.3") -load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.1") - load("py-f90nml/1.4.3") load("py-netcdf4/1.5.8") - +load("netcdf-c/4.9.2") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") load("sp/2.5.0") load("w3emc/2.10.0") diff --git a/scm/etc/modules/orion_gnu.lua b/scm/etc/modules/orion_gnu.lua index 17eb12b09..503045b80 100644 --- a/scm/etc/modules/orion_gnu.lua +++ b/scm/etc/modules/orion_gnu.lua @@ -7,22 +7,20 @@ whatis([===[Loads libraries needed for building the CCPP SCM on Orion with GNU c prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/modulefiles") -load("cmake/3.22.1") -load("python/3.9.2") - -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") +prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core") load("stack-gcc/10.2.0") load("stack-openmpi/4.0.4") load("stack-python/3.10.8") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.0") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") -load("sp/2.3.3") -load("w3emc") +load("sp/2.5.0") +load("w3emc/2.10.0") setenv("CMAKE_C_COMPILER","mpicc") setenv("CMAKE_CXX_COMPILER","mpicxx") diff --git a/scm/etc/modules/orion_intel.lua b/scm/etc/modules/orion_intel.lua index 005119c5d..5be8a9abf 100644 --- a/scm/etc/modules/orion_intel.lua +++ b/scm/etc/modules/orion_intel.lua @@ -7,9 +7,6 @@ whatis([===[Loads libraries needed for building the CCPP SCM on Orion with Intel prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/modulefiles") -load("cmake/3.22.1") -load("python/3.9.2") - prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") load("stack-intel/2022.0.2") @@ -17,12 +14,14 @@ load("stack-intel-oneapi-mpi/2021.5.1") load("stack-python/3.10.8") load("py-f90nml") load("py-netcdf4/1.5.8") +load("cmake/3.23.1") load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.0") +load("netcdf-fortran/4.6.1") load("bacio/2.4.1") -load("sp/2.3.3") -load("w3emc") +load("sp/2.5.0") +load("w3emc/2.10.0") + setenv("CMAKE_C_COMPILER","mpiicc") setenv("CMAKE_CXX_COMPILER","mpiicpc") From 241da127377c24f16b994e81d6bbba7d94bf2b20 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Thu, 18 Jul 2024 13:29:08 -0600 Subject: [PATCH 10/19] Remove defunct docker setup script --- scm/etc/CENTOS_docker_setup.sh | 20 -------------------- 1 file changed, 20 deletions(-) delete mode 100755 scm/etc/CENTOS_docker_setup.sh diff --git a/scm/etc/CENTOS_docker_setup.sh b/scm/etc/CENTOS_docker_setup.sh deleted file mode 100755 index c7164cfd0..000000000 --- a/scm/etc/CENTOS_docker_setup.sh +++ /dev/null @@ -1,20 +0,0 @@ -#!/bin/bash - -echo "Setting environment variables for SCM-CCPP on CENTOS with gcc/gfortran" - -MYDIR=$(cd "$(dirname "$(readlink -f -n "${BASH_SOURCE[0]}" )" )" && pwd -P) - -export SCM_ROOT=$MYDIR/../.. - -export CC=/opt/rh/devtoolset-9/root/usr/bin/gcc -export CXX=/opt/rh/devtoolset-9/root/usr/bin/g++ -export F77=/opt/rh/devtoolset-9/root/usr/bin/gfortran -export F90=/opt/rh/devtoolset-9/root/usr/bin/gfortran -export FC=/opt/rh/devtoolset-9/root/usr/bin/gfortran - -export NETCDF=/comsoftware/libs/netcdf - -echo "Running NCEPLIBS installation script for SCM-CCPP" -cd .. -./contrib/build_nceplibs.sh $PWD/nceplibs -cd scm From 02a92f29f5144e5a83086d5f21fdd29089bf076b Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Fri, 19 Jul 2024 11:26:41 -0600 Subject: [PATCH 11/19] Finally got Derecho GNU module working, missed some python updates in other modules --- scm/etc/modules/derecho_gnu.lua | 11 +++++------ scm/etc/modules/jet_gnu.lua | 2 +- scm/etc/modules/orion_gnu.lua | 2 +- scm/etc/modules/orion_intel.lua | 2 +- 4 files changed, 8 insertions(+), 9 deletions(-) diff --git a/scm/etc/modules/derecho_gnu.lua b/scm/etc/modules/derecho_gnu.lua index 70c01bd92..f7f64fb57 100644 --- a/scm/etc/modules/derecho_gnu.lua +++ b/scm/etc/modules/derecho_gnu.lua @@ -5,25 +5,24 @@ the CISL machine Derecho (Cray) using GNU 12.2.0 whatis([===[Loads spack-stack libraries needed for building the CCPP SCM on Derecho with GNU compilers]===]) -setenv("LMOD_TMOD_FIND_FIRST","yes") -load("ncarenv/23.09") - prepend_path("MODULEPATH","/lustre/desc1/scratch/epicufsrt/contrib/modulefiles_extra") prepend_path("MODULEPATH","/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core") load("stack-gcc/12.2.0") load("stack-cray-mpich/8.1.25") -load("stack-python/3.10.8") -load("py-f90nml") -load("py-netcdf4/1.5.8") +load("stack-python/3.10.13") load("cmake/3.23.1") +load("hdf5/1.14.0") load("netcdf-c/4.9.2") load("netcdf-fortran/4.6.1") load("bacio/2.4.1") load("sp/2.5.0") load("w3emc/2.10.0") +load("py-f90nml") +load("py-netcdf4/1.5.8") + setenv("CMAKE_C_COMPILER","mpicc") setenv("CMAKE_CXX_COMPILER","mpicxx") setenv("CMAKE_Fortran_COMPILER","mpif90") diff --git a/scm/etc/modules/jet_gnu.lua b/scm/etc/modules/jet_gnu.lua index 090a61af7..fc9569fe7 100644 --- a/scm/etc/modules/jet_gnu.lua +++ b/scm/etc/modules/jet_gnu.lua @@ -9,7 +9,7 @@ prepend_path("MODULEPATH","/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-st load("stack-gcc/9.2.0") load("stack-openmpi/3.1.4") -load("stack-python/3.10.8") +load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") load("cmake/3.23.1") diff --git a/scm/etc/modules/orion_gnu.lua b/scm/etc/modules/orion_gnu.lua index 503045b80..2a10d5512 100644 --- a/scm/etc/modules/orion_gnu.lua +++ b/scm/etc/modules/orion_gnu.lua @@ -11,7 +11,7 @@ prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-st load("stack-gcc/10.2.0") load("stack-openmpi/4.0.4") -load("stack-python/3.10.8") +load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") load("cmake/3.23.1") diff --git a/scm/etc/modules/orion_intel.lua b/scm/etc/modules/orion_intel.lua index 5be8a9abf..807f446b2 100644 --- a/scm/etc/modules/orion_intel.lua +++ b/scm/etc/modules/orion_intel.lua @@ -11,7 +11,7 @@ prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-st load("stack-intel/2022.0.2") load("stack-intel-oneapi-mpi/2021.5.1") -load("stack-python/3.10.8") +load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") load("cmake/3.23.1") From 2bf8e24506aab760073ec069a36eb2c0655f75c3 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr." Date: Wed, 24 Jul 2024 13:54:52 -0600 Subject: [PATCH 12/19] Update quick start guide a bit --- scm/doc/TechGuide/chap_quick.rst | 13 ++++++------- 1 file changed, 6 insertions(+), 7 deletions(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index 78a8b7155..3e119e2ea 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -491,10 +491,11 @@ To see the full list of available options, use the ``--help`` flag: The run script’s full set of options are described below, where optional abbreviations are included in brackets. If using the main branch, you should run the above command to ensure you have the most up-to-date list of options. +There are no required arguments, but at least one of ``--case`` or ``--file`` must be specified. - ``--case [-c]`` - - **This is the only required argument.** The provided argument should correspond to the name of a case in + - The provided argument should correspond to the name of a case in ``../etc/case_config`` (without the ``.nml`` extension). - ``--suite [-s]`` @@ -596,17 +597,15 @@ configuration files located in ``../etc/case_config`` (*without the .nml extensi specifying a suite other than the default, the suite name used must match the value of the suite name in one of the suite definition files located in ``../../ccpp/suites`` (Note: not the filename of the suite definition file). As -part of the sixth CCPP release, the following suite names are supported: +part of the seventh CCPP release, the following suite names are supported: #. SCM_GFS_v16 -#. SCM_GFS_v17p8 +#. SCM_GFS_v16_RRTMGP -#. SCM_RAP +#. SCM_GFS_v17_p8_ugwpv1 -#. SCM_HRRR - -#. SCM_RRFS_v1beta +#. SCM_HRRR_gf #. SCM_WoFS_v0 From 85c5ab37691a58e80f2db90deeff764909f52d2b Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Fri, 2 Aug 2024 12:57:04 -0600 Subject: [PATCH 13/19] Remove GNU modulefiles for Jet and Orion (not sure we can get these working and they arent necessary) --- scm/etc/modules/jet_gnu.lua | 26 -------------------------- scm/etc/modules/orion_gnu.lua | 28 ---------------------------- 2 files changed, 54 deletions(-) delete mode 100644 scm/etc/modules/jet_gnu.lua delete mode 100644 scm/etc/modules/orion_gnu.lua diff --git a/scm/etc/modules/jet_gnu.lua b/scm/etc/modules/jet_gnu.lua deleted file mode 100644 index fc9569fe7..000000000 --- a/scm/etc/modules/jet_gnu.lua +++ /dev/null @@ -1,26 +0,0 @@ -help([[ -This module loads libraries for building the CCPP Single-Column Model on -the NOAA RDHPC machine Jet using GNU 9.2.0 -]]) - -whatis([===[Loads libraries needed for building the CCPP SCM on Jet with GNU compilers ]===]) - -prepend_path("MODULEPATH","/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core") - -load("stack-gcc/9.2.0") -load("stack-openmpi/3.1.4") -load("stack-python/3.10.13") -load("py-f90nml") -load("py-netcdf4/1.5.8") -load("cmake/3.23.1") - -load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.1") -load("bacio/2.4.1") -load("sp/2.5.0") -load("w3emc/2.10.0") - -setenv("CMAKE_C_COMPILER","mpicc") -setenv("CMAKE_CXX_COMPILER","mpicxx") -setenv("CMAKE_Fortran_COMPILER","mpif90") -setenv("CMAKE_Platform","jet.gnu") diff --git a/scm/etc/modules/orion_gnu.lua b/scm/etc/modules/orion_gnu.lua deleted file mode 100644 index 2a10d5512..000000000 --- a/scm/etc/modules/orion_gnu.lua +++ /dev/null @@ -1,28 +0,0 @@ -help([[ -This module loads libraries for building the CCPP Single-Column Model on -the NOAA RDHPC machine orion using GNU 10.2.0 -]]) - -whatis([===[Loads libraries needed for building the CCPP SCM on Orion with GNU compilers ]===]) - -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/modulefiles") - -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core") - -load("stack-gcc/10.2.0") -load("stack-openmpi/4.0.4") -load("stack-python/3.10.13") -load("py-f90nml") -load("py-netcdf4/1.5.8") -load("cmake/3.23.1") - -load("netcdf-c/4.9.2") -load("netcdf-fortran/4.6.1") -load("bacio/2.4.1") -load("sp/2.5.0") -load("w3emc/2.10.0") - -setenv("CMAKE_C_COMPILER","mpicc") -setenv("CMAKE_CXX_COMPILER","mpicxx") -setenv("CMAKE_Fortran_COMPILER","mpif90") -setenv("CMAKE_Platform","orion.gnu") From fd4e3073eb82d8b2090467255be1c79a4e5adc4d Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Fri, 2 Aug 2024 16:03:06 -0600 Subject: [PATCH 14/19] Instructions on building with macos/linux modulefiles --- scm/doc/TechGuide/chap_quick.rst | 18 ++++++++++++++---- 1 file changed, 14 insertions(+), 4 deletions(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index 3e119e2ea..cb1757461 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -214,12 +214,9 @@ However, we have provided an example procedure in The main downside to spack-stack is that it contains a large number of libraries and utilities used by the whole Unified Forecast System and related applications, only a minority of which are required for the SCM. Users may install libraries manually if they wish, but they will need to make sure the appropriate environment variables -are set to the correct values so that the build system can find them, as described in the following chapter. +are set to the correct values so that the build system can find them, as described in the following paragraphs. -Setting up compilation environment -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - For users on a pre-configured platform, you can load the spack-stack environment via one of the provided modules in ``scm/etc/modules/``. For example, users on the NSF NCAR machine Derecho who wish to use Intel compilers can do the following: @@ -244,6 +241,19 @@ compilers (``CC``, ``CXX``, ``FC``), as well as the root directories for the lib provided Dockerfile in ``ccpp-scm/docker/``, so users can reference that file for guidance on how to install this software and set these variables. +If libraries were installed via spack-stack, users can load modules similarly to those available on pre-configured platforms. +For a user on MacOS, who has installed spack-stack with ``clang``/``gfortran`` compilers, they can set up the build environment +by setting the SPACK_STACK_DIR variable to the appropriate path, and loading the module as on pre-configured platforms described above. + +:: + + export SPACK_STACK_DIR=[/path/to/spack-stack] + cd [path/to/ccpp-scm/] + module use scm/etc/modules/ + module load macos_clang + +A module file is also provided for a generic linux platform with gnu compilers. For other platforms/combinations, you may be able +to modify the provided modulefiles to work with your spack-stack install, otherwise reference the above procedure for manually installed libraries. Python requirements """"""""""""""""""""" From c0a5d79a152837fcd53f5c47dac64e6331caf27c Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Fri, 2 Aug 2024 16:11:34 -0600 Subject: [PATCH 15/19] Remove SCM_RAP from supported suites --- scm/src/supported_suites.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scm/src/supported_suites.py b/scm/src/supported_suites.py index 6c1f0d311..2beaff71f 100644 --- a/scm/src/supported_suites.py +++ b/scm/src/supported_suites.py @@ -1,2 +1,2 @@ -suites = ["SCM_GFS_v16","SCM_GFS_v16_RRTMGP","SCM_GFS_v17_p8_ugwpv1","SCM_RAP","SCM_HRRR_gf","SCM_WoFS_v0"] +suites = ["SCM_GFS_v16","SCM_GFS_v16_RRTMGP","SCM_GFS_v17_p8_ugwpv1","SCM_HRRR_gf","SCM_WoFS_v0"] From 41bfbbbe3c3840995d3684b17cb720932d2429c4 Mon Sep 17 00:00:00 2001 From: "Michael J. Kavulich, Jr" Date: Thu, 8 Aug 2024 16:43:01 -0500 Subject: [PATCH 16/19] Changes to get things working on Orion --- scm/etc/modules/orion_intel.lua | 8 +++----- 1 file changed, 3 insertions(+), 5 deletions(-) diff --git a/scm/etc/modules/orion_intel.lua b/scm/etc/modules/orion_intel.lua index 807f446b2..b95162412 100644 --- a/scm/etc/modules/orion_intel.lua +++ b/scm/etc/modules/orion_intel.lua @@ -5,12 +5,10 @@ the NOAA RDHPC machine Orion using Intel-2021.5.0 whatis([===[Loads libraries needed for building the CCPP SCM on Orion with Intel compilers ]===]) -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/modulefiles") +prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core") -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.5.1/envs/unified-env/install/modulefiles/Core") - -load("stack-intel/2022.0.2") -load("stack-intel-oneapi-mpi/2021.5.1") +load("stack-intel/2021.9.0") +load("stack-intel-oneapi-mpi/2021.9.0") load("stack-python/3.10.13") load("py-f90nml") load("py-netcdf4/1.5.8") From 0e51f537296a39a7a6e0cc998dceec3d2b49b1f7 Mon Sep 17 00:00:00 2001 From: "Michael J. Kavulich, Jr" Date: Tue, 13 Aug 2024 11:00:26 -0500 Subject: [PATCH 17/19] Address comments from Dustin --- scm/doc/TechGuide/chap_quick.rst | 1 - scm/src/run_scm.py | 3 +++ 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index cb1757461..729635666 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -501,7 +501,6 @@ To see the full list of available options, use the ``--help`` flag: The run script’s full set of options are described below, where optional abbreviations are included in brackets. If using the main branch, you should run the above command to ensure you have the most up-to-date list of options. -There are no required arguments, but at least one of ``--case`` or ``--file`` must be specified. - ``--case [-c]`` diff --git a/scm/src/run_scm.py b/scm/src/run_scm.py index e610475ba..c6014e57e 100755 --- a/scm/src/run_scm.py +++ b/scm/src/run_scm.py @@ -195,6 +195,9 @@ def parse_arguments(): mpi_command = args.mpi_command stop_on_error = args.stop_on_error + if not case and not file: + parser.error('Either "--case" or "--file" must be specified. Use "--help" for more information.') + if not sdf: sdf = DEFAULT_SUITE From ce138f947a858d5800c7418e7ad4b393f98c533e Mon Sep 17 00:00:00 2001 From: Grant Firl Date: Thu, 15 Aug 2024 16:57:02 -0400 Subject: [PATCH 18/19] add more content regarding UFS_case_gen forcing methods --- scm/doc/TechGuide/chap_cases.rst | 43 +++++++++++++++++++++++++++++--- 1 file changed, 40 insertions(+), 3 deletions(-) diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index 4faf7ba22..a9c9bbdaa 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -369,7 +369,8 @@ Notes Regarding Implemented Forcing Methods The ``--forcing_method`` option hides some complexity that should be understood when running this script since each method has a particular use case and produces potentially very different forcing terms. Forcing method 1 is designed to be used in concert with the three-dimensional UFS. I.e., the UFS must be run with diagnostic tendencies -activated so that the `nophysics` term is calculated and output for all grid points. This diagnostic term +activated so that the `nophysics` term is calculated and output for all grid points +(see https://ccpp-techdoc.readthedocs.io/en/latest/ParamSpecificOutput.html#tendencies). This diagnostic term represents the tendency produced for each state variable by the UFS between calls to the "slow" physics. This includes the tendency due to advection, but also any tendencies due to other non-physics processes, e.g. "fast" physics, coupling to external components, data assimilation, etc. Within the SCM, this diagnostic is used as the @@ -382,8 +383,44 @@ UFS_case_gen.py script due to the UFS initial conditions and history files using can only be used when the UFS has been configured and run with the anticipation of running the SCM using this forcing method afterward because it requires considerable extra disk space for the additional output. - - +The `--forcing_method` 2 option is the most general in the sense that the same method could apply to any three-dimensional +model output. For a given column, it uses a configurable number of neighboring grid points to calculate the horizontal +advective tendencies using the horizontal components of the three-dimensional wind and horizontal derivatives of the +state variables. Note that the script performs some smoothing in the vertical profiles used to calculate the advective +terms in order to eliminate small-scale noise in the forcing terms and the derivatives are calculated using a second- or +fourth-order centered difference scheme, depending on the number of neighboring points used. Vertical advective terms +are calculated based on the specification of `--vertical_method` (-vm). For vertical_method 1, the vertical advective +terms are calculated from the history files using UFS vertical velocities and the same modeled smoothed vertical profiles +of the state variables using the upstream scheme. Note that while the horizontal terms use neighboring points, the vertical +advective terms only use the central, chosen column. This method is sometimes referred to as "total advective forcing" and +tends to be less "responsive" to the SCM-modeled state. I.e., a SCM run using vertical method 2 has a greater chance of +deviating from the UFS column state and not being able to "recover". For this reason, vertical method 2 is often used +in the literature, whereby the vertical velocity profile from the three-dimensional model is provided as forcing to the SCM +and the vertical advective terms are calculated during the SCM integration using the SCM-modeled state variable profiles. + +The final forcing method, 3, uses the three-dimensional history files to calculate profiles of the total time-rate of change +of the state variables to use as forcing for the SCM. Note that this is tantamount to strongly nudging the SCM state to the +UFS state and already intrinsically includes both the physics and dynamics tendencies. While a simulation using this forcing +is more-or-less guaranteed to produce a SCM simulation that closely matches the three-dimensional output of the state variables, +it strongly minimizes the contribution of physics in the SCM simulation. Indeed, an SCM simulation without running a physics suite +at all would still be expected to closely track the mean state of the three-dimensional column, so this method will likely be of +limited use for physics studies. + +Forcing the horizontal components of the wind can be notoriously difficult in SCMs, and the most straightforward method is to +simply nudge them to the three-dimensional modeled state. This method is achieved by using the `--wind_nudge` (-wn) option and uses a nudging +timescale of one hour. It should be possible to calculate a nudging timescale based on the magnitude of the wind in the neighboring +grid cells, although this is not implemented yet. + +The second method to force the horizontal wind components is to calculate the geostrophic wind using the "large scale" pressure +gradient from the three-dimensional model. This is achieved by using the `--geostrophic` (-geos) option. What qualifies as large +enough of a horizontal gridscale to achieve geostrophic balance is when the Rossby number is much less than one. The script uses a +configurable Rossby number (around 0.1) to expand the number of neighboring grid points such that geostrophic balance can be assumed +given the particular UFS history file grid. The geostrophic winds are calculated using the horizontal geopotential gradient and the +local latitude-dependent Coriolis parameter. From the PBL top downward, the geostrophic winds are assumed to go to zero. In testing +with this method, if the initial horizontal winds have a significant ageostrophic component (the initial condition winds are +appreciably different than the calculated geostrophic winds), this often leads to spurious clockwise turning of the mean modeled winds +with time. An option exists within the script to assume that the mean three-dimensional winds are, in fact, identical to the +geostrophic winds as well. Using this option eliminates any spurious turning. .. _`ufsforcingensemblegenerator`: From 9e8ac9ba25af4c93c7422f471397d370f4479c41 Mon Sep 17 00:00:00 2001 From: Grant Firl Date: Mon, 19 Aug 2024 16:10:19 -0400 Subject: [PATCH 19/19] update GH workflow to reflect env name change --- .github/workflows/ci_run_scm_ufs_replay.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/ci_run_scm_ufs_replay.yml b/.github/workflows/ci_run_scm_ufs_replay.yml index 508d3c490..961276180 100644 --- a/.github/workflows/ci_run_scm_ufs_replay.yml +++ b/.github/workflows/ci_run_scm_ufs_replay.yml @@ -40,8 +40,8 @@ jobs: - name: Setup python. uses: conda-incubator/setup-miniconda@v3 with: - activate-environment: env_ufsreplay - environment-file: environment-ufsreplay.yml + activate-environment: env_ufscasegen + environment-file: environment-ufscasegen.yml use-only-tar-bz2: true auto-activate-base: true auto-update-conda: true