diff --git a/.cicd/scripts/srw_metric_example.sh b/.cicd/scripts/srw_metric_example.sh index 2018505735..1f587d1bf4 100755 --- a/.cicd/scripts/srw_metric_example.sh +++ b/.cicd/scripts/srw_metric_example.sh @@ -61,7 +61,7 @@ cd ${workspace}/tests/WE2E cd ${workspace} # run skill-score check -[[ ! -f Indy-Severe-Weather.tgz ]] && wget https://noaa-ufs-srw-pds.s3.amazonaws.com/sample_cases/release-public-v2.1.0/Indy-Severe-Weather.tgz +[[ ! -f Indy-Severe-Weather.tgz ]] && wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.1.0/METplus-vx-sample/Indy-Severe-Weather.tgz [[ ! -d Indy-Severe-Weather ]] && tar xvfz Indy-Severe-Weather.tgz [[ -f skill-score.out ]] && rm skill-score.out # Skill score index is computed over several terms that are defined in parm/metplus/STATAnalysisConfig_skill_score. diff --git a/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst b/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst index c4f373f4ba..21be725d29 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst @@ -123,7 +123,7 @@ The community AQM configuration assumes that users have :term:`HPSS` access and USE_USER_STAGED_EXTRN_FILES: true EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/data -On Level 1 systems, users can find :term:`ICs/LBCs` in the usual :ref:`input data locations ` under ``FV3GFS/netcdf/2023021700`` and ``FV3GFS/netcdf/2023021706``. Users can also download the data required for the community experiment from the `UFS SRW App Data Bucket `__. +On Level 1 systems, users can find :term:`ICs/LBCs` in the usual :ref:`input data locations ` under ``FV3GFS/netcdf/2023021700`` and ``FV3GFS/netcdf/2023021706``. Users can also download the data required for the community experiment from the `UFS SRW App Data Bucket `__. Users may also wish to change :term:`cron`-related parameters in ``config.yaml``. In the ``config.aqm.community.yaml`` file, which was copied into ``config.yaml``, cron is used for automatic submission and resubmission of the workflow: diff --git a/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst b/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst index 6d17dab6e7..8aaba501d2 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst @@ -218,8 +218,8 @@ The SRW App requires input files to run. These include static datasets, initial .. code-block:: console - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/fix_data.tgz - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/gst_data.tgz + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/fix_data.tgz + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/gst_data.tgz tar -xzf fix_data.tgz tar -xzf gst_data.tgz diff --git a/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst b/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst index a0065dc553..7e162b1b29 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst @@ -724,7 +724,7 @@ the same cycle starting date/time and forecast hours. Other parameters may diffe Cartopy Shapefiles ````````````````````` -The Python plotting tasks require a path to the directory where the Cartopy Natural Earth shapefiles are located. The medium scale (1:50m) cultural and physical shapefiles are used to create coastlines and other geopolitical borders on the map. On `Level 1 `__ systems, this path is already set in the system's machine file using the variable ``FIXshp``. Users on other systems will need to download the shapefiles and update the path of ``$FIXshp`` in the machine file they are using (e.g., ``$SRW/ush/machine/macos.yaml`` for a generic MacOS system, where ``$SRW`` is the path to the ``ufs-srweather-app`` directory). The subset of shapefiles required for the plotting task can be obtained from the `SRW Data Bucket `__. The full set of medium-scale (1:50m) Cartopy shapefiles can be downloaded `here `__. +The Python plotting tasks require a path to the directory where the Cartopy Natural Earth shapefiles are located. The medium scale (1:50m) cultural and physical shapefiles are used to create coastlines and other geopolitical borders on the map. On `Level 1 `__ systems, this path is already set in the system's machine file using the variable ``FIXshp``. Users on other systems will need to download the shapefiles and update the path of ``$FIXshp`` in the machine file they are using (e.g., ``$SRW/ush/machine/macos.yaml`` for a generic MacOS system, where ``$SRW`` is the path to the ``ufs-srweather-app`` directory). The subset of shapefiles required for the plotting task can be obtained from the `SRW Data Bucket `__. The full set of medium-scale (1:50m) Cartopy shapefiles can be downloaded `here `__. Task Configuration ````````````````````` diff --git a/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst b/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst index 78ed48091a..fef1ce4699 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst @@ -45,12 +45,12 @@ On `Level 1 ` files, observation data, model/forecast output, and MET verification output for the sample forecast. Users who have never run the SRW App on their system before will also need to download (1) the fix files required for SRW App forecasts and (2) the NaturalEarth shapefiles required for plotting. Users can download the fix file data from a browser at https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/fix_data.tgz or visit :numref:`Section %s ` for instructions on how to download the data with ``wget``. NaturalEarth files are available at https://noaa-ufs-srw-pds.s3.amazonaws.com/NaturalEarth/NaturalEarth.tgz. See the :numref:`Section %s ` for more information on plotting. +This tar file contains :term:`IC/LBC ` files, observation data, model/forecast output, and MET verification output for the sample forecast. Users who have never run the SRW App on their system before will also need to download (1) the fix files required for SRW App forecasts and (2) the NaturalEarth shapefiles required for plotting. Users can download the fix file data from a browser at https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/fix_data.tgz or visit :numref:`Section %s ` for instructions on how to download the data with ``wget``. NaturalEarth files are available at https://noaa-ufs-srw-pds.s3.amazonaws.com/develop-20240618/NaturalEarth/NaturalEarth.tgz. See the :numref:`Section %s ` for more information on plotting. After downloading ``Indy-Severe-Weather.tgz`` using one of the three methods above, untar the downloaded compressed archive file: diff --git a/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst b/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst index 6312aabf7e..b508daed02 100644 --- a/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst +++ b/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst @@ -219,14 +219,14 @@ A set of input files, including static (fix) data and raw initial and lateral bo Static Files -------------- -Static files are available in the `"fix" directory `__ of the SRW App Data Bucket. Users can download the full set of fix files as a tar file: +Static files are available in the `"fix" directory `__ of the SRW App Data Bucket. Users can download the full set of fix files as a tar file: .. code-block:: console - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/fix_data.tgz + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/fix_data.tgz tar -xzf fix_data.tgz -Alternatively, users can download the static files individually from the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. Users will need to create an appropriate directory structure for the files when downloading them individually. The best solution is to download the files into directories that mirror the structure of the `Data Bucket `__. +Alternatively, users can download the static files individually from the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. Users will need to create an appropriate directory structure for the files when downloading them individually. The best solution is to download the files into directories that mirror the structure of the `Data Bucket `__. The environment variables ``FIXgsm``, ``FIXorg``, and ``FIXsfc`` indicate the path to the directories where the static files are located. After downloading the experiment data, users must set the paths to the files in ``config.yaml``. Add the following code to the ``task_run_fcst:`` section of the ``config.yaml`` file, and alter the variable paths accordingly: @@ -246,7 +246,7 @@ To download the model input data for the 12-hour "out-of-the-box" experiment con .. code-block:: console - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/gst_data.tgz + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/gst_data.tgz tar -xzf gst_data.tgz To download data for different dates, model types, and formats, users can explore the ``input_model_data`` section of the data bucket and replace the links above with ones that fetch their desired data. @@ -312,7 +312,7 @@ Default Initial and Lateral Boundary Conditions ----------------------------------------------- The default initial and lateral boundary condition files are set to be a severe weather case from June 15, 2019 (20190615) at 18 UTC. FV3GFS GRIB2 files are the default model and file format. A tar file -(``gst_data.tgz``) containing the model data for this case is available in the `UFS SRW App Data Bucket `__. +(``gst_data.tgz``) containing the model data for this case is available in the `UFS SRW App Data Bucket `__. Running the App for Different Dates -----------------------------------