Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to read increment files on native cubed sphere grid #342

Merged

Conversation

DavidNew-NOAA
Copy link

@DavidNew-NOAA DavidNew-NOAA commented May 17, 2024

Description

This PR and its companion FV3 PR, #837, add the ability to read increments on the native cubed sphere grid, either in the IAU framework or on restart.. This option can be turned on by an input namelist option, "increment_file_on_native_grid", which defaults to .false..

The motivation for this PR is that we would like to read cubed sphere increments in the Global Workflow used by NOAA-EMC for the new JEDI DA system. However, we cannot yet produce IAU increments with JEDI, so we need to apply increments for now at restart. This PR is based off of another PR, #274, which has been largely rewritten to apply increments at restart in addition to IAU. (cc @mark-a-potts)

Overview of code changes:

A new module, "cubed_sphere_inc_mod.F90", has been added to the "tools" directory, and it has a subroutine, "read_cubed_sphere_inc", that will read the increment into an "increment_data_type" structure. This structure was formerly the "iau_internal_data_type" data structure in "fv_iau_mod.F90", but it's been moved into the new module to avoid circular dependencies and renamed it to reflect the fact that it's not just for IAU.

"read_cubed_sphere_inc" is called in either "fv_iau_mod.F90" for "fv_treat_da_inc.F90", depending on whether or not IAU is used. If not using IAU, a new subroutine, "read_da_inc_cubed_sphere", in "fv_treat_da_inc.F90" is called in "fv_restart.F90" to read and apply the increment. This involves some halo updates and a physics call so the A-grid wind increment can be applied to the D-grid wind.

Note: "cubed_sphere_inc_mod.F90" could possibly be merged into "fv_treat_da_inc.F90", so I'd be interested to hear feedback about that.

Fixes # (issue)

N/A

How Has This Been Tested?

I ran a few of sets of experiments that involved three C96, 6-hour forecasts with identical initial conditions. These experiments were run on RDHPCS Hera.

First, I used JEDI to produce two increments, identical except that one was on the native grid and one was interpolated to the Gaussian grid, and then I ran two forecasts, one for each increment (not with IAU). In the third experiment, I ran the forecast without an increment. I then subtracted the free (no increment) forecast model fields from the model fields of the forecasts using the increments. These residuals allowed me to isolate the impact of the increments on the forecast. Finally, I plotted the residuals and their differences and verified visually that the impact of the increments were identical in terms of large-scale patterns. Naturally there are differences between the two resduals because of interpolation of the increment, but these appeared to produce mostly noise. However, the fact that there are (expected) differences is why I had to verify my results visually.

The following is a plot of "o3mr" at model level 20 (statosphere), using OMI satellite observations for the increment:

Figure_1

Here is another plot of "ua" at model level 20, using satwind observations for the increment:

Figure_2

Checklist:

Please check all whether they apply or not

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • Any dependent changes have been merged and published in downstream modules

@DavidNew-NOAA DavidNew-NOAA changed the title Feature/cubed sphere inc Add ability to read increment files on native cubed sphere grid May 20, 2024
@DavidNew-NOAA DavidNew-NOAA marked this pull request as ready for review May 20, 2024 13:34
Copy link
Contributor

@lharris4 lharris4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is good. One thing I do recommend is checking for the consistency of the increments for both delp and the tracers; this should be OK as long as JEDI doesn't do any transformations on these variables.

tools/fv_treat_da_inc.F90 Outdated Show resolved Hide resolved
tools/fv_treat_da_inc.F90 Show resolved Hide resolved
@DavidNew-NOAA
Copy link
Author

Are there any other obstacles to getting this PR reviewed and merged?

Copy link
Contributor

@lharris4 lharris4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is good. Thanks for submitting this.

Comment on lines 43 to 64
call open_ncfile( trim(filename), ncid )

! Read increments
call get_var5_r4( ncid, 'u_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%ua_inc )
call get_var5_r4( ncid, 'v_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%va_inc )
call get_var5_r4( ncid, 'T_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%temp_inc )
call get_var5_r4( ncid, 'delp_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%delp_inc )
if ( .not. Atm%flagstruct%hydrostatic ) then
call get_var5_r4( ncid, 'delz_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%delz_inc )
end if

! Read tracer increments
do itracer = 1,ntracers
call get_tracer_names(MODEL_ATMOS, itracer, tracer_name)
call check_var_exists(ncid, trim(tracer_name)//'_inc', ncerr)
if ( ncerr == 0 ) then
call get_var5_r4( ncid, trim(tracer_name)//'_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%tracer_inc(:,:,:,itracer) )
end if
end do

! Close increment file
call close_ncfile(ncid)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may get a complaint from operations that this is too disruptive to the filesystem, especially for large numbers of mpi-ranks. You have every core opening the file, performing reads and then closing the file. While I know you are using these embedded I/O routines from the dycore, they were originally put in as hacks a long time ago and generally won't play nice now that we've scaled up to thousands or cores on distributed filesystems.

You'd be better off using the fms2_io routines which are already domain aware (suited for native grid) and will read in the correct increment for a given mpi-rank. The fms2_io routines have recently been modified by UFS contributors to use MPI-IO for reads of NetCDF4 files (FMS PR#1477).

@DavidNew-NOAA
Copy link
Author

@bensonr See one of my original

Note: There was a suggestion in the comments of #274 (cc lharris4 bensonr) to use FMS calls for NetCDF IO, but I was unable to get that to work. I tried using the "read_data" FMS2 subroutine to read the increments using "corner" and "edge_lengths" arguments for start and count, but for some reason it would always read the first tile even if I specified the nth tile. However, when I fed the "corner" and "edge_lengths" arguments directly into the "nf90_get_var" subroutine from the NetCDF library, this problem disappeared. This makes me think there may be a bug in the 5D "read_data" subroutine, unless I'm missing something. In the end I decided to extend the "sim_nc_mod.F90" module with a new subroutine since up until now, that module has been used to read increments, not FMS.

I would really like to use fms2_io, but in addition to the above problem, I could not find an example to work from where someone does a 5D read from a cubed-sphere history file, let alone using domain awareness features. Every example I found was for reading FMS restart files.

Moreover, as of right now, Gaussian grid increment files are read, both IAU and non-IAU, using the sim_nc_mod.F90 module routines anyway.

@DavidNew-NOAA
Copy link
Author

@bensonr See one of my original

Note: There was a suggestion in the comments of #274 (cc lharris4 bensonr) to use FMS calls for NetCDF IO, but I was unable to get that to work. I tried using the "read_data" FMS2 subroutine to read the increments using "corner" and "edge_lengths" arguments for start and count, but for some reason it would always read the first tile even if I specified the nth tile. However, when I fed the "corner" and "edge_lengths" arguments directly into the "nf90_get_var" subroutine from the NetCDF library, this problem disappeared. This makes me think there may be a bug in the 5D "read_data" subroutine, unless I'm missing something. In the end I decided to extend the "sim_nc_mod.F90" module with a new subroutine since up until now, that module has been used to read increments, not FMS.

I would really like to use fms2_io, but in addition to the above problem, I could not find an example to work from where someone does a 5D read from a cubed-sphere history file, let alone using domain awareness features. Every example I found was for reading FMS restart files.

Moreover, as of right now, Gaussian grid increment files are read, both IAU and non-IAU, using the sim_nc_mod.F90 module routines anyway.

@bensonr Are you aware of an example of such a 5D read with FMS2 using domain awareness capability?

@lharris4
Copy link
Contributor

@DavidNew-NOAA I am curious: what is the use case for a 5D I/O statement? Usually, 4d data (x,y,z, tracer_index) is sufficient, and the time dimension is read one slice at a time.

Thanks
Lucas

@DavidNew-NOAA
Copy link
Author

DavidNew-NOAA commented Jun 13, 2024

@lharris4 Increment files in cubed-sphere history format have dimensions (x,y,z,tile,time). The time dimension is always length=1, but it exists. Tiles are read one at a time.

@DavidNew-NOAA
Copy link
Author

DavidNew-NOAA commented Jul 30, 2024

@laurenchilutti @lharris4 @bensonr @XiaqiongZhou-NOAA I made changes to this PR, now, as requested, using FMS calls to read the increment files. The increment files are now split into one file for each tile. When the namelist parameters read_increment and increment_file_on_native_grid are both set to .true., res_latlon_dynamics becomes the prefix for the increment file names, rather than the full name itself. Thus, for example, if res_latlon_dynamics='atminc', the increment files will be named atminc.tile1.nc, atminc.tile2.nc, etc.

Can you please review these changes? Thanks

Copy link
Contributor

@bensonr bensonr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DavidNew-NOAA - thanks for being so accommodating with the suggestions to use the existing IO layers within FMS.

@bensonr
Copy link
Contributor

bensonr commented Aug 8, 2024

According to the comments, may address issue #188 (close when merged).

@DavidNew-NOAA
Copy link
Author

@laurenchilutti Is there anything I need to do to move this PR along now that everything is approved?

@laurenchilutti
Copy link
Contributor

Once approvals are done, testing needs to be done on the UFS PR associated with this change. And then someone usually tells me to merge when the UFS PR is ready to merge. @jkbk2004 is there anything @DavidNew-NOAA needs to do to get this merged?

@DavidNew-NOAA
Copy link
Author

DavidNew-NOAA commented Aug 8, 2024

If I'm not mistaken there following error in testing is due to 7c3102f :

https://github.com/NOAA-EMC/fv3atm/actions/runs/10304307448/job/28522464673?pr=837#step:6:2384

Does anyone have some insight on this?

@laurenchilutti
Copy link
Contributor

The commit you reference was just merged in today and is accompanied by an FV3atm PR that was also merged today: NOAA-EMC/fv3atm#798. I would make sure that your FV3atm PR has merged in the latest from the develop branch.

@jkbk2004
Copy link

jkbk2004 commented Aug 8, 2024

@DavidNew-NOAA We need to update the namelist file option (increment_file_on_native_grid) at weather model regression test cases. Can you create a pull request to https://github.com/ufs-community/ufs-weather-model ?

@jkbk2004
Copy link

jkbk2004 commented Aug 9, 2024

I missed. @DavidNew-NOAA has a draft pr ufs-community/ufs-weather-model#2304. I think we can make the PR ready with namelist parameter update and pre-test.

@DavidNew-NOAA
Copy link
Author

@jkbk2004 I added the namelist parameter in NOAA-EMC/fv3atm#837 for FV3. Where do I need to modify UFS? I was able to test successfully with the namelist parameter in FV3

@zach1221
Copy link

@bensonr testing is complete on UFS-WM PR #2389. Can you please merge this PR for us?

@zach1221
Copy link

Copying @laurenchilutti as well. We're ready to merge this PR.

@laurenchilutti laurenchilutti merged commit ac3055e into NOAA-GFDL:dev/emc Aug 27, 2024
zach1221 pushed a commit to ufs-community/ufs-weather-model that referenced this pull request Aug 27, 2024
…ng in rt.sh (2388) + Add ability to read increment files on native cubed sphere grid (2304) (#2389)

* UFSWM -  Unify CDEPS gfs, cfsr, and gefs datm datamodes. Add ability to read increment files on native cubed sphere grid.  Update rt scripts to fix checking the job exit status from Slurm and PBS schedulers.
  * FV3 - Add namelist parameter for cubed sphere increment read and update atmos_cubed_sphere hash for GFDL 
    Atmospheric Cubed Sphere PR #[342](NOAA-GFDL/GFDL_atmos_cubed_sphere#342)
     * atmos_cubed_sphere - Add ability to read increments from files on native cubed sphere grid
  * CDEPS - Unify CDEPS gfs, cfsr, and gefs datm datamodes. Rename CDEPS-related FILENAME_BASE to FILEBASE_DATM in 
     tests. Sunset gfs_hafs datamode.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants