Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running MOM6 in a coarser resolution #252

Open
whuiskamp opened this issue Nov 1, 2018 · 52 comments
Open

Running MOM6 in a coarser resolution #252

whuiskamp opened this issue Nov 1, 2018 · 52 comments

Comments

@whuiskamp
Copy link

Hi everyone,
I was wondering if anyone has experience running MOM6/ a C-grid model at resolutions coarser than 1-degree. My group is looking to use the wetting and drying scheme in MOM6 for deglacial simulations, but these necessitate a coarse resolution (we're currently using MOM5 with a ~3 degree resolution) to maintain speed. Does anyone have experience here or could advise if this is feasible?
Thanks.
Willem

@gustavo-marques
Copy link
Member

gustavo-marques commented Nov 1, 2018

Hi Willen,

The current resolutions maintained by GFDL and NCAR are equal or finer than 1 degree. I've a nominal 3 degree global configuration that has not been scientifically validated. I am happy to share it with you. I believe it will take some trail and error (and perhaps some development) to make the solutions look reasonable. Some of the options used in the examples provided by GFDL will have to be changed.

@whuiskamp
Copy link
Author

Hi Gustavo,
That would be great, thank you!

@gustavo-marques
Copy link
Member

Hi Willen,
Please send me an email (gmarques[at]ucar.edu) so we can coordinate this offline.

@ashao
Copy link
Collaborator

ashao commented Nov 5, 2018

I know that there was a coupled configuration, CM2Mc, using MOM5 as the ocean component with a nominal resolution of 3-degrees. It looks like Eric Galbraith and his group are still using it since they just recently published a paper and data using it: https://earthsystemdynamics.org/models/cm2mc-simulation-library/

Of course, MOM5 and MOM6 are very different models, but perhaps @gustavo-marques, you could use some of the choices made for that model (like where to start the tripolar grid and the equatorial enhancement) as a baseline?

@whuiskamp
Copy link
Author

So, the grid we are currently using with MOM5 is this nominally 3 degree grid, with the increase in resolution around the equator, found in CM2Mc. I plan to use this same grid, but the difficulty will of course be setting the model up to use it.

@whuiskamp
Copy link
Author

I thought I would post an update on this model configuration here. I currently have a functioning configuration of MOM6 (ocean only) with a nominally 3 degree horizontal grid and 28 vertical levels. This includes re-gridded forcing files for things like buoyancy restoring, wind stress etc.

While stable (I've run it for a few decades), the ocean state looks bad. As someone with no experience tuning an ocean model, it would be great to get some tips on which parameterisations may need changing at this horizontal/vertical resolution.

In case of interest, model performance is 1 model year in approx. 11 minutes on 16 PEs with a DT of 7200.

@ashao
Copy link
Collaborator

ashao commented Jan 18, 2019 via email

@ashao
Copy link
Collaborator

ashao commented Jan 18, 2019 via email

@whuiskamp
Copy link
Author

Yes, this is currently running in ocean stand-alone mode without SIS or SIS2. I've attached a few figures showing surface zonal velocity, global overturning and a zonally averaged temperature profile from the end of a 20 year run (the figures are derived from annual diagnostic output, 'prog.nc', not the regridded version in z* coords).

Surface currents look heterogeneous and far too strong, while vertical motion is also hugely wrong. The temperature profile also gives some indication that vertical mixing seems to be excessive.

gmoc_mom6
temp_mom6
u_mom6

MOM_parameter_doc_all.txt

@ashao
Copy link
Collaborator

ashao commented Jan 18, 2019

There's nothing immediately jumping out at me with your MOM_parameter_doc.all. I have two suggestions that are more sanity checks than anything else.

  1. Try this coupled to SIS2 and use the CORE2-normalyear forcing. You should just be able to use the data_table in ice_ocean_SIS2/OM4_025 without changing anything
  2. Use your forcing in ocean-only mode but for a 1-degree test case. There's one in the GFDL repository https://github.com/NOAA-GFDL/MOM6-examples/tree/dev/gfdl/land_ice_ocean_LM3_SIS2/OM_360x320_C180, but because of the US government shutdown it'd be difficult to get you the input files that you need. @gustavo-marques: since he's already in contact with you, could you get him NCAR's 1-degree configuration?

@gustavo-marques
Copy link
Member

I see a few issues with your MOM_parameter_doc.all. The values you are using are meant to work for a 1-degree configuration. I think you should first try to simplify your choice of parameters, so that you can understand what the model is doing . Try the following:

RESOLN_SCALED_KH = False
RESOLN_SCALED_KHTH = False
KH = 2500.0                      
KH_VEL_SCALE = 0.0
SMAGORINSKY_AH = False 
USE_LAND_MASK_FOR_HVISC = True
SMAGORINSKY_AH = False
KHTH = 1800.0 
KHTH_MAX = 2700.0
KHTR = 1800.0
KHTR_MAX = 2700.0

To help us understand what the model is doing, add daily averages of the following fields to your list of diagnostics (file diag_table):

  • KHTH_t
  • Ahh
  • Khh
  • SST
  • SSS
  • SSU
  • SSV

Other parameters that could be changed:

  1. Time steps: you have DT = DT_THERM = DT_FORCING = 7200 s. Try this:

DT_THERM = 14400.0 ! i.e., 2 x DT
DT_FORCING = 14400.0

  1. Minimum depth: you have MINIMUM_DEPTH = 0.5 m. What's the minimum depth in your topography file? 0.5 m seems very shallow for a 3-degree configuration.

@whuiskamp
Copy link
Author

whuiskamp commented Jan 21, 2019

Thanks for the tips. I tried another year-long test run with these new settings, and there have been some improvements. Zonal velocities have decreased to more reasonable magnitudes (see attached fig.), though equatorial currents still seem too strong to me, as well as some strange artefacts in the Arctic. With regard to the variables you suggested adding to the diag. output, would it help to upload some figures showing the mean, variance or min/max of these values across the year? Some like the khh are simply fixed values (as we prescribed).
usurf_mom6

@ashao
Copy link
Collaborator

ashao commented Jan 21, 2019

I agree that the parameterization settings were geared more towards a 1-degree configuration, but I suppose I would have expected them to be 'good enough' in some sense.

With regards to the diagnostics, because you have MEKE on you should still be getting some variability in KHh. For KHTH and KHTR your mapping factors would not lead to MEKE mapping onto KHTH and KHTR so unless, the resolution scaling function kicks in, those should be constant.

I would imagine that the biggest sources of your model bias come from

  1. Not having a sea-ice component
  2. Some inconsistency with the forcing.

@gustavo-marques
Copy link
Member

@ashao, you might be right that the model should have been 'good enough' with the setting from the 1-degree configuration. However, all the different parameters acting on Khh (e.g., MEKE, latitude dependency, scaling function etc) make the tunning process more difficult, and that's why I suggested a simpler choice of parameters for now.

By the way, I would also recommend turning off MEKE (MEKE = False). With these options, Khh should be 2500.0 m^2/s everywhere. I also just noticed that you have VARIABLE_WINDS = False. I think you want that to be TRUE.

After you re-run the model for ~ 1 year, It would be helpful if you could post your netCDF file, with all the variables I mentioned above, as well as all your forcing files somewhere we can download them (perhaps a ftp server). Prescribing consistent forcing in the absence of a sea ice model is challenging, that's why @ashao suggested coupling to SIS2 and using CORE2 forcing.

@ashao
Copy link
Collaborator

ashao commented Jan 22, 2019

@gustavo-marques: Oh yes I definitely agree that your suggestions were good and could simplify the tuning process. Also good catch on VARIABLE_WINDS, that definitely seems like it could be a big part of the problem.

@ashao
Copy link
Collaborator

ashao commented Jan 22, 2019

Along the same lines of @gustavo-marques suggestions, all of the RESOLN_SCALED_* parameters should also explicitly be set to False.

@whuiskamp
Copy link
Author

Ok, so with this in mind, I think I will set up a configuration coupled to SIS2 and implement the changes you have both suggested, namely turning variable winds back on (well spotted, I missed this) and disabling MEKE. As far as I can see, the resolution dependant parametrisation options are already all false.

@whuiskamp
Copy link
Author

I've set up and done several test runs with a configuration based upon the /ice_ocean_SIS2/SIS2/ test case. I've configured it as closely to the previous ocean-only run as I could, without it crashing (getting it to run without too many velocity truncation errors was tricky), though I suspect that there will be some options that I've missed and/or set incorrectly - apologies in advance.

The output from my most recent test can be found here http://www.pik-potsdam.de/~huiskamp/mom6_output/MOM_SIS2_test.tar.gz and contains the additional forcing files I've added (all the rest are from the standard SIS2 test case. I didn't include these as the archive is already very large). I have also not included any ice diagnostic output.

The main problem I'm running into in this configuration is unusual overturning cells either side of the equator driven by enormous drops in SST. Perhaps there is something wrong with my forcing settings.
If I can provide extra information or change how I've provided the data to make it more convenient for you, let me know. Thanks again.

@whuiskamp
Copy link
Author

Hi @gustavo-marques , @ashao - if you have not had a chance to look at the data I've uploaded, that link will be unavailable for the next two weeks while our computing cluster is down for maintenance, just in case you try to access it and the link is dead.

@gustavo-marques
Copy link
Member

Thanks for letting us know. Looking at this is in my to-do list.

@whuiskamp
Copy link
Author

Cheers. I should add, if any of you (or the wider MOM community) will be at EGU, I will be there hoping to talk about this coarse-res setup and its potential for sea level rise experiments.

@whuiskamp
Copy link
Author

Update on this configuration. I have not been able to diagnose what is causing the strange circulation artefacts (see attached figs) at the equator in my MOM6-SIS2 configuration. Essentially there's a cold strip in the region where the latitudinal resolution of the model reaches 0.6 degrees which is accompanied by a ~400Sv overturning that extends to about 1000m depth. This is accompanied by enormous heat fluxes into and out of the ocean (weirdly, the 'forcing' diagnostic shows a positive longwave flux into the ocean..). This artefact does not occur when using the ocean only solo driver with the same parametrisations for viscosity/ diffusion (for example). Anyone have any thoughts? Is this a mosaic/ grid issue? a resolution issue? I'd appreciated any thoughts, I'm completely stumped.
@StephenGriffies @Hallberg-NOAA @adcroft ?
lw_down
Psi
sst

@gustavo-marques
Copy link
Member

gustavo-marques commented Mar 21, 2019

I see a few issues with your setup and I tried to group them into the six categories listed below. I suspect that 1) to 3) may help with the issue in the Equator.

1) GM and along-isopycnal diffusivities
KHTH and KHTR are very small for a nominal 3-deg configuration. Also, I think you missed some options needed to enable these schemes. Add the following:

KHTH_USE_EBT_STRUCT = True 
KHTH_USE_FGNV_STREAMFUNCTION = True
KHTH_SLOPE_CFF = 0.25
KHTR_SLOPE_CFF = 0.25
CHECK_DIFFUSIVE_CFL = True     
MAX_TR_DIFFUSION_CFL = 2.0
KHTH = 3000.0 
KHTR = 3000.0

2) Coupling time-step
In input.nml, dt_cpld and dt_atmos (= 3600 sec) are much smaller than the baroclinic time-step (DT = 7200 sec). You should try either a) dt_cpld = dt_atmos = 1.44E+04 (you will miss the diurnal cycle in this case) or b) set THERMO_SPANS_COUPLING = True.

3) Horizontal viscosity
I think you are using small voscisity coefficients for a nominal 3-deg configuration. Add the following:

KH_VEL_SCALE = 0.05
KH = 4000.0
AH_VEL_SCALE = 0.1 

4) Surface pressure
Variable p_surf looks streange. Add the following:

MAX_P_SURF = -1.0
USE_LIMITED_PATM_SSH = True

5) SSS restoring
Add the following:

SRESTORE_AS_SFLUX = True
MAX_DELTA_SRESTORE = 5.0

6) Others suggestions

COORD_CONFIG = "none"
USE_TRIPOLAR_GEOLONB_BUG = False
GRID_ROTATION_ANGLE_BUGS = False
USE_NET_FW_ADJUSTMENT_SIGN_BUG = False
VERTEX_SHEAR = True
PEN_SW_SCALE = 15.0             
PEN_SW_FRAC = 0.42           
PEN_SW_NBANDS = 1

I noticed you are using CHANNEL_CONFIG = "list". Are you confident that your MOM_channel_list is configured properly? Lastly, please start saving the surface boundary layer depth (i.e., add "ePBL_h_ML" in the surface section of your diag_table).

@whuiskamp
Copy link
Author

whuiskamp commented Mar 22, 2019

I've implemented these changes and run a new test (http://www.pik-potsdam.de/~huiskamp/mom6_output/MOM6_coarse_updated.tar.gz) - unfortunately the problem remains. I've gone back and had a look at the viscosity values that are implemented in the equivalent MOM5 simulations I've run and the values are much higher (~160,000 m2/sec) and increase towards this higher value with decreasing latitude, which seems to be the opposite to what is implemented by default in MOM6 (if RESOLN_SCALED_KH is enabled). I'm currently running another test with these higher values to see what the impact is.

Edit: This completed. Result is the same (http://www.pik-potsdam.de/~huiskamp/mom6_output/00010101.prog.nc), the increased viscosity simply suppresses the emergence of the cold anomaly.

As for the channels, I'm fairly confident these are correct, though they may need fine tuning (in terms of channel width).

@ashao
Copy link
Collaborator

ashao commented Mar 22, 2019

An idealized, 2-degree global configuration that I had developed large abyssal overturning cells just off the equator. Some of this came from the grid not having a T/U-point right at latitude=0. Can you confirm if your grid does?

@whuiskamp
Copy link
Author

My GEOLAT (tracer grid) points exist at +- 0.29 degrees either side of the equator, and I then of course have meridional velocity points on 0 latitude. I'll re-generate the grid, test and report back.

@whuiskamp
Copy link
Author

Hi guys,
With the new grid and increasing Kh to 25000, the abyssal overturning at the equator is gone and all prognostic fields are back within normal ranges. There's still a bit of tuning to do, but otherwise I think it looks good. Thanks very much to both of you for your help.

@StephenGriffies
Copy link
Contributor

Wow! Wonderful.

What was the key modification? Kh change or y=0 change?

In general, I hope you document this experience in your "working notes" for this model! This is really a lesson worth teaching others...

@whuiskamp
Copy link
Author

I think the key was re-generating the ocean_hgrid to remove meridional velocity points from the equator. I should note, I encountered some strange problems with the make_solo_mosaic tool in this process which resulted in later errors in the exchange grid and therefore forcing fields. I got around this simply by using the test case ocean mosaic - still not sure what was going wrong here, though.

As I mentioned, the model will need re-tuning, I think a lot of parameter changes were simply to suppress the issue with the grid (excess viscosity has now all but eliminated deep convection at the poles, so this needs to be scaled back). Once I re-tune and validate everything, I'm happy to provide it as another MOM6 example case along with a readme (I've kept detailed notes on the process).

@Zhi-Liang
Copy link

Zhi-Liang commented Apr 5, 2019 via email

@whuiskamp
Copy link
Author

whuiskamp commented Apr 5, 2019

Hi Zhi, yes of course.

My ocean h_grid was generated with the following:
make_hgrid --grid_type tripolar_grid --nxbnd 2 --nybnd 7 --xbnd -285,75 --ybnd -81,-49.5,-18,0,18,60,90 --dlon 3.0,3.0 --dlat 3.0,1.5,3.0,0.6,3.0,1.421052,3.194332 --lat_join 66 --grid_name MOM6_grid --center t_cell

The the ocean_mosaic was generated with (I renamed the grid file to its correct name before generating the mosaic...):
make_solo_mosaic --num_tiles 1 --dir ./ --mosaic_name ocean_mosaic --tile_file ocean_hgrid.nc --periodx 360

The problem (I think) was in the coupler mosaics, specifically the atmosXocean mosaic. The only thing I changed was replacing the ocean_mosaic I created above, with the mosaic from the SIS2 test case, after which everything functioned properly. This is strange as the only differences I could find were the contact indexes listed in the ocean_mosaic (it also struck me as strange that the model should work with incorrect contact indices). It is possible I made an error in the creation of the coupler mosaics, but if so, I have not been able to identify it.

@Zhi-Liang
Copy link

Zhi-Liang commented Apr 5, 2019 via email

@Zhi-Liang
Copy link

Zhi-Liang commented Apr 5, 2019 via email

@whuiskamp
Copy link
Author

Hi Zhi,
No, as I mentioned above, I renamed it before running the mosaic tool - it works fine for me and even make_coupler_mosaic functioned, the issue was that when the model runs, large parts of the model domain are apparently undefined for the coupler and (for example) wind stress would have 0 values here. Could you tell me your email address? I will send you the SIS2 test case mosaic.

@Zhi-Liang
Copy link

Zhi-Liang commented Apr 5, 2019 via email

@Zhi-Liang
Copy link

Zhi-Liang commented Apr 5, 2019 via email

@whuiskamp
Copy link
Author

Hi everyone. I thought I'd just follow up by mentioning that I've recently been working on tuning the model scientifically and now have a 'decent' looking ocean state (figures attached for AMOC, GMOC and PIMOC), though AABW is a little too strong and doesn't really extend into the NH. I've attached my parameter file for those interested - if you have any thoughts on how to improve the overturning structure, please let me know.

MOM_parameter_doc.all.txt

psi_atl
psi_glob
psi_pi

.

@jiandewang
Copy link

jiandewang commented Mar 23, 2020 via email

@Hallberg-NOAA
Copy link
Member

Hallberg-NOAA commented Mar 23, 2020 via email

@adcroft
Copy link
Member

adcroft commented Mar 23, 2020

@whuiskamp @jiandewang There's a script in tools/analysis/meridional_overturning.py that plots AMOC . Could you compare the figures? @whuiskamp's plot has a good deep AMOC (maybe as he suggests because AABW is not reaching far enough) but we've had the experience that details in the diagnostic scripts can matter. In particular the direction of summation of transports and the precise positioning of the stream function values in the vertical can make things look better/worse.

@whuiskamp We just discussed this on our dev call. Your run is looking very nice - I wasn't sure it would work at all well at this resolution (because of the C-grid). @MJHarrison-GFDL noted you have tides turned on! In case you didn't know...

@whuiskamp
Copy link
Author

@Hallberg-NOAA @adcroft - Yes, I was surprised how good the configuration looks with minimal tweaking, though it's important to remember that approaching the equator, the resolution becomes finer (up to 0.6 degree). I will note, however, that there is an interesting problem that arises if I attempt to use a DT larger than 1 hour - namely the entire Southern Ocean becomes one large deep convection pool (see fig for mixed layer depth). If you have any thoughts on why this is occurring I'd be very interested, as it's not ideal for us to be working with such a fast time-step.

I was aware tides were on, haha. I have an input file from a MOM5 configuration for tidal amplitude and wanted to see if turning on tides had a noticeable effect on the ocean state. Haven't looked into this at all - all I know is that it didn't break anything!

One more thing - the python script for calculating AMOC doesn't seem to be working for me (some numpy error), but I'll post the resulting figure if I get it working.

MLD_003

@Hallberg-NOAA
Copy link
Member

Hallberg-NOAA commented Mar 24, 2020 via email

@whuiskamp
Copy link
Author

Hi Bob,

Thanks very much for the explanation - incredibly helpful.

@StephenGriffies
Copy link
Contributor

StephenGriffies commented Mar 24, 2020 via email

@whuiskamp
Copy link
Author

Hi all,

Myself and a PhD student in our group are currently trying to optimise this configuration and we have noticed that it's considerably slower than MOM5 (on the order of 25-30% as fast). I'm aware that the numerics of the two models are considerably different, but would you expect the discrepancy to be this large? (I understand if there's no easy answer here - performance doesn't scale linearly with resolution).
My current configuration is running with a dt = 5400 (this is limited to 1hr 50, based on Bob's explanation above); dt_therm of 21600; dtbt = 0 (so, optimally calculated by the model) with a coupling time-step of 21600.
While I believe I've optimised the setup regarding time-stepping and diagnostic output, am I missing some other options where I might claw back some performance?

Thanks,
Willem

@adcroft
Copy link
Member

adcroft commented Jul 31, 2020

Willem,

The performance question is something we've just begun to examine. You are not alone - we have been aware of a general slow down over time during the development of the model and we're not sure why. The code is more non-linear than a fixed coordinate model so we always knew the model had more work to do than others but the numbers used to add up in favor of this algorithm. We have no easy fix right now but it would be worth checking the module-level clocks that something unexpected isn't happening. For instance, although you need diagnostics, sometimes a poorly written diagnostic is costing a lot of time so turning off all diagnostics is an obvious test. We have several ideas about what might be limiting the performance but it is too early to suggest any one of them as a leading candidate right now.

Concerning scaling, we find scaling falls off most noticeably when the tiles size is in the teens, say 15x15, although this varies from platform to platform. This also has changed since we used to maintain scaling to smaller tile sizes. Again, this is something we are investigating.

Sorry we don't have quick fix (pun intended) for you (yet).

-Alistair

@Hallberg-NOAA
Copy link
Member

Hi Willem,

Following up on Alistair's e-mail, we have a couple more specific questions that might help to identify where you might be seeing the slowdown in your configuration.

  1. Please verify that this is still the 28-vertical-level Z*-coordinate configurations you were discussing originally, or if not let us know how the vertical coordinate of your configuration differs.

  2. How many tracers are you using in your configuration (in addition to temperature and saliniity).

  3. What equation of state are you using?

  • Bob Hallberg

@whuiskamp
Copy link
Author

Hi Alistair; Bob,

Thanks for the replies. The diagnostics at the end of the run suggest to me that diagnostics (at least alone) cannot be responsible for the slow down. I've attached the runtime stats here, in case of interest - the 'diagnostics framework' is responsible for about 7% of the runtime (I'm exporting diagnostics in the default model layer, z and rho space).

MOM_stats.txt

To answer your questions, Bob;

  1. Yes, this is still the same configuration with an 80x120 grid and 28 vertical levels in z*
  2. The only additional tracer is the ideal age tracer package.
  3. We're using the default Write EOS (In the model description paper, it suggests the others were not ready yet?)
    I've attached the paramter.all file here as well, in case this adds value.
    MOM_parameter_doc_all.txt

I'm happy to run tests with my configuration for you in search of speed optimisations, so please let me know if I can assist.

Cheers,
Willem

@whuiskamp
Copy link
Author

Hi all,
Quick question: I'm generating a new grid for the model and am in the process of creating the sub-grid topographic roughness file. Is there a script in the repository here that performs this? The description of the process in the OM4 paper was not entirely unambiguous to me.

Thanks.

@ashao
Copy link
Collaborator

ashao commented Aug 7, 2020

If my memory serves me correctly (sketchy assumption), it should just be the 'std' field made by https://github.com/NOAA-GFDL/MOM6-examples/blob/dev/gfdl/ice_ocean_SIS2/OM4_025/preprocessing/create_topo.py

@ashao
Copy link
Collaborator

ashao commented Aug 7, 2020

Also, with regard to the speedup part, I wonder if using wide halos in the barotropic solve might buy you a little gain? It's a tradeoff between message passing and doing more calculations, so it's definitely not a guaranteed to be an improvement.

! === module MOM_barotropic ===
BT_USE_WIDE_HALOS = True        !   [Boolean] default = True
                                ! If true, use wide halos and march in during the barotropic time stepping for
                                ! efficiency.
BTHALO = 0                      ! default = 0
                                ! The minimum halo size for the barotropic solver.

(Note: this doesn't show up in MOM_parameter_doc.all, but only in MOM_parameter_doc.layout but you should be able to add it to MOM_input or MOM_override as per usual. We should probably change that at some point).

@whuiskamp
Copy link
Author

Hi everyone,
I have created an updated grid (now extends to 90S, with the intention of one day performing Antarctic melt experiments) and run the model to equilibrium over 1400 years. Attached are some figures of global overturning and mixed layer depth from the 'best' state I've been able to achieve. The AMOC is weaker than I would like/expect, while in the Southern Ocean we see strong Weddell Sea convection, but an absence of any in the Ross Sea. I'll add that I've only been able to achieve this by disabling THICKNESSDIFFUSE and using very low background diffusivities, surprising given the resolution (which makes me a little nervous). If anyone has suggestions on how I might improve this, I'd be grateful - my tests have so far been unsuccessful (re-enabling thicknessdiffuse yields an AMOC of ~10Sv).
mld
gmoc

MOM_parameter_doc_all.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants