Skip to content

Software necessary to reproduce the model results in the article

Notifications You must be signed in to change notification settings

USDA-ARS-NWRC/WRR_2018_Hedrick

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Direct insertion of NASA Airborne Snow Observatory-derived snow depth time-series into the iSnobal energy balance snow model

Water Resources Research

Andrew Hedrick@#, Danny Marks@, Scott Havens@, Mark Robertson@, Micah Johnson@, Micah Sandusky@, Hans-Peter Marshall#, Patrick Kormos^, Kat J. Bormann*, and Thomas H. Painter*

@ USDA-ARS Northwest Watershed Research Center, Boise, Idaho, USA.
# Department of Geosciences, Boise State University, Boise, Idaho, USA.
^ Colorado Basin River Forecast Center, National Weather Service, Salt Lake City, Utah, USA.
* Jet Propulsion Laboratory/California Institute of Technology, Pasadena, California, USA.

Corresponding author: Andrew Hedrick ([email protected])

Background / Considerations

This Docker image contains all the necessary software for reproducing the model results presented in the above titled scientific manuscript. This application of the iSnobal snow model (Marks et al., 1999) produces daily estimates of 19 snowpack parameters over the Tuolumne River Basin in California. In addition, this software produces all 10 of the 50-meter resolution hourly forcing grids required as input to the snow model. These forcing grids are in netCDF format and take up a considerable amount of disk space. Each variable file is approximately 65GB for an entire year and subsequently, all forcing grids for each year take up around 650GB on the local filesystem where the /data/ directory is mounted. It is suggested that the user mount the /data/ directory directly onto an external hard drive or server system with at least 3TB of available storage.

In addition to the software contained in this Docker image, any person seeking to reproduce the manuscript results would need to download the data and configuration scripts located at https://doi.org/10.5281/zenodo.1228399. This dataset contains:

  1. the hourly meteorological measurements from the weather stations in vector format (.csv files),
  2. the configuration files for executing the AWSM automated protocol,
  3. the static 50-meter grids for the Tuolumne Basin,
  4. the ASO 50-meter snow depth products for all 36 updates presented in the manuscript,
  5. and the shell scripts for restarting iSnobal with each updated initialization file.

The initial model run does not contain any updates from the Airborne Snow Observatory snow depths. These updates must be manually initiated in the command line. Outputs from the updating shell scripts will be placed in the /data/ directory alongside the non-updated results.

Contents

/data/ This is the directory that should be mounted to your computer's local filesystem. All model outputs are directed into this folder.

/code/ipw This directory contains the compiled source code for the Image Processing Workbench (IPW), which houses the iSnobal model and other associated programs.

/code/smrf This directory holds the Spatial Modeling for Resources Framework (SMRF) described in detail by Havens et al., 2017. SMRF is responsible for distributing the point meteorological data over the modeling domain in order to create the hourly forcing grids required by iSnobal.

/code/awsm Within this directory is the Automated Water Supply Model (AWSM), which automates the previously ad hoc process for running SMRF and iSnobal.

Running with Docker

To mount a data volume, so that you can share data between the local filesystem and the docker, the -v option must be used. For a more in-depth dicussion and tutorial, read https://docs.docker.com/engine/userguide/containers/dockervolumes/. The container has a shared data volume at /data where the container can access the local filesystem.

When the image is run, it will go into the Python terminal within the image. Within this terminal, AWSM can be imported. The command /bin/bash can be appended to the end of docker run command to enter into the docker terminal for full control. It will start in the /data location with IPW code in /code/ipw, SMRF code in /code/smrf, and AWSM code in /code/awsm.

For Linux docker run -v <path>:/data -it usdaaranwrc/wrr_2018_hedrick

For MacOSX: docker run -v /Users/<path>:/data -it usdaaranwrc/wrr_2018_hedrick

For Windows: docker run -v /c/Users/<path>:/data -it usdaaranwrc/wrr_2018_hedrick

References

Havens, S., Marks, D., Kormos, P., & Hedrick, A. (2017). Spatial Modeling for Resources Framework (SMRF): A modular framework for developing spatial forcing data for snow modeling in mountain basins. Computers & Geosciences, 109(September 2016), 295–304. https://doi.org/10.1016/j.cageo.2017.08.016

Marks, D., Domingo, J., Susong, D., Link, T. E., & Garen, D. C. (1999). A spatially distributed energy balance snowmelt model for application in mountain basins. Hydrological Processes, 13(12–13), 1935–1959. https://doi.org/10.1002/(SICI)1099-1085(199909)13:12/13<1935::AID-HYP868>3.0.CO;2-C

About

Software necessary to reproduce the model results in the article

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published