Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorporating Fluctuating Water Levels into rLakeAnalyzer #62

Open
brdeemer opened this issue Aug 18, 2015 · 10 comments
Open

Incorporating Fluctuating Water Levels into rLakeAnalyzer #62

brdeemer opened this issue Aug 18, 2015 · 10 comments

Comments

@brdeemer
Copy link

Hello,

I am working with thermistor string data from a reservoir that undergoes a large water level drawdown. I have written a loop to incorporate shifting water levels into my calculation of Schmidt Stabilities. I am now working to incorporate shifting water levels into my estimates of metalimnion extent/depth. I have a loop setup:

#setup empty matrix for max and min metalimnion depth
metadeps <- matrix(ncol=2,nrow=0)
colnames(metadeps)<-c("max","min")

for (i in 1:19300){
#pull out array of temp & depth data from thermistor string at a single date/time 
  filterwtr<-filter(subsetdata,subsetdata$datetime==dt$datetime[i])
#name the temp and depth data… depth data is corrected for dropping water level, so the array is different at each time step
  wtr<-filterwtr$wtr
  dep<-filterwtr$depsurlvl
#calculate max and min metalimnion depth & put it into the matrix
  metadeps<-rbind(metadeps,meta.depths(wtr,dep,seasonal=TRUE,mixed.cutoff=1))
}

I get output that looks somewhat messy, though, and I think it would be better if I could use the temporal layer averaging that is described in the Read et al. 2011 Environmental Modeling and Software paper (I found it in the appendix: A.2.1.3 and A.2.1.4). The graph below, which shows the current output for maximum metalimnion depth, gives an idea of what I am working with. The index numbers are ordered by time… earliest numbers are during summer stratified period, latest numbers are during turnover.

Any help that you can offer would be sincerely appreciated.

Thanks so much in advance,

Bridget
maximum_metalimnion_deemer

@lawinslow
Copy link
Member

Hi @brdeemer,

Good question. The R version doesn't yet have any of the downsampling or averaging routines that were in the Matlab version. We've often discussed adding them, but just haven't gotten there yet.

I did some digging around to figure out exactly what the Matlab version is doing. (this is detailed so I can refer back later if we decide to implement) Basically, it looks to be implementing a simple one-pass moving average filter on the calculated layer depths. (Code here)[https://github.com/GLEON/Lake-Analyzer/blob/master/Source/Run_LA.m#L428-L457] For the start of the timeseries, it pads it with 1:filter_window number of values based on a moving average of the initial estimates.

To do this in R is fairly straight forward.

#create moving average filter. Filter window in timesteps.  
filter_window = 5
b = rep(1/filter_window, filter_window)

x=seq(1, 100, by=1) + 3*rnorm(100)
padx = rep(NA, filter_window-1)

for(i in 1:filter_window-1){
    padx[i] = mean(x[1:i])
}

x_padded = c(padx, x)

newx = filter(x_padded, b, method='convolution', sides=1)

#now drop the NA's on the front introduced by the window
newx = newx[filter_window:length(newx)]

plot(x)
lines(newx)

This is pretty much the same algorithm as is used in the Matlab version. Do you want to give this a try?

Note: This doesn't do any downsampling first which might be helpful with high resolution buoy data. Also, I just arbitrarily selected "5" for the averaging window. A number like 24 (so daily average window) might make more sense if you have hourly data.

-Luke

@brdeemer
Copy link
Author

Hi Luke,

Thanks for your insight & help. I gave your code a try, but ran into some problems along the way. Specifically, the code spit back an error:

newx = filter(x_padded,b,method='convolution',sides=1)
Error in UseMethod("filter_") :
no applicable method for 'filter_' applied to an object of class "c('double', 'numeric')"

Here my "x" was metadeps[,'max'](see code above)

Also, it seems like this code only averages at a single time step? Or maybe it would be placed as a loop within a loop?

My "metadeps[,'max'] is very high resolution (19,300 rows long) and spans about 2 months. I came up with another solution for a moving average filter by using the "zoo" package. I applied a moving window to the data output from the meta.depths function and everything looks much cleaner now. This is what I did:

metarollmin<-rollapply(metadeps[,'min'],width=339,by=1,FUN=mean,align="left")
#get 24 hour averages of metalimnion depths & associated datetimes
#data resolution is every 3-5 minutes, hence the large "width" of 339 data points
dt.mov.av<-rollapply(dt$datetime,width=339,by=1,FUN=mean,align="left")
metarollmax<-rollapply(metadeps[,'max'],width=339,by=1,FUN=mean,align="left")
plot(metarollmindt.mov.av)
plot(metarollmax
dt.mov.av)

This smooths out some of the anamoulously high and/or low depths and gives me data that looks like this:
metarollmax

Do you think this functionally similar to what your matlab code did (and that it makes sense?) Specifically, is it ok to calculate the metalimion depths first (with the high resolution data) and then smooth the results?

Thank you again for all your help & advice. It is exciting to be getting some data output!
Bridget

@lawinslow
Copy link
Member

I'm going to guess that you have previously loaded dplyr, am I right? It also has a filter function which masks the stats::filter function.

And yeah, rollapply is also a very good (if not better) way of doing the same thing. The only minor nuance would be (if I'm understanding rollapply documentation correctly), the result from the Matlab version is right aligned. But both left and right alignment result in a slight phase delay that is inconsequential if you're averaging to a daily timestep.

@brdeemer
Copy link
Author

Yes, you are right. I had dplyr installed (and used it for the loop function I posted at the top of this thread). I'm not sure what you mean by left vs. right alignment? Glad to hear that this will work for daily averaging though.

Thanks again.

@lawinslow
Copy link
Member

Right - the average is of the past N observations
Left - the average is of the future N observations
Center - the average is of N/2 future and N/2 past observations

Something like that. I don't think it matters for your purposes.

@brdeemer
Copy link
Author

Ok, thanks!

@brdeemer
Copy link
Author

Hello again,

One other question that I did have about incorporating shifting water levels into rLakeAnalyzer regards plotting. Do you know of any way to depict changing depth in the contour time-series plots? Or perhaps you might know of another program that would do that well?

Thanks so much!

Bridget

@lawinslow
Copy link
Member

No functionality available at the moment to do that. The matlab version will do the level-varying plotting you're talking about.

@brdeemer
Copy link
Author

Thanks, I will have a look.

Just ran into one more thing. For the Wedderburn number, how would you suggest incorporating shifting lake level into Ao? Specifically, can you suggest a way to link into the bathy file to calculate an estimated Ao as the surface level changes?

@lawinslow
Copy link
Member

Hmmm, not super easy.

You'd need to change Ao to be time-varying
https://github.com/GLEON/rLakeAnalyzer/blob/master/R/timeseries.functions.R#L222

Below example passes in Ao as vector. Could actually calculate it from the level adjusted hypsometry data (interpolate shifted data to depth zero at each timestep).

Max depth would also need to vary with time in the layer density calculations
https://github.com/GLEON/rLakeAnalyzer/blob/master/R/timeseries.functions.R#L212

And you'd need to adjust bathymetry based on water level.

bthA    <-    c(1000,900,864,820,200,10)
bthD    <-    c(0,2.3,2.5,4.2,5.8,7)
wtr    <-    c(28,27,26.4,26,25.4,24,23.3)
depths    <-    c(0,1,2,3,4,5,6)
cat('Schmidt stability for input is: ')
cat(schmidt.stability(wtr, depths, bthA, bthD))
cat('\n')
cat('Schmidt Stabil with level 1m lower:', schmidt.stability(wtr, depths, bthA, bthD - 1))

I roughed this out quickly. Not working yet, but something kinda along those lines

ts.wedderburn.number.level <- function(wtr, wnd, wnd.height, bathy, Ao, lvl, seasonal=TRUE, na.rm=TRUE){

    depths = get.offsets(wtr)

    # Make sure data frames match by date/time. 
    all.data = merge(wtr, wnd, by='datetime')

    cols = ncol(all.data)
    wtr = all.data[,-cols]
    wnd = all.data[,c(1, cols)]

    n = nrow(wtr)
    w.n = rep(NA, n)

    wtr.mat = as.matrix(wtr[,-1])
    dimnames(wtr.mat) <- NULL

    for(i in 1:n){
        #check we have all the data necessary
        if(any(is.na(wtr.mat[i,])) || is.na(wnd[i,2])){
            next
        }

        m.d = meta.depths(wtr.mat[i,], depths, seasonal=seasonal)
        if(any(is.na(m.d))){
            next
        }

        #Need epi and hypo density for wedderburn.number calc
        temp = wtr.mat[i,]
        notNA = !is.na(temp)

        epi.dens = layer.density(0, m.d[1], temp[notNA], depths[notNA], bathy$areas, bathy$depths - lvl[i])
        hypo.dens = layer.density(m.d[2], max(depths[notNA]), wtr.mat[i,], depths[notNA], bathy$areas, bathy$depths - lvl[i])

        uS = uStar(wnd[i,2], wnd.height, epi.dens)

        #St = schmidt.stability(wtr.mat[i,], depths, bathy$areas, bathy$depths)

        #thermo.depth <- function(wtr, depths, Smin = 0.1){\
        #l.n[i] = lake.number(bathy$areas, bathy$depths, uS, St, m.d[1], m.d[2], hypo.dens)

        w.n[i] = wedderburn.number(hypo.dens - epi.dens, m.d[1], uS, Ao[i], hypo.dens)

    }

    output = data.frame(datetime=get.datetime(wtr), wedderburn.number=w.n)

    return(output)
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants