Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-Resolution Time Series Analysis #2345

Open
wants to merge 19 commits into
base: devel
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 11 commits
Commits
Show all changes
19 commits
Select commit Hold shift + click to select a range
ed77ac5
new classes for multiresolutionTSA and discrete wavelet transform algo
GabrielSoto-INL May 14, 2024
b89d1a0
updating submodules to match devel
GabrielSoto-INL Jul 24, 2024
95a0f25
renaming DWT to FilterBankDWT, adding class attrib for MRA
GabrielSoto-INL Jul 24, 2024
19bd128
allowing macroSteps (e.g., years) looping in training
GabrielSoto-INL Jul 24, 2024
7cbb350
formalizing getters for trained MRA params
GabrielSoto-INL Jul 25, 2024
746a852
fixing evaluate for multiple macrosteps
GabrielSoto-INL Jul 29, 2024
d2d1205
fixing levels error when requested exceeds max
GabrielSoto-INL Jul 29, 2024
0c0973d
multiresolution tests
GabrielSoto-INL Jul 29, 2024
045bc72
pywavelets not optional, but imported with lazy loading
GabrielSoto-INL Jul 29, 2024
5921128
fixing typo in pywt call, gold dirnames, add docstrings
GabrielSoto-INL Jul 29, 2024
c2e8911
fixing other typos
GabrielSoto-INL Jul 30, 2024
84d3222
fixing docstrings
GabrielSoto-INL Aug 5, 2024
8a16c42
addressing reviewer comments, fixing writeXML and additional getters
GabrielSoto-INL Sep 11, 2024
038c71c
fixing test files and regolding romMeta
GabrielSoto-INL Sep 11, 2024
31bfc5a
updating descriptions and some fixes to levels input
GabrielSoto-INL Sep 17, 2024
521ae78
matching TEAL plugin to remote
GabrielSoto-INL Sep 17, 2024
0fd5ae0
changing training data and MRA tests
GabrielSoto-INL Sep 18, 2024
48c5e1b
regolding
GabrielSoto-INL Sep 18, 2024
ee4a358
Merge branch 'devel' into multiResolutionTSA
wangcj05 Sep 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion dependencies.xml
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Note all install methods after "main" take
<line_profiler optional='True'/>
<!-- <ete3 optional='True'/> -->
<statsforecast/>
<pywavelets optional='True'>1.2</pywavelets>
<pywavelets>1.2</pywavelets>
<python-sensors source="pip"/>
<numdifftools source="pip">0.9</numdifftools>
<fmpy optional='True'/>
Expand Down
2 changes: 1 addition & 1 deletion plugins/HERON
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this intentional?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it... was not. a little unsure what's happening, on my local it says they're at the same commit as the plugins in the RAVEN repo... I'll dig in to it a little more.

2 changes: 1 addition & 1 deletion plugins/TEAL
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here?

3 changes: 2 additions & 1 deletion ravenframework/Models/ROM.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@ class ROM(Dummy):
interfaceFactory = factory
segmentNameToClass = {'segment': 'Segments',
'cluster': 'Clusters',
'interpolate': 'Interpolated'}
'interpolate': 'Interpolated',
'decomposition': 'Decomposition'}
@classmethod
def getInputSpecification(cls, xml=None):
"""
Expand Down
3 changes: 2 additions & 1 deletion ravenframework/SupervisedLearning/Factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,13 @@
from .NDinvDistWeight import NDinvDistWeight
from .NDspline import NDspline
from .SyntheticHistory import SyntheticHistory
from .MultiResolutionTSA import MultiResolutionTSA
from .pickledROM import pickledROM
from .PolyExponential import PolyExponential
from .DynamicModeDecomposition import DMD
from .DynamicModeDecompositionControl import DMDC
from .ARMA import ARMA
from .ROMCollection import Segments, Clusters, Interpolated
from .ROMCollection import Segments, Clusters, Interpolated, Decomposition

## Tensorflow-Keras Neural Network Models
from .KerasMLPClassifier import KerasMLPClassifier
Expand Down
183 changes: 183 additions & 0 deletions ravenframework/SupervisedLearning/MultiResolutionTSA.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,183 @@
# Copyright 2017 Battelle Energy Alliance, LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Created on July 29, 2024

@author: sotogj
Class to specifically handle multi-resolution time series analysis training
"""
# internal libraries
from .SupervisedLearning import SupervisedLearning
from .SyntheticHistory import SyntheticHistory
#
#
#
#
class MultiResolutionTSA(SupervisedLearning):
""" Class to specifically handle multi-resolution time series analysis training and evaluation."""

@classmethod
def getInputSpecification(cls):
"""
Method to get a reference to a class that specifies the input data for
class cls.
@ In, cls, the class for which we are retrieving the specification
@ Out, inputSpecification, InputData.ParameterInput, class to use for
specifying input of cls.
"""
spec = super().getInputSpecification()
spec.description = r"""Class to specifically handle multi-resolution time series analysis training and evaluation."""
spec = SyntheticHistory.addTSASpecs(spec)
return spec

def __init__(self):
"""
Constructor.
@ In, kwargs, dict, initialization options
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Method has no **kwargs argument.

@ Out, None
"""
super().__init__()
self.printTag = 'Multiresolution Synthetic History'
self._globalROM = SyntheticHistory()
self._decompParams = {}
self.decompositionAlgorithm = None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is better to provide some descriptions for the self variables.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed, actually realized I wasn't using _decompParams so it helped clean up the code!


def _handleInput(self, paramInput):
"""
Function to handle the common parts of the model parameter input.
@ In, paramInput, InputData.ParameterInput, the already parsed input.
@ Out, None
"""
super()._handleInput(paramInput)
self._globalROM._handleInput(paramInput)
self._dynamicHandling = True # This ROM is able to manage the time-series on its own.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should be moved to init method.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed


# check that there is a multiresolution algorithm
allAlgorithms = self._globalROM._tsaAlgorithms
allAlgorithms.extend(self._globalROM._tsaGlobalAlgorithms)
foundMRAalgorithm = False
for algo in allAlgorithms:
if algo.canTransform():
if algo.isMultiResolutionAlgorithm():
foundMRAalgorithm = True
self.decompositionAlgorithm = algo.name
break
if not foundMRAalgorithm:
msg = 'The MultiResolutionTSA ROM class requires a TSA algorithm capable of '
msg += ' multiresolution time series analysis. None were found. Example: FilterBankDWT.'
self.raiseAnError(IOError, msg)

def _train(self, featureVals, targetVals):
"""
Perform training on input database stored in featureVals.
@ In, featureVals, array, shape=[n_timeStep, n_dimensions], an array of input data # Not use for ARMA training
@ In, targetVals, array, shape = [n_timeStep, n_dimensions], an array of time series data
@ Out, None
"""
self._globalROM.trainTSASequential(targetVals)

def _getMRTrainedParams(self):
"""
Get trained params for multi-resolution time series analysis. Returns number of decomposition levels
and a sorted (or re-indexed) version of the trained params dictionary.
@ In, None
@ Out, numLevels, int, number of decomposition levels
@ Out, sortedTrainedParams, dict, dictionary of trained parameters
"""
# get all trained parameters from final algorithm (should be multiresolution transformer)
trainedParams = list(self._globalROM._tsaTrainedParams.items())
mrAlgo, mrTrainedParams = trainedParams[-1]

# eh, maybe this should live in TSAUser in the future...
# extract settings used for that last algorithm (should have some sort of "levels")
numLevels = mrAlgo._getDecompositionLevels()

# reformat the trained params
sortedTrainedParams = mrAlgo._sortTrainedParamsByLevels(mrTrainedParams)
return numLevels, sortedTrainedParams

def _updateMRTrainedParams(self, params):
"""
Method to update trained parameter dictionary from this class using imported `params` dictionary
containing multiple-decomposition-level trainings
@ In, params, dict, dictionary of trained parameters from previous decomposition levels
@ Out, None
"""
# get all trained parameters from final algorithm (should be multiresolution transformer)
trainedParams = list(self._globalROM._tsaTrainedParams.items())
mrAlgo, mrTrainedParams = trainedParams[-1]

mrAlgo._combineTrainedParamsByLevels(mrTrainedParams, params)


def writeXML(self, writeTo, targets=None, skip=None):
"""
Allows the SVE to put whatever it wants into an XML to print to file.
Overload in subclasses.
@ In, writeTo, xmlUtils.StaticXmlElement, entity to write to
@ In, targets, list, optional, unused (kept for compatability)
@ In, skip, list, optional, unused (kept for compatability)
@ Out, None
"""

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this function completely empty? even not an empty return or pass? Is it an abstract one that has to be there?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I missed the return statement. this one is empty for now, need to find an efficient way to print out XML meta

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually found an easy way of reporting back info, this method is now filled in!

def __evaluateLocal__(self, featureVals):
"""
Evaluate algorithms for ROM generation
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

two extra spaces for this line.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed, good catch!

@ In, featureVals, float, a scalar feature value is passed as scaling factor
@ Out, rlz, dict, realization dictionary of values for each target
"""
rlz = self._globalROM.evaluateTSASequential()
return rlz





### ESSENTIALLY UNUSED ###
def _localNormalizeData(self,values,names,feat):
"""
Overwrites default normalization procedure, since we do not desire normalization in this implementation.
@ In, values, unused
@ In, names, unused
@ In, feat, feature to normalize
@ Out, None
"""
self.muAndSigmaFeatures[feat] = (0.0,1.0)

def __confidenceLocal__(self,featureVals):
"""
This method is currently not needed for ARMA
"""
pass

def __resetLocal__(self,featureVals):
"""
After this method the ROM should be described only by the initial parameter settings
Currently not implemented for ARMA
"""
pass

def __returnInitialParametersLocal__(self):
"""
there are no possible default parameters to report
"""
localInitParam = {}
return localInitParam

def __returnCurrentSettingLocal__(self):
"""
override this method to pass the set of parameters of the ROM that can change during simulation
Currently not implemented for ARMA
"""
pass
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you need to keep these methods? I assume they are also defined in the base class.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it seems we didn't need _localNormalizeData but the others are abstract methods in SupervisedLearning so they need to be defined here

Loading