This is the framework for running accelerated sampling with data-augmented autoencoders.
OpenMM simulation pacakge: https://github.com/pandegroup/openmm
ANN_Force biasing force package: https://github.com/weiHelloWorld/ANN_Force
Keras: https://github.com/fchollet/keras
PyTorch: https://pytorch.org
MDAnalysis: https://github.com/MDAnalysis/mdanalysis
Nose testing framework: https://github.com/nose-devs/nose
PLUMED (ANN included): https://github.com/plumed/plumed2 + https://github.com/weiHelloWorld/plumed_additional
cluster management: https://github.com/weiHelloWorld/cluster_management
plumed helper: https://github.com/weiHelloWorld/plumed_helper
OpenMM-PLUMED force plugin: https://github.com/peastman/openmm-plumed
Bayes WHAM free energy calculation package: https://bitbucket.org/andrewlferguson/bayeswham_python
Some other Python scientific calculation packages (e.g. seaborn, pandas, sklearn) are also needed, it is recommended to install them with Anaconda: https://www.continuum.io/downloads
No installation is required. You may simply have all dependent packages installed and checkout this repository.
It is highly recommended to run tests before running code to make sure packages are correctly installed.
This package uses nosetest
framework. To run testing, run
root_dir=MD_simulation_on_alanine_dipeptide/current_work
cd ${root_dir}/tests
make test
Tests include numerical unit tests (for tests with clear expected results) and figure plots (for others, such as neural network training).
Go ahead to modify configuration file ${root_dir}/src/config.py
, and run
python main_work.py
For more options, type
python main_work.py --help
A typical autoencoder consists of encoder ANN and decoder ANN, where encoder ANN maps inputs to a small number of collective variables (CVs) in encoding layer and decoder ANN tries to reconstruct inputs (or some variants of inputs) from CVs:
A typical 5-layer structure is given below:
For traditional autoencoders, we minimize
where
To remove external degrees of freedom, we use data-augmented autoencoders, which minimizes
where
To possibly remove dependency on specific reference, we apply multiple references to data-augmented autoencoders, corresponding error function is
where
If we want to see relative importance among these CVs, we construct multiple outputs with each output taking contribution from some of CVs in encoding layer. Two possible types of network topology are given below:
Corresponding error function is then
where
See slides for more information: (TODO)
Directories are arranged as follows:
${root_dir}/src: source code
${root_dir}/target: output of simulation data (pdb files and coordinate files)
${root_dir}/resources: training results (autoencoders), and reference configurations files (pdb files)
${root_dir}/tests: test source code
-
Create a subclass of
Sutils
for the molecule and implement corresponding methods in${root_dir}/src/molecule_spec_sutils.py
. -
Include molecule-specific information in the configuration file
${root_dir}/src/config.py
, and modify corresponding configuration settings. -
Modify biased simulation file (
${root_dir}/src/biased_simulation_general.py
) for the new molecule. -
Add molecule-related statements to
${root_dir}/src/ANN_simulation.py
and${root_dir}/src/autoencoders.py
wheneverTrp_cage
appears.
-
Create a subclass of
autoencoder
for the new structure/backend and do implementation. Note that all abstract methods (@abc.abstractmethod
) must be implemented. -
Include new network information in the configuration file
${root_dir}/src/config.py
.
Modify method Sutils.get_boundary_points()
in ${root_dir}/src/molecule_spec_sutils.py
.
Modify biased_simulation.py
or biased_simulation_general.py
If you use this code in your work, please cite:
-
Chen, Wei, and Andrew L. Ferguson. "Molecular enhanced sampling with autoencoders: On‐the‐fly collective variable discovery and accelerated free energy landscape exploration." Journal of computational chemistry 39.25 (2018): 2079-2102.
-
Chen, Wei, Aik Rui Tan, and Andrew L. Ferguson. "Collective variable discovery and enhanced sampling using autoencoders: Innovations in network architecture and error function design." The Journal of chemical physics 149.7 (2018): 072312.
For any questions, feel free to contact [email protected] or open a github issue.