Releases: michaelhush/M-LOOP
Releases · michaelhush/M-LOOP
M-LOOP 3.0.0
- Documentation for neural net learner.
- Several improvements to visualization features.
- Documentation fixes.
- Support for TensorFlow 2.
M-LOOP 2.2.0
- Made a variety of small bug fixes with visualisations.
- The find local minima option has been removed, as it is rarely used, slow, and of little value when attempting to learn about the cost landscape.
- Neural nets have now been added as a machine learning option. Functionality is still undocumented, will is available to interested users. Full documentation will be in the 3.0 series release.
M-LOOP 2.1.1
Fixed some issues with:
*Halting conditions
*Appropriately processing bad runs.
Documentation has been updated.
M-LOOP 2.1.0
M-LOOP now some new options and has a different default behaviour.
- A differential evolution algorithm has been added. This can now be used on its own or as a trainer for the gaussian process (GP). It is now the default trainer for the GP.
- Added a shell based interface.
- Improved the documentation. Added a tutorial for using M-LOOP as a python API and also improved the installation instructions. New features have also been appropriately documented.
M-LOOP 2.0.3
Added support for python 2.
M-LOOP 2.0
M-LOOP has been significantly upgraded.
The new version allows you to automatically optimize your quantum experiments or numerical simulations faster and easier than before. New features include:
- Multithreaded execution with no downtime between runs.
- Visualizations of your optimization process and cost landscape.
- Documentation with easy to follow installation instructions.
Go to http://m-loop.readthedocs.io/ and follow the installation instructions to get the latest version.
Pre-release of M-LOOP
More complete documentation. Testing new documentation hosting.
Pre-release of M-LOOP
Preliminary release of M-LOOP 2, now with threading, automated testing and documentation.