Skip to content

v1.2.0

Compare
Choose a tag to compare
@ashleve ashleve released this 19 Nov 12:32
· 198 commits to main since this release
abce7bc

List of changes:

  • Update template for compatibility with lightning v1.5 and pytorch v1.10
  • General documentation improvements
  • Move LICENSE to README.md
  • Add manual resetting of metrics at the end of every epoch, to make sure no one makes hard to spot calculation mistakes
  • Add experiment mode to all experiment configs
  • Improve logging paths for experiment mode
  • Add MaxMetric to model, for computation of best so far validation accuracy
  • Add RichProgressBar to default callbacks for the pretty formatted progress bar
  • Get rid of the trick for preventing auto hparam logging, since lightning now supports it with self.save_hyperparameters(logger=False)
  • Add self.save_hyperparameters() to datamodule since lightinng now supports it
  • Deprecate Apex support since native pytorch mixed-precision is better
  • Deprecate bash script for conda setup since installation commands change too often to maintain it
  • Change trainer.terminate_on_nan debug option to trainer.detect_anomaly for compatibility with lightning v1.5
  • Specify model and datamodule during trainer.test(), for compatibility with lightning v1.5
  • Remove configs/trainer/all_params.yaml
  • Make hyperparameter optimization compatible with lightning v1.5
  • Specify that EarlyStopping patience is counted in validation epochs and not in training epochs.
  • Add a new way for accessing datamodule attributes to the README.md
  • Make debug mode automatically set the level of all command-line loggers to DEBUG
  • Make debug mode automatically set the trainer config to debug.yaml
  • Add generator seed to prevent test data leaking to train data in datamodule.setup() when seed is not set up
  • Move Dockerfile to dockerfiles branch
  • Modifiy configs/trainer/debug.yaml to enable some debug options
  • Remove unused if config.get("debug"): in extras

Special thanks for PRs to: @CharlesGaydon, @eungbean, @gscriva