Skip to content

Releases: linjing-lab/easy-pytorch

perming-1.9.3

23 Nov 15:38
Compare
Choose a tag to compare

Enhancement:

  • replace parallel_backend with parallel_config to enhance parallel processing configuration by adding prefer option in train_val.

Users can prefer backend='threading' and prefer='threads' ('processes') when try fails by setting backend='loky' or turn to download v1.6.1. If users don't need more fine-grained control over the backend configuration, download v1.9.2 can reserve all improved traits and reduce runtime of parallel_config.__init__.

download:

!pip install perming==1.9.3 # in jupyter
pip install perming==1.9.3 # in cmd

perming-1.9.2

15 Nov 08:51
Compare
Choose a tag to compare

Optimized Details:

  • disable outputs_val returned from self.model in _val_acc and move prediction of val_set to self.criterion by reducing value located at cuda.

New Training Outputs:

import perming
main = perming.Box(8, 29, (60,), batch_size=256, activation='relu', inplace_on=True, solver='sgd', learning_rate_init=0.01)
main.print_config()
main.data_loader(features, labels, random_seed=0)
main.train_val(num_epochs=1, interval=100, early_stop=True) # set n_jobs > 1 within number of processes
'''
Epoch [1/1], Step [100/3277], Training Loss: 2.5334, Validation Loss: 0.3205
Epoch [1/1], Step [200/3277], Training Loss: 1.7961, Validation Loss: 0.2379
Epoch [1/1], Step [300/3277], Training Loss: 1.3085, Validation Loss: 0.1575
Epoch [1/1], Step [400/3277], Training Loss: 0.8955, Validation Loss: 0.1160
Epoch [1/1], Step [500/3277], Training Loss: 0.7074, Validation Loss: 0.0901
Process stop at epoch [1/1] with patience 10 within tolerance 0.001
'''

If users are not encountering large dataset with over 1 million samples and large batch_size, versions range from v1.7.0 to v1.9.1 will be also suitable for users need.
download:

!pip install perming==1.9.2 # in jupyter
pip install perming==1.9.2 # in cmd

perming-1.9.1

06 Nov 05:11
Compare
Choose a tag to compare

Upgraded Details:

  • configured criterion of common/Binarier from BCELoss to CrossEntropyLoss, make annotated cases in tests executable.
  • drop BCELoss in the allowed criterion to avoid users manually set torch.nn.Sigmoid() with outputs in train_val module.

Binarier Configuration:

import perming
# main = perming.Box(23, 2, (50,), batch_size=8, activation='relu', inplace_on=True, solver='adam', learning_rate_init=0.01)
main = perming.Binarier(23, (50,), batch_size=8, activation='relu', solver='adam', learning_rate_init=0.01)

download:

!pip install perming==1.9.1 # in jupyter
pip install perming==1.9.1 # in cmd

perming-1.8.3

29 Oct 15:44
Compare
Choose a tag to compare

Improved Content:

  • correct model name in README which do task in Multi-classification and delete assertion returned if num_classes < 2.
  • add indication about downloading previous versions with supported O(1) early_stop and without supported.

download:

!pip install perming==1.8.3 # in jupyter
pip install perming==1.8.3 # cmd

perming-1.8.2

29 Oct 14:55
Compare
Choose a tag to compare

Upgraded Trait:

  • improved annotations for more model initial items, like self.input, self.num_classes, self.batch_size, self.lr, self.solver, self.lr_scheduler.
  • completed the similarities and differences between various stable versions and suggested download information.

download:

!pip install perming==1.8.2 # in jupyter
pip install perming==1.8.2 # cmd

perming-1.8.1

28 Oct 12:59
Compare
Choose a tag to compare

Improved Trait:

  • move assignment of self.val_container to function self._set_container, so reduce memory burden in train_val module by completing assignment of validation set in local space. Users will never encounter crash of jupyter kernel, though there are no bugs from the programming environment before v1.6.1.

Do trial and experiments on any task, include Multi-classification, Multi-outputs, Binary-classification, Regression. For example, if take n_jobs=-1 and early_stop=True, the maximum amount of dataset that CUDA memory can support is determined by the user's programming environment.

download:

!pip install perming==1.8.1 # in jupyter
pip install perming==1.8.1 # in cmd

see tests in main branch for more equivalent trial, like models from common file.

perming-1.8.0

28 Oct 10:10
Compare
Choose a tag to compare

Supported Module:

  • add set_freeze module for freezing assigned layers of pretrained model according to serial numbers of self.model

Example:

import perming
main = perming.Box(8, 29, (60,), batch_size=256, activation='relu', inplace_on=True, solver='sgd', learning_rate_init=0.01)
main.data_loader(features, labels, random_seed=0)
# train_val, test, save in training full network about main
main.load(False, dir='../models/bitcoin.ckpt')
main.set_freeze({0:False}) # freeze the first layer of `self.model`
main.train_val(num_epochs=1, interval=100, early_stop=True)
main.test()

download:

!pip install perming==1.8.0 # in jupyter
pip install perming==1.8.0 # in cmd

perming-1.7.1

26 Oct 17:43
Compare
Choose a tag to compare

Improved Trait:

  • replace [*iter(self.val_loader)] with [x for x in self.val_loader] to reduce memory burden in jupyter kernel when encounters big datasets, like Multi-classification.ipynb

download:

!pip install perming==1.7.1 # in jupyter
pip install perming==1.7.1 # in cmd

perming-1.7.0

25 Oct 17:32
Compare
Choose a tag to compare

Upgraded Traits:

  • add Parallel and delayed from joblib to improve the parallel processing than the previous versions, see main branch of released_box/perming/L247.
  • disable _set_container and replace with [*iter(self.val_loader)] to make Python bulit-in rules first.

Examples and any core process are remained as previous versions.

download:

!pip install perming==1.7.0 # in jupyter
pip install perming==1.7.0 # in cmd

perming-1.6.1

25 Oct 14:47
Compare
Choose a tag to compare

Fixed Bug:

  • correct backend name from joblib.parallel_backend under __doc__ of _set_container and val_loss accumulated stage.

example with setting backend:

import perming
main = perming.Box(10, 3, (30,), batch_size=8, activation='relu', inplace_on=True, solver='sgd', criterion='MultiLabelSoftMarginLoss', learning_rate_init=0.01)
main.data_loader(X, y, random_seed=0)
main.train_val(num_epochs=60, interval=25, tolerance=1e-4, patience=10, backend='loky', early_stop=True)
main.test()

download:

!pip install perming==1.6.1 # in jupyter
pip install perming==1.6.1 # in cmd