Skip to content

TextBrewer 0.2.0

Compare
Choose a tag to compare
@airaria airaria released this 30 Jul 01:15
· 55 commits to master since this release
000f4af

New Features

  • Now supports distributed data-parallel training with torch.nn.parallel.DistributedDataParallel ! You can pass local_rank to the TrainingConfig to setup for the distributed training. The detailed usage of DistributedDataParallel can be found at the PyTorch docs.

  • We also added an example (Chinese NER task) to demonstrate how to use TextBrewer with distributed data-parallel training.