Skip to content

Knowledge Distillation into BiLSTM Networks for the Compression of the Greek‐BERT Model.

License

Notifications You must be signed in to change notification settings

andreasgoulas/greek-bert-distil

Repository files navigation

GreekBERT Distillation

This repository contains code for my diploma thesis: "Knowledge Distillation into BiLSTM Networks for the Compression of the Greek‐BERT Model".

Requirements

  • click
  • numpy
  • pandas
  • PyTorch
  • scikit-learn
  • spacy (el_core_news_sm)
  • tqdm
  • transformers

License and Citation

The code is licensed under the MIT License.

If you find the code useful in your work, please cite the following publication:

Goulas, A., Malamas, N., Symeonidis, A.L. (2022). A Methodology for Enabling NLP Capabilities on Edge and Low-Resource Devices. In Natural Language Processing and Information Systems. NLDB 2022. Lecture Notes in Computer Science, vol 13286 (pp. 197–208). Springer, Cham.

Bibtex:

@inproceedings{goulas2022methodology,
    author    = {Goulas, Andreas and Malamas, Nikolaos and Symeonidis, Andreas L.},
    title     = {A Methodology for Enabling NLP Capabilities on Edge and Low-Resource Devices},
    booktitle = {Natural Language Processing and Information Systems},
    month     = {06},
    year      = {2022},
    pages     = {197--208},
    publisher = {Springer International Publishing},
    address   = {Cham}
}

About

Knowledge Distillation into BiLSTM Networks for the Compression of the Greek‐BERT Model.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages