This repository contains code for my diploma thesis: "Knowledge Distillation into BiLSTM Networks for the Compression of the Greek‐BERT Model".
- click
- numpy
- pandas
- PyTorch
- scikit-learn
- spacy (el_core_news_sm)
- tqdm
- transformers
The code is licensed under the MIT License.
If you find the code useful in your work, please cite the following publication:
Goulas, A., Malamas, N., Symeonidis, A.L. (2022). A Methodology for Enabling NLP Capabilities on Edge and Low-Resource Devices. In Natural Language Processing and Information Systems. NLDB 2022. Lecture Notes in Computer Science, vol 13286 (pp. 197–208). Springer, Cham.
Bibtex:
@inproceedings{goulas2022methodology,
author = {Goulas, Andreas and Malamas, Nikolaos and Symeonidis, Andreas L.},
title = {A Methodology for Enabling NLP Capabilities on Edge and Low-Resource Devices},
booktitle = {Natural Language Processing and Information Systems},
month = {06},
year = {2022},
pages = {197--208},
publisher = {Springer International Publishing},
address = {Cham}
}