Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could you share pretrained model weights? #2

Open
vhargitai opened this issue Nov 27, 2019 · 1 comment
Open

Could you share pretrained model weights? #2

vhargitai opened this issue Nov 27, 2019 · 1 comment

Comments

@vhargitai
Copy link

Hi @Smerity , could you share the pretrained SHA-RNN weights from your WikiText103 experiments? I'd like to do some fine-tuning experiments with it for text classification. (It would be a huge help, as I only have access to an oven for limited time. :)) Thank you!

@PiotrCzapla
Copy link

PiotrCzapla commented Dec 5, 2019

Steven It would be useful to have an LSTM based network shown on wikitext103 benchmark on sotabench. If you don't have the pre-trained weights anymore I can run the training but it would be good to host it here in this repo.
https://sotabench.com/benchmarks/language-modelling-on-wikitext-103

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants