-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support incremental downloads of large files somehow. #18
Comments
Sounds like a nice thing to have (I also would enjoy it). I have no clue how hard that would be but seems more like an issue for the upstream hf-hub crate. @Narsil what do you think? |
Definitely for upstream. the The Resumable would be harder as it would require using temporary files for parts and validating parts during resume (something |
Would you be willing to write a PR for it @crlf0710 ? |
I'm glad to if there's some mentoring... I haven't got myself familiar with the code base yet. |
Transfered issue. The code here is relatively simple everything should be in |
@crlf0710 If you have not already begun working on this and would not mind, I should be able to create a PR this weekend to introduce the same retry capabilities to sync that the tokio version already has. @Narsil Regarding resumable downloads, could we create our own version of .incomplete (perhaps in the tmp folder)? Using etag for identification and creating two files for each download, the partial file itself and a meta file containing the chunk size and the successfully downloaded chunks (as a 'checklist' to keep it async). We could then also assume that all successfully downloaded chunks are correct and just add the missing chunks. |
For larger models (maybe >= 3.0 GB), there's a larger chance that the downloads get aborted by network failures with slower networks. I wonder if it's possible to make the downloads incremental, so restart from the beginning is not needed?
The text was updated successfully, but these errors were encountered: