Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More informative download error messages #1373

Merged
merged 1 commit into from
Apr 30, 2024
Merged

More informative download error messages #1373

merged 1 commit into from
Apr 30, 2024

Conversation

rasbt
Copy link
Collaborator

@rasbt rasbt commented Apr 29, 2024

This adds more informative error messages for the various access token issues to address #1363.

Also, I tried to reduce the stack trace @lantiga . Instead of

Traceback (most recent call last):
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
    response.raise_for_status()
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/models/meta-llama/llama-2-7b

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 125, in gated_repo_catcher
    yield
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 115, in find_weight_files
    info = repo_info(repo_id, token=access_token)
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 119, in _inner_fn
    return fn(*args, **kwargs)
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/huggingface_hub/hf_api.py", line 2418, in repo_info
    return method(
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 119, in _inner_fn
    return fn(*args, **kwargs)
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/huggingface_hub/hf_api.py", line 2228, in model_info
    hf_raise_for_status(r)
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 321, in hf_raise_for_status
    raise GatedRepoError(message, response) from e
huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-663001a8-2fe0554741634ef222a41c49;b05217eb-e0a2-4270-9e65-d894b9c30f92)

Cannot access gated repo for url https://huggingface.co/api/models/meta-llama/llama-2-7b.
Access to model meta-llama/Llama-2-7b is restricted. You must be authenticated to access it.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/zeus/miniconda3/envs/cloudspace/bin/litgpt", line 8, in <module>
    sys.exit(main())
  File "/teamspace/studios/this_studio/litgpt/litgpt/__main__.py", line 143, in main
    fn(**kwargs)
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 54, in download_from_hub
    bins, safetensors = find_weight_files(repo_id, access_token)
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 114, in find_weight_files
    with gated_repo_catcher(repo_id, access_token):
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 141, in gated_repo_catcher
    raise ValueError(
ValueError: https://huggingface.co/meta-llama/llama-2-7b requires authentication. The access token provided by `HF_TOKEN=your_token` environment variable or `--access_token=your_token` may not have sufficient access rights. Please visit https://huggingface.co/meta-llama/llama-2-7b for more information.

It's now just

Traceback (most recent call last):
  File "/home/zeus/miniconda3/envs/cloudspace/bin/litgpt", line 8, in <module>
    sys.exit(main())
  File "/teamspace/studios/this_studio/litgpt/litgpt/__main__.py", line 143, in main
    fn(**kwargs)
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 54, in download_from_hub
    bins, safetensors = find_weight_files(repo_id, access_token)
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 114, in find_weight_files
    with gated_repo_catcher(repo_id, access_token):
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/teamspace/studios/this_studio/litgpt/litgpt/scripts/download.py", line 141, in gated_repo_catcher
    raise ValueError(
ValueError: https://huggingface.co/meta-llama/llama-2-7b requires authentication. The access token provided by `HF_TOKEN=your_token` environment variable or `--access_token=your_token` may not have sufficient access rights. Please visit https://huggingface.co/meta-llama/llama-2-7b for more information.

If we want to not raise a ValueError at all, we could print the message and use quit() for example. But I remember that @carmocca had some arguments against it which is why we prefer raising errors.

Fixes #1363

Copy link
Contributor

@carmocca carmocca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

Exceptions are very useful for those who call our code outside of the CLI as it lets them catch it and reason about them. They are also useful when there's an actual underlying issue (in our code or the code of our dependencies). But if It's preferable to hide the exceptions for product reasons, then let's do it.

@rasbt rasbt merged commit 4744512 into main Apr 30, 2024
9 checks passed
@rasbt rasbt deleted the stacktrace branch April 30, 2024 15:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

litgpt download doesn't work
2 participants