Skip to content

Commit

Permalink
Apply suggestions from code review
Browse files Browse the repository at this point in the history
Co-authored-by: Jingya HUANG <[email protected]>
  • Loading branch information
dacorvo and JingyaHuang authored Jan 31, 2024
1 parent 322524b commit 4531aa2
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion optimum/neuron/generation/token_selector.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ def create(
The model provides the internal helpers allowing to select the logits processors and stopping criterias.
max_seq_length (`int`):
The maximum number of input + generated tokens for this model. It depends on the model compilation parameters.
stopping_criteria (`Optional[transformers.generation.StoppingCriteriaList]):
stopping_criteria (`Optional[transformers.generation.StoppingCriteriaList], defaults to `None`):
Custom stopping criteria that complement the default stopping criteria built from arguments and a
generation config.
Return:
Expand Down
2 changes: 1 addition & 1 deletion optimum/neuron/modeling.py
Original file line number Diff line number Diff line change
Expand Up @@ -749,7 +749,7 @@ def generate(
priority: 1) from the `generation_config.json` model file, if it exists; 2) from the model
configuration. Please note that unspecified parameters will inherit [`~transformers.generation.GenerationConfig`]'s
default values, whose documentation should be checked to parameterize generation.
stopping_criteria (`Optional[transformers.generation.StoppingCriteriaList]):
stopping_criteria (`Optional[transformers.generation.StoppingCriteriaList], defaults to `None`):
Custom stopping criteria that complement the default stopping criteria built from arguments and a
generation config.
Expand Down

0 comments on commit 4531aa2

Please sign in to comment.