Skip to content

Commit

Permalink
Using the real model name instead of hard code "model" (#231)
Browse files Browse the repository at this point in the history
* Using the real model name instead of "model"

* Fix CLI multiple line backslash

* NeuronDecoderConfig from ExportConfig to NeuronConfig

* Revert to ExportConfig
  • Loading branch information
davidshtian authored Sep 15, 2023
1 parent 0cb9880 commit b538488
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ trainer = Trainer(
You can compile and export your 🤗 Transformers models to a serialized format before inference on Neuron devices:
```bash
optimum-cli export neuron
optimum-cli export neuron \
--model distilbert-base-uncased-finetuned-sst-2-english \
--batch_size 1 \
--sequence_length 32 \
Expand Down
5 changes: 3 additions & 2 deletions optimum/exporters/neuron/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,8 +205,9 @@ def main_export(
neuron_config = neuron_config_constructor(model.config, dynamic_batch_size=dynamic_batch_size, **input_shapes)
if atol is None:
atol = neuron_config.ATOL_FOR_VALIDATION
output_model_names = {"model": "model.neuron"}
models_and_neuron_configs = {"model": (model, neuron_config)}
model_name = model.name_or_path.split("/")[-1]
output_model_names = {model_name: "model.neuron"}
models_and_neuron_configs = {model_name: (model, neuron_config)}
maybe_save_preprocessors(model, output.parent)

if is_stable_diffusion:
Expand Down

0 comments on commit b538488

Please sign in to comment.