Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use convert_attribute=True by default for onnx.save #1738

Merged
merged 2 commits into from
Mar 1, 2024

Conversation

fxmarty
Copy link
Contributor

@fxmarty fxmarty commented Mar 1, 2024

As per title, see https://github.com/onnx/onnx/blob/b48d763a2d22574540f51b6bc5ce1a537fb65024/onnx/__init__.py#L310-L312

Have this as False may raise ValueError: The proto size is larger than the 2 GB limit. Please use save_as_external_data to save tensors separately from the model file. despite save_as_external_data=True.

Related: onnx/onnx#5949

We did not have errors previously as weights are Initializers, but Brevitas is using (not sure why) Constant for weights cc @Giuseppe5

@Giuseppe5
Copy link

Giuseppe5 commented Mar 1, 2024

I suspect the reason why we have constant is because the integer value of the weights is computed at runtime but the original weights are never "replaced" with their integer counterparts.

As far as ONNX export is concerned, those weights are constant values that are not part of the original model and they have no relations with the original weights.

In any case, thanks for fixing this!

@fxmarty
Copy link
Contributor Author

fxmarty commented Mar 1, 2024

I am not sure how torchscript -> onnx decides what is Initializer and what is Constant, to be honest.

@fxmarty
Copy link
Contributor Author

fxmarty commented Mar 1, 2024

Getting

  File "/home/felix/miniconda3/envs/fx/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/felix/miniconda3/envs/fx/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/felix/optimum/optimum/exporters/onnx/__main__.py", line 412, in <module>
    main()
  File "/home/felix/optimum/optimum/exporters/onnx/__main__.py", line 388, in main
    main_export(
  File "/home/felix/optimum/optimum/exporters/onnx/__main__.py", line 351, in main_export
    onnx_export_from_model(
  File "/home/felix/optimum/optimum/exporters/onnx/convert.py", line 1157, in onnx_export_from_model
    _, onnx_outputs = export_models(
  File "/home/felix/optimum/optimum/exporters/onnx/convert.py", line 768, in export_models
    export(
  File "/home/felix/optimum/optimum/exporters/onnx/convert.py", line 902, in export
    config.fix_dynamic_axes(output, device=device, input_shapes=input_shapes, dtype=dtype)
  File "/home/felix/optimum/optimum/exporters/onnx/base.py", line 307, in fix_dynamic_axes
    session = InferenceSession(model_path.as_posix(), providers=providers, sess_options=session_options)
  File "/home/felix/miniconda3/envs/fx/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/home/felix/miniconda3/envs/fx/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /tmp/tmpo6_v2ag3/encoder_model.onnx failed:Node (/Unsqueeze_1) Op (Unsqueeze) [ShapeInferenceError] Cannot parse data from external tensors. Please load external data into raw data for tensor: /Constant_2_output_0

@fxmarty
Copy link
Contributor Author

fxmarty commented Mar 1, 2024

@fxmarty
Copy link
Contributor Author

fxmarty commented Mar 1, 2024

Merging as tests pass.

@fxmarty fxmarty merged commit ebb63f8 into main Mar 1, 2024
38 of 47 checks passed
@fxmarty fxmarty deleted the onnx-convert_attribute branch March 1, 2024 12:49
young-developer pushed a commit to young-developer/optimum that referenced this pull request May 10, 2024
* use convert_attribute=True by default for onnx.save

* leave some constants in the model.onnx
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants