littlestone
littlestone

Reputation: 11

How to convert onnx with onnx.data to openvino IR format

I am using mo to convert onnx to openvino IR format.But when encountering onnx and onnx.data, it reported error.

mo --input_model G:\convert_model\onnx-fp16\text_encoder\model.onnx --input_shape [1,77]

[ INFO ] MO command line tool is considered as the legacy conversion API as of OpenVINO 2023.2 release.
In 2025.0 MO command line tool and openvino.tools.mo.convert_model() will be removed. Please use OpenVINO Model Converter (OVC) or openvino.convert_model(). OVC represents a lightweight alternative of MO and provides simplified model conversion API.
Find more information about transition from MO to OVC at https://docs.openvino.ai/2023.2/openvino_docs_OV_Converter_UG_prepare_model_convert_model_MO_OVC_transition.html
[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Check 'onnx_node.get_outputs_size() <= outputs_size' failed at src/frontends/onnx/frontend/src/core/graph.cpp:400:
FrontEnd API failed with GeneralFailure:
Expected output number of SkipLayerNormalization node is 4 while the implementation provides 1 outputs

[ ERROR ]  Traceback (most recent call last):
  File "C:\Users\stone\.conda\envs\dragdiff\lib\site-packages\openvino\tools\mo\convert_impl.py", line 892, in _convert
    ov_model, legacy_path = driver(argv, {"conversion_parameters": non_default_params})
  File "C:\Users\stone\.conda\envs\dragdiff\lib\site-packages\openvino\tools\mo\convert_impl.py", line 552, in driver
    graph, ngraph_function = prepare_ir(argv)
  File "C:\Users\stone\.conda\envs\dragdiff\lib\site-packages\openvino\tools\mo\convert_impl.py", line 406, in prepare_ir
    ngraph_function = moc_pipeline(argv, moc_front_end)
  File "C:\Users\stone\.conda\envs\dragdiff\lib\site-packages\openvino\tools\mo\moc_frontend\pipeline.py", line 285, in moc_pipeline
    ov_model = moc_front_end.convert(input_model)
  File "C:\Users\stone\.conda\envs\dragdiff\lib\site-packages\openvino\frontend\frontend.py", line 18, in convert
    converted_model = super().convert(model)
openvino._pyopenvino.GeneralFailure: Check 'onnx_node.get_outputs_size() <= outputs_size' failed at src/frontends/onnx/frontend/src/core/graph.cpp:400:
FrontEnd API failed with GeneralFailure:
Expected output number of SkipLayerNormalization node is 4 while the implementation provides 1 outputs
[ ERROR ]  ---------------- END OF BUG REPORT --------------
[ ERROR ]  -------------------------------------------------

It works on onnx without onnx.data.And I think something wrong with convert onnx to fp16.

mo --input_model G:\convert_model\onnx\text_encoder\model.onnx --input_shape [1,77]

[ INFO ] MO command line tool is considered as the legacy conversion API as of OpenVINO 2023.2 release.
In 2025.0 MO command line tool and openvino.tools.mo.convert_model() will be removed. Please use OpenVINO Model Converter (OVC) or openvino.convert_model(). OVC represents a lightweight alternative of MO and provides simplified model conversion API.
Find more information about transition from MO to OVC at https://docs.openvino.ai/2023.2/openvino_docs_OV_Converter_UG_prepare_model_convert_model_MO_OVC_transition.html
[ INFO ] Generated IR will be compressed to FP16. If you get lower accuracy, please consider disabling compression explicitly by adding argument --compress_to_fp16=False.
Find more information about compression to FP16 at https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_FP16_Compression.html
[ SUCCESS ] Generated IR version 11 model.
[ SUCCESS ] XML file: G:\convert_model\model.xml
[ SUCCESS ] BIN file: G:\convert_model\model.bin

But unet size is over 2GB.It must have onnx with onnx.data. enter image description here Thanks for help.

Upvotes: 0

Views: 94

Answers (0)

Related Questions