reza
reza

Reputation: 63

OpenCV with OpenVINO Backend: Problem Whit Dynamic Batch Size

I am using OpenCV (version 4.10.0) compiled with the OpenVINO (2023.0.1) backend to load and process a deep learning model. I have successfully converted a model from the Open Model Zoo to the OpenVINO IR format using both ovc and omz_downloader. The conversion worked fine, but I'm encountering an issue when importing the model in OpenCV for inference.

Problem:

The model is converted with a dynamic batch size ([-1, 3, 112, 112]). When I try to load this model in OpenCV using the cv::dnn::readNetFromModelOptimizer() function, I get an exception in this part of the OpenCV source code:

NetImplOpenVINO::createNetworkFromModelOptimizer(std::shared_ptr<ov::Model>& ieNet) function
{
....
for (auto& it : ieNet->get_parameters())
{
    inputsNames.push_back(it->get_friendly_name());
    std::vector<size_t> dims = it->get_shape(); // Exception occurs here
    inp_shapes.push_back(std::vector<int>(dims.begin(), dims.end()));
}
.....
}

The exception occurs when calling it->get_shape() in the OpenCV code, likely because the model has a dynamic shape.

Question: How can I handle models with dynamic batch sizes in OpenCV's DNN module when using the OpenVINO backend? Is there a workaround for loading models with dynamic input shapes in OpenCV, or should I manage dynamic batching directly with OpenVINO's Inference Engine?

Environment:

OpenCV 4.10.0
OpenVINO 2023.1
Windows 11
environment c++

Static Batch Size: I converted the model with a static batch size ([1, 3, 112, 112]), and it works fine. However, I need to handle dynamic batch sizes for my application.

Backend Setup: I am setting the backend to OpenVINO with:  

`net.setPreferableBackend(cv::dnn::DNN_BACKEND_INFERENCE_ENGINE);
net.setPreferableTarget(cv::dnn::DNN_TARGET_CPU);`

Any help or insights on handling dynamic batch sizes in OpenCV with OpenVINO would be appreciated!

Upvotes: 0

Views: 68

Answers (0)

Related Questions