fautei
fautei

Reputation: 1

Openvino runtime weight change

I'm trying to reimplement dasiamrpn tracker from opencv, but using openvino inference. In the init method I suppose some layer parameters have been changed by the tensors prodused by the r1 and cls1 heads

    siamRPN.setInput(blob);
    cv::Mat out1;
    siamRPN.forward(out1, "63");

    siamKernelCL1.setInput(out1);
    siamKernelR1.setInput(out1);

    cv::Mat cls1 = siamKernelCL1.forward();
    cv::Mat r1 = siamKernelR1.forward();
    std::vector<int> r1_shape = { 20, 256, 4, 4 }, cls1_shape = { 10, 256, 4, 4 }; //same shape as conv layers 65 and 68

    siamRPN.setParam(siamRPN.getLayerId("65"), 0, r1.reshape(0, r1_shape));
    siamRPN.setParam(siamRPN.getLayerId("68"), 0, cls1.reshape(0, cls1_shape));

but I couldn't find an API or a some way to do this in openvino. Someone faced such problem?

what I'm trying to do

I suppose weight stored in this two nodes:

    <layer id="31" name="new_layer_2.weight" type="Const" version="opset1">
        <data element_type="f32" shape="10, 256, 4, 4" offset="17349120" size="163840"/>
        <rt_info>
            <attribute name="fused_names" version="0" value="new_layer_2.weight"/>
        </rt_info>
        <output>
            <port id="0" precision="FP32" names="new_layer_2.weight">
                <dim>10</dim>
                <dim>256</dim>
                <dim>4</dim>
                <dim>4</dim>
            </port>
        </output>
    </layer>
    <layer id="38" name="new_layer_1.weight" type="Const" version="opset1">
        <data element_type="f32" shape="20, 256, 4, 4" offset="19873280" size="327680"/>
        <rt_info>
            <attribute name="fused_names" version="0" value="new_layer_1.weight"/>
        </rt_info>
        <output>
            <port id="0" precision="FP32" names="new_layer_1.weight">
                <dim>20</dim>
                <dim>256</dim>
                <dim>4</dim>
                <dim>4</dim>
            </port>
        </output>
    </layer>

I can view this nodes in model ops

auto ops = model->get_ops();

but I have no idea how to change its weight data. There is a way to change it on runtime?

Upvotes: 0

Views: 317

Answers (2)

fautei
fautei

Reputation: 1

I have found this solution

    auto ops = _siamRPN->get_ordered_ops();
    auto cls_weight = std::make_shared<ov::opset1::Constant>(ov::element::f32, ov::Shape{ 10,256,4,4 }, cls1.data());
    cls_weight->set_friendly_name(ops[111]->get_friendly_name());        // operation name
    cls_weight->output(0).set_names(ops[111]->output(0).get_names());
    auto r1_weight = std::make_shared<ov::opset1::Constant>(ov::element::f32, ov::Shape{ 20,256,4,4 }, r1.data());
    r1_weight->set_friendly_name(ops[127]->get_friendly_name());        // operation name
    r1_weight->output(0).set_names(ops[127]->output(0).get_names());

    _siamRPN->replace_node(ops[111], cls_weight);
    _siamRPN->replace_node(ops[127], r1_weight);
    _siamRPN->validate_nodes_and_infer_types();
    
    compiled_siamRPN = std::make_shared < ov::CompiledModel>(core->compile_model(_siamRPN, "CPU"));

but I think it's not the best solution. Is there a way to do this more clear and fast?

Upvotes: 0

Zul_Intel
Zul_Intel

Reputation: 66

You can refer to the inference pipeline to infer a model with OpenVINO Runtime, it shows the steps that you need to perform in your application code.

To read multiple networks in an application, you may refer to Pedestrian Tracker C++ Demo

Upvotes: 0

Related Questions