Hariharan
Hariharan

Reputation: 1

How can I configure OpenVINO to automatically use GPU, CPU, or NPU for model inference?

1)I am working on an AI/ML project using Windows and an Intel processor system. My model is based on OpenVINO. How can I load the model such that it automatically detects the available functionality (CPU, GPU, or NPU) and selects the appropriate device by default? 2)Any Dependency's Needed. Drivers or something it is a windows system?

i saw used the code in Chat gpt it is detect the hard ware but not loaded?

Upvotes: 0

Views: 64

Answers (0)

Related Questions