deep learning - How can I configure OpenVINO to automatically use GPU, CPU, or NPU for model inference? - Stack Overflow

admin2025-02-19  9

1)I am working on an AI/ML project using Windows and an Intel processor system. My model is based on OpenVINO. How can I load the model such that it automatically detects the available functionality (CPU, GPU, or NPU) and selects the appropriate device by default? 2)Any Dependency's Needed. Drivers or something it is a windows system?

i saw used the code in Chat gpt it is detect the hard ware but not loaded?

转载请注明原文地址:http://www.anycun.com/QandA/1739903349a23592.html