[Detector Support]: Openvino and Intel integrated NPU support #13248
Replies: 4 comments 3 replies
-
Bumping up this one.... |
Beta Was this translation helpful? Give feedback.
-
Hi gyus. I have solved this issue installing the NPU drivers and from intel and updated OpenVino to the ltest version using pip and even once i got NPU working. Unfortunately I didn't get any screenshots to show how it works. CPU is Ultra 5 125H I've installed all the drivers inside the docker container but didn't figure out how to commit changes and lauch it as the HA addon. So after container restart i just lost all progres, but for my opinion this would be great to add NPU support. It is really easy, working out of the box. And NPU can be really useful for some specific cases. |
Beta Was this translation helpful? Give feedback.
-
I'm having the same issue. ASUS nuc14 pro with Intel Core Ultra 7 155H processor. frigate log
frigate config
docker compose yaml
I seem to have no problem accessing libopenvino_intel_npu_plugin.so from inside the container (docker exec)
Is anyone else having the same issue and has found a solution? |
Beta Was this translation helpful? Give feedback.
-
So i can confirm that intel NPU is works fine for me. There are the steps how i got it works.
So i'm not a pro in github usage but i'll try to create the pull request to add intel NPU drivers in dev container because for this you just need to install one package from debial bookwork and 4 packages from intel. The main 2 problems which i faced with are:
also it seems that detector CPU load shows incorrect data because htop shows host load no more than 18% So it would be awesome to add NPU support because it can be very potential for some cases and also feature mentioned somewhere there to add the possibility to use separate detector for specific camera |
Beta Was this translation helpful? Give feedback.
-
Describe the problem you are having
Hello,
I have been using Frigate 14 with YoloNAS-M (256x256) with great satisfaction. For my use case, I have no FP and an inference speed below 10ms with an Intel Core Ultra 7 155H with the iArc GPU.
I have been exploring the possibility to utilize the NPU, to offload the GPU from this task
1. I have sucessfully downloaded the NPU drivers from :
https://github.com/intel/linux-npu-driver/releases
2. I have an initialized VPU with :
dmesg | grep vpu
[ 2.001761] intel_vpu 0000:00:0b.0: enabling device (0000 -> 0002)
[ 2.014536] intel_vpu 0000:00:0b.0: [drm] Firmware: intel/vpu/vpu_37xx_v0.0.bin, version: 20240611MTL_CLIENT_SILICON-release0003ci_tag_ud202424_vpu_rc_20240611_0003f3e8a8f2747
[ 2.166758] [drm] Initialized intel_vpu 1.0.0 20230117 for 0000:00:0b.0 on minor 0
3. I also have an accel0 device passthrough working:
ls -l /dev/accel
total 0
crw-rw---- 1 root ssl-cert 261, 0 Aug 21 07:00 accel0
4. When I try to set the detector device to NPU in the Frigate config, I get the following message (see log below):
Cannot load library '/usr/local/lib/python3.9/dist-packages/openvino/libs/libopenvino_intel_npu_plugin.so': libze_loader.so.1: cannot open shared object file: No such file or directory
5. However, when looking inside the container, the plugin seems present:
ls -l /usr/local/lib/python3.9/dist-packages/openvino/libs/libopenvino_intel_npu_plugin.so
-rw-r--r-- 1 root root 1116121 Jul 23 14:43 /usr/local/lib/python3.9/dist-packages/openvino/libs/libopenvino_intel_npu_plugin.so
I observe the same behavior when using :
model:
path: /openvino-model/ssdlite_mobilenet_v2.xml
Has anyone tried the integrated Intel NPU with Openvino before ?
Many thanks,
Philippe
Version
0.14.0-da913d8
Frigate config file
docker-compose file or Docker CLI command
Relevant log output
Operating system
Proxmox
Install method
Docker Compose
Object Detector
OpenVino
Any other information that may be helpful
Asus NUC Intel Core Ultra 7 155H with iArc GPU.
Beta Was this translation helpful? Give feedback.
All reactions