#
intel-npu
Here are 4 public repositories matching this topic...
A high-performance Docker container that runs OpenAI's Whisper model. Optimized for CPU, Intel NPU, Intel Arc/iGPU, and NVIDIA CUDA GPUs.
docker cuda speech-to-text hardware-acceleration whisper asr uvr openvino bazarr huggingface vocal-isolation faster-whisper ctranslate2 media-automation whisper-asr intel-npu
-
Updated
Feb 1, 2026 - Python
Complete guide and scripts for setup, execution and testing the AI models (ONNX) performance on NPU architectures (Qualcomm QNN & Intel OpenVINO) on Windows.
-
Updated
Dec 12, 2025 - Python
Professional benchmarking framework for Intel NPU, CPU, and GPU inference using OpenVINO
benchmark machine-learning computer-vision deep-learning inference pytorch openvino edge-ai ai-acceleration intel-npu core-ultra neural-processing-unit
-
Updated
Jan 25, 2026 - Python
Improve this page
Add a description, image, and links to the intel-npu topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the intel-npu topic, visit your repo's landing page and select "manage topics."