-
Notifications
You must be signed in to change notification settings - Fork 2.3k
BuildingForWindows
The software was validated on:
- Microsoft Windows 10 (64-bit) with Visual Studio 2019
- Software Requirements
- Build Steps
- Additional Build Options
- Building Inference Engine with Ninja* Build System
- CMake* 3.14 or higher
- Microsoft* Visual Studio 2019, version 16.8 or later
- (Optional) Intel® Graphics Driver for Windows* (30.0) driver package.
- Python 3.6 or higher for OpenVINO Runtime Python API
- Git for Windows*
- Clone submodules:
git clone https://github.com/openvinotoolkit/openvino.git cd openvino git submodule update --init --recursive
- Create build directory:
mkdir build && cd build
NOTE: By default, the build enables the Inference Engine GPU plugin to infer models on your Intel® Processor Graphics. This requires you to download and install the Intel® Graphics Driver for Windows (26.20) driver package before running the build. If you don't want to use the GPU plugin, use the
-DENABLE_INTEL_GPU=OFF
CMake build option and skip the installation of the Intel® Graphics Driver.
-
In the
build
directory, runcmake
to fetch project dependencies and generate a Visual Studio solution.For Microsoft* Visual Studio 2019 x64 architecture:
cmake -G "Visual Studio 16 2019" -A x64 -DCMAKE_BUILD_TYPE=Release ..
For Microsoft* Visual Studio 2019 ARM architecture:
cmake -G "Visual Studio 16 2019" -A ARM -DCMAKE_BUILD_TYPE=Release ..
For Microsoft* Visual Studio 2019 ARM64 architecture:
cmake -G "Visual Studio 16 2019" -A ARM64 -DCMAKE_BUILD_TYPE=Release ..
-
Build generated solution in Visual Studio or run
cmake --build . --config Release --verbose -j8
to build from the command line. Note that this process may take some time. -
Before running the samples, add paths to the Threading Building Blocks (TBB) and OpenCV binaries used for the build to the
%PATH%
environment variable. By default, TBB binaries are downloaded by the CMake-based script to the<openvino_repo>/inference-engine/temp/tbb/bin
folder, OpenCV binaries to the<openvino_repo>/inference-engine/temp/opencv_4.5.0/opencv/bin
folder.
-
Internal JIT GEMM implementation is used by default.
-
Threading Building Blocks (TBB) is used by default. To build Inference Engine with OpenMP threading, set the
-DTHREADING=OMP
option. -
Required versions of TBB and OpenCV packages are downloaded automatically by the CMake-based script. If you want to use the automatically-downloaded packages but you have already installed TBB or OpenCV packages configured in your environment, you may need to clean the
TBBROOT
andOpenCV_DIR
environment variables before running thecmake
command; otherwise they won't be downloaded and the build may fail if incompatible versions were installed. -
If the CMake-based build script can not find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, refer to the Use Custom OpenCV Builds section for details.
-
To switch off/on the CPU and GPU plugins, use the
cmake
options-DENABLE_INTEL_CPU=ON/OFF
and-DENABLE_INTEL_GPU=ON/OFF
respectively. -
To build the OpenVINO Runtime Python API:
- First, install all additional packages (e.g., cython and opencv) listed in the
src\bindings\python\src\compatibility\openvino\requirements-dev.txt
file:pip install -r requirements-dev.txt
- Second, enable the
-DENABLE_PYTHON=ON
in the CMake (Step #4) option above. To specify an exact Python version, use the following options:-DPYTHON_EXECUTABLE="C:\Program Files\Python37\python.exe" ^ -DPYTHON_LIBRARY="C:\Program Files\Python37\libs\python37.lib" ^ -DPYTHON_INCLUDE_DIR="C:\Program Files\Python37\include"
- First, install all additional packages (e.g., cython and opencv) listed in the
-
OpenVINO runtime compilation options:
-DENABLE_OV_ONNX_FRONTEND=ON
enables the building of the ONNX importer.
call "C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\VC\Auxiliary\Build\vcvars64.bat"
cmake -G Ninja -Wno-dev -DCMAKE_BUILD_TYPE=Release ..
cmake --build . --config Release
© Copyright 2018-2024, OpenVINO team
- Home
- General resources
- How to build
-
Developer documentation
- Inference Engine architecture
- CPU plugin
- GPU plugin
- HETERO plugin architecture
- Snippets
- Sample for IE C++/C/Python API
- Proxy plugin (Concept)
- Tests