-
Notifications
You must be signed in to change notification settings - Fork 2.3k
BuildingForRaspbianStretchOS
NOTE: Only the MYRIAD plugin is supported.
- Hardware Requirements
- Native Compilation
- Cross Compilation Using Docker*
- Additional Build Options
- (Optional) Additional Installation Steps for the Intel® Neural Compute Stick 2
-
Raspberry Pi* 2 or 3 with Raspbian* Stretch OS (32-bit). Check that it's CPU supports ARMv7 instruction set (
uname -m
command returnsarmv7l
).NOTE: Despite the Raspberry Pi* CPU is ARMv8, 32-bit OS detects ARMv7 CPU instruction set. The default
gcc
compiler applies ARMv6 architecture flag for compatibility with lower versions of boards. For more information, run thegcc -Q --help=target
command and refer to the description of the-march=
option.
You can compile the Inference Engine for Raspberry Pi* in one of the two ways:
- Native Compilation, which is the simplest way, but time-consuming
- Cross Compilation Using Docker*, which is the recommended way
Native compilation of the Inference Engine is the most straightforward solution. However, it might take at least one hour to complete on Raspberry Pi* 3.
- Install dependencies:
sudo apt-get update
sudo apt-get install -y git cmake libusb-1.0-0-dev
- Go to the cloned
openvino
repository:
cd openvino
- Initialize submodules:
git submodule update --init --recursive
- Create a build folder:
mkdir build && cd build
- Build the Inference Engine:
cmake -DCMAKE_BUILD_TYPE=Release -DTHREADING=SEQ .. && make
To cross-compile using pre-configured Dockerfile
one can follow the following instruction: Build OpenCV, OpenVINO™ and the plugin using pre-
configured Dockerfile. Otherwise follow the next few steps.
This compilation was tested on the following configuration:
- Host: Ubuntu* 18.04 (64-bit, Intel® Core™ i7-6700K CPU @ 4.00GHz × 8)
- Target: Raspbian* Stretch (32-bit, ARMv7, Raspberry Pi* 3)
- Install Docker*:
sudo apt-get install -y docker.io
- Add a current user to
docker
group:
sudo usermod -a -G docker $USER
Log out and log in for this to take effect.
- Create a directory named
ie_cross_armhf
and add a text file namedDockerfile
with the following content:
FROM debian:stretch
USER root
RUN dpkg --add-architecture armhf && \
apt-get update && \
apt-get install -y --no-install-recommends \
build-essential \
crossbuild-essential-armhf \
git \
wget \
libusb-1.0-0-dev:armhf \
libgtk-3-dev:armhf \
libavcodec-dev:armhf \
libavformat-dev:armhf \
libswscale-dev:armhf \
libgstreamer1.0-dev:armhf \
libgstreamer-plugins-base1.0-dev:armhf \
libpython3-dev:armhf \
python3-pip \
python-minimal \
python-argparse
RUN wget https://www.cmake.org/files/v3.14/cmake-3.14.3.tar.gz && \
tar xf cmake-3.14.3.tar.gz && \
(cd cmake-3.14.3 && ./bootstrap --parallel=$(nproc --all) && make --jobs=$(nproc --all) && make install) && \
rm -rf cmake-3.14.3 cmake-3.14.3.tar.gz
RUN git config --global user.name "Your Name" && \
git config --global user.email "[email protected]"
It uses the Debian* Stretch (Debian 9) OS for compilation because it is a base of the Raspbian* Stretch.
- Build a Docker* image:
docker image build -t ie_cross_armhf ie_cross_armhf
- Run Docker* container with mounted source code folder from host:
docker run -it -v /absolute/path/to/openvino:/openvino ie_cross_armhf /bin/bash
-
While in the container:
- Go to the cloned
openvino
repository:
cd openvino
- Create a build folder:
mkdir build && cd build
- Build the Inference Engine:
cmake -DCMAKE_BUILD_TYPE=Release \ -DCMAKE_TOOLCHAIN_FILE="../cmake/arm.toolchain.cmake" \ -DTHREADS_PTHREAD_ARG="-pthread" .. make --jobs=$(nproc --all)
- Go to the cloned
-
Press Ctrl+D to exit from Docker. You can find the resulting binaries in the
openvino/bin/armv7l/
directory and the OpenCV* installation in theopenvino/inference-engine/temp
.
NOTE: Native applications that link to cross-compiled Inference Engine library require an extra compilation flag
-march=armv7-a
.
You can use the following additional build options:
-
Required versions of OpenCV packages are downloaded automatically by the CMake-based script. If you want to use the automatically downloaded packages but you have already installed OpenCV packages configured in your environment, you may need to clean the
OpenCV_DIR
environment variable before running thecmake
command; otherwise they won't be downloaded and the build may fail if incompatible versions were installed. -
If the CMake-based build script cannot find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, see how to Use Custom OpenCV Builds.
-
To build Python API wrapper, install
libpython3-dev:armhf
andpython3-pip
packages usingapt-get
; then installnumpy
andcython
python modules viapip3
, adding the following options:-DENABLE_PYTHON=ON \ -DPYTHON_EXECUTABLE=/usr/bin/python3.5 \ -DPYTHON_LIBRARY=/usr/lib/arm-linux-gnueabihf/libpython3.5m.so \ -DPYTHON_INCLUDE_DIR=/usr/include/python3.5
-
nGraph-specific compilation options:
-DNGRAPH_ONNX_IMPORT_ENABLE=ON
enables the building of the nGraph ONNX importer.-DNGRAPH_DEBUG_ENABLE=ON
enables additional debug prints.
NOTE: These steps are only required if you want to perform inference on the Intel® Neural Compute Stick 2 using the Inference Engine MYRIAD Plugin. See also Intel® Neural Compute Stick 2 Get Started.
- Add the current Linux user to the
users
group; you will need to log out and log in for it to take effect:
sudo usermod -a -G users "$(whoami)"
- To perform inference on Intel® Neural Compute Stick 2, install the USB rules as follows:
cat <<EOF > 97-myriad-usbboot.rules
SUBSYSTEM=="usb", ATTRS{idProduct}=="2485", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
SUBSYSTEM=="usb", ATTRS{idProduct}=="f63b", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
EOF
sudo cp 97-myriad-usbboot.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules
sudo udevadm trigger
sudo ldconfig
rm 97-myriad-usbboot.rules
© Copyright 2018-2024, OpenVINO team
- Home
- General resources
- How to build
-
Developer documentation
- Inference Engine architecture
- CPU plugin
- GPU plugin
- HETERO plugin architecture
- Snippets
- Sample for IE C++/C/Python API
- Proxy plugin (Concept)
- Tests