Skip to content

Latest commit

 

History

History
130 lines (119 loc) · 4.1 KB

cmake_option.md

File metadata and controls

130 lines (119 loc) · 4.1 KB

CMake Build Option Spec

NAME VALUE DEFAULT REMARK
MMDEPLOY_SHARED_LIBS {ON, OFF} ON Switch to build shared libs
MMDEPLOY_BUILD_SDK {ON, OFF} OFF Switch to build MMDeploy SDK
MMDEPLOY_BUILD_SDK_MONOLITHIC {ON, OFF} OFF Build single lib
MMDEPLOY_BUILD_TEST {ON, OFF} OFF Build unittest
MMDEPLOY_BUILD_SDK_PYTHON_API {ON, OFF} OFF Switch to build MMDeploy SDK python package
MMDEPLOY_BUILD_SDK_CSHARP_API {ON, OFF} OFF Build C# SDK API
MMDEPLOY_BUILD_SDK_JAVA_API {ON, OFF} OFF Build Java SDK API
MMDEPLOY_BUILD_TEST {ON, OFF} OFF Switch to build MMDeploy SDK unittest cases
MMDEPLOY_SPDLOG_EXTERNAL {ON, OFF} OFF Build with spdlog installation package that comes with the system
MMDEPLOY_ZIP_MODEL {ON, OFF} OFF Enable SDK with zip format
MMDEPLOY_COVERAGE {ON, OFF} OFF Build for cplus code coverage report
MMDEPLOY_TARGET_DEVICES {"cpu", "cuda"} cpu Enable target device. You can enable more by passing a semicolon separated list of device names to MMDEPLOY_TARGET_DEVICES variable, e.g. -DMMDEPLOY_TARGET_DEVICES="cpu;cuda"
MMDEPLOY_TARGET_BACKENDS {"trt", "ort", "pplnn", "ncnn", "openvino", "torchscript", "snpe", "tvm"} N/A Enabling inference engine. By default, no target inference engine is set, since it highly depends on the use case. When more than one engine are specified, it has to be set with a semicolon separated list of inference backend names, e.g.
-DMMDEPLOY_TARGET_BACKENDS="trt;ort;pplnn;ncnn;openvino"
After specifying the inference engine, it's package path has to be passed to cmake as follows,
1. trt: TensorRT. TENSORRT_DIR and CUDNN_DIR are needed.

-DTENSORRT_DIR=${TENSORRT_DIR}
-DCUDNN_DIR=${CUDNN_DIR}
2. ort: ONNXRuntime. ONNXRUNTIME_DIR is needed.
-DONNXRUNTIME_DIR=${ONNXRUNTIME_DIR}
3. pplnn: PPL.NN. pplnn_DIR is needed.
-Dpplnn_DIR=${PPLNN_DIR}
4. ncnn: ncnn. ncnn_DIR is needed.
-Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn
5. openvino: OpenVINO. InferenceEngine_DIR is needed.
-DInferenceEngine_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/share
6. torchscript: TorchScript. Torch_DIR is needed.
-DTorch_DIR=${Torch_DIR}
7. snpe: qcom snpe. SNPE_ROOT must existed in the environment variable because of C/S mode.
8. coreml: CoreML. Torch_DIR is required.
-DTorch_DIR=${Torch_DIR}
9. TVM: TVM. TVM_DIR is required.
-DTVM_DIR=${TVM_DIR}
MMDEPLOY_CODEBASES {"mmpretrain", "mmdet", "mmseg", "mmagic", "mmocr", "all"} all Enable codebase's postprocess modules. You can provide a semicolon separated list of codebase names to enable them, e.g., -DMMDEPLOY_CODEBASES="mmpretrain;mmdet". Or you can pass all to enable them all, i.e., -DMMDEPLOY_CODEBASES=all