Inference

The following tutorials will help you learn how to deploy MXNet models for inference applications.

GluonCV Models in a C++ Inference Applicationhttps://gluon-cv.mxnet.io/build/examples_deployment/cpp_inference.html

An example application that works with an exported MXNet GluonCV YOLO model.

Inference with Quantized Modelshttps://gluon-cv.mxnet.io/build/examples_deployment/int8_inference.html

How to use quantized GluonCV models for inference on Intel Xeon Processors to gain higher performance.

C++cpp.html

How to use MXNet models in a C++ environment.

Image Classification on Jetsonimage_classification_jetson.html

Example of running a pretrained image classification model on a Jetson module.

Object Detection on Jetsonhttps://gluon-cv.mxnet.io/build/examples_detection/demo_jetson.html

Example of running a pretrained object detection model on a Jetson module.