Inference

The following tutorials will help you learn how to deploy MXNet models for inference applications.

GluonCV Models in a C++ Inference Applicationhttps://gluon-cv.mxnet.io/build/examples_deployment/cpp_inference.html

An example application that works with an exported MXNet GluonCV YOLO model.

Inference with Quantized Modelshttps://gluon-cv.mxnet.io/build/examples_deployment/int8_inference.html

How to use quantized GluonCV models for inference on Intel Xeon Processors to gain higher performance.

The following tutorials will help you learn how to deploy MXNet models for inference applications.

Scala and Javascala.html

How to use MXNet models in a Scala or Java environment.

C++cpp.html

How to use MXNet models in a C++ environment.

Raspberry Piwine_detector.html

Example of running a wine detector on a raspberry pi.