ONNX Runtime JavaScript
Introduction
ONNX Runtime has a JavaScript API so that the neural network inference could be performed at the user front-end from the browser.
In this blog post, I would like to quickly discuss the ONNX Runtime JavaScript API using a MNIST classifier as an example.
MNIST Classifier
The MNIST classifier uses the pre-trained MNIST model from ONNX model zoo.
Line width : Color :
Predicted Digit | Confidence | Latency (ms) |
---|---|---|
ONNX Runtime JavaScript API
The basic usage of the ONNX Runtime JavaScript API is as follows.
1 | <script src="https://cdn.jsdelivr.net/npm/onnxruntime-web@1.12.1/dist/ort.min.js"></script> |
It is recommended to load the ONNX model and create the inference session only once. Selecting the executionProviders
is critical for the inference latency.
References
- ONNX Runtime Inference Examples - JavaScript
- ONNX Runtime Web Demo
- ONNX Runtime Web Demo - GitHub
- Draw on a HTML5 Canvas with a Mouse
- Resize Image Data - GitHub
ONNX Runtime JavaScript