Learning ONNX for trading - page 9

 

Using ONNX w/ Qualcomm powered devices from smartphones to the cloud edge and everything in between



Using ONNX w/ Qualcomm powered devices from smartphones to the cloud edge and everything in between

The use of ONNX interchange format throughout Qualcomm's range of devices helps to support models on all of their devices. Qualcomm faces challenging architectures when supporting different devices and varying models, but ONNX helps to achieve scalability across verticals, powerful devices, and geographies. Qualcomm has worked with Microsoft to create an ONNX runtime executioner provider that allows ONNX models to be run on Qualcomm-powered devices, including those running Windows. The unified software stack includes a library called AI engine that can route the ONNX model dynamically to different accelerators to get the best performance, with additional tools available such as profilers, compilers, and analyzers for optimizing models.

  • 00:00:00 In this section, the speaker from Qualcomm talks about how they use ONNX to support their range of devices, from tiny earbuds to laptops, secure cameras and even automobile equipment. They mention that ONNX's interchange format allows them to target a model and use it on all devices that they support. They also discuss the challenging architectures they have to deal with to support the different devices along with the different models, implementations, and feature requirements. As an example, they talk about using Apple's depth sensor technology for authentication on mobile phones and how they have now integrated the same technology into secure cameras and automobiles.

  • 00:05:00 In this section, the speaker discusses the scalability challenge that the industry faces today in the field of AI. He explains how Qualcomm has dealt with the challenge and the benefits of using ONNX as an interchange format to achieve scalability. By moving algorithms from CPU to AI accelerators, the devices can be scaled easily. The multi-core architecture enables the system to achieve higher performance, which helps in working with live video streams. Also, the interchange format saves considerable time as there is no need to deal with other frameworks. Lastly, the speaker explains that ONNX helps to scale across verticals, small and powerful devices, and geographies.

  • 00:10:00 In this section, the speaker discusses how Qualcomm is working with Microsoft to create an ONNX runtime executioner provider for their AI accelerator. This allows ONNX models to be run on a variety of Qualcomm-powered devices, including mobile and automotive devices, as well as those running Windows. Qualcomm has developed a unified software stack that supports a variety of operating systems and includes a unified software library called AI engine that can dynamically route the ONNX model to different accelerators to get the best performance. They also have a range of additional tools available for their customers to use, such as profilers, compilers, and analyzers, for building and optimizing models for their specific devices.
Using ONNX w/ Qualcomm powered devices from smartphones to the cloud edge and everything in between
Using ONNX w/ Qualcomm powered devices from smartphones to the cloud edge and everything in between
  • 2022.07.13
  • www.youtube.com
Whenever our clients target high performant AI cloud inferencing servers, create new and exciting AI based experiences on mobile phones or improve our lives ...
 

ONNX Runtime IoT Deployment on Raspberry Pi



ONNX Runtime IoT Deployment on Raspberry Pi

In this video titled "ONNX Runtime IoT Deployment on Raspberry Pi", the presenter demonstrates how to deploy an ONNX Runtime for a computer vision model on a Raspberry Pi using a Mobilenet model optimized for the device. The video covers the process of connecting to the Raspberry Pi using VNC viewer, configuring it, and running a camera test using OpenCV and Python. The presenter captures an image, runs the inference, and prints out the top five predicted classes, which correctly identify the fountain pen in the image. Overall, the video provides a helpful guide for deploying ONNX Runtime on a Raspberry Pi for computer vision applications.

ONNX Runtime IoT Deployment on Raspberry Pi
ONNX Runtime IoT Deployment on Raspberry Pi
  • 2023.02.01
  • www.youtube.com
Learn how to perform image classification on the edge using ONNX Runtime and a Raspberry Pi, taking input from the device’s camera and sending the classifica...
 

How to install ONNX Runtime on Raspberry Pi



How to install ONNX Runtime on Raspberry Pi

The video provides a detailed guide on how to install ONNX Runtime on Raspberry Pi. After downloading and installing Raspbian Stretch on the Raspberry Pi, the user needs to install Docker and QMU user static package, create a build directory and run a command to get the ONNX Runtime wheel package, which can be installed via pip. The video also explains how to test ONNX Runtime using a deep neural network trained on the MNIST dataset and how to calculate the time taken to run an inference session on a single image. The speaker notes that the process can be lengthy and complicated but is worth it for the ability to deploy and test neural networks on edge devices.

  • 00:00:00 The OS has been released by Raspberry Pi and you can download Raspberry Pi OS (formerly Raspbian) Stretch from here. Once you have downloaded and installed Docker on your Raspberry Pi, the next step is to go to the ONNX Runtime Github repository and find the Dockerfile for the ARM32v7 platform. There are different Dockerfiles available for different platforms, but ARM32v7 is the one you need for Raspberry Pi. Once you have found the correct Dockerfile, you can follow the instructions provided to install ONNX Runtime on your Raspberry Pi. This can be a bit of a lengthy and complicated process, but it is well worth it for the ability to deploy and test neural networks on edge devices.

  • 00:05:00 In this section, the speaker explains how to install ONNX Runtime on Raspberry Pi. First, the user needs to download the Raspbian Stretch file and the Raspberry Pi imager to make the SD card bootable. Once Raspbian Stretch is installed on Raspberry Pi, the user needs to install Docker and QMU user static package. After creating a build directory, the user needs to save the updated Dockerfile in the build directory and run the command to get the ONNX Runtime wheel package. Once the wheel file is installed with pip and tested, it can be imported, and ONNX can be used on Raspberry Pi with Python 3 version 3.8.

  • 00:10:00 In this section, the speaker explains how to use ONNX Runtime on Raspberry Pi to test a model that has been trained on a particular data set. The speaker has already trained a deep neural network that can do digit classification using the MNIST dataset and provided a link to the code and model. After importing the necessary libraries, the speaker creates a script that runs an inference session using ONNX Runtime. The speaker encounters a segmentation fault and after researching, learns about the three levels of graph optimizations that ONNX Runtime can do and disables them, successfully running the script and getting the predicted output.

  • 00:15:00 In this section, the speaker explains how they calculated the time their Raspberry Pi takes to run an inference session on a single image. They introduce a separate script using the time library to calculate the time taken to generate output and run the model inside ONNX runtime, taking about 0.06 seconds, which the speaker notes is really good for a simple model and small image size. They encourage viewers to install ONNX runtime on their Raspberry Pi and reach out for any questions or concerns.
How to install ONNX Runtime on Raspberry Pi
How to install ONNX Runtime on Raspberry Pi
  • 2020.09.24
  • www.youtube.com
This video explains how to install Microsoft's deep learning inference engine ONNX Runtime on Raspberry Pi.Jump to a section:0:19 - Introduction to ONNX Runt...
 

Image Classification working on Raspberry Pi with various MobileNet ONNX models


Image Classification working on Raspberry Pi with various MobileNet ONNX models

Perform image classification on Raspberry Pi 4 at ONNX Runtime using 3 pattern of MobileNet V1 ONNX models.

  1. Depth 1.00 & 224x224
  2. Depth 0.50 & 160x160
  3. Depth 0.25 & 128x128

The classification done in 7ms, depending on model used.

Image Classification working on Raspberry Pi with various MobileNet ONNX models
Image Classification working on Raspberry Pi with various MobileNet ONNX models
  • 2020.05.26
  • www.youtube.com
Perform image classification on Raspberry Pi 4 at ONNX Runtime using 3 pattern of MobileNet V1 ONNX models.1) Depth 1.00 & 224x2242) Depth 0.50 & 160x1603) D...
 

SSDLite Mobilenet V2 on ONNX Runtime working on Raspberry Pi 4


SSDLite Mobilenet V2 on ONNX Runtime working on Raspberry Pi 4

SSDLite Mobilenet V2 on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration.

SSDLite Mobilenet V2 on ONNX Runtime working on Raspberry Pi 4
SSDLite Mobilenet V2 on ONNX Runtime working on Raspberry Pi 4
  • 2020.04.26
  • www.youtube.com
SSDLite Mobilenet V2 on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration
 

SSDLite Mobilenet V1 0.75 depth on ONNX Runtime working on Raspberry Pi 4



SSDLite Mobilenet V1 0.75 depth on ONNX Runtime working on Raspberry Pi 4

SSDLite Mobilenet V1 0.75 depth on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration.

SSDLite Mobilenet V1 0.75 depth on ONNX Runtime working on Raspberry Pi 4
SSDLite Mobilenet V1 0.75 depth on ONNX Runtime working on Raspberry Pi 4
  • 2020.04.26
  • www.youtube.com
SSDLite Mobilenet V1 0.75 depth on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration.
 

Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4



Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4

Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration.

Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4
Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4
  • 2020.04.08
  • www.youtube.com
Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration
 

Raspberry Pi 4 Classification and Object Detection with Optimized ONNX Runtime



Raspberry Pi 4 Classification and Object Detection with Optimized ONNX Runtime

Perform image classification on Raspberry Pi 4 at ONNX Runtime:

  1. Classification using MobileNet V3;
  2. Detection using SSDLite MobileNet V2.
Raspberry Pi 4 Classification and Object Detection with Optimized ONNX Runtime
Raspberry Pi 4 Classification and Object Detection with Optimized ONNX Runtime
  • 2020.08.06
  • www.youtube.com
Perform image classification on Raspberry Pi 4 at ONNX Runtime1) Classification using MobileNet V32) Detection using SSDLite MobileNet V2
 

Raspberry Pi 4 Object Detection with Optimized ONNX Runtime (Late 2020)



Raspberry Pi 4 Object Detection with Optimized ONNX Runtime (Late 2020)

Hardware : Raspberry Pi 4B
OS : Raspberry Pi OS (32bit)
Software : ONNX Runtime 1.4.0 with custom execution provider (CPU accelerated)
Models:

Raspberry Pi 4 Object Detection with Optimized ONNX Runtime (Late 2020)
Raspberry Pi 4 Object Detection with Optimized ONNX Runtime (Late 2020)
  • 2020.12.15
  • www.youtube.com
Hardware : Raspberry Pi 4BOS : Raspberry Pi OS (32bit)Software : ONNX Runtime 1.4.0 with custom execution provider (CPU accelerated)ModelsMobileNetV1 SSD 0.7...
 

Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4


Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4

Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration.

Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4
Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4
  • 2020.04.11
  • www.youtube.com
Tiny-YOLOv3 on ONNX Runtime working on Raspberry Pi 4 without hardware acceleration