EAS/SW Stack Test v3
Contents
Edge AI SDK Software Stack
Edge AI SDK Software Stack
Edge AI SDK |
| |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Intel Utilities |
| |||||||||
Intel AI SDK | OpenVINO 2023.01 | |||||||||
Driver | Intel UHD/HD/Arc GPU | |||||||||
OS / Kernel | Ubuntu 22.04 ( 5.15.0-1027-intel-iotg ) |
OpenVINO
OpenVINO™ toolkit: An open-source solution for optimizing and deploying AI inference, in domains such as computer vision, automatic speech recognition, natural language processing, recommendation systems, and more. With its plug-in architecture, OpenVINO allows developers to write once and deploy anywhere. We are proud to announce the release of OpenVINO 2023.0 introducing a range of new features, improvements, and deprecations aimed at enhancing the developer experience.
- Enables the use of models trained with popular frameworks, such as TensorFlow and PyTorch.
- Optimizes inference of deep learning models by applying model retraining or fine-tuning, like post-training quantization.
- Supports heterogeneous execution across Intel hardware, using a common API for the Intel CPU, Intel Integrated Graphics, Intel Discrete Graphics, and other commonly used accelerators.
OpenVINO Runtime SDK
Overall updates
- Proxy & hetero plugins have been migrated to API 2.0, providing enhanced compatibility and stability.
- Symbolic shape inference preview is now available, leading to improved performance for LLMs.
- OpenVINO's graph representation has been upgraded to opset12, introducing a new set of operations that offer enhanced functionality and optimizations.
Applications
Edge AI SDK / Vision Application
Application | Model |
Object Detection | yolov3 (tf) |
Person Detection | person-detection-retail-0013 |
Face Detection | faceboxes-pytorch |
Pose Estimation | human-pose-estimation-0001 |
Edge AI SDK / GenAI Application
Application | Model |
Chatbot | Llama-2-7b |
Benchmark
You can refer the link to test the performance with the benchmark_app
benchmark_app
The OpenVINO benchmark setup includes a single system with OpenVINO™, as well as the benchmark application installed. It measures the time spent on actual inference (excluding any pre or post processing) and then reports on the inferences per second (or Frames Per Second).
You can refer : link
Examples
cd /opt/Advantech/EdgeAISuite/Intel_Standard/benchmark
<CPU>
$ ./benchmark_app -m ../model/mobilenet-ssd/FP16/mobilenet-ssd.xml -i car.png -t 8 -d CPU
<iGPU>
$ ./benchmark_app -m ../model/mobilenet-ssd/FP16/mobilenet-ssd.xml -i car.png -t 8 -d GPU.0
<dGPU>
$ ./benchmark_app -m ../model/mobilenet-ssd/FP16/mobilenet-ssd.xml -i car.png -t 8 -d GPU.1
Utility
XPU-SIM
Intel® XPU Manager ( offical link ) is a free and open-source solution for local and remote monitoring and managing Intel® Data Center GPUs. It is designed to simplify administration, maximize reliability and uptime, and improve utilization.
Intel XPU System Management Interface (SMI) A command line utility for local XPU management.
Key features
Monitoring GPU utilization and health, getting job-level statistics, running comprehensive diagnostics, controlling power, policy management, firmware updating, and more.
Show GPU basic information , sample below
You can refer more info xpu-smi