Difference between revisions of "NXP eIQ"
Darren.huang (talk | contribs) |
Darren.huang (talk | contribs) |
||
Line 87: | Line 87: | ||
*[[:File:pyeiq-3.0.0.tar.gz]] | *[[:File:pyeiq-3.0.0.tar.gz]] | ||
*[[:File:pyeiq-3.1.0.tar.gz]] | *[[:File:pyeiq-3.1.0.tar.gz]] | ||
+ | |||
===== <span style="color:#0070c0">How to Run Samples</span> ===== | ===== <span style="color:#0070c0">How to Run Samples</span> ===== | ||
Line 120: | Line 121: | ||
| pyeiq --clear-cache | | pyeiq --clear-cache | ||
| Clear cached media generated by demos. | | Clear cached media generated by demos. | ||
− | | # pyeiq --info | + | | # pyeiq --info object_detection_tflite |
|} | |} | ||
− | + | | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
===== <span style="color:#0070c0">Run Applications and Demos</span> ===== | ===== <span style="color:#0070c0">Run Applications and Demos</span> ===== |
Revision as of 10:17, 27 September 2023
Contents
NXP i.MX series
The i.MX 8M Plus family focuses on neural processing unit (NPU) and vision system, advance multimedia, andindustrial automation with high reliability.
- The Neural Processing Unit (NPU) of i.MX 8M Plus operating at up to 2.3 TOPS
NXP Demo Experience
- Preinstalled on NXP-provided demo Linux images
- imx-image-full image must be used
- Yocto 3.3 (5.10.52_2.1.0 ) ~ Yocto 4.2 (6.1.1_1.0.0)
- Need to connect the internet
Start the demo launcher by clicking NXP Logo is displayed on the top left-hand corner of the screen
Machine Learning Demos
- NNStreamer demos
- Object classification
- Object detection
- Pose detection
- Brand detection
- ML gateway
- OpenCV demos
- Face recognition
- Selfie segmentation
NNStreamer Demo: Object Detection
Click the "Object Detection " and Launch Demo
Set some parameters:
- Source: Select the camera to use or to use the example video
- Backend: Select whether to use the NPU (if available) or CPU for inferences.
- Height: Select the input height of the video if using a camera.
- Width: Select the input width of the video if using a camera.
- Label Color: Select the color of the overlay labels.
The result of NPU object detection
NXP Demo Experience - Text User Interface(TUI)
- Command :demoexperience tui
eIQ - A Python Framework for eIQ on i.MX Processors
PyeIQ is written on top of eIQ™ ML Software Development Environment and provides a set of Python classes
allowing the user to run Machine Learning applications in a simplified and efficiently way without spending time on
cross-compilations, deployments or reading extensive guides.
Installation
- Method 1: Use pip3 tool to install the package located at PyPI repository:
$ pip3 install pyeiq
- Method 2: Get the latest tarball Download files and copy it to the board:
$ pip3 install <tarball>
pyeiq tarball:
How to Run Samples
- Start the manager tool:
$ pyeiq
- The above command returns the PyeIQ manager tool options:
Manager Tool Command | Description | Example |
pyeiq --list-apps | List the available applications. | |
pyeiq --list-demos | List the available demos. | |
pyeiq --run <app_name/demo_name> | Run the application or demo. | # pyeiq --run object_detection_tflite |
pyeiq --info <app_name/demo_name> | Application or demo short description and usage. | |
pyeiq --clear-cache | Clear cached media generated by demos. | # pyeiq --info object_detection_tflite |
Run Applications and Demos
- Applications
Application Name | Framework | i.MX Board | BSP Release | Inference Core | Status |
Switch Classification Image | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | CPU, GPU, NPU | PASS |
Switch Detection Video | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | CPU, GPU, NPU | PASS |
- Demos
Demo Name | Framework | i.MX Board | BSP Release | Inference Core | Status |
Object Classification | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Object Detection SSD | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Object Detection YOLOv3 | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Object Detection DNN | OpenCV:4.2.0 | RSB-3720 | 5.4.24_2.1.0 | CPU | PASS |
Facial Expression Detection | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Fire Classification | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Fire Classification | ArmNN:19.08 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Pose Detection | TFLite:2.1.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Face/Eyes Detection | OpenCV:4.2.0 | RSB-3720 | 5.4.24_2.1.0 | GPU, NPU | PASS |
Applications Example - Switch Detection Video
This application offers a graphical interface for users to run an object detection demo using either CPU or GPU/NPU to perform inference on a video file.
- Run the Switch Detection Video demo using the following line:
$ pyeiq --run switch_video
- Type on CPU or GPU/NPU in the terminal to switch between cores.
- This runs inference on a default video:
Demos Example - Running Object Detection SSD
Object detection is a computer technology related to computer vision and image processing that deals with detecting instances of semantic objects of a certain class (such as humans, buildings, or cars) in digital images and videos. Well-researched domains of object detection include face detection and pedestrian detection. Object detection has applications in many areas of computer vision, including image retrieval and video surveillance.
- Run the Object Detection Default Image demo using the following line:
$ pyeiq --run object_detection_tflite
* This runs inference on a default image:
- Run the Object Detection Custom Image demo using the following line:
$ pyeiq --run object_detection_tflite --image=/path_to_the_image
- Run the Object Detection Video File using the following line:
$ pyeiq --run object_detection_tflite --video_src=/path_to_the_video
- Run the Object Detection Video Camera or Webcam using the following line:
$ pyeiq --run object_detection_tflite --video_src=/dev/video<index>