Difference between revisions of "NXP eIQ"

From ESS-WIKI
Jump to: navigation, search
Line 1: Line 1:
  
 
+
<span style="color:#0070c0">NXP i.MX series</span>
 
 
=== <span style="color:#0070c0">NXP i.MX series</span> ===
 
  
 
The i.MX 8M Plus family focuses on neural processing unit (NPU) and vision system, advance multimedia, andindustrial automation with high reliability.
 
The i.MX 8M Plus family focuses on neural processing unit (NPU) and vision system, advance multimedia, andindustrial automation with high reliability.
Line 64: Line 62:
 
[[File:2023-09-27 170604.png|400px|2023-09-27 170604.png]] &nbsp;
 
[[File:2023-09-27 170604.png|400px|2023-09-27 170604.png]] &nbsp;
  
 +
&nbsp;
  
 
==== <span style="color:#0070c0">eIQ - A Python Framework for eIQ on i.MX Processors</span> ====
 
==== <span style="color:#0070c0">eIQ - A Python Framework for eIQ on i.MX Processors</span> ====
Line 88: Line 87:
 
*[[:File:pyeiq-3.1.0.tar.gz]]  
 
*[[:File:pyeiq-3.1.0.tar.gz]]  
  
 
+
<span style="color:#0070c0">How to Run Samples</span>
===== <span style="color:#0070c0">How to Run Samples</span> =====
 
  
 
*&nbsp;Start the manager tool:  
 
*&nbsp;Start the manager tool:  
Line 124: Line 122:
 
|}
 
|}
  
&nbsp;
+
<span style="color:#0070c0">PyeIQ Demos</span>
 
 
===== <span style="color:#0070c0">Run Applications and Demos</span> =====
 
 
 
*Applications
 
 
 
{| border="1" cellpadding="1" cellspacing="1" style="width: 599px;"
 
|-
 
| style="width: 172px;" | '''Application Name'''
 
| style="width: 80px;" | '''Framework'''
 
| style="width: 77px;" | '''i.MX Board'''
 
| style="width: 86px;" | '''BSP Release'''
 
| style="width: 112px;" | '''Inference Core'''
 
| style="width: 44px;" | '''Status'''
 
|-
 
| style="width: 172px;" | Switch Classification Image
 
| style="width: 80px;" | TFLite:2.1.0
 
| style="width: 77px;" | RSB-3720
 
| style="width: 86px;" | 5.4.24_2.1.0
 
| style="width: 112px;" | CPU, GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 172px;" | Switch Detection Video
 
| style="width: 80px;" | TFLite:2.1.0
 
| style="width: 77px;" | RSB-3720
 
| style="width: 86px;" | 5.4.24_2.1.0
 
| style="width: 112px;" | CPU, GPU, NPU
 
| style="width: 44px;" | PASS
 
|}
 
 
 
*Demos  
 
 
 
{| border="1" cellpadding="1" cellspacing="1" style="width: 616px;"
 
|-
 
| style="width: 179px;" | '''Demo&nbsp;Name'''
 
| style="width: 92px;" | '''Framework'''
 
| style="width: 81px;" | '''i.MX Board'''
 
| style="width: 92px;" | '''BSP Release'''
 
| style="width: 103px;" | '''Inference Core'''
 
| style="width: 44px;" | '''Status'''
 
|-
 
| style="width: 179px;" | Object Classification
 
| style="width: 92px;" | TFLite:2.1.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Object Detection SSD
 
| style="width: 92px;" | TFLite:2.1.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Object Detection YOLOv3
 
| style="width: 92px;" | TFLite:2.1.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Object Detection DNN
 
| style="width: 92px;" | OpenCV:4.2.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | CPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Facial Expression Detection
 
| style="width: 92px;" | TFLite:2.1.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Fire Classification
 
| style="width: 92px;" | TFLite:2.1.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Fire Classification
 
| style="width: 92px;" | ArmNN:19.08
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Pose Detection
 
| style="width: 92px;" | TFLite:2.1.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|-
 
| style="width: 179px;" | Face/Eyes Detection
 
| style="width: 92px;" | OpenCV:4.2.0
 
| style="width: 81px;" | RSB-3720
 
| style="width: 92px;" | 5.4.24_2.1.0
 
| style="width: 103px;" | GPU, NPU
 
| style="width: 44px;" | PASS
 
|}
 
 
 
====== <span style="color:#0070c0">Applications Example - Switch Detection Video</span> ======
 
  
This application offers a graphical interface for users to run an object detection demo using either CPU or GPU/NPU to perform inference on a video file.
+
*covid19_detection&nbsp;
 +
*object_classification_tflite
 +
*object_detection_tflite
  
*Run the&nbsp;''Switch Detection Video''&nbsp;demo using the following line: <pre>$ pyeiq --run switch_video</pre>
+
====== <span style="color:#0070c0">Demos Example - Running Object Detection</span> ======
 
 
 
 
&nbsp;
 
 
 
*Type on&nbsp;'''CPU'''&nbsp;or&nbsp;'''GPU'''/'''NPU'''&nbsp;in the terminal to switch between cores.
 
**This runs inference on a default video: 
 
 
 
[[File:Switch detection resized logo.gif|RTENOTITLE]]
 
 
 
====== <span style="color:#0070c0">Demos Example - Running Object Detection SSD</span> ======
 
  
 
Object detection is a computer technology related to computer vision and image processing that deals with detecting instances of semantic objects of a certain class (such as humans, buildings, or cars) in digital images and videos. Well-researched domains of object detection include face detection and pedestrian detection. Object detection has applications in many areas of computer vision, including image retrieval and video surveillance.
 
Object detection is a computer technology related to computer vision and image processing that deals with detecting instances of semantic objects of a certain class (such as humans, buildings, or cars) in digital images and videos. Well-researched domains of object detection include face detection and pedestrian detection. Object detection has applications in many areas of computer vision, including image retrieval and video surveillance.
Line 251: Line 136:
  
 
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;*&nbsp; This runs inference on a default image:
 
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;*&nbsp; This runs inference on a default image:
 
+
[[File:2023-09-27_172321.png|400px]]
[[File:Image eiqobjectdetection resized logo.gif|RTENOTITLE]]
+
&nbsp;
  
 
*Run the&nbsp;''Object Detection''&nbsp;'''Custom Image&nbsp;'''demo using the following line:  
 
*Run the&nbsp;''Object Detection''&nbsp;'''Custom Image&nbsp;'''demo using the following line:  

Revision as of 11:24, 27 September 2023

NXP i.MX series

The i.MX 8M Plus family focuses on neural processing unit (NPU) and vision system, advance multimedia, andindustrial automation with high reliability.

  • The Neural Processing Unit (NPU) of i.MX 8M Plus operating at up to 2.3 TOPS

NXP Demo Experience 

  • Preinstalled on NXP-provided demo Linux images
  • imx-image-full image must be used
  • Yocto 3.3 (5.10.52_2.1.0 ) ~ Yocto 4.2 (6.1.1_1.0.0)
  • Need to connect the internet

Start the demo launcher by clicking NXP Logo is displayed on the top left-hand corner of the screen

2023-09-27 155355.png

2023-09-27 155552.png

Machine Learning Demos
  • NNStreamer demos
    • Object classification
    • Object detection
    • Pose detection
    • Brand detection
    • ML gateway

 2023-09-27 155836.png  

  • OpenCV demos
    • Face recognition
    • Selfie segmentation

 2023-09-27 162615.png

NNStreamer Demo: Object Detection

Click the "Object Detection " and Launch Demo

111111111333.png  

Set some parameters:

22222222222.png  

  • Source: Select the camera to use or to use the example video
  • Backend: Select whether to use the NPU (if available) or CPU for inferences.
  • Height: Select the input height of the video if using a camera.
  • Width: Select the input width of the video if using a camera.
  • Label Color: Select the color of the overlay labels.

The result of NPU object detection

3333333333.png  

NXP Demo Experience - Text User Interface(TUI)
  • Command :demoexperience tui

2023-09-27 170604.png  

 

eIQ - A Python Framework for eIQ on i.MX Processors

PyeIQ is written on top of eIQ™ ML Software Development Environment and provides a set of Python classes

allowing the user to run Machine Learning applications in a simplified and efficiently way without spending time on

cross-compilations, deployments or reading extensive guides.

Installation

  • Method 1: Use pip3 tool to install the package located at PyPI repository:
$ pip3 install pyeiq
  • Method 2: Get the latest tarball  Download files  and copy it to the board:
$ pip3 install <tarball>

pyeiq tarball:

How to Run Samples

  •  Start the manager tool:
$ pyeiq
  • The above command returns the PyeIQ manager tool options:
Manager Tool Command Description Example
pyeiq --list-apps List the available applications.  
pyeiq --list-demos List the available demos.  
pyeiq --run <app_name/demo_name> Run the application or demo. # pyeiq --run object_detection_tflite
pyeiq --info <app_name/demo_name> Application or demo short description and usage.  
pyeiq --clear-cache Clear cached media generated by demos. # pyeiq --info object_detection_tflite

PyeIQ Demos

  • covid19_detection 
  • object_classification_tflite
  • object_detection_tflite
Demos Example - Running Object Detection

Object detection is a computer technology related to computer vision and image processing that deals with detecting instances of semantic objects of a certain class (such as humans, buildings, or cars) in digital images and videos. Well-researched domains of object detection include face detection and pedestrian detection. Object detection has applications in many areas of computer vision, including image retrieval and video surveillance.

  • Run the Object Detection Default Image demo using the following line:
    $ pyeiq --run object_detection_tflite


           *  This runs inference on a default image: 2023-09-27 172321.png  

  • Run the Object Detection Custom Image demo using the following line:
$ pyeiq --run object_detection_tflite --image=/path_to_the_image
  • Run the Object Detection Video File using the following line:
$ pyeiq --run object_detection_tflite --video_src=/path_to_the_video
  • Run the Object Detection Video Camera or Webcam using the following line:
$ pyeiq --run object_detection_tflite --video_src=/dev/video<index>