<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://ess-wiki.advantech.com.tw/wiki/index.php?action=history&amp;feed=atom&amp;title=Edge_AI_SDK%2FAI_Framework%2FRK3576</id>
		<title>Edge AI SDK/AI Framework/RK3576 - Revision history</title>
		<link rel="self" type="application/atom+xml" href="https://ess-wiki.advantech.com.tw/wiki/index.php?action=history&amp;feed=atom&amp;title=Edge_AI_SDK%2FAI_Framework%2FRK3576"/>
		<link rel="alternate" type="text/html" href="https://ess-wiki.advantech.com.tw/wiki/index.php?title=Edge_AI_SDK/AI_Framework/RK3576&amp;action=history"/>
		<updated>2026-05-02T13:20:44Z</updated>
		<subtitle>Revision history for this page on the wiki</subtitle>
		<generator>MediaWiki 1.28.3</generator>

	<entry>
		<id>https://ess-wiki.advantech.com.tw/wiki/index.php?title=Edge_AI_SDK/AI_Framework/RK3576&amp;diff=42142&amp;oldid=prev</id>
		<title>Zhihao.zhu at 01:18, 30 April 2026</title>
		<link rel="alternate" type="text/html" href="https://ess-wiki.advantech.com.tw/wiki/index.php?title=Edge_AI_SDK/AI_Framework/RK3576&amp;diff=42142&amp;oldid=prev"/>
				<updated>2026-04-30T01:18:00Z</updated>
		
		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class='diff-marker' /&gt;
				&lt;col class='diff-content' /&gt;
				&lt;col class='diff-marker' /&gt;
				&lt;col class='diff-content' /&gt;
				&lt;tr style='vertical-align: top;' lang='en'&gt;
				&lt;td colspan='2' style=&quot;background-color: white; color:black; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan='2' style=&quot;background-color: white; color:black; text-align: center;&quot;&gt;Revision as of 01:18, 30 April 2026&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l51&quot; &gt;Line 51:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 51:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 154px;&amp;quot; | Application&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 154px;&amp;quot; | Application&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 179px;&amp;quot; | Model&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 179px;&amp;quot; | Model&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color:black; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 179px;&amp;quot; | AOM-&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;3821 &lt;/del&gt;FPS (video file)&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color:black; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 179px;&amp;quot; | AOM-&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;3841 &lt;/ins&gt;FPS (video file)&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color:black; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 179px;&amp;quot; | &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;ASR&lt;/del&gt;-&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;A501 &lt;/del&gt;FPS (video file)&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color:black; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 179px;&amp;quot; | &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;AOM&lt;/ins&gt;-&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;5841 &lt;/ins&gt;FPS (video file)&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;|-&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;|-&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 154px;&amp;quot; | Object Detection&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f9f9f9; color: #333333; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #e6e6e6; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;| style=&amp;quot;width: 154px;&amp;quot; | Object Detection&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Zhihao.zhu</name></author>	</entry>

	<entry>
		<id>https://ess-wiki.advantech.com.tw/wiki/index.php?title=Edge_AI_SDK/AI_Framework/RK3576&amp;diff=42140&amp;oldid=prev</id>
		<title>Zhihao.zhu: Created page with &quot; = RKNN SDK =  RKNN SDK （[https://pan.baidu.com/s/1R0DhNJU56Uhp4Id7AzNfgQ Baidu] Password: a887）include two parts:  *rknpu2 (on the Board End)  *rknn-toolkit2 (on the PC)...&quot;</title>
		<link rel="alternate" type="text/html" href="https://ess-wiki.advantech.com.tw/wiki/index.php?title=Edge_AI_SDK/AI_Framework/RK3576&amp;diff=42140&amp;oldid=prev"/>
				<updated>2026-04-30T01:10:35Z</updated>
		
		<summary type="html">&lt;p&gt;Created page with &amp;quot; = RKNN SDK =  RKNN SDK （[https://pan.baidu.com/s/1R0DhNJU56Uhp4Id7AzNfgQ Baidu] Password: a887）include two parts:  *rknpu2 (on the Board End)  *rknn-toolkit2 (on the PC)...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&lt;br /&gt;
= RKNN SDK =&lt;br /&gt;
&lt;br /&gt;
RKNN SDK （[https://pan.baidu.com/s/1R0DhNJU56Uhp4Id7AzNfgQ Baidu] Password: a887）include two parts:&lt;br /&gt;
&lt;br /&gt;
*rknpu2 (on the Board End) &lt;br /&gt;
*rknn-toolkit2 (on the PC) &lt;br /&gt;
&amp;lt;pre&amp;gt;├── rknpu2&lt;br /&gt;
│   ├── Driver&lt;br /&gt;
│   └── RKNPU2 Environment&lt;br /&gt;
└── rknn-toolkit2&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== RKNPU2 ==&lt;br /&gt;
&lt;br /&gt;
RKNPU2 include driver and environment to help to fast develop AI applications using rknn model(*.rknn). More Info refer to&amp;amp;nbsp;[https://ess-wiki.advantech.com.tw/view/RK_Platform_NPU_SDK RK_Platform_NPU_SDK]&lt;br /&gt;
&lt;br /&gt;
=== RKNPU2 Driver ===&lt;br /&gt;
&lt;br /&gt;
The official firmware of boards all installs the RKNPU2 driver.&amp;lt;br/&amp;gt; You can execute the following command on the board end to query the RKNPU2 driver version:&amp;lt;br/&amp;gt; dmesg | grep -i rknpu.&lt;br /&gt;
&lt;br /&gt;
=== RKNPU2 Environment ===&lt;br /&gt;
&lt;br /&gt;
Here are two basic concepts in the RKNPU2 environment:&lt;br /&gt;
&lt;br /&gt;
*'''RKNN Server''': A background proxy service running on the development board. The main function of this service is to call the interface corresponding to the board end Runtime to process the data transmitted by the computer through USB, and return the processing results to the computer. &lt;br /&gt;
&lt;br /&gt;
*'''RKNPU2 Runtime library (librknnrt.so)''': The main responsibility is to load the RKNN model in the system and perform inference operations of the RKNN model by calling a dedicated neural processing unit (NPU) &lt;br /&gt;
&lt;br /&gt;
== RKNN-TOOLKIT2 ==&lt;br /&gt;
&lt;br /&gt;
RKNN-Toolkit2 is a development kit that provides users with model conversion, inference and performance evaluation on PC platforms. Users can easily complete the following functions through the Python interface provided by the tool:&lt;br /&gt;
&lt;br /&gt;
#&amp;lt;span style=&amp;quot;color:#0000ff;&amp;quot;&amp;gt;'''Model conversion'''&amp;lt;/span&amp;gt;: support to convert Caffe / TensorFlow / TensorFlow Lite / ONNX / Darknet / PyTorch model to RKNN model, support RKNN model import/export, which can be used on Rockchip NPU platform later. &lt;br /&gt;
#&amp;lt;span style=&amp;quot;color:#0000ff;&amp;quot;&amp;gt;'''Quantization'''&amp;lt;/span&amp;gt;: support to convert float model to quantization model, currently support quantized methods including asymmetric quantization (asymmetric_quantized-8). and support hybrid quantization. &lt;br /&gt;
#&amp;lt;span style=&amp;quot;color:#0000ff;&amp;quot;&amp;gt;'''Model inference'''&amp;lt;/span&amp;gt;: Able to simulate NPU to run RKNN model on PC and get the inference result. This tool can also distribute the RKNN model to the specified NPU device to run, and get the inference results. &lt;br /&gt;
#&amp;lt;span style=&amp;quot;color:#0000ff;&amp;quot;&amp;gt;'''Performance &amp;amp; Memory evaluation'''&amp;lt;/span&amp;gt;: distribute the RKNN model to the specified NPU device to run, and evaluate the model performance and memory consumption in the actual device. &lt;br /&gt;
#&amp;lt;span style=&amp;quot;color:#0000ff;&amp;quot;&amp;gt;'''Quantitative error analysis'''&amp;lt;/span&amp;gt;: This function will give the Euclidean or cosine distance of each layer of inference results before and after the model is quantized. This can be used to analyze how quantitative error occurs, and provide ideas for improving the accuracy of quantitative models. &lt;br /&gt;
#&amp;lt;span style=&amp;quot;color:#0000ff;&amp;quot;&amp;gt;'''Model encryption'''&amp;lt;/span&amp;gt;: Use the specified encryption method to encrypt the RKNN model as a whole. &lt;br /&gt;
&lt;br /&gt;
More Info refer to&amp;amp;nbsp;[https://github.com/airockchip/rknn-toolkit2 rknn-toolkit2] &amp;amp;nbsp;&amp;lt;br/&amp;gt; More Info refer to&amp;amp;nbsp;[https://ess-wiki.advantech.com.tw/view/RK_Platform_NPU_SDK#rknn-toolkit2 RK_Platform_NPU_SDK] &amp;amp;nbsp;&lt;br /&gt;
&lt;br /&gt;
&amp;amp;nbsp;&lt;br /&gt;
&lt;br /&gt;
= Applications =&lt;br /&gt;
&lt;br /&gt;
== Edge AI SDK&amp;amp;nbsp;/ Vision Application ==&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;1&amp;quot; cellspacing=&amp;quot;1&amp;quot; style=&amp;quot;width: 900px;&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | Application&lt;br /&gt;
| style=&amp;quot;width: 179px;&amp;quot; | Model&lt;br /&gt;
| style=&amp;quot;width: 179px;&amp;quot; | AOM-3821 FPS (video file)&lt;br /&gt;
| style=&amp;quot;width: 179px;&amp;quot; | ASR-A501 FPS (video file)&lt;br /&gt;
|-&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | Object Detection&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | yolov10&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
|-&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | Person Detection&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | yolov5&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
|-&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | Face Detection&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | retinaface&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
|-&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | Pose Estimation&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | yolov8_pose&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
| style=&amp;quot;width: 154px;&amp;quot; | 25&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;amp;nbsp;&lt;br /&gt;
&lt;br /&gt;
= Benchmark =&lt;br /&gt;
&lt;br /&gt;
RK3576 is used to deploy a wide range of popular DNN models and ML frameworks to the edge with high performance inferencing, for tasks like real-time classification and object detection, pose estimation, semantic segmentation, and natural language processing (NLP). More Info refer to&amp;amp;nbsp;[ [https://github.com/airockchip https://github.com/airockchip] ]&lt;br /&gt;
&lt;br /&gt;
Advantech, based on the rknn_common_test command, has encapsulated the npu_stress_test.sh to be used for testing the NPU.&lt;br /&gt;
&lt;br /&gt;
&amp;amp;nbsp;&lt;br /&gt;
&lt;br /&gt;
= Utility =&lt;br /&gt;
&lt;br /&gt;
*'''sysstat''': This SDK provides the mpstat utility, which reports memory usage and processor usage for devices. You can find the utility in your package at the following location. More Info refer to&amp;amp;nbsp;[https://github.com/sysstat/sysstat Link]&amp;lt;br/&amp;gt; &amp;amp;nbsp; &lt;br /&gt;
*'''OpenCV4.6''': OpenCV (Open Source Computer Vision Library: [http://opencv.org http://opencv.org]) is an open-source library that includes several hundreds of computer vision algorithms. More Info refer to&amp;amp;nbsp;[https://github.com/opencv/opencv/wiki/OpenCV-Change-Logs-v2.2‐v4.10 Link]&amp;lt;br/&amp;gt; &amp;amp;nbsp;&lt;/div&gt;</summary>
		<author><name>Zhihao.zhu</name></author>	</entry>

	</feed>