Kneron toolchain not converting the YOLOv5s to KL520
Hi,
I'm using the kneron/toolchain:v0.28.0 docker image to convert the YOLOv5s noupsample model to .nef for KL520. The steps that i'm following are:
1 Container run and model train and export to onnx
franklinls@MLOPS-Machine:~$ sudo docker run -it --rm --name=kneron kneron/toolchain
(onnx1.13) root@dc4f5de59e3d:/workspace# conda deactivate
(base) root@dc4f5de59e3d:/workspace# cd /workspace/ai_training/detection/yolov5/yolov5/
(base) root@dc4f5de59e3d:/workspace/ai_training/detection/yolov5/yolov5# wget
https://raw.githubusercontent.com/kneron/Model_Zoo/main/detection/yolov5/yolov5s-noupsample/best.pt
(base) root@dc4f5de59e3d:/workspace/ai_training/detection/yolov5/yolov5# CUDA_VISIBLE_DEVICES='0' python3 train.py --data coco128.yaml --cfg yolov5s-noupsample.yaml --weights 'best.pt' --batch-size 4 --epoch 2 --workers 2
...
Optimizer stripped from runs/train/exp/weights/best.pt, 13.8MB
2 epochs completed in 0.032 hours.
(base) root@dc4f5de59e3d:/workspace/ai_training/detection/yolov5/yolov5# python3 ../exporting/yolov5_export.py --data ../yolov5/data/model_paths_520_coco128.yaml
self.vanish_point 0.0
Starting ONNX export with onnx 1.7.0...
****onnx file**** ./yolov5s-noupsample-coco128.onnx
...
ONNX export success, saved as ./yolov5s-noupsample-coco128.onnx
2 Model conversion from ONNX to NEF (KL520)
(base) root@dc4f5de59e3d:/workspace/ai_training/detection/yolov5/yolov5# conda activate onnx1.13
(onnx1.13) root@dc4f5de59e3d:/workspace/ai_training/detection/yolov5/yolov5# (onnx1.13) root@dc4f5de59e3d:/workspace/ai_training/detection/yolov5/yolov5# python onnx_to_nef.py
2.1 km.evaluate() output
docker_version: kneron/toolchain:v0.28.0
comments:
kdp520/input bitwidth: int8
kdp520/output bitwidth: int8
kdp520/cpu bitwidth: int8
kdp520/datapath bitwidth: int8
kdp520/weight bitwidth: int8
kdp520/ip_eval/fps: 4.64802
kdp520/ip_eval/ITC(ms): 215.145 ms
kdp520/ip_eval/RDMA band
width GB/s: 0.8
kdp520/ip_eval/WDMA bandwidth GB/s: 0.8
kdp520/ip_eval/GETW bandwidth GB/s: 0.8
kdp520/ip_eval/cpu_node: Sigmoid: Sigmoid_129, Sigmoid_131, Sigmoid_133
gen fx model report: model_fx_report.html
gen fx model json: model_fx_report.json
2.2 km.analysis() output
Failure for model "input/input" when running "kdp520/unimplemented feature"
Fixed-point analysis done. Saved bie model to '/data1/kneron_flow/input.kdp520.scaled.bie'
2.3 km.compile() output
[common][error][exceptions.cc:41][KneronException] UnimplementedFeature: undefined CPU op [Sigmoid] of node [Sigmoid_129]
[tool][error][batch_compile.cc:528][BatchCompile] Failed to compile input.kdp520.scaled.bie
As explained, the conversion process is not working due to the Sigmoid operation in Sigmoid_129 node mapped to CPU. So, how can i fix this?
Comments
Code
Hi,
The nodes labeled as "cpu_nodes" are nodes that are not supported by KL520 NPU:
kdp520/ip_eval/cpu_node: Sigmoid: Sigmoid_129, Sigmoid_131, Sigmoid_133
Since KL520 doesn't support Sigmoid nodes, you will need to cut them in your onnx model, then add them back to your postprocess.
You can find a list of supported nodes here: Hardware Supported Operators - Document Center
Hi Maria,
Thanks for your reply.
I've found the sigmoid in postprocess of
kl520_kn-model-zoo_generic_image_inference_post_yolov5
example.Hi again Maria.
In my project, i'm using the Banana Pi P2 Zero with Armbian to access the KL520.
So, i'm interested in use the Kneron DFUT on this board to flash the firmware but only the Raspberry Pi binary is available. How can i proceed in this case?
Hi Franklin,
These are the platforms you can build and run Kneron PLUS on:
If you are not using the above platforms, to flash the KL520 firmware, you could download Kneron PLUS and try building the DFUT console:
Build with DFUT console: Build Kneron PLUS - Document Center
Then, you could use the console to update the KL520 to USB Boot mode or Flash Boot mode, then upload the firmware.
DFUT console: Run Examples - Document Center
Hi Maria,
Thanks for your reply. I'll follow this build process.
Hi again Maria,
Is there any HW library available for KL520 TFBGA 8x8 159io (Symbol, footprint)? As reference, we're using Kicad
Hi again Franklin,
For KL520 hardware information, you could refer to these documents available on our Developer Center: Developers | Kneron – Full Stack Edge AI
Hi Maria,
The 96 board/Main board files are not available for me. Can you send me, please?
Hi Franklin,
The files are only provided to limited users. Please contact your purchase window to get access rights
Hi Ethon,
I've tried to contact info@kneron.us, but without success until now. Do you know other contact (email or skype)?
Hi Franklin,
Did you purchase any Kneron devices, and where did you get them? Please contact your purchase window instead of 'info@kneron.us.' We will provide download access for the product you bought.
If you intend to purchase but have not bought the products yet, please contact our sales representative, Brian (brian.lin@kneron.us).
Hi Ethon,
At my company, we're testing the KL520 USB dongle with the intention of integrating it into our product. So, we purchased it from Mouser for testing, and now we're interested in designing our own board based on the KL520 chip, for continuous purchase. Because of this, we need the footprint and symbol for Kicad.
Would it be better to contact them through the purchase window, or should I reach out to Brian for assistance?
Is that the contact through purchase window?
Hi Franklin,
Yes, you should reach out to Brian (brian.lin@kneron.us), our sales representative. Thank you for your interest!
You could also fill out the form on our website: Contact us | Kneron – Full Stack Edge AI
Hi Maria,
I've tried reaching out through Brian and the Contact us | Kneron – Full Stack Edge AI, but unfortunately, I haven’t received a response. Do you have another contact I could try? My issue is simple to resolve, but the lack of communication is delaying our production.
Thanks,
Hi Franklin,
I'm sorry for their lack of communication. Could you contact Jeffrey (jeffrey-yc.chen@kneron.us) instead and cc Brian just in case? Thank you!