KL520 toolchain on docker
目前下載好了docker 環境並run docker pull kneron/toolchain:latest
# Login to the docker and mount host machine /mnt/docker into the docker at /docker_mount.
docker run --rm -it -v /mnt/docker:/docker_mount kneron/
,我目前已有一個訓練好的tflite模型,想請問轉換tflite model to onnx格式 to .nef 的具體步驟為何? 我執行了onnx_tflite.py ,出現了tensorflow unfound它docker預設沒有裝?,如果沒有,想請問它支援哪個版本
The discussion has been closed due to inactivity. To continue with the topic, please feel free to post a new discussion.
Comments
@陳柏錩
Hi 柏錩,
you can try to enter the base environment by typing the following command in the Docker environment:
conda activate base
Try yourself step after entering the base environment in Docker, as shown in the image below.
You can refer to the following links:
- [3.1.4. TF Lite to ONNX](https://doc.kneron.com/docs/#toolchain/manual_3_onnx/)
- [5. TF Lite to ONNX](https://doc.kneron.com/docs/#toolchain/appendix/converters/)"
順道一提,如果是 Tensorflow 而非 tflite model 的話, 目前 Tensorlfow only support Tensorflow 1.x
網頁目前是看不到的
Hi,
The following links are:
- [3.1.4. TF Lite to ONNX] 3. Floating-Point Model Preparation - Document Center (kneron.com)
- [5. TF Lite to ONNX] ONNX Converter - Document Center (kneron.com)
我成功轉換成onnx格式後接下來怎麼換成.nef?
請參考文件的流程說明來進行量化(quantization)、編譯等流程
https://doc.kneron.com/docs/#toolchain/manual_1_overview/#12-workflow-overview
我做到w3m/data1/kneron_flow/model_fx_report.html這一段沒出現model_fx_report.html
我執行這一步卻出現error
@陳柏錩
Hi 柏錩,
看到您的錯誤訊息是在 E2E 階段出了點問題。
您可以在調用 ktc.kneron_inference 的時候,確認一下您轉出來 onnx model input_names 是否為 "images"。
建議您參考連結 :
https://doc.kneron.com/docs/#toolchain/appendix/app_flow_manual/ (Python API Inference, Necessary items)
https://doc.kneron.com/docs/#toolchain/manual_1_overview/ (1.2. Workflow Overview)
我成功產生.nef檔了,請問下一步是怎麼做,才能讓KL520進行推論
Hello,
若使用的是KL520 usb dongle的話,可以參考我們網頁的說明,透過我們的開發程式PLUS來完成你的應用
設定好PLUS編譯環境後可以參考文件透過generic_inference API來進行推論
若沒有C語言的相關環境,也可以參考使用python環境