We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
版本、环境信息: 1)Paddle Lite 版本:2.12 2)Host 环境:请描述 Host 系统类型、OS 版本 Mac OS 13.0.1
模型信息 1)yolov8n.pt 2) https://github.com/ultralytics/assets/releases/download/v8.2.0/yolov8n.pt
复现信息: 1)yolo转paddle yolo export model=./yolov8n.pt format=paddle
opt = lite.Opt() opt.set_model_file("./inference_model/model.pdmodel") opt.set_param_file("./inference_model/model.pdiparams") opt.set_optimize_out("./yolov8_output_inference/yolov8_opt_x86_int16_new") opt.set_model_type("naive_buffer") opt.set_valid_places("x86") opt.set_quant_model(True) opt.enable_fp16() opt.set_quant_type("QUANT_INT16") opt.run()`
The text was updated successfully, but these errors were encountered:
你好,能把你报错的整个inference_model目录压缩下直接传上来,方便复现吗?
Sorry, something went wrong.
你好,能把你报错的整个inference_model目录压缩下直接传上来,方便复现吗? 好的,压缩文件: yolov8n_paddle_model.zip
当前将Paddle Lite 版本:2.12升级到2.13rc0,已经可以转换成功。 问题解决:paddlelite版本问题 pip install paddlelite==2.13rc0
好的,我这边也用最新develop分支编译出来的opt工具验证了,是可以正确转换的~解决了就可以close了哈
ddchenhao66
No branches or pull requests
版本、环境信息:
1)Paddle Lite 版本:2.12
2)Host 环境:请描述 Host 系统类型、OS 版本 Mac OS 13.0.1
模型信息
1)yolov8n.pt
2) https://github.com/ultralytics/assets/releases/download/v8.2.0/yolov8n.pt
复现信息:
1)yolo转paddle
yolo export model=./yolov8n.pt format=paddle
`import paddlelite.lite as lite
opt = lite.Opt()
opt.set_model_file("./inference_model/model.pdmodel")
opt.set_param_file("./inference_model/model.pdiparams")
opt.set_optimize_out("./yolov8_output_inference/yolov8_opt_x86_int16_new")
opt.set_model_type("naive_buffer")
opt.set_valid_places("x86")
opt.set_quant_model(True)
opt.enable_fp16()
opt.set_quant_type("QUANT_INT16")
opt.run()`
Error: This model is not supported, because 1 ops are not supported on 'x86'. These unsupported ops are: 'silu'.
The text was updated successfully, but these errors were encountered: