This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
I have try these two model
https://github.com/TexasInstruments/edgeai-yolov5/tree/master/pretrained_models/models/yolov5s6_640_ti_lite/weights/yolov5s6_640_ti_lite_37p4_56p0.onnx
https://github.com/TexasInstruments/edgeai-yolov5/tree/master/pretrained_models/models/yolov5s6_640_ti_lite/weights/best.pt
Convert best.pt to best.onnx using the command below: python export.py --weights pretrained_models/yolov5s6_640_ti_lite/weights/best.pt --img 640 --batch 1 --simplify --export-nms --opset 11
and when I used the file ./out/tidl_model_import.out to compile yolov5s6_640_ti_lite_37p4_56p0.onnx and best.onnx to bin file, I found yolov5s6_640_ti_lite_37p4_56p0.onnx can be complied successfully and best.onnx was failed. The picture below is the error log.
My question is, if I want to train my datasheet, and run the model on TDA4, then I need the pt file that can be export to onnx, and the onnx file can be complied to bin file successfully. Where is the pt file located?
Hi we have got your problem and escalated to e2e, please expect the response. Thanks.
Hello,
The prototxt needs to be modified based on the onnx model. If you can share the onnx model that you have generated, we can help with the prototxt then.
The .pt file is shared here:
You can go through this video from 24.00 onwards to get an idea of how to define the prototxt.
You can pull the latest code from https://github.com/TexasInstruments/edgeai-yolov5. While exporting an ONNX model, it exports the corresponding prototxt file as well. This will probably be helpful.
Thanks,
Cherry
Thank you for your reply. This method is very effective. The new export.py file can export the prototxt file that can be used to complie the onnx model file. Thank you very much.
Hi,
Nice to hear it works! Thanks and if you have any other problem, you could ask help from us at any time!
Thanks and Best Regards,
Cherry