This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

[参考译文] SK-AM62A-LP:编译期间出现 edgeai 模型生成器错误

Guru**** 1788580 points
请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1407430/sk-am62a-lp-edgeai-model-maker-error-during-compilation

器件型号:SK-AM62A-LP

工具与软件:

您好!

我正在尝试基于 yolox Slite 为 am62a 器件构建自定义检测模型。
我在带 Docker 选项的主分支上使用 edgeai-tensorlab。

培训效果很好、但在编译步骤中我遇到了这个错误

INFO:20240831-044036: infer  - od-8220 - this may take some time...Traceback (most recent call last):
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 203, in _run_pipeline
    result = cls._run_pipeline_impl(basic_settings, pipeline_config, description)
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 176, in _run_pipeline_impl
    accuracy_result = accuracy_pipeline(description)
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 87, in __call__
    param_result = self._run(description=description)
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 138, in _run
    output_list = self._infer_frames(description)
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 192, in _infer_frames
    is_ok = session.start_infer()
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/sessions/onnxrt_session.py", line 77, in start_infer
    super().start_infer()
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/sessions/basert_session.py", line 171, in start_infer
    raise FileNotFoundError(error_message)
FileNotFoundError: ERROR:20240831-044036: artifacts_folder is missing, please run import (on pc) - /opt/code/edgeai-modelmaker/data/projects/resized_640_640_weather_night_aug_3k/run/20240829-150353/yolox_s_lite/compilation/AM62A/work/od-8220/artifacts
ERROR:20240831-044036: artifacts_folder is missing, please run import (on pc) - /opt/code/edgeai-modelmaker/data/projects/resized_640_640_weather_night_aug_3k/run/20240829-150353/yolox_s_lite/compilation/AM62A/work/od-8220/artifacts
TASKS                                                       | 100%|██████████|| 1/1 [00:01<00:00,  1.13s/it]


packaging artifacts to /opt/code/edgeai-modelmaker/data/projects/resized_640_640_weather_night_aug_3k/run/20240829-150353/yolox_s_lite/compilation/AM62A/pkg please wait...
WARNING:20240831-044036: could not package - /opt/code/edgeai-modelmaker/data/projects/resized_640_640_weather_night_aug_3k/run/20240829-150353/yolox_s_lite/compilation/AM62A/work/od-8220
Traceback (most recent call last):
  File "/opt/code/edgeai-modelmaker/./scripts/run_modelmaker.py", line 141, in <module>
    main(config)
  File "/opt/code/edgeai-modelmaker/./scripts/run_modelmaker.py", line 80, in main
    model_runner.run()
  File "/opt/code/edgeai-modelmaker/edgeai_modelmaker/ai_modules/vision/runner.py", line 187, in run
    self.model_compilation.run()
  File "/opt/code/edgeai-modelmaker/edgeai_modelmaker/ai_modules/vision/compilation/edgeai_benchmark.py", line 279, in run
    edgeai_benchmark.interfaces.package_artifacts(self.settings, self.work_dir, out_dir=self.package_dir, custom_model=True)
  File "/opt/code/edgeai-benchmark/edgeai_benchmark/interfaces/run_package.py", line 271, in package_artifacts
    with open(os.path.join(out_dir,'artifacts.yaml'), 'w') as fp:
FileNotFoundError: [Errno 2] No such file or directory: '/opt/code/edgeai-modelmaker/data/projects/resized_640_640_weather_night_aug_3k/run/20240829-150353/yolox_s_lite/compilation/AM62A/pkg/artifacts.yaml'

您能帮助一下吗

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好、请提出解决此问题的想法?

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    嗨、Mickael、

    我们很乐意提供帮助。 很抱歉, 因为昨天是美国的公共假日。

    从提供的数据中没有太多的结论。 请考虑以下情况:

    如果这些建议未能解决您的问题、请在下次回复中包含您的 confi.yaml 文件。  

    此致、

    Qutaiba

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好、感谢您的回答。

    是的、我在 edgeai_modelmaker 中运行此行

    ./run_modelmaker.sh AM62A config_detection.yaml

    config_detection.yaml 的内容

    common:
        target_module: 'vision'
        task_type: 'detection'
        target_device: 'AM62A'
        # run_name can be any string, but there are some special cases:
        # {date-time} will be replaced with datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
        # {model_name} will be replaced with the name of the model
        run_name: '{date-time}/{model_name}'
    
    dataset:
        # enable/disable dataset loading
        enable: True #False
        # max_num_files: [750, 250] #None
    
        # Object Detection Dataset Examples:
        # -------------------------------------
        # Example 1, (known datasets): 'widerface_detection', 'pascal_voc0712', 'coco_detection', 'udacity_selfdriving', 'tomato_detection', 'tiscapes2017_driving'
        # dataset_name: widerface_detection
        # -------------------------------------
        # Example 2, give a dataset name and input_data_path.
        # input_data_path could be a path to zip file, tar file, folder OR http, https link to zip or tar files
        # for input_data_path these are provided with this repository as examples:
        #    'software-dl.ti.com/.../tiscapes2017_driving.zip'
        #    'software-dl.ti.com/.../animal_detection.zip'
        # -------------------------------------
        # Example 3, give image folders with annotation files (require list with values for both train and val splits)
        # dataset_name: coco_detection
        # input_data_path: ["./data/projects/coco_detection/dataset/train2017",
        #                        "./data/projects/coco_detection/dataset/val2017"]
        # input_annotation_path: ["./data/projects/coco_detection/dataset/annotations/instances_train2017.json",
        #                        "./data/projects/coco_detection/dataset/annotations/instances_val2017.json"]
        # -------------------------------------
        dataset_name: parking_ploum_aug_5k
        input_data_path: '../../datasets/parking_weather_night_aug_5k.zip'
    
    training:
        # enable/disable training
        enable: True #False
    
        # Object Detection model chosen can be changed here if needed
        # options are: 'yolox_s_lite', 'yolox_tiny_lite', 'yolox_nano_lite', 'yolox_pico_lite', 'yolox_femto_lite'
        model_name: 'yolox_s_lite'
    
        training_epochs: 30 #30
        batch_size: 8 #32
        learning_rate: 0.001
        num_gpus: 1 #1 #4
    
    compilation:
        # enable/disable compilation
        enable: True #False
        tensor_bits: 8 #16 #32

    我将 edgeai-tensorlab 的分支更改为 r9.02

    相同的误差。 模型训练正常(在 GPU 版本上)但模型伪影出现错误...

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    嗨、Mickael、  

    您正在使用的配置文件看起来没有问题。 该错误消息表明"/opt/code/edgeai-modelmaker/data/projects/resized_640_640_weather_night_aug_3k/run/20240829-150353/yolox_s_lite/compilation/AM62A/work/od-8220/artifacts "上没有工件。 我认为此处可能存在一些目录名称问题。  我正在尝试在我这边重新生成这个问题。 同时、您是否需要在/opt/code/edgeai-modelmaker/data/projects/?下提供目录树?

    此致、

    Qutaiba

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    嗨、Mickael、

    感谢您提供额外的数据。 从日志中似乎缺少此库:"无法从'osrt_model_tools.onnx_tools'"导入名称"onnx_model_opt"。 获取这些文件的快速方法是克隆 edgai-tidl-tools、并在当前使用的相同虚拟环境中启动设置脚本。 edgai-tidl-tools 的链接: https://github.com/TexasInstruments/edgeai-tidl-tools/tree/master。 从此处、按照自述文件 https://github.com/TexasInstruments/edgeai-tidl-tools/tree/master?tab=readme-ov-file#setup-on-x86_pc 中的说明进行操作

    完成后、您可以返回 modelmaker 并照常开始训练/编译。

    如果感兴趣、还可以使用其他路径:Edge AI Studio:模型编写器 (https://dev.ti.com/modelcomposer)来训练/编译模型。 此在线工具提供与 modelmaker 大致相同的功能。 您可以上传数据集并开始在线培训。

    请让我知道情况如何。  

    此致、

    Qutaiba

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好!

    使用 edgeai-tidl-tools 折叠您的程序,但结果相同...

    是的、在线解决方案可能是更简单的解决方案、但与 SDK 9.2或10不兼容

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    @ó n Qutaiba Saleh:

    我和 Mickael 有完全相同的问题。 然而、几个星期前、这个问题并不存在。 您能否为 edgeai-modelmaker 提供 r9.2和 r9.1的更新。

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    edgeai-tensorlab/edgeai-benchmarking 取决于 edgeai-tidl-tools 中的 python 软件包、而该软件包在没有标签/版本的情况下使用。

    现在、已在此处添加一个标签以使用特定版本:

    https://github.com/TexasInstruments/edgeai-tensorlab/blob/main/edgeai-benchmark/requirements_pc.txt#L4

    r8.6、r9.0、r9.1、r9.2和主分支已使用正确的修复程序更新了 Ben。

    注意:到目前为止、main 分支 指向 r9.2

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好、 、使用您的修复更新 requirement_pc.txt。 使用 r9.2。
    激活 Pyenv py310
    并启动

    ./setup_pc.sh

    但同样的错误...

    SUCCESS: ModelMaker - Training completed.
    
    INFO:20240910-125717: model import is in progress - please see the log file for status.
    configs to run: ['od-8220']
    number of configs: 1
    
    INFO:20240910-125717: parallel_run - parallel_processes:1 parallel_devices=[0]
    TASKS                                                       |          |     0% 0/1| [< ]
    INFO:20240910-125718: starting process on parallel_device - 0   0%|          || 0/1 [00:00<?, ?it/s]
    
    INFO:20240910-125718: starting - od-8220
    INFO:20240910-125718: model_path - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/training/model.onnx
    INFO:20240910-125718: model_file - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220/model/model.onnx
    INFO:20240910-125718: quant_file - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220/model/model_qparams.prototxt
    Downloading 1/1: /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/training/model.onnx
    Download done for /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/training/model.onnx
    Downloading 1/1: /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/training/model.onnx
    Download done for /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/training/model.onnx
    Traceback (most recent call last):
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 203, in _run_pipeline
        result = cls._run_pipeline_impl(basic_settings, pipeline_config, description)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 176, in _run_pipeline_impl
        accuracy_result = accuracy_pipeline(description)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 68, in __call__
        self.session.start()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/onnxrt_session.py", line 45, in start
        super().start()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/basert_session.py", line 145, in start
        self.get_model()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/basert_session.py", line 524, in get_model
        apply_input_optimization = self._optimize_model(model_file,
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/basert_session.py", line 569, in _optimize_model
        from osrt_model_tools.onnx_tools import onnx_model_opt as onnxopt
    ImportError: cannot import name 'onnx_model_opt' from 'osrt_model_tools.onnx_tools' (/home/ubuntu/.pyenv/versions/py310/lib/python3.10/site-packages/osrt_model_tools/onnx_tools/__init__.py)
    cannot import name 'onnx_model_opt' from 'osrt_model_tools.onnx_tools' (/home/ubuntu/.pyenv/versions/py310/lib/python3.10/site-packages/osrt_model_tools/onnx_tools/__init__.py)
    TASKS                                                       | 100%|██████████|| 1/1 [00:00<00:00,  4.42it/s]
    
    
    
    INFO:20240910-125718: model inference is in progress - please see the log file for status.
    configs to run: ['od-8220']
    number of configs: 1
    
    INFO:20240910-125718: parallel_run - parallel_processes:1 parallel_devices=[0]
    TASKS                                                       |          |     0% 0/1| [< ]
    INFO:20240910-125718: starting process on parallel_device - 0   0%|          || 0/1 [00:00<?, ?it/s]
    
    INFO:20240910-125718: starting - od-8220
    INFO:20240910-125718: model_path - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/training/model.onnx
    INFO:20240910-125718: model_file - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220/model/model.onnx
    INFO:20240910-125718: quant_file - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220/model/model_qparams.prototxt
    
    INFO:20240910-125718: running - od-8220
    INFO:20240910-125718: pipeline_config - {'task_type': 'detection', 'dataset_category': 'coco', 'calibration_dataset': <edgeai_benchmark.datasets.modelmaker_datasets.ModelMakerDetectionDataset object at 0x7f93ca8e2bf0>, 'input_dataset': <edgeai_benchmark.datasets.modelmaker_datasets.ModelMakerDetectionDataset object at 0x7f93cb35f670>, 'preprocess': <edgeai_benchmark.preprocess.PreProcessTransforms object at 0x7f92c6317a00>, 'session': <edgeai_benchmark.sessions.onnxrt_session.ONNXRTSession object at 0x7f92c6317a60>, 'postprocess': <edgeai_benchmark.postprocess.PostProcessTransforms object at 0x7f92c6317d00>, 'metric': {'label_offset_pred': 1}, 'model_info': {'metric_reference': {'accuracy_ap[.5:.95]%': None}, 'model_shortlist': 10}}
    INFO:20240910-125718: infer  - od-8220 - this may take some time...Traceback (most recent call last):
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 203, in _run_pipeline
        result = cls._run_pipeline_impl(basic_settings, pipeline_config, description)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 176, in _run_pipeline_impl
        accuracy_result = accuracy_pipeline(description)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 87, in __call__
        param_result = self._run(description=description)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 138, in _run
        output_list = self._infer_frames(description)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 192, in _infer_frames
        is_ok = session.start_infer()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/onnxrt_session.py", line 77, in start_infer
        super().start_infer()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/basert_session.py", line 171, in start_infer
        raise FileNotFoundError(error_message)
    FileNotFoundError: ERROR:20240910-125718: artifacts_folder is missing, please run import (on pc) - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220/artifacts
    ERROR:20240910-125718: artifacts_folder is missing, please run import (on pc) - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220/artifacts
    TASKS                                                       | 100%|██████████|| 1/1 [00:00<00:00,  4.63it/s]
    
    
    packaging artifacts to /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/pkg please wait...
    WARNING:20240910-125718: could not package - /home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/work/od-8220
    Traceback (most recent call last):
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/./scripts/run_modelmaker.py", line 141, in <module>
        main(config)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/./scripts/run_modelmaker.py", line 80, in main
        model_runner.run()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/edgeai_modelmaker/ai_modules/vision/runner.py", line 187, in run
        self.model_compilation.run()
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/edgeai_modelmaker/ai_modules/vision/compilation/edgeai_benchmark.py", line 279, in run
        edgeai_benchmark.interfaces.package_artifacts(self.settings, self.work_dir, out_dir=self.package_dir, custom_model=True)
      File "/home/ubuntu/TI/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/interfaces/run_package.py", line 271, in package_artifacts
        with open(os.path.join(out_dir,'artifacts.yaml'), 'w') as fp:
    FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/TI/edgeai-tensorlab/edgeai-modelmaker/data/projects/parking_ploum_aug_5k/run/20240910-125317/yolox_s_lite/compilation/AM62A/pkg/artifacts.yaml'

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    我不知道为什么它不能为你工作。  

    您能否创建一个全新的 python 3.10环境并尝试以下行: https://github.com/TexasInstruments/edgeai-tensorlab/blob/r9.2/edgeai-benchmark/requirements_pc.txt#L4

    即:

    pip install git+github.com/.../edgeai-tidl-tools.git@09_02_09_00

    之后、启动 python 控制台、然后键入显示上面粘贴内容中错误的确切行:

    从 osrt_model_tools.onnx_tools 中导入 onnx_model_opt 作为 onnxopt

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    好的、结果

    (py310_TI) ubuntu@t2-le-45-gra11:~$ python
    Python 3.10.14 (main, Sep  2 2024, 15:58:16) [GCC 11.4.0] on linux
    Type "help", "copyright", "credits" or "license" for more information.
    >>> from osrt_model_tools.onnx_tools import onnx_model_opt as onnxopt
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/home/ubuntu/.pyenv/versions/py310_TI/lib/python3.10/site-packages/osrt_model_tools/onnx_tools/onnx_model_opt.py", line 58, in <module>
        import onnx
    ModuleNotFoundError: No module named 'onnx'
    

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    运行 setup_pc.sh 应该已经安装了所有依赖项。 您可以尝试再次运行它。 (我昨天检查了它、它对我很有用-在我推送该更新后)

    或者、您可以手动查看这些依赖项并进行安装-完成名为 r9.2的部分中的安装文件

    https://github.com/TexasInstruments/edgeai-tensorlab/blob/r9.2/edgeai-benchmark/setup_pc.sh#L93

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    你好,谢谢你!

    使用新的 python venv 和 setup_pc.sh 它现在工作!
    您能告诉我是否需要 SDK 10.00.00.08的自定义模型吗?请告诉我需要使用哪个分支?

    此致

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您还可以说明我可以更改编译参数的位置,如模型编写器中所示:

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    这些参数位于以下文件中: github.com/.../settings_base.yaml

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    谢谢!

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好!

    就可以直接编译已经接受过培训的模型。
    我想我需要使用这个 jupyter 笔记本
    https://github.com/TexasInstruments/edgeai-tensorlab/blob/r9.2/edgeai-benchmark/tutorials/tutorial_detection.ipynb

    请告诉我需要修改什么才能编译存储在 edgeai-modelmaker 文件夹中的模型吗?

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好、Michael:

    由于您使用的是 edgeai-modelmaker、因此它可以进行训练和编译。 因此您无需单独编译。  

    (但如果您没有使用 edgeai-modelmaker 并且您已训练/导出 onnx 模型、则可以使用 edgeai-benhcmark 来编译模型)。  

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    是的、我明白。 但是,如果我想重新编译 SDK 10的模型? 或者更改一些参数,如检测 TOP k ...

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    detection_top_k 可以在 edgeai-benchmarking 中更改、并将在 modelmaker 中生效:

    https://github.com/TexasInstruments/edgeai-tensorlab/blob/main/edgeai-benchmark/settings_base.yaml#L97

    如果您想单独编译、则可以查看提供的示例:

    https://github.com/TexasInstruments/edgeai-tensorlab/blob/r9.2/edgeai-benchmark/tutorials/tutorial_detection.ipynb

    此处的配置:

    https://github.com/TexasInstruments/edgeai-tensorlab/tree/main/edgeai-benchmark/configs

    请注意、每个型号都不同、没有适用于所有型号的通用规则。 如果你是冒险,你可以研究这些和编译自己-但这将需要时间来理解。

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    还有另一个 e2e 线程讨论模型编译: e2e.ti.com/.../sk-am62a-lp-edgeai-model-maker-custom-model-compilation