This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

[参考译文] TDA4VM:SIGBUS、使用 python edgeai-tidl-tools 的 SIGSEV with RTOS SDK 8.6

Guru**** 663810 points
请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1208475/tda4vm-sigbus-sigsev-when-using-python-edgeai-tidl-tools-with-rtos-sdk-8-6

器件型号:TDA4VM

您好!

我正在使用 edgeai-tidl-tools 使用 ONNX 文件为 TDA4编译 TIDL 伪影。 自从迁移到 RTOS SDK 8.6 (以及 edgeai 8.6)之后、我在尝试编译时会抛出 SIGBUS 和 SIGSEV。

下面是我的脚本的简短版本:

import onnxruntime as rt
import onnx

# ...

onnx.shape_inference.infer_shapes_path(onnx_model_path, onnx_model_path)
compile_options = {
    'tidl_tools_path': os.environ['TIDL_TOOLS_PATH'],
    'artifacts_folder': artifacts_folder,
    'tensor_bits': 16,
    'accuracy_level': 0,
    'advanced_options:calibration_frames': 1,
    'advanced_options:calibration_iterations': 1,
    'advanced_options:quantization_scale_type': 1,
    'debug_level': 1,
    "platform": "J7"
}
delegate_options = {}
delegate_options.update(compile_options)

so = rt.SessionOptions()
EP_list = ['TIDLCompilationProvider', 'CPUExecutionProvider']
sess = rt.InferenceSession(onnx_model_path, providers=EP_list,
                           provider_options=[delegate_options, {}],
                           sess_options=so)

# ...

output = sess.run(None, {...})

具体而言、  当我调用 sess.run (...)时、python 脚本会停止并显示"Bus error (core pose)"。 下面是使用 gdb python3运行脚本时的回溯:

Thread 1 "python3" received signal SIGBUS, Bus error.
0x00007ffff7b70ded in ?? () from /lib/x86_64-linux-gnu/libc.so.6
(gdb) backtrace
#0 0x00007ffff7b70ded in ?? () from /lib/x86_64-linux-gnu/libc.so.6
#1 0x00007fff40f09f06 in vxCreateUserDataObject () from /devel/edgeai-tidl-tools/tidl_tools/libvx_tidl_rt.so
#2 0x00007fff40f0214c in TIDLRT_create () from /devel/edgeai-tidl-tools/tidl_tools/libvx_tidl_rt.so
#3 0x00007fff4cc65cab in TIDL_subgraphRtCreate () from /devel/edgeai-tidl-tools/tidl_tools/tidl_model_import_onnx.so
#4 0x00007fff4cc43dc6 in TIDL_computeImportFunc () from /devel/edgeai-tidl-tools/tidl_tools/tidl_model_import_onnx.so

如果我注释掉 sess.run (...)、则脚本会停止并显示"Segmentation fault (core disposed)"。 下面是它的回溯:

Thread 1 "python3" received signal SIGSEGV, Segmentation fault.
0x00007fff40f1707c in ownReleaseReferenceInt () from /devel/edgeai-tidl-tools/tidl_tools/libvx_tidl_rt.so
(gdb) backtrace
#0 0x00007fff40f1707c in ownReleaseReferenceInt () from /devel/edgeai-tidl-tools/tidl_tools/libvx_tidl_rt.so
#1 0x00007fff40f02ba7 in TIDLRT_delete () from /devel/edgeai-tidl-tools/tidl_tools/libvx_tidl_rt.so
#2 0x00007fff4cc660bf in TIDL_subgraphRtDelete () from /devel/edgeai-tidl-tools/tidl_tools/tidl_model_import_onnx.so

我已经测试删除"TIDLCompilationProvider"仅保留"CPUExecutionProvider"、然后我可以编译我的工件并在我的 TDA4上使用它们。 我认为这证明 ONNX 或我的脚本没有问题。

是否发现了该问题? 它看起来像是我身边的恶劣环境、还是类似的东西?
以下是运行 edgeai-tidl-tools/setup.sh 后的我的环境变量:
# echo $LD_LIBRARY_PATH
:/devel/edgeai-tidl-tools/tidl_tools:/devel/edgeai-tidl-tools/tidl_tools/osrt_deps
# echo $ARM64_GCC_PATH
/devel/edgeai-tidl-tools/gcc-arm-9.2-2019.12-x86_64-aarch64-none-linux-gnu
# echo $TIDL_TOOLS_PATH
/devel/edgeai-tidl-tools/tidl_tools
# echo $SOC
am68pa
谢谢。
弗雷德
  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    尊敬的 Fred:

    对拖延答复表示歉意。 您能否确认以下有关正在使用的设置的信息:

    • 使用该模型时、错误是否仅出现在 SDK 的8.6版中、以及使用与先前的 SDK (如8.4版)相同的模型时不会显示任何错误?
    • 模型和构件是否已使用8.6 edgeai-tidl-tools 重新编译、或按原样从之前的 SDK 中复制? 我做了一个快速实验、其中我将模型+构件从8.4 SDK 复制到8.6 SDK 中、导致了错误、因此使用8.4 SDK 编译的模型似乎与8.6不兼容。

    此致、

    Takuma.

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    Takuma、您好、感谢您的回答。

    输入 onnx 文件是来自 TIDL 外部手电筒工具的量化模型输出。

    也就是说、  将同一个模型与 SDK 8.4搭配使用是可行的、而将其与8.6搭配使用 则会产生异常。

    模型和工件没有从8.4复制到8.6。 我正在尝试生成这些伪影。

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    你好、Takuma。

    这些错误似乎来自我的 Docker 环境不良。 我再也不知道了。

    但是、如果我设置'debug_level=3'、我会得到一个 SEGFAULT。 您能否在您身边重现问题?

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    尊敬的 Fred:

    很遗憾、在 DEBUG_LEVEL 设置为3的情况下、我无法重现错误。 我所做的实验是使用 edgeai-tidl-tools 的8.6版本标签设置 Docker 容器、并在编译选项中运行示例"python3 onnxrt_ep.py -c"、并将 debug_level 设置为3。

    为了澄清、在 Docker 容器内的 PC 上运行编译时是否发生了 SEGFAULT、以及您是否能够共享导致错误的日志(还是与初始 POST 中共享的日志相同)?

    此致、

    Takuma.

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好、Takuma、

    我成功地使用您提供的示例脚本和使用 edgeai-tidl-tools 8.6的 Docker 重现了问题。

    您需要  对脚本应用以下补丁:

    diff --git a/examples/osrt_python/ort/onnxrt_ep.py b/examples/osrt_python/ort/onnxrt_ep.py
    index 03eb9f1..9ad065d 100644
    --- a/examples/osrt_python/ort/onnxrt_ep.py
    +++ b/examples/osrt_python/ort/onnxrt_ep.py
    @@ -20,7 +20,8 @@ from model_configs import *
     
     required_options = {
     "tidl_tools_path":tidl_tools_path,
    -"artifacts_folder":artifacts_folder
    +"artifacts_folder":artifacts_folder,
    +"debug_level":3
     }
     
     parser = argparse.ArgumentParser()
    @@ -144,8 +145,8 @@ def run_model(model, mIdx):
             test_images = seg_test_images
         
         delegate_options = {}
    -    delegate_options.update(required_options)
         delegate_options.update(optional_options)   
    +    delegate_options.update(required_options)
     
         # stripping off the ss-ort- from model namne
         delegate_options['artifacts_folder'] = delegate_options['artifacts_folder'] + '/' + model + '/' #+ 'tempDir/' 
    

     需要调用 Delegate_options.update (可选_options) 那样  Delegate_options.update (required_options)。 否则、DEBUG_LEVEL 会写回0。

    日志中没有什么有用的东西、但它们在这里。 我无法放入日志的全部内容,因为它太长了,但运行 deeplabv3lite_mobilenetv2.onnx 似乎失败了。

    Running_Model :  ss-ort-deeplabv3lite_mobilenetv2  
    
    
    Running shape inference on model ../../../models/public/deeplabv3lite_mobilenetv2.onnx 
    
    tidl_tools_path                                 = /opt/edgeai-tidl-tools/tidl_tools 
    artifacts_folder                                = ../../../model-artifacts//ss-ort-deeplabv3lite_mobilenetv2/ 
    tidl_tensor_bits                                = 8 
    debug_level                                     = 3 
    num_tidl_subgraphs                              = 16 
    tidl_denylist                                   = 
    tidl_denylist_layer_name                        = 
    tidl_denylist_layer_type                         = 
    tidl_allowlist_layer_name                        = 
    model_type                                      =  
    tidl_calibration_accuracy_level                 = 7 
    tidl_calibration_options:num_frames_calibration = 2 
    tidl_calibration_options:bias_calibration_iterations = 5 
    mixed_precision_factor = -1.000000 
    model_group_id = 0 
    power_of_2_quantization                         = 2 
    enable_high_resolution_optimization             = 0 
    pre_batchnorm_fold                              = 1 
    add_data_convert_ops                          = 3 
    output_feature_16bit_names_list                 =  
    m_params_16bit_names_list                       =  
    reserved_compile_constraints_flag               = 1601 
    ti_internal_reserved_1                          = 
    
     ****** WARNING : Network not identified as Object Detection network : (1) Ignore if network is not Object Detection network (2) If network is Object Detection network, please specify "model_type":"OD" as part of OSRT compilation options******
    
    Supported TIDL layer type ---            Cast --  
    Supported TIDL layer type ---             Add --  
    Supported TIDL layer type ---             Mul --  
    Supported TIDL layer type ---            Conv -- encoder.features.0.0 
    Supported TIDL layer type ---            Relu -- 369 
    Supported TIDL layer type ---            Conv -- encoder.features.1.conv.0.0 
    Supported TIDL layer type ---            Relu -- 372 
    Supported TIDL layer type ---            Conv -- encoder.features.1.conv.1 
    Supported TIDL layer type ---            Conv -- encoder.features.2.conv.0.0 
    Supported TIDL layer type ---            Relu -- 377 
    Supported TIDL layer type ---            Conv -- encoder.features.2.conv.1.0 
    Supported TIDL layer type ---            Relu -- 380 
    Supported TIDL layer type ---            Conv -- encoder.features.2.conv.2 
    Supported TIDL layer type ---            Conv -- encoder.features.3.conv.0.0 
    Supported TIDL layer type ---            Relu -- 385 
    Supported TIDL layer type ---            Conv -- encoder.features.3.conv.1.0 
    Supported TIDL layer type ---            Relu -- 388 
    Supported TIDL layer type ---            Conv -- encoder.features.3.conv.2 
    Supported TIDL layer type ---             Add -- 391 
    Supported TIDL layer type ---            Conv -- encoder.features.4.conv.0.0 
    Supported TIDL layer type ---            Relu -- 394 
    Supported TIDL layer type ---            Conv -- encoder.features.4.conv.1.0 
    Supported TIDL layer type ---            Relu -- 397 
    Supported TIDL layer type ---            Conv -- encoder.features.4.conv.2 
    Supported TIDL layer type ---            Conv -- encoder.features.5.conv.0.0 
    Supported TIDL layer type ---            Relu -- 402 
    Supported TIDL layer type ---            Conv -- encoder.features.5.conv.1.0 
    Supported TIDL layer type ---            Relu -- 405 
    Supported TIDL layer type ---            Conv -- encoder.features.5.conv.2 
    Supported TIDL layer type ---             Add -- 408 
    Supported TIDL layer type ---            Conv -- encoder.features.6.conv.0.0 
    Supported TIDL layer type ---            Relu -- 411 
    Supported TIDL layer type ---            Conv -- encoder.features.6.conv.1.0 
    Supported TIDL layer type ---            Relu -- 414 
    Supported TIDL layer type ---            Conv -- encoder.features.6.conv.2 
    Supported TIDL layer type ---             Add -- 417 
    Supported TIDL layer type ---            Conv -- encoder.features.7.conv.0.0 
    Supported TIDL layer type ---            Relu -- 420 
    Supported TIDL layer type ---            Conv -- encoder.features.7.conv.1.0 
    Supported TIDL layer type ---            Relu -- 423 
    Supported TIDL layer type ---            Conv -- encoder.features.7.conv.2 
    Supported TIDL layer type ---            Conv -- encoder.features.8.conv.0.0 
    Supported TIDL layer type ---            Relu -- 428 
    Supported TIDL layer type ---            Conv -- encoder.features.8.conv.1.0 
    Supported TIDL layer type ---            Relu -- 431 
    Supported TIDL layer type ---            Conv -- encoder.features.8.conv.2 
    Supported TIDL layer type ---             Add -- 434 
    Supported TIDL layer type ---            Conv -- encoder.features.9.conv.0.0 
    Supported TIDL layer type ---            Relu -- 437 
    Supported TIDL layer type ---            Conv -- encoder.features.9.conv.1.0 
    Supported TIDL layer type ---            Relu -- 440 
    Supported TIDL layer type ---            Conv -- encoder.features.9.conv.2 
    Supported TIDL layer type ---             Add -- 443 
    Supported TIDL layer type ---            Conv -- encoder.features.10.conv.0.0 
    Supported TIDL layer type ---            Relu -- 446 
    Supported TIDL layer type ---            Conv -- encoder.features.10.conv.1.0 
    Supported TIDL layer type ---            Relu -- 449 
    Supported TIDL layer type ---            Conv -- encoder.features.10.conv.2 
    Supported TIDL layer type ---             Add -- 452 
    Supported TIDL layer type ---            Conv -- encoder.features.11.conv.0.0 
    Supported TIDL layer type ---            Relu -- 455 
    Supported TIDL layer type ---            Conv -- encoder.features.11.conv.1.0 
    Supported TIDL layer type ---            Relu -- 458 
    Supported TIDL layer type ---            Conv -- encoder.features.11.conv.2 
    Supported TIDL layer type ---            Conv -- encoder.features.12.conv.0.0 
    Supported TIDL layer type ---            Relu -- 463 
    Supported TIDL layer type ---            Conv -- encoder.features.12.conv.1.0 
    Supported TIDL layer type ---            Relu -- 466 
    Supported TIDL layer type ---            Conv -- encoder.features.12.conv.2 
    Supported TIDL layer type ---             Add -- 469 
    Supported TIDL layer type ---            Conv -- encoder.features.13.conv.0.0 
    Supported TIDL layer type ---            Relu -- 472 
    Supported TIDL layer type ---            Conv -- encoder.features.13.conv.1.0 
    Supported TIDL layer type ---            Relu -- 475 
    Supported TIDL layer type ---            Conv -- encoder.features.13.conv.2 
    Supported TIDL layer type ---             Add -- 478 
    Supported TIDL layer type ---            Conv -- encoder.features.14.conv.0.0 
    Supported TIDL layer type ---            Relu -- 481 
    Supported TIDL layer type ---            Conv -- encoder.features.14.conv.1.0 
    Supported TIDL layer type ---            Relu -- 484 
    Supported TIDL layer type ---            Conv -- encoder.features.14.conv.2 
    Supported TIDL layer type ---            Conv -- encoder.features.15.conv.0.0 
    Supported TIDL layer type ---            Relu -- 489 
    Supported TIDL layer type ---            Conv -- encoder.features.15.conv.1.0 
    Supported TIDL layer type ---            Relu -- 492 
    Supported TIDL layer type ---            Conv -- encoder.features.15.conv.2 
    Supported TIDL layer type ---             Add -- 495 
    Supported TIDL layer type ---            Conv -- encoder.features.16.conv.0.0 
    Supported TIDL layer type ---            Relu -- 498 
    Supported TIDL layer type ---            Conv -- encoder.features.16.conv.1.0 
    Supported TIDL layer type ---            Relu -- 501 
    Supported TIDL layer type ---            Conv -- encoder.features.16.conv.2 
    Supported TIDL layer type ---             Add -- 504 
    Supported TIDL layer type ---            Conv -- encoder.features.17.conv.0.0 
    Supported TIDL layer type ---            Relu -- 507 
    Supported TIDL layer type ---            Conv -- encoder.features.17.conv.1.0 
    Supported TIDL layer type ---            Relu -- 510 
    Supported TIDL layer type ---            Conv -- encoder.features.17.conv.2 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_bra3.0.0 
    Supported TIDL layer type ---            Relu -- 533 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_bra3.1.0 
    Supported TIDL layer type ---            Relu -- 536 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_bra2.0.0 
    Supported TIDL layer type ---            Relu -- 527 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_bra2.1.0 
    Supported TIDL layer type ---            Relu -- 530 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_bra1.0.0 
    Supported TIDL layer type ---            Relu -- 521 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_bra1.1.0 
    Supported TIDL layer type ---            Relu -- 524 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.conv1x1.0 
    Supported TIDL layer type ---            Relu -- 518 
    Supported TIDL layer type ---          Concat -- 537 
    Supported TIDL layer type ---            Conv -- decoders.0.aspp.aspp_out.0 
    Supported TIDL layer type ---            Relu -- 540 
    Supported TIDL layer type ---          Resize -- 571 
    Supported TIDL layer type ---            Conv -- decoders.0.shortcut.0 
    Supported TIDL layer type ---            Relu -- 515 
    Supported TIDL layer type ---          Concat -- 516 
    Supported TIDL layer type ---            Conv -- decoders.0.pred.0.0 
    Supported TIDL layer type ---            Conv -- decoders.0.pred.1.0 
    Supported TIDL layer type ---          Resize -- 576 
    Supported TIDL layer type ---          ArgMax -- 565 
    Supported TIDL layer type ---            Cast --  
    
    Preliminary subgraphs created = 1 
    Final number of subgraphs created are : 1, - Offloaded Nodes - 124, Total Nodes - 124 
    INFORMATION -- [TIDL_ResizeLayer]  Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.  
    INFORMATION -- [TIDL_ResizeLayer]  Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.  
    Running runtimes graphviz - /opt/edgeai-tidl-tools/tidl_tools/tidl_graphVisualiser_runtimes.out ../../../model-artifacts//ss-ort-deeplabv3lite_mobilenetv2//allowedNode.txt ../../../model-artifacts//ss-ort-deeplabv3lite_mobilenetv2//tempDir/graphvizInfo.txt ../../../model-artifacts//ss-ort-deeplabv3lite_mobilenetv2//tempDir/runtimes_visualization.svg 
    *** In TIDL_createStateImportFunc *** 
    Compute on node : TIDLExecutionProvider_TIDL_0_0
      0,            Cast, 1, 1, input.1Net_IN, TIDL_cast_in
      1,             Add, 2, 1, TIDL_cast_in, TIDL_Scale_In
      2,             Mul, 2, 1, TIDL_Scale_In, input.1
      3,            Conv, 3, 1, input.1, 369
      4,            Relu, 1, 1, 369, 370
      5,            Conv, 3, 1, 370, 372
      6,            Relu, 1, 1, 372, 373
      7,            Conv, 3, 1, 373, 375
      8,            Conv, 3, 1, 375, 377
      9,            Relu, 1, 1, 377, 378
     10,            Conv, 3, 1, 378, 380
     11,            Relu, 1, 1, 380, 381
     12,            Conv, 3, 1, 381, 383
     13,            Conv, 3, 1, 383, 385
     14,            Relu, 1, 1, 385, 386
     15,            Conv, 3, 1, 386, 388
     16,            Relu, 1, 1, 388, 389
     17,            Conv, 3, 1, 389, 391
     18,             Add, 2, 1, 383, 392
     19,            Conv, 3, 1, 392, 515
     20,            Relu, 1, 1, 515, 516
     21,            Conv, 3, 1, 392, 394
     22,            Relu, 1, 1, 394, 395
     23,            Conv, 3, 1, 395, 397
     24,            Relu, 1, 1, 397, 398
     25,            Conv, 3, 1, 398, 400
     26,            Conv, 3, 1, 400, 402
     27,            Relu, 1, 1, 402, 403
     28,            Conv, 3, 1, 403, 405
     29,            Relu, 1, 1, 405, 406
     30,            Conv, 3, 1, 406, 408
     31,             Add, 2, 1, 400, 409
     32,            Conv, 3, 1, 409, 411
     33,            Relu, 1, 1, 411, 412
     34,            Conv, 3, 1, 412, 414
     35,            Relu, 1, 1, 414, 415
     36,            Conv, 3, 1, 415, 417
     37,             Add, 2, 1, 409, 418
     38,            Conv, 3, 1, 418, 420
     39,            Relu, 1, 1, 420, 421
     40,            Conv, 3, 1, 421, 423
     41,            Relu, 1, 1, 423, 424
     42,            Conv, 3, 1, 424, 426
     43,            Conv, 3, 1, 426, 428
     44,            Relu, 1, 1, 428, 429
     45,            Conv, 3, 1, 429, 431
     46,            Relu, 1, 1, 431, 432
     47,            Conv, 3, 1, 432, 434
     48,             Add, 2, 1, 426, 435
     49,            Conv, 3, 1, 435, 437
     50,            Relu, 1, 1, 437, 438
     51,            Conv, 3, 1, 438, 440
     52,            Relu, 1, 1, 440, 441
     53,            Conv, 3, 1, 441, 443
     54,             Add, 2, 1, 435, 444
     55,            Conv, 3, 1, 444, 446
     56,            Relu, 1, 1, 446, 447
     57,            Conv, 3, 1, 447, 449
     58,            Relu, 1, 1, 449, 450
     59,            Conv, 3, 1, 450, 452
     60,             Add, 2, 1, 444, 453
     61,            Conv, 3, 1, 453, 455
     62,            Relu, 1, 1, 455, 456
     63,            Conv, 3, 1, 456, 458
     64,            Relu, 1, 1, 458, 459
     65,            Conv, 3, 1, 459, 461
     66,            Conv, 3, 1, 461, 463
     67,            Relu, 1, 1, 463, 464
     68,            Conv, 3, 1, 464, 466
     69,            Relu, 1, 1, 466, 467
     70,            Conv, 3, 1, 467, 469
     71,             Add, 2, 1, 461, 470
     72,            Conv, 3, 1, 470, 472
     73,            Relu, 1, 1, 472, 473
     74,            Conv, 3, 1, 473, 475
     75,            Relu, 1, 1, 475, 476
     76,            Conv, 3, 1, 476, 478
     77,             Add, 2, 1, 470, 479
     78,            Conv, 3, 1, 479, 481
     79,            Relu, 1, 1, 481, 482
     80,            Conv, 3, 1, 482, 484
     81,            Relu, 1, 1, 484, 485
     82,            Conv, 3, 1, 485, 487
     83,            Conv, 3, 1, 487, 489
     84,            Relu, 1, 1, 489, 490
     85,            Conv, 3, 1, 490, 492
     86,            Relu, 1, 1, 492, 493
     87,            Conv, 3, 1, 493, 495
     88,             Add, 2, 1, 487, 496
     89,            Conv, 3, 1, 496, 498
     90,            Relu, 1, 1, 498, 499
     91,            Conv, 3, 1, 499, 501
     92,            Relu, 1, 1, 501, 502
     93,            Conv, 3, 1, 502, 504
     94,             Add, 2, 1, 496, 505
     95,            Conv, 3, 1, 505, 507
     96,            Relu, 1, 1, 507, 508
     97,            Conv, 3, 1, 508, 510
     98,            Relu, 1, 1, 510, 511
     99,            Conv, 3, 1, 511, 513
    100,            Conv, 3, 1, 513, 518
    101,            Relu, 1, 1, 518, 519
    102,            Conv, 3, 1, 513, 521
    103,            Relu, 1, 1, 521, 522
    104,            Conv, 3, 1, 522, 524
    105,            Relu, 1, 1, 524, 525
    106,            Conv, 3, 1, 513, 527
    107,            Relu, 1, 1, 527, 528
    108,            Conv, 3, 1, 528, 530
    109,            Relu, 1, 1, 530, 531
    110,            Conv, 3, 1, 513, 533
    111,            Relu, 1, 1, 533, 534
    112,            Conv, 3, 1, 534, 536
    113,            Relu, 1, 1, 536, 537
    114,          Concat, 4, 1, 519, 538
    115,            Conv, 3, 1, 538, 540
    116,            Relu, 1, 1, 540, 541
    117,          Resize, 3, 1, 541, 551
    118,          Concat, 2, 1, 551, 552
    119,            Conv, 3, 1, 552, 554
    120,            Conv, 2, 1, 554, 555
    121,          Resize, 3, 1, 555, 565
    122,          ArgMax, 1, 1, 565, 566
    123,            Cast, 1, 1, 566, 566TIDL_cast_out
    
    Input tensor name -  input.1Net_IN 
    Output tensor name - 566TIDL_cast_out 
    In TIDL_onnxRtImportInit subgraph_name=566TIDL_cast_out
    Layer 0, subgraph id 566TIDL_cast_out, name=566TIDL_cast_out
    Layer 1, subgraph id 566TIDL_cast_out, name=input.1Net_IN
    In TIDL_runtimesOptimizeNet: LayerIndex = 126, dataIndex = 125 
    Warning : Requested Output Data Convert Layer is not Added to the network, It is currently not Optimal
    
     ************** Frame index 1 : Running float import ************* 
    In TIDL_runtimesPostProcessNet 
    INFORMATION: [TIDL_ResizeLayer] 571 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.
    INFORMATION: [TIDL_ResizeLayer] 576 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.
    ****************************************************
    **          2 WARNINGS          0 ERRORS          **
    ****************************************************
    ************ in TIDL_subgraphRtCreate ************ 
     The soft limit is 2048
    The hard limit is 2048
    MEM: Init ... !!!
    MEM: Init ... Done !!!
     56.768821s:  VX_ZONE_INIT:Enabled
     56.768824s:  VX_ZONE_ERROR:Enabled
     56.768826s:  VX_ZONE_WARNING:Enabled
     56.769548s:  VX_ZONE_INIT:[tivxInit:184] Initialization Done !!!
    ************ TIDL_subgraphRtCreate done ************ 
     *******   In TIDL_subgraphRtInvoke  ******** 
       0         1.00000        13.00000       255.00000 6
       1         1.00000        13.00000       255.00000 6
       2         1.00000         0.00000         1.70939 6
       3         1.00000         0.00000         6.63638 6
       4         1.00000        -6.26595         4.70380 6
       5         1.00000         0.00000         4.82374 6
       6         1.00000         0.00000         3.33056 6
       7         1.00000        -2.43808         3.36087 6
       8         1.00000         0.00000         1.01476 6
       9         1.00000         0.00000         1.82469 6
      10         1.00000        -3.23967         2.83533 6
      11         1.00000        -4.95676         4.71945 6
      12         1.00000         0.00000         1.35142 6
      13         1.00000         0.00000         1.91816 6
      14         1.00000         0.00000         2.74106 6
      15         1.00000        -2.92971         2.87725 6
      16         1.00000         0.00000         0.92410 6
      17         1.00000         0.00000         1.58639 6
      18         1.00000        -2.13769         2.07801 6
      19         1.00000        -3.14827         3.24599 6
      20         1.00000         0.00000         1.27750 6
      21         1.00000         0.00000         1.47342 6
      22         1.00000        -2.48598         2.16254 6
      23         1.00000        -4.13893         3.83608 6
      24         1.00000         0.00000         1.87032 6
      25         1.00000         0.00000         3.24930 6
      26         1.00000        -2.22035         2.01417 6
      27         1.00000         0.00000         0.79210 6
      28         1.00000         0.00000         1.27267 6
      29         1.00000        -2.13012         1.73467 6
      30         1.00000        -2.59007         2.31123 6
      31         1.00000         0.00000         0.69329 6
      32         1.00000         0.00000         1.40499 6
      33         1.00000        -1.67600         1.13149 6
      34         1.00000        -2.67766         2.79643 6
      35         1.00000         0.00000         0.98111 6
      36         1.00000         0.00000         1.68029 6
      37         1.00000        -1.26448         1.36133 6
      38         1.00000        -3.29024         3.32031 6
      39         1.00000         0.00000         1.53051 6
      40         1.00000         0.00000         3.13890 6
      41         1.00000        -1.89363         2.02331 6
      42         1.00000         0.00000         0.89560 6
      43         1.00000         0.00000         1.77731 6
      44         1.00000        -1.41292         1.19110 6
      45         1.00000        -1.90091         2.41924 6
      46         1.00000         0.00000         0.86644 6
      47         1.00000         0.00000         1.44425 6
      48         1.00000        -1.69233         1.44870 6
      49         1.00000        -2.41101         2.40814 6
      50         1.00000         0.00000         1.50448 6
      51         1.00000         0.00000         2.77799 6
      52         1.00000        -1.51747         1.50009 6
      53         1.00000         0.00000         3.22712 6
      54         1.00000         0.00000         1.32348 6
      55         1.00000        -1.28662         1.12000 6
      56         1.00000        -1.57279         1.41987 6
      57         1.00000         0.00000         1.23090 6
      58         1.00000         0.00000         1.38300 6
      59         1.00000        -0.90853         1.93517 6
      60         1.00000        -1.72649         1.73335 6
      61         1.00000         0.00000         0.75028 6
      62         1.00000         0.00000         3.64279 6
      63         1.00000        -1.76884         1.59342 6
      64         1.00000         0.00000         0.69980 6
      65         1.00000         0.00000         1.14739 6
      66         1.00000         0.00000         0.94319 6
      67         1.00000         0.00000         2.97372 6
      68         1.00000         0.00000         1.28226 6
      69         1.00000         0.00000         3.15015 6
      70         1.00000         0.00000         2.12302 6
      71         1.00000         0.00000         2.12302 6
      72         1.00000         0.00000         9.21194 6
      73         1.00000         0.00000         9.21098 6
      74         1.00000         0.00000         9.21098 6
      75         1.00000       -34.97995         7.52468 6
      76         1.00000       -51.73018        16.70612 6
      77         1.00000       -51.72625        16.53502 6
      78         1.00000         0.00000        21.00000 6
     Layer,   Layer Cycles,kernelOnlyCycles, coreLoopCycles,LayerSetupCycles,dmaPipeupCycles, dmaPipeDownCycles, PrefetchCycles,copyKerCoeffCycles,LayerDeinitCycles,LastBlockCycles, paddingTrigger,    paddingWait,LayerWithoutPad,LayerHandleCopy,   BackupCycles,  RestoreCycles,
    
         1,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         2,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         3,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         4,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         5,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         6,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         7,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         8,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
         9,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        10,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        11,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        12,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        13,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        14,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        15,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        16,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        17,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        18,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        19,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        20,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        21,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        22,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        23,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        24,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        25,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        26,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        27,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        28,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        29,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        30,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        31,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        32,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        33,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        34,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        35,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        36,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        37,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        38,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        39,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        40,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        41,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        42,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        43,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        44,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        45,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        46,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        47,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        48,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        49,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        50,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        51,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        52,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        53,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        54,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        55,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        56,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        57,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        58,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        59,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        60,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        61,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        62,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        63,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        64,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        65,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        66,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        67,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        68,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        69,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        70,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        71,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        72,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        73,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        74,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        75,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        76,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        77,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
        78,              0,              0,              0,              0,              0,                 0,              0,                 0,              0,              0,              0,              0,              0,              0,              0,              0,
    
     Sum of Layer Cycles 0 
    Segmentation fault (core dumped)
    root@mypc:/opt/edgeai-tidl-tools/examples/osrt_python/ort# 

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    尊敬的 Fred:

    感谢您的耐心。 是的、我能够重现错误。 我们将在我们这边更深入地研究这一点。

    对于原始问题、之前提到您可以解决它、但该后续问题是否会阻碍开发? 我主要要求测量优先级。

    此致、

    Takuma.

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    非阻塞、感谢验证。