This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

[参考译文] TDA4VM:7.03版 RTOS+QNX 中发生 IPC_TEST 异常

Guru**** 2538920 points


请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1014568/tda4vm-ipc_test-exception-occured-in-version-7-03-rtos-qnx

器件型号:TDA4VM

参考

https://software-dl.ti.com/jacinto7/esd/processor-sdk-qnx-jacinto7/07_03_00_02/exports/docs/qnx_sdk_components_j721e.html#example-application

https://software-dl.ti.com/jacinto7/esd/processor-sdk-rtos-jacinto7/07_03_00_07/exports/docs/pdk_jacinto_07_03_00_29/docs/userguide/jacinto/modules/ipc.html#build

执行以下步骤以运行 IPC_TEST。

1。  

cd $/ ti-processor-sdk-rtos-j721e-evm-07_03_00_07/pdk_jacinto_07_03_00_29/packages/ti/build

make -s -j build_profile=release Board=j721e_evm core=mcu1_0 ex02_BIOS_multicore_echo_test

make -s -j build_profile=release Board=j721e_evm core=MCU2_0 ex02_BIOS_multicore_echo 测试

make -s -j build_profile=release Board=j721e_evm core=MCU2_1 ex02_BIOS_multicore_echo_test

make -s -j build_profile=release Board=j721e_evm core=c66xdsp_1 ex02_BIOS_multicore_echo 测试

make -s -j build_profile=release Board=j721e_evm core=c66xdsp_2 ex02_BIOS_multicore_echo 测试

make -s -j build_profile=release Board=j721e_evm core=c7x_1 ex02_BIOS_multicore_echo 测试

2.

cd $/ ti-processor-sdk-rtos-j721e-evm-07_03_00_07/pdk_jacinto_07_03_00_29/packages/ti/binary

CP ex02_BIOS_multicore_echo_test_c7x_1_release.xe71 /media/tda4/rootfs/lib/firmware/j7-c71_0-fw
CP ex02_BIOS_multicore_echo_test_c66xdsp_2_release.xe66 /media/tda4/rootfs/lib/firmware/j7-c66_1-fw
CP ex02_BIOS_multicore_echo_test_c66xdsp_1_release.xe66 /media/tda4/rootfs/lib/firmware/j7-c66_0-fw
CP ex02_BIOS_multicore_echo_test_MCU2_1_release.xer5f /media/tda4/rootfs/lib/firmware/j7-main-r5f0_1-fw
CP ex02_BIOS_multicore_echo_test_MCU2_0_release.xer5f /media/tda4/rootfs/lib/firmware/j7-main-r5f0_0-fw

3.运行 EVM, 运行‘IPC_TEST -v’  

U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 01:08:23 +0000)
SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
Trying to boot from MMC2
Loading Environment from MMC... *** Warning - No MMC card found, using default environment

Starting ATF on ARM64 core...

NOTICE:  BL31: v2.4(release):07.03.00.005-dirty
NOTICE:  BL31: Built : 00:15:40, Apr 10 2021

U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)
SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
Detected: J7X-BASE-CPB rev E3
Detected: J7X-VSC8514-ETH rev E2
Trying to boot from MMC2


U-Boot 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)

SoC:   J721E SR1.0
Model: Texas Instruments K3 J721E SoC
Board: J721EX-PM2-SOM rev E7
DRAM:  4 GiB
not found for dev hbmc-mux
Flash: 0 Bytes
MMC:   sdhci@4f80000: 0, sdhci@4fb0000: 1
Loading Environment from MMC... OK
In:    serial@2800000
Out:   serial@2800000
Err:   serial@2800000
Detected: J7X-BASE-CPB rev E3
Detected: J7X-VSC8514-ETH rev E2
Net:   K3 CPSW: nuss_ver: 0x6BA00101 cpsw_ver: 0x6BA80100 ale_ver: 0x00293904 Ports:1 mdio_freq:1000000

Warning: ethernet@46000000 using MAC address from ROM
eth0: ethernet@46000000
Hit any key to stop autoboot:  0
switch to partitions #0, OK
mmc1 is current device
SD/MMC found on device 1
526 bytes read in 3 ms (170.9 KiB/s)
Loaded env from uEnv.txt
Importing environment from mmc1 ...
Running uenvcmd ...
Core 1 is already in use. No rproc commands work
Core 2 is already in use. No rproc commands work
4286448 bytes read in 91 ms (44.9 MiB/s)
Load Remote Processor 2 with data@addr=0x82000000 4286448 bytes: Success!
4286420 bytes read in 91 ms (44.9 MiB/s)
Load Remote Processor 3 with data@addr=0x82000000 4286420 bytes: Success!
5444692 bytes read in 116 ms (44.8 MiB/s)
Load Remote Processor 6 with data@addr=0x82000000 5444692 bytes: Success!
5444708 bytes read in 115 ms (45.2 MiB/s)
Load Remote Processor 7 with data@addr=0x82000000 5444708 bytes: Success!
11479872 bytes read in 37 ms (295.9 MiB/s)
Load Remote Processor 8 with data@addr=0x82000000 11479872 bytes: Success!
8176868 bytes read in 173 ms (45.1 MiB/s)
## Starting application at 0x80080000 ...
MMU: 16-bit ASID 44-bit PA TCR_EL1=b5183519
cpu0: MPIDR=80000000
cpu0: MIDR=411fd080 Cortex-A72 r1p0
cpu0: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
cpu0: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
cpu0: L1 Icache 48K linesz=64 set/way=256/3
cpu0: L1 Dcache 32K linesz=64 set/way=256/2
cpu0: L2 Unified 1024K linesz=64 set/way=1024/16
Display set to R5
Loading IFS...decompressing...done
cpu1: MPIDR=80000001
cpu1: MIDR=411fd080 Cortex-A72 r1p0
cpu1: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
cpu1: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
cpu1: L1 Icache 48K linesz=64 set/way=256/3
cpu1: L1 Dcache 32K linesz=64 set/way=256/2
cpu1: L2 Unified 1024K linesz=64 set/way=1024/16

System page at phys:0000000080011000 user:ffffff8040254000 kern:ffffff8040251000
Starting next program at vffffff8060086e10
All ClockCycles offsets within tolerance
Welcome to QNX Neutrino 7.1.0 on the TI J721E EVM Board!!
Starting random service ...
start serial driver
Starting MMC/SD memory card driver... eMMC
Starting MMC/SD memory card driver... SD
Starting XHCI driver on USB3SS0 and USB3SS1
Path=0 - am65x
 target=0 lun=0     Direct-Access(0) - SDMMC: S0J56X Rev: 1.0
Setting environment variables...
done..
Mounting the sd ..
Looking for user script to run: /ti_fs/scripts/user.sh
Running user script...
user.sh called...
Setting additional environment variables...
Starting tisci-mgr..
Starting shmemallocator..
Starting tiipc-mgr..
Mailbox_plugInterrupt: interrupt Number 489, arg 0x4C48B018
Mailbox_plugInterrupt: interrupt Number 490, arg 0x4C48B1B8
Mailbox_plugInterrupt: interrupt Number 491, arg 0x4C48B358
Mailbox_plugInterrupt: interrupt Number 492, arg 0x4C48B4F8
Mailbox_plugInterrupt: interrupt Number 493, arg 0x4C48B698

Process 57360 (tiipc-mgr) terminated SIGSEGV code=1 fltno=11 ip=000000384c43f798(/ti_fs/tibin/tiipc-mgr@lose+0x0000000000003438) mapaddr=000000000000f798. ref=0000005230448bf0
Memory fault (core dumped)
Starting tiudma-mgr..
Start screen..
screen started with dss_on_r5 configuration..
done...
J7EVM@QNX:/# ipc_test -v
IPC_echo_test (core : ) .....
responderFxn will stay active. Please use ctrl-c to exit the test when finished.
RecvTask: Failed to create endpoint
SendTask 1: Failed to create message endpoint
SendTask 2: Failed to create message endpoint
SendTask 3: Failed to create message endpoint
SendTask 4: Failed to create message endpoint
SendTask 5: Failed to create message endpoint
SendTask 6: Failed to create message endpoint
SendTask 7: Failed to create message endpoint
SendTask 8: Failed to create message endpoint
SendTask 9: Failed to create message endpoint

从日志中、IPC_TEST 不起作用。然后我修改了$/ti-processor-sdk-rtos-j721e-evm-07_03_00_07/pdk_jacinto_07_03_00_29/packages/ti/drv/ipc/examples/common/src/ipc_setup.h

/*为每个器件设置起始地址*/
#ifdef SOC_AM65XX
#define VING_BASE_ADDRESS 0xA2000000U
定义了#Elif (SOC_J7200)
#define VING_BASE_ADDRESS 0xA4000000U
定义了#Elif (SOC_AM64X)
#define VING_BASE_ADDRESS 0xA5000000U
其他
//#define VING_BASE_ADDRESS 0xAA000000U
#define VING_BASE_ADDRESS 0xB0000000U
#endif

运行 EVM。

U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 01:08:23 +0000)
SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
Trying to boot from MMC2
Loading Environment from MMC... *** Warning - No MMC card found, using default environment

Starting ATF on ARM64 core...

NOTICE:  BL31: v2.4(release):07.03.00.005-dirty
NOTICE:  BL31: Built : 00:15:40, Apr 10 2021

U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)
SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
Detected: J7X-BASE-CPB rev E3
Detected: J7X-VSC8514-ETH rev E2
Trying to boot from MMC2


U-Boot 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)

SoC:   J721E SR1.0
Model: Texas Instruments K3 J721E SoC
Board: J721EX-PM2-SOM rev E7
DRAM:  4 GiB
not found for dev hbmc-mux
Flash: 0 Bytes
MMC:   sdhci@4f80000: 0, sdhci@4fb0000: 1
Loading Environment from MMC... OK
In:    serial@2800000
Out:   serial@2800000
Err:   serial@2800000
Detected: J7X-BASE-CPB rev E3
Detected: J7X-VSC8514-ETH rev E2
Net:   K3 CPSW: nuss_ver: 0x6BA00101 cpsw_ver: 0x6BA80100 ale_ver: 0x00293904 Ports:1 mdio_freq:1000000

Warning: ethernet@46000000 using MAC address from ROM
eth0: ethernet@46000000
Hit any key to stop autoboot:  0
switch to partitions #0, OK
mmc1 is current device
SD/MMC found on device 1
526 bytes read in 3 ms (170.9 KiB/s)
Loaded env from uEnv.txt
Importing environment from mmc1 ...
Running uenvcmd ...
Core 1 is already in use. No rproc commands work
Core 2 is already in use. No rproc commands work
4286448 bytes read in 91 ms (44.9 MiB/s)
Load Remote Processor 2 with data@addr=0x82000000 4286448 bytes: Success!
4286420 bytes read in 91 ms (44.9 MiB/s)
Load Remote Processor 3 with data@addr=0x82000000 4286420 bytes: Success!
5444692 bytes read in 115 ms (45.2 MiB/s)
Load Remote Processor 6 with data@addr=0x82000000 5444692 bytes: Success!
5444708 bytes read in 116 ms (44.8 MiB/s)
Load Remote Processor 7 with data@addr=0x82000000 5444708 bytes: Success!
11479872 bytes read in 37 ms (295.9 MiB/s)
Load Remote Processor 8 with data@addr=0x82000000 11479872 bytes: Success!
8176868 bytes read in 173 ms (45.1 MiB/s)
## Starting application at 0x80080000 ...
MMU: 16-bit ASID 44-bit PA TCR_EL1=b5183519
cpu0: MPIDR=80000000
cpu0: MIDR=411fd080 Cortex-A72 r1p0
cpu0: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
cpu0: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
cpu0: L1 Icache 48K linesz=64 set/way=256/3
cpu0: L1 Dcache 32K linesz=64 set/way=256/2
cpu0: L2 Unified 1024K linesz=64 set/way=1024/16
Display set to R5
Loading IFS...decompressing...done
cpu1: MPIDR=80000001
cpu1: MIDR=411fd080 Cortex-A72 r1p0
cpu1: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
cpu1: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
cpu1: L1 Icache 48K linesz=64 set/way=256/3
cpu1: L1 Dcache 32K linesz=64 set/way=256/2
cpu1: L2 Unified 1024K linesz=64 set/way=1024/16

System page at phys:0000000080011000 user:ffffff8040254000 kern:ffffff8040251000
Starting next program at vffffff8060086e10
All ClockCycles offsets within tolerance
Welcome to QNX Neutrino 7.1.0 on the TI J721E EVM Board!!
Starting random service ...
start serial driver
Starting MMC/SD memory card driver... eMMC
Starting MMC/SD memory card driver... SD
Starting XHCI driver on USB3SS0 and USB3SS1
Path=0 - am65x
 target=0 lun=0     Direct-Access(0) - SDMMC: S0J56X Rev: 1.0
Setting environment variables...
done..
Mounting the sd ..
Looking for user script to run: /ti_fs/scripts/user.sh
Running user script...
user.sh called...
Setting additional environment variables...
Starting tisci-mgr..
Starting shmemallocator..
Starting tiipc-mgr..
Mailbox_plugInterrupt: interrupt Number 489, arg 0x88A71018
Mailbox_plugInterrupt: interrupt Number 490, arg 0x88A711B8
Mailbox_plugInterrupt: interrupt Number 491, arg 0x88A71358
Mailbox_plugInterrupt: interrupt Number 492, arg 0x88A714F8
Mailbox_plugInterrupt: interrupt Number 493, arg 0x88A71698
Starting TI IPC Resmgr
Starting tiudma-mgr..
Start screen..
screen started with dss_on_r5 configuration..
done...
J7EVM@QNX:/# ipc_test -v
IPC_echo_test (core : mpu1_0) .....
responderFxn will stay active. Please use ctrl-c to exit the test when finished.
SendTask3: Sending "ping 0" from mpu1_0 to mcu2_0...
SendTask9: Sending "ping 0" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 0" from mpu1_0 to mcu2_1...
SendTask7: Sending "ping 0" from mpu1_0 to C66X_1...
SendTask8: Sending "ping 0" from mpu1_0 to C66X_2...
SendTask9: RPMessage_recv failed with code -1
SendTask9: Received "ping 0" len 7 from C7X_1 endPt 13
SendTask4: RPMessage_recv failed with code -1
SendTask9: Sending "ping 1" from mpu1_0 to C7X_1...
SendTask4: Received "ping 0" len 7 from mcu2_1 endPt 13
SendTask4: Sending "ping 1" from mpu1_0 to mcu2_1...
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: RPMessage_recv failed with code -1
SendTask7: Received "ping 0" len 7 from C66X_1 endPt 13
SendTask8: RPMessage_recv failed with code -1
SendTask7: Sending "ping 1" from mpu1_0 to C66X_1...
SendTask9: RPMessage_recv failed with code -1
SendTask8: Received "ping 0" len 7 from C66X_2 endPt 13
SendTask4: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: Received "ping 1" len 7 from C7X_1 endPt 13
SendTask8: Sending "ping 1" from mpu1_0 to C66X_2...
SendTask4: Received "ping 1" len 7 from mcu2_1 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Sending "ping 2" from mpu1_0 to C7X_1...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: Sending "ping 2" from mpu1_0 to mcu2_1...
SendTask7: Received "ping 1" len 7 from C66X_1 endPt 13
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 2" from mpu1_0 to C66X_1...
SendTask9: RPMessage_recv failed with code -1
SendTask8: Received "ping 1" len 7 from C66X_2 endPt 13
SendTask4: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: Received "ping 2" len 7 from C7X_1 endPt 13

Process 57360 (tiipc-mgr) terminated SIGSEGV code=1 fltno=11 ip=0000002588a22e60(/ti_fs/tibin/tiipc-mgr@lose+0x0000000000000b00) mapaddr=000000000000ce60. ref=0000002bf5bf04f0
SendTask8: Sending "ping 2" from mpu1_0 to C66X_2...
SendTask4: Received "ping 2" len 7 from mcu2_1 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Sending "ping 3" from mpu1_0 to C7X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Sending "ping 3" from mpu1_0 to mcu2_1...
SendTask7: Received "ping 2" len 7 from C66X_1 endPt 13
SendTask1: Sending "ping 0" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 0" from mpu1_0 to mcu1_1...
SendTask3: Received "ping 0" len 7 from mcu2_0 endPt 13
SendTask6: Sending "ping 0" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 0" from mpu1_0 to mcu3_0...
SendTask7: Sending "ping 3" from mpu1_0 to C66X_1...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 1" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 2" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 3" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 3" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 0" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 0" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 0" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 0" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 3" from mpu1_0 to C66X_2...
SendTask7: Received "ping 3" len 7 from C66X_1 endPt 13
SendTask9: Sending "ping 4" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 4" from mpu1_0 to mcu2_1...
SendTask3: Received "ping 1" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 1" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 1" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 1" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 1" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 4" from mpu1_0 to C66X_1...
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 2" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 3" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 4" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 4" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 1" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 1" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 1" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 1" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 4" from mpu1_0 to C66X_2...
SendTask7: Received "ping 4" len 7 from C66X_1 endPt 13
SendTask9: Sending "ping 5" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 5" from mpu1_0 to mcu2_1...
SendTask3: Received "ping 2" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 2" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 2" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 2" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 2" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 5" from mpu1_0 to C66X_1...
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 3" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 4" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 5" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 5" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 2" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 2" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 2" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 2" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 5" from mpu1_0 to C66X_2...
SendTask7: Received "ping 5" len 7 from C66X_1 endPt 13
SendTask9: Sending "ping 6" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 6" from mpu1_0 to mcu2_1...
SendTask3: Received "ping 3" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 3" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 3" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 3" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 3" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 6" from mpu1_0 to C66X_1...
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 4" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 5" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 6" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 6" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 3" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 3" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 3" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 3" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 6" from mpu1_0 to C66X_2...
SendTask7: Received "ping 6" len 7 from C66X_1 endPt 13
SendTask9: Sending "ping 7" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 7" from mpu1_0 to mcu2_1...
SendTask3: Received "ping 4" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 4" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 4" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 4" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 4" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 7" from mpu1_0 to C66X_1...
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 5" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 6" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 7" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 7" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 4" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 4" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 4" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 4" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 7" from mpu1_0 to C66X_2...
SendTask7: Received "ping 7" len 7 from C66X_1 endPt 13
SendTask9: Sending "ping 8" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 8" from mpu1_0 to mcu2_1...
SendTask3: Received "ping 5" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 5" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 5" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 5" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 5" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 8" from mpu1_0 to C66X_1...
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 6" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 7" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 8" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 8" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 5" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 5" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 5" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 5" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 8" from mpu1_0 to C66X_2...
SendTask7: Received "ping 8" len 7 from C66X_1 endPt 13
SendTask9: Sending "ping 9" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 9" from mpu1_0 to mcu2_1...
SendTask3: Received "ping 6" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 6" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 6" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 6" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 6" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: Sending "ping 9" from mpu1_0 to C66X_1...
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Sending "ping 7" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: RPMessage_recv failed with code -1
SendTask4: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 8" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask9: Received "ping 9" len 7 from C7X_1 endPt 13
SendTask4: Received "ping 9" len 7 from mcu2_1 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 6" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 6" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 6" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 6" len 7 from mcu3_0 endPt 13
SendTask8: Sending "ping 9" from mpu1_0 to C66X_2...
SendTask7: Received "ping 9" len 7 from C66X_1 endPt 13
SendTask9: mpu1_0 <--> C7X_1, Ping- 10, pong - 10 completed
SendTask4: mpu1_0 <--> mcu2_1, Ping- 10, pong - 10 completed
SendTask3: Received "ping 7" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 7" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 7" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 7" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 7" from mpu1_0 to mcu3_0...
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: mpu1_0 <--> C66X_1, Ping- 10, pong - 10 completed
SendTask3: Sending "ping 8" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: RPMessage_recv failed with code -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask8: Received "ping 9" len 7 from C66X_2 endPt 13
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 7" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 7" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 7" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 7" len 7 from mcu3_0 endPt 13
SendTask8: mpu1_0 <--> C66X_2, Ping- 10, pong - 10 completed
SendTask3: Received "ping 8" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 8" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 8" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 8" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 8" from mpu1_0 to mcu3_0...
SendTask3: Sending "ping 9" from mpu1_0 to mcu2_0...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask3: RPMessage_recv failed with code -1
SendTask1: Received "ping 8" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 8" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 8" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 8" len 7 from mcu3_0 endPt 13
SendTask3: Received "ping 9" len 7 from mcu2_0 endPt 13
SendTask1: Sending "ping 9" from mpu1_0 to mcu1_0...
SendTask2: Sending "ping 9" from mpu1_0 to mcu1_1...
SendTask6: Sending "ping 9" from mpu1_0 to mcu3_1...
SendTask5: Sending "ping 9" from mpu1_0 to mcu3_0...
SendTask3: mpu1_0 <--> mcu2_0, Ping- 10, pong - 10 completed
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask1: Received "ping 9" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 9" len 7 from mcu1_1 endPt 13
SendTask6: Received "ping 9" len 7 from mcu3_1 endPt 13
SendTask5: Received "ping 9" len 7 from mcu3_0 endPt 13
SendTask1: mpu1_0 <--> mcu1_0, Ping- 10, pong - 10 completed
SendTask2: mpu1_0 <--> mcu1_1, Ping- 10, pong - 10 completed
SendTask6: mpu1_0 <--> mcu3_1, Ping- 10, pong - 10 completed
SendTask5: mpu1_0 <--> mcu3_0, Ping- 10, pong - 10 completed
发生故障并导致 tiipc-mgr 死亡。

1。 如何解决这个问题?

2。 表单日志、MPU 和(mcu3_0、mcu3_1、mcu1_0、mcu1_1)之间的 IPC 函 数可以正常工作、但在/media/tda4/rootfs/lib/firmware 中没有相关固件。 为什么是这样?

然后我继续  

TDA4VM:QNX 环境中的 IPC 失败-处理器论坛-处理器- TI E2E 支持论坛

仍然失败。

U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 01:08:23 +0000)
SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
Trying to boot from MMC2
Loading Environment from MMC... *** Warning - No MMC card found, using default environment

Starting ATF on ARM64 core...

NOTICE:  BL31: v2.4(release):07.03.00.005-dirty
NOTICE:  BL31: Built : 00:15:40, Apr 10 2021

U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)
SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
Detected: J7X-BASE-CPB rev E3
Detected: J7X-VSC8514-ETH rev E2
Trying to boot from MMC2


U-Boot 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)

SoC:   J721E SR1.0
Model: Texas Instruments K3 J721E SoC
Board: J721EX-PM2-SOM rev E7
DRAM:  4 GiB
not found for dev hbmc-mux
Flash: 0 Bytes
MMC:   sdhci@4f80000: 0, sdhci@4fb0000: 1
Loading Environment from MMC... OK
In:    serial@2800000
Out:   serial@2800000
Err:   serial@2800000
Detected: J7X-BASE-CPB rev E3
Detected: J7X-VSC8514-ETH rev E2
Net:   K3 CPSW: nuss_ver: 0x6BA00101 cpsw_ver: 0x6BA80100 ale_ver: 0x00293904 Ports:1 mdio_freq:1000000

Warning: ethernet@46000000 using MAC address from ROM
eth0: ethernet@46000000
Hit any key to stop autoboot:  0
switch to partitions #0, OK
mmc1 is current device
SD/MMC found on device 1
526 bytes read in 3 ms (170.9 KiB/s)
Loaded env from uEnv.txt
Importing environment from mmc1 ...
Running uenvcmd ...
Core 1 is already in use. No rproc commands work
Core 2 is already in use. No rproc commands work
4286416 bytes read in 92 ms (44.4 MiB/s)
Load Remote Processor 2 with data@addr=0x82000000 4286416 bytes: Success!
4286388 bytes read in 92 ms (44.4 MiB/s)
Load Remote Processor 3 with data@addr=0x82000000 4286388 bytes: Success!
4286388 bytes read in 91 ms (44.9 MiB/s)
Load Remote Processor 4 with data@addr=0x82000000 4286388 bytes: Success!
4286388 bytes read in 92 ms (44.4 MiB/s)
Load Remote Processor 5 with data@addr=0x82000000 4286388 bytes: Success!
5444692 bytes read in 115 ms (45.2 MiB/s)
Load Remote Processor 6 with data@addr=0x82000000 5444692 bytes: Success!
5444708 bytes read in 116 ms (44.8 MiB/s)
Load Remote Processor 7 with data@addr=0x82000000 5444708 bytes: Success!
11479872 bytes read in 36 ms (304.1 MiB/s)
Load Remote Processor 8 with data@addr=0x82000000 11479872 bytes: Success!
8176868 bytes read in 173 ms (45.1 MiB/s)
## Starting application at 0x80080000 ...
MMU: 16-bit ASID 44-bit PA TCR_EL1=b5183519
cpu0: MPIDR=80000000
cpu0: MIDR=411fd080 Cortex-A72 r1p0
cpu0: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
cpu0: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
cpu0: L1 Icache 48K linesz=64 set/way=256/3
cpu0: L1 Dcache 32K linesz=64 set/way=256/2
cpu0: L2 Unified 1024K linesz=64 set/way=1024/16
Display set to R5
Loading IFS...decompressing...done
cpu1: MPIDR=80000001
cpu1: MIDR=411fd080 Cortex-A72 r1p0
cpu1: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
cpu1: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
cpu1: L1 Icache 48K linesz=64 set/way=256/3
cpu1: L1 Dcache 32K linesz=64 set/way=256/2
cpu1: L2 Unified 1024K linesz=64 set/way=1024/16

System page at phys:0000000080011000 user:ffffff8040254000 kern:ffffff8040251000
Starting next program at vffffff8060086e10
All ClockCycles offsets within tolerance
Welcome to QNX Neutrino 7.1.0 on the TI J721E EVM Board!!
Starting random service ...
start serial driver
Starting MMC/SD memory card driver... eMMC
Starting MMC/SD memory card driver... SD
Starting XHCI driver on USB3SS0 and USB3SS1
Path=0 - am65x
 target=0 lun=0     Direct-Access(0) - SDMMC: S0J56X Rev: 1.0
Setting environment variables...
done..
Mounting the sd ..
Looking for user script to run: /ti_fs/scripts/user.sh
Running user script...
user.sh called...
Setting additional environment variables...
Starting tisci-mgr..
Starting shmemallocator..
Starting tiipc-mgr..
Mailbox_plugInterrupt: interrupt Number 489, arg 0x6F942018
Mailbox_plugInterrupt: interrupt Number 490, arg 0x6F9421B8
Mailbox_plugInterrupt: interrupt Number 491, arg 0x6F942358
Mailbox_plugInterrupt: interrupt Number 492, arg 0x6F9424F8
Mailbox_plugInterrupt: interrupt Number 493, arg 0x6F942698
Starting TI IPC Resmgr
Starting tiudma-mgr..
Start screen..
screen started with dss_on_r5 configuration..
done...
J7EVM@QNX:/#
J7EVM@QNX:/#
J7EVM@QNX:/# ipc_test -v
IPC_echo_test (core : mpu1_0) .....
responderFxn will stay active. Please use ctrl-c to exit the test when finished.
SendTask3: Sending "ping 0" from mpu1_0 to mcu2_0...
SendTask6: Sending "ping 0" from mpu1_0 to mcu3_1...
SendTask9: Sending "ping 0" from mpu1_0 to C7X_1...
SendTask4: Sending "ping 0" from mpu1_0 to mcu2_1...
SendTask5: Sending "ping 0" from mpu1_0 to mcu3_0...
SendTask7: Sending "ping 0" from mpu1_0 to C66X_1...
SendTask8: Sending "ping 0" from mpu1_0 to C66X_2...

Process 57360 (tiipc-mgr) terminated SIGSEGV code=1 fltno=11 ip=0000005c6f8f3e60(/ti_fs/tibin/tiipc-mgr@lose+0x0000000000000b00) mapaddr=000000000000ce60. ref=00000063deaddcf0
SendTask8: RPMessage_recv failed with code -1
SendTask8: Received "ping 0" len 7 from C66X_2 endPt 13
SendTask7: RPMessage_recv failed with code -1
SendTask8: Sending "ping 1" from mpu1_0 to C66X_2...
SendTask5: RPMessage_recv failed with code -1
SendTask7: Received "ping 0" len 7 from C66X_1 endPt 13
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask5: Received "ping 0" len 7 from mcu3_0 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask7: Sending "ping 1" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 0" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 0" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 0" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 1" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 0" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 0" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 0" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 1" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 1" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 1" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 1" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 1" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 1" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 0" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 1" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 0" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 2" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 2" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 1" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 1" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 1" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 2" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 1" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 1" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 1" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 2" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 2" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 2" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 2" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 2" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 2" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 1" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 2" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 1" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 3" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 3" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 2" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 2" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 2" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 3" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 2" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 2" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 2" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 3" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 3" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 3" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 3" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 3" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 3" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 2" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 3" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 2" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 4" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 4" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 3" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 3" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 3" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 4" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 3" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 3" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 3" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 4" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 4" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 4" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 4" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 4" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 4" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 3" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 4" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 3" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 5" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 5" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 4" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 4" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 4" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 5" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 4" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 4" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 4" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 5" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 5" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 5" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 5" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 5" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 5" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 4" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 5" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 4" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 6" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 6" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 5" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 5" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 5" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 6" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 5" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 5" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 5" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 6" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 6" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 6" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 6" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 6" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 6" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 5" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 6" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 5" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 7" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 7" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 6" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 6" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 6" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 7" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 6" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 6" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 6" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 7" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 7" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 7" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 7" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 7" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 7" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 6" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 7" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 6" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 8" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 8" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 7" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 7" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 7" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 8" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 7" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 7" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 7" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 8" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 8" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 8" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 8" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 8" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 8" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 7" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 8" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 7" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: Sending "ping 9" from mpu1_0 to C66X_2...
SendTask7: Sending "ping 9" from mpu1_0 to C66X_1...
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 8" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 8" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 8" from mpu1_0 to mcu1_0...
SendTask5: Sending "ping 9" from mpu1_0 to mcu3_0...
SendTask2: Sending "ping 8" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 8" len 7 from mcu3_1 endPt 13
SendTask8: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask7: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask3: Received "ping 8" len 7 from mcu2_0 endPt 13
SendTask4: Sending "ping 9" from mpu1_0 to mcu2_1...
SendTask9: Sending "ping 9" from mpu1_0 to C7X_1...
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask5: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: Sending "ping 9" from mpu1_0 to mcu3_1...
SendTask8: RPMessage_recv failed with code -1
SendTask7: RPMessage_recv failed with code -1
SendTask3: Sending "ping 9" from mpu1_0 to mcu2_0...
SendTask4: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask9: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask1: RPMessage_recv failed with code -1
SendTask5: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask6: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask8: Received "ping 9" len 7 from C66X_2 endPt 13
SendTask7: Received "ping 9" len 7 from C66X_1 endPt 13
SendTask3: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask4: RPMessage_recv failed with code -1
SendTask9: RPMessage_recv failed with code -1
SendTask1: Received "ping 8" len 7 from mcu1_0 endPt 13
SendTask5: Received "ping 9" len 7 from mcu3_0 endPt 13
SendTask2: Received "ping 8" len 7 from mcu1_1 endPt 13
SendTask6: RPMessage_recv failed with code -1
SendTask8: mpu1_0 <--> C66X_2, Ping- 10, pong - 10 completed
SendTask7: mpu1_0 <--> C66X_1, Ping- 10, pong - 10 completed
SendTask3: RPMessage_recv failed with code -1
SendTask4: Received "ping 9" len 7 from mcu2_1 endPt 13
SendTask9: Received "ping 9" len 7 from C7X_1 endPt 13
SendTask1: Sending "ping 9" from mpu1_0 to mcu1_0...
SendTask5: mpu1_0 <--> mcu3_0, Ping- 10, pong - 10 completed
SendTask2: Sending "ping 9" from mpu1_0 to mcu1_1...
SendTask6: Received "ping 9" len 7 from mcu3_1 endPt 13
SendTask3: Received "ping 9" len 7 from mcu2_0 endPt 13
SendTask4: mpu1_0 <--> mcu2_1, Ping- 10, pong - 10 completed
SendTask9: mpu1_0 <--> C7X_1, Ping- 10, pong - 10 completed
SendTask1: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask2: rpmsg_senderFxn: RPMessage_send  failed status -1
SendTask6: mpu1_0 <--> mcu3_1, Ping- 10, pong - 10 completed
SendTask3: mpu1_0 <--> mcu2_0, Ping- 10, pong - 10 completed
SendTask1: RPMessage_recv failed with code -1
SendTask2: RPMessage_recv failed with code -1
SendTask1: Received "ping 9" len 7 from mcu1_0 endPt 13
SendTask2: Received "ping 9" len 7 from mcu1_1 endPt 13
SendTask1: mpu1_0 <--> mcu1_0, Ping- 10, pong - 10 completed
SendTask2: mpu1_0 <--> mcu1_1, Ping- 10, pong - 10 completed

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    然后、我在 版本7.02 (RTOS+QNX)中验证 IPC_TEST。 它的工作原理。

    U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 01:08:23 +0000)
    SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
    Trying to boot from MMC2
    Loading Environment from MMC... *** Warning - No MMC card found, using default environment
    
    Starting ATF on ARM64 core...
    
    NOTICE:  BL31: v2.4(release):07.03.00.005-dirty
    NOTICE:  BL31: Built : 00:15:40, Apr 10 2021
    
    U-Boot SPL 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)
    SYSFW ABI: 3.1 (firmware rev 0x0015 '21.1.1--v2021.01a (Terrific Lla')
    Detected: J7X-BASE-CPB rev E3
    Detected: J7X-VSC8514-ETH rev E2
    Trying to boot from MMC2
    
    
    U-Boot 2020.01-g2781231a33 (Apr 10 2021 - 00:17:14 +0000)
    
    SoC:   J721E SR1.0
    Model: Texas Instruments K3 J721E SoC
    Board: J721EX-PM2-SOM rev E7
    DRAM:  4 GiB
    not found for dev hbmc-mux
    Flash: 0 Bytes
    MMC:   sdhci@4f80000: 0, sdhci@4fb0000: 1
    Loading Environment from MMC... OK
    In:    serial@2800000
    Out:   serial@2800000
    Err:   serial@2800000
    Detected: J7X-BASE-CPB rev E3
    Detected: J7X-VSC8514-ETH rev E2
    Net:   K3 CPSW: nuss_ver: 0x6BA00101 cpsw_ver: 0x6BA80100 ale_ver: 0x00293904 Ports:1 mdio_freq:1000000
    
    Warning: ethernet@46000000 using MAC address from ROM
    eth0: ethernet@46000000
    Hit any key to stop autoboot:  0
    switch to partitions #0, OK
    mmc1 is current device
    SD/MMC found on device 1
    530 bytes read in 3 ms (171.9 KiB/s)
    Loaded env from uEnv.txt
    Importing environment from mmc1 ...
    Running uenvcmd ...
    Core 1 is already in use. No rproc commands work
    Core 2 is already in use. No rproc commands work
    4274368 bytes read in 90 ms (45.3 MiB/s)
    Load Remote Processor 2 with data@addr=0x82000000 4274368 bytes: Success!
    4274352 bytes read in 90 ms (45.3 MiB/s)
    Load Remote Processor 3 with data@addr=0x82000000 4274352 bytes: Success!
    4274348 bytes read in 91 ms (44.8 MiB/s)
    Load Remote Processor 4 with data@addr=0x82000000 4274348 bytes: Success!
    4274352 bytes read in 91 ms (44.8 MiB/s)
    Load Remote Processor 5 with data@addr=0x82000000 4274352 bytes: Success!
    5433728 bytes read in 116 ms (44.7 MiB/s)
    Load Remote Processor 6 with data@addr=0x82000000 5433728 bytes: Success!
    5433740 bytes read in 114 ms (45.5 MiB/s)
    Load Remote Processor 7 with data@addr=0x82000000 5433740 bytes: Success!
    11460416 bytes read in 36 ms (303.6 MiB/s)
    Load Remote Processor 8 with data@addr=0x82000000 11460416 bytes: Success!
    8176360 bytes read in 173 ms (45.1 MiB/s)
    ## Starting application at 0x80080000 ...
    MMU: 16-bit ASID 44-bit PA TCR_EL1=b5183519
    cpu0: MPIDR=80000000
    cpu0: MIDR=411fd080 Cortex-A72 r1p0
    cpu0: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
    cpu0: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
    cpu0: L1 Icache 48K linesz=64 set/way=256/3
    cpu0: L1 Dcache 32K linesz=64 set/way=256/2
    cpu0: L2 Unified 1024K linesz=64 set/way=1024/16
    Display set to R5
    Loading IFS...decompressing...done
    cpu1: MPIDR=80000001
    cpu1: MIDR=411fd080 Cortex-A72 r1p0
    cpu1: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT
    cpu1: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1
    cpu1: L1 Icache 48K linesz=64 set/way=256/3
    cpu1: L1 Dcache 32K linesz=64 set/way=256/2
    cpu1: L2 Unified 1024K linesz=64 set/way=1024/16
    
    System page at phys:0000000080011000 user:ffffff8040254000 kern:ffffff8040251000
    Starting next program at vffffff8060086e10
    All ClockCycles offsets within tolerance
    Welcome to QNX Neutrino 7.1.0 on the TI J721E EVM Board!!
    Starting random service ...
    start serial driver
    Starting MMC/SD memory card driver... eMMC
    Starting MMC/SD memory card driver... SD
    Starting XHCI driver on USB3SS0 and USB3SS1
    Path=0 - am65x
     target=0 lun=0     Direct-Access(0) - SDMMC: S0J56X Rev: 1.0
    Setting environment variables...
    done..
    Looking for user script to run: /ti_fs/scripts/user.sh
    Running user script...
    user.sh called...
    Setting additional environment variables...
    Starting tisci-mgr..
    Starting shmemallocator..
    Starting tiipc-mgr..
    Mailbox_plugInterrupt: interrupt Number 489, arg 0xF0B05EA0
    Mailbox_plugInterrupt: interrupt Number 490, arg 0xF0B06040
    Mailbox_plugInterrupt: interrupt Number 491, arg 0xF0B061E0
    Mailbox_plugInterrupt: interrupt Number 492, arg 0xF0B06380
    Mailbox_plugInterrupt: interrupt Number 493, arg 0xF0B06520
    Starting TI IPC Resmgr
    Starting tiudma-mgr..
    Start screen..
    screen started with dss_on_r5 configuration..
    done...
    J7EVM@QNX:/# ipc_test
    IPC_echo_test (core : mpu1_0) .....
    responderFxn will stay active. Please use ctrl-c to exit the test when finished.
    SendTask9: mpu1_0 <--> C7X_1, Ping- 10, pong - 10 completed
    SendTask7: mpu1_0 <--> C66X_1, Ping- 10, pong - 10 completed
    SendTask8: mpu1_0 <--> C66X_2, Ping- 10, pong - 10 completed
    SendTask6: mpu1_0 <--> mcu3_1, Ping- 10, pong - 10 completed
    SendTask4: mpu1_0 <--> mcu2_1, Ping- 10, pong - 10 completed
    SendTask5: mpu1_0 <--> mcu3_0, Ping- 10, pong - 10 completed
    SendTask3: mpu1_0 <--> mcu2_0, Ping- 10, pong - 10 completed
    

    但 MPU 和 mcu1之间没有 SendTask。  如何解决这个问题?

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好!

    关于 在 PSDK 7.3上运行的 IPC_TEST、请参阅以下链接上的默认安装说明、无需修改代码:

    (+)[常见问题解答] TDA4VM:PSDK QNX 7.2/PSDK QNX 7.3上的 IPC_Test -处理器论坛-处理器- TI E2E 支持论坛

    关于 MPU 到 MCU1的通信、由于 MCU1映像已在运行、 因此未加载"IPC echo test"固件映像。   将更新此主题、并提供解决此问题的建议步骤。

    KB

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好 KB:

      感谢你的答复。   

      根据 IPC_TEST 示例、已知  7.02或7.03版中未实现 MPU 到 MCU1的通信。此外、系统存储器映射与之前的版本不同。

      发件人线程:

       TDA4VM:在构建 MCU1_0后,运行 Vision Apps Demos for Display,出现 IPC 错误。 -处理器论坛-处理器- TI E2E 支持论坛

      已知 Vision Apps 为启用 IPC 的 MCU1_0生成的固件映像在7.03版中无法正常工作。

      那么、如何证明 MCU1_0和 MPU IPC 通信?

      2 .MPU 到 MCU1 IPC 通信是我们项目中需要开发的一项紧急任务。 我们尝试使用 cddipc 来满足此要求 、但失败了。  

         要在项目中使用 cddipc 进行参考,必须成功进行 IPC 通信。  

       我想知道 cddipc 模块是否是  MPU 到 MCU1 IPC 通信的可用方式?  

      3.如果 cddipc 不起作用,根据 TI 的建议,如何 实现 MPU 到 MCU1 IPC 的通信?

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好!

    该问题与在 MCU1上运行的固件映像有关。   要使 IPC 测试成功、MCU1固件映像需要:

    • 使 IPC 软件运行、能够响应正在运行的特定测试
    • 具有与系统其余部分匹配的内存映射

    根据之前的响应、TI 将 使用解决方法的建议步骤更新此主题。

    此致、

    KB

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好 KB:

      哪个版本将实现 MCU 的 IPC_TEST 功能?  或者 MCU 的 IPC_TEST 能够正常工作的哪个早期版本?

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好!

    参考 PSDK QNX 7.1的常见问题解答:

    (+)[常见问题解答] PROCESSOR-SDK-DRA8X-TDA4X:如何在 PSDKQA7.0上使用 IPC 示例应用回波测试-处理器论坛-处理器- TI E2E 支持论坛

    其中的日志显示 mcu1_0响应 IPC 回波测试:

    "SendTask1:mpu1_0 <->mcu1_0、乒声10、pong -10已完成"

    此致、

    KB

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好 KB:

      感谢你的答复。

      关注您的主题:

       (+)[常见问题解答] TDA4VM:PSDK QNX 7.2/PSDK QNX 7.3上的 IPC_Test -处理器论坛-处理器- TI E2E 支持论坛

      我得到的结果与您相同。  

  • 请注意,本文内容源自机器翻译,可能存在语法或其它翻译错误,仅供参考。如需获取准确内容,请参阅链接中的英语原文或自行翻译。

    您好!

    补丁 PSDK_QNX_07_03_03_patch 可从 PROCESSOR-SDK-QNX-J721E_07.03.00.02 | TI.com 获取、该补丁包含自述文件和说明。  

     应用 PSDK_QNX_07_03_03_patch 并遵循自述文件说明后 、预期 IPC_echo 测试的 MCU1_0通信将正常运行。

    此致、

    KB