This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VH-Q1: 8.6 sdk 使用mosaic异常

Part Number: TDA4VH-Q1


Hi ti

我们使用tda4vh 8.6sdk,使用mosaic节点拼接四路1920*1080的摄像头到3840x2160,在拼接的过程中出现异常如下,如何解决,谢谢

[MCU2_0]     76.678657 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.711825 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.745147 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.778350 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.811661 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.844832 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.878109 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.911438 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.944598 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     76.977914 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.011137 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.044405 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.077614 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.110888 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.144195 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.177410 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.210707 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.243887 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.277194 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.310391 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.343669 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.376967 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.410175 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.443499 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.476675 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.509961 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.543172 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.576462 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.609755 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.642964 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.676259 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request
[MCU2_0]     77.709430 s:  VX_ZONE_ERROR:[tivxKernelImgMosaicMscDrvSubmit:1074] Failed to Submit Request

  • 你好,使用的是索尼的摄像头,输入的格式是uyvy,想通过mosaic节点把数据格式转换为nv12后再拼接

  • 能否贴一下您的代码?

  • 你好,代码如下,没有使能ldc和viss,mosaic的输入是capture的输出uvyv格式的视频,mosaic节点可以直接接收uyvy格式的数据然后转换成nv12的数据吗?

    /*
     *
     * Copyright (c) 2020 Texas Instruments Incorporated
     *
     * All rights reserved not granted herein.
     *
     * Limited License.
     *
     * Texas Instruments Incorporated grants a world-wide, royalty-free, non-exclusive
     * license under copyrights and patents it now or hereafter owns or controls to make,
     * have made, use, import, offer to sell and sell ("Utilize") this software subject to the
     * terms herein.  With respect to the foregoing patent license, such license is granted
     * solely to the extent that any such patent is necessary to Utilize the software alone.
     * The patent license shall not apply to any combinations which include this software,
     * other than combinations with devices manufactured by or for TI ("TI Devices").
     * No hardware patent is licensed hereunder.
     *
     * Redistributions must preserve existing copyright notices and reproduce this license
     * (including the above copyright notice and the disclaimer and (if applicable) source
     * code license limitations below) in the documentation and/or other materials provided
     * with the distribution
     *
     * Redistribution and use in binary form, without modification, are permitted provided
     * that the following conditions are met:
     *
     * *       No reverse engineering, decompilation, or disassembly of this software is
     * permitted with respect to any software provided in binary form.
     *
     * *       any redistribution and use are licensed by TI for use only with TI Devices.
     *
     * *       Nothing shall obligate TI to provide you with source code for the software
     * licensed and provided to you in object code.
     *
     * If software source code is provided to you, modification and redistribution of the
     * source code are permitted provided that the following conditions are met:
     *
     * *       any redistribution and use of the source code, including any resulting derivative
     * works, are licensed by TI for use only with TI Devices.
     *
     * *       any redistribution and use of any object code compiled from the source code
     * and any resulting derivative works, are licensed by TI for use only with TI Devices.
     *
     * Neither the name of Texas Instruments Incorporated nor the names of its suppliers
     *
     * may be used to endorse or promote products derived from this software without
     * specific prior written permission.
     *
     * DISCLAIMER.
     *
     * THIS SOFTWARE IS PROVIDED BY TI AND TI'S LICENSORS "AS IS" AND ANY EXPRESS
     * OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
     * OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
     * IN NO EVENT SHALL TI AND TI'S LICENSORS BE LIABLE FOR ANY DIRECT, INDIRECT,
     * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
     * BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
     * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
     * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
     * OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
     * OF THE POSSIBILITY OF SUCH DAMAGE.
     *
     */
    
    #include <utils/draw2d/include/draw2d.h>
    #include <utils/perf_stats/include/app_perf_stats.h>
    #include <utils/console_io/include/app_get.h>
    #include <utils/grpx/include/app_grpx.h>
    #include <VX/vx_khr_pipelining.h>
    
    #include "app_common.h"
    #include "app_sensor_module.h"
    #include "app_capture_module.h"
    #include "app_viss_module.h"
    #include "app_aewb_module.h"
    #include "app_ldc_module.h"
    #include "app_img_mosaic_module.h"
    #include "app_display_module.h"
    #include "app_test.h"
    
    #include <signal.h>
    #include <unistd.h>
    #include "camera_server.h"
    #include "camera_request.h"
    
    #define APP_BUFFER_Q_DEPTH   (8)
    #define APP_PIPELINE_DEPTH   (10)
    
    #define MOSAIC_OUT_WIDTH   (1920)
    #define MOSAIC_OUT_HEIGHT   (1080)
    
    typedef struct {
    
        SensorObj     sensorObj;
        CaptureObj    captureObj;
        VISSObj       vissObj;
        AEWBObj       aewbObj;
        LDCObj        ldcObj;
        ImgMosaicObj  imgMosaicObj;
        DisplayObj    displayObj;
    
        vx_char output_file_path[APP_MAX_FILE_PATH];
    
        /* OpenVX references */
        vx_context context;
        vx_graph   graph;
    
        vx_int32 en_out_img_write;
        vx_int32 test_mode;
    
        vx_uint32 is_interactive;
    
        vx_uint32 num_frames_to_run;
    
        vx_uint32 num_frames_to_write;
        vx_uint32 num_frames_to_skip;
    
        tivx_task task;
        tivx_task camera_task;
        vx_uint32 stop_task;
        vx_uint32 stop_task_done;
    
        app_perf_point_t total_perf;
        app_perf_point_t fileio_perf;
        app_perf_point_t draw_perf;
    
        int32_t enable_ldc;
        int32_t enable_viss;
        int32_t enable_aewb;
        int32_t enable_mosaic;
        int32_t enable_display;
    
        int32_t pipeline;
    
        int32_t enqueueCnt;
        int32_t dequeueCnt;
    
        int32_t write_file;
    
    } AppObj;
    
    AppObj gAppObj;
    
    static void app_parse_cmd_line_args(AppObj *obj, vx_int32 argc, vx_char *argv[]);
    static vx_status app_init(AppObj *obj);
    static void app_deinit(AppObj *obj);
    static vx_status app_create_graph(AppObj *obj);
    static vx_status app_verify_graph(AppObj *obj);
    static vx_status app_run_graph(AppObj *obj);
    static vx_status app_run_graph_interactive(AppObj *obj);
    static void app_delete_graph(AppObj *obj);
    static void app_default_param_set(AppObj *obj);
    static void app_update_param_set(AppObj *obj);
    static void app_pipeline_params_defaults(AppObj *obj);
    static void add_graph_parameter_by_node_index(vx_graph graph, vx_node node, vx_uint32 node_parameter_index);
    static vx_int32 calc_grid_size(vx_uint32 ch);
    static void set_img_mosaic_params(ImgMosaicObj *imgMosaicObj, vx_uint32 in_width, vx_uint32 in_height, vx_int32 numCh);
    
    static void app_show_usage(vx_int32 argc, vx_char* argv[])
    {
        printf("\n");
        printf(" Camera Demo - (c) Texas Instruments 2020\n");
        printf(" ========================================================\n");
        printf("\n");
        printf(" Usage,\n");
        printf("  %s --cfg <config file>\n", argv[0]);
        printf("\n");
    }
    
    static char menu[] = {
        "\n"
        "\n ========================="
        "\n Demo : Camera Demo"
        "\n ========================="
        "\n"
        "\n s: Save CSIx, VISS and LDC outputs"
        "\n"
        "\n p: Print performance statistics"
        "\n"
        "\n x: Exit"
        "\n"
        "\n Enter Choice: "
    };
    
    
    static void app_run_task(void *app_var)
    {
        AppObj *obj = (AppObj *)app_var;
        vx_status status = VX_SUCCESS;
        while((!obj->stop_task) && (status == VX_SUCCESS))
        {
            status = app_run_graph(obj);
        }
        obj->stop_task_done = 1;
    }
    
    static int32_t app_run_task_create(AppObj *obj)
    {
        tivx_task_create_params_t params;
        vx_status status;
    
        tivxTaskSetDefaultCreateParams(&params);
        params.task_main = app_run_task;
        params.app_var = obj;
    
        obj->stop_task_done = 0;
        obj->stop_task = 0;
    
        status = tivxTaskCreate(&obj->task, &params);
    
        return status;
    }
    
    static void app_run_task_delete(AppObj *obj)
    {
        while(obj->stop_task_done==0)
        {
            tivxTaskWaitMsecs(100);
        }
    
        tivxTaskDelete(&obj->task);
        tivxTaskDelete(&obj->camera_task);
    }
    
    static int run_flag = 1;
    
    static void SignalHandler(int sig)
    {
        run_flag = 0;
        printf("get Signal exit \n");
    }
    static vx_status app_run_graph_interactive(AppObj *obj)
    {
        vx_status status;
        uint32_t done = 0;
    
        char ch;
        FILE *fp;
        app_perf_point_t *perf_arr[1];
    
        signal(SIGTERM, SignalHandler);
        signal(SIGHUP, SignalHandler);
        signal(SIGUSR1, SignalHandler);
        signal(SIGQUIT, SignalHandler);
        signal(SIGINT, SignalHandler);
        signal(SIGKILL, SignalHandler);
    
        status = app_run_task_create(obj);
        if(status == VX_FAILURE)
        {
            printf("app_tidl: ERROR: Unable to create task\n");
        }
        else
        {
            appPerfStatsResetAll();
            while(!done)
            {
                printf(menu);
                ch = getchar();
                printf("\n");
    
                switch(ch)
                {
                    case 'p':
                        appPerfStatsPrintAll();
                        status = tivx_utils_graph_perf_print(obj->graph);
                        appPerfPointPrint(&obj->fileio_perf);
                        appPerfPointPrint(&obj->total_perf);
                        printf("\n");
                        appPerfPointPrintFPS(&obj->total_perf);
                        appPerfPointReset(&obj->total_perf);
                        printf("\n");
    
                        vx_reference refs[1];
                        refs[0] = (vx_reference)obj->captureObj.raw_image_arr[0];
                        if (status == VX_SUCCESS)
                        {
                            status = tivxNodeSendCommand(obj->captureObj.node, 0u,
                                        TIVX_CAPTURE_PRINT_STATISTICS,
                                        refs, 1u);
                        }
                        break;
                    case 'e':
                        perf_arr[0] = &obj->total_perf;
                        fp = appPerfStatsExportOpenFile(".", "basic_demos_app_multi_cam");
                        if (NULL != fp)
                        {
                            appPerfStatsExportAll(fp, perf_arr, 1);
                            if (status == VX_SUCCESS)
                            {
                                status = tivx_utils_graph_perf_export(fp, obj->graph);
                            }
                            appPerfStatsExportCloseFile(fp);
                            appPerfStatsResetAll();
                        }
                        else
                        {
                            printf("fp is null\n");
                        }
                        break;
                    case 's':
                        obj->write_file = 1;
                        break;
                    case 'x':
                        obj->stop_task = 1;
                        done = 1;
                        break;
                }
            }
            app_run_task_delete(obj);
        }
        return status;
    }
    
    static void app_set_cfg_default(AppObj *obj)
    {
        snprintf(obj->captureObj.output_file_path,APP_MAX_FILE_PATH, ".");
        snprintf(obj->vissObj.output_file_path,APP_MAX_FILE_PATH, ".");
        snprintf(obj->ldcObj.output_file_path,APP_MAX_FILE_PATH, ".");
    
        obj->captureObj.en_out_capture_write = 0;
        obj->vissObj.en_out_viss_write = 0;
        obj->ldcObj.en_out_ldc_write = 0;
    
        obj->num_frames_to_write = 0;
        obj->num_frames_to_skip = 0;
    }
    
    static void app_parse_cfg_file(AppObj *obj, vx_char *cfg_file_name)
    {
        FILE *fp = fopen(cfg_file_name, "r");
        vx_char line_str[1024];
        vx_char *token;
    
        if(fp==NULL)
        {
            printf("# ERROR: Unable to open config file [%s]\n", cfg_file_name);
            exit(0);
        }
    
        while(fgets(line_str, sizeof(line_str), fp)!=NULL)
        {
            vx_char s[]=" \t";
    
            if (strchr(line_str, '#'))
            {
                continue;
            }
    
            /* get the first token */
            token = strtok(line_str, s);
            if(token != NULL)
            {
                if(strcmp(token, "sensor_index")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->sensorObj.sensor_index = atoi(token);
                    }
                }
                else
                if(strcmp(token, "num_frames_to_run")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->num_frames_to_run = atoi(token);
                    }
                }
                else
                if(strcmp(token, "enable_error_detection")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->captureObj.enable_error_detection = atoi(token);
                    }
                }
                else
                if(strcmp(token, "enable_ldc")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->sensorObj.enable_ldc = atoi(token);
                        if(obj->sensorObj.enable_ldc > 1)
                            obj->sensorObj.enable_ldc = 1;
                    }
                }
                else
                if(strcmp(token, "en_out_img_write")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->en_out_img_write = atoi(token);
                        if(obj->en_out_img_write > 1)
                            obj->en_out_img_write = 1;
                    }
                }
                else
                if(strcmp(token, "en_out_capture_write")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->captureObj.en_out_capture_write = atoi(token);
                        if(obj->captureObj.en_out_capture_write > 1)
                            obj->captureObj.en_out_capture_write = 1;
                    }
                }
                else
                if(strcmp(token, "en_out_viss_write")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->vissObj.en_out_viss_write = atoi(token);
                        if(obj->vissObj.en_out_viss_write > 1)
                            obj->vissObj.en_out_viss_write = 1;
                    }
                }
                else
                if(strcmp(token, "en_out_ldc_write")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->ldcObj.en_out_ldc_write = atoi(token);
                        if(obj->ldcObj.en_out_ldc_write > 1)
                            obj->ldcObj.en_out_ldc_write = 1;
                    }
                }
                else
                if(strcmp(token, "output_file_path")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        token[strlen(token)-1]=0;
                        strcpy(obj->captureObj.output_file_path, token);
                        strcpy(obj->vissObj.output_file_path, token);
                        strcpy(obj->ldcObj.output_file_path, token);
                        strcpy(obj->output_file_path, token);
                    }
                }
                else
                if(strcmp(token, "display_option")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->displayObj.display_option = atoi(token);
                        if(obj->displayObj.display_option > 1)
                            obj->displayObj.display_option = 1;
                    }
                }
                else
                if(strcmp(token, "is_interactive")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        token[strlen(token)-1]=0;
                        obj->is_interactive = atoi(token);
                        if(obj->is_interactive > 1)
                        {
                            obj->is_interactive = 1;
                        }
                    }
                    obj->sensorObj.is_interactive = obj->is_interactive;
                }
                else
                if(strcmp(token, "num_cameras_enabled")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        token[strlen(token)-1]=0;
                        obj->sensorObj.num_cameras_enabled = atoi(token);
                    }
                }
                else
                if(strcmp(token, "usecase_option")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        token[strlen(token)-1]=0;
                        obj->sensorObj.usecase_option = atoi(token);
                    }
                }
                else
                if(strcmp(token, "num_frames_to_write")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        token[strlen(token)-1]=0;
                        obj->num_frames_to_write = atoi(token);
                    }
                }
                else
                if(strcmp(token, "num_frames_to_skip")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        token[strlen(token)-1]=0;
                        obj->num_frames_to_skip = atoi(token);
                    }
                }
                else
                if(strcmp(token, "test_mode")==0)
                {
                    token = strtok(NULL, s);
                    if(token != NULL)
                    {
                        obj->test_mode = atoi(token);
                        obj->captureObj.test_mode = atoi(token);
                    }
                }
            }
        }
    
        fclose(fp);
    
    }
    
    static void app_parse_cmd_line_args(AppObj *obj, vx_int32 argc, vx_char *argv[])
    {
        vx_int32 i;
        vx_int16 num_test_cams = 0xFF, sensor_override = 0xFF;
        vx_bool set_test_mode = vx_false_e;
    
        app_set_cfg_default(obj);
    
        if(argc==1)
        {
            app_show_usage(argc, argv);
            exit(0);
        }
    
        for(i=0; i<argc; i++)
        {
            if(strcmp(argv[i], "--cfg")==0)
            {
                i++;
                if(i>=argc)
                {
                    app_show_usage(argc, argv);
                }
                app_parse_cfg_file(obj, argv[i]);
            }
            else
            if(strcmp(argv[i], "--help")==0)
            {
                app_show_usage(argc, argv);
                exit(0);
            }
            else
            if(strcmp(argv[i], "--test")==0)
            {
                set_test_mode = vx_true_e;
                // check to see if there is another argument following --test
                if (argc > i+1)
                {
                    num_test_cams = atoi(argv[i+1]);
                    // increment i again to avoid this arg
                    i++;
                }
            }
            else
            if(strcmp(argv[i], "--sensor")==0)
            {
                // check to see if there is another argument following --sensor
                if (argc > i+1)
                {
                    sensor_override = atoi(argv[i+1]);
                    // increment i again to avoid this arg
                    i++;
                }
            }
        }
    
        if (set_test_mode == vx_true_e)
        {
            obj->test_mode = 1;
            obj->captureObj.test_mode = 1;
            obj->is_interactive = 0;
            obj->sensorObj.is_interactive = 0;
            // set the number of test cams from cmd line
            if (num_test_cams != 0xFF)
            {
                obj->sensorObj.num_cameras_enabled = num_test_cams;
            }
            if (sensor_override != 0xFF)
            {
                obj->sensorObj.sensor_index = sensor_override;
            }
        }
    
        return;
    }
    
    static void camera_fault_run_task(void *app_var)
    {
        AppObj *obj = (AppObj *)app_var;
        vx_reference refs[1];
        vx_user_data_object capture_stats_obj;
        vx_map_id capture_stats_map_id;
        tivx_capture_statistics_t *capture_stats_struct;
        uint32_t *data_ptr;
        int camera_mask = 15;
        uint32_t reset_num = 0;
        uint32_t reset_flag = 0;
        uint32_t timeout = 3000;
        uint32_t node_report = 0;
    
    
        capture_stats_obj = vxCreateUserDataObject(obj->context, "tivx_capture_statistics_t" ,sizeof(tivx_capture_statistics_t), NULL);
        refs[0] = (vx_reference)capture_stats_obj;
    
        while (run_flag)
        {
            tivxTaskWaitMsecs(timeout);
            tivxNodeSendCommand(obj->captureObj.node, 0,TIVX_CAPTURE_GET_STATISTICS, refs, 1u);
            vxMapUserDataObject(
                    (vx_user_data_object)refs[0],
                    0,
                    sizeof(tivx_capture_statistics_t),
                    &capture_stats_map_id,
                    (void **)&data_ptr,
                    VX_READ_ONLY,
                    VX_MEMORY_TYPE_HOST,
                    0
                );
            capture_stats_struct = (tivx_capture_statistics_t*)data_ptr;
    
            //printf("capture_stats_struct->activeChannelMask = %d [%d]\n",capture_stats_struct->activeChannelMask,timeout);
            if (capture_stats_struct->activeChannelMask != camera_mask)
            {
                //camera_fault_deal(camera_mask,capture_stats_struct->activeChannelMask);
            }
    
            if ((capture_stats_struct->activeChannelMask != 15) && (reset_num < 5)) reset_flag = 1;
            else timeout = 3000;
    
            if ((reset_flag) && (reset_num < 5))
            {
                printf("Request to reset sensor [%d]\n",reset_num);
                //appResetLink(obj->sensorObj.sensor_name, ((1 << obj->sensorObj.num_cameras_enabled) - 1));
                reset_num++;
                //timeout = 1000;
            }
    
            vxUnmapUserDataObject((vx_user_data_object)refs[0], capture_stats_map_id);
            camera_mask = capture_stats_struct->activeChannelMask;
            if ((capture_stats_struct->activeChannelMask == 15)) {reset_flag = 0;reset_num = 0;timeout = 3000;}
    
            if (node_report == 0)
            {
                vx_status node_status=(vx_status)VX_FAILURE;
                vxQueryNode(obj->imgMosaicObj.node, (vx_enum)VX_NODE_STATUS, &node_status, sizeof(vx_status));
                if (node_status != 0)
                {
                    node_report = 1;
                    //camera_fault_report(1 /*mask*/, 0 /*index*/);
                }
            }
        }
        vxReleaseUserDataObject(&capture_stats_obj);
    }
    
    static int32_t camera_fault_register(AppObj *obj)
    {
        tivx_task_create_params_t params;
        vx_status status;
    
        tivxTaskSetDefaultCreateParams(&params);
        params.task_main = camera_fault_run_task;
        params.app_var = obj;
    
        status = tivxTaskCreate(&obj->camera_task, &params);
        camera_fault_init();
    
        return status;
    }
    
    vx_int32 app_multi_cam_main(vx_int32 argc, vx_char* argv[])
    {
        vx_status status = VX_SUCCESS;
    
        AppObj *obj = &gAppObj;
    
        /*Optional parameter setting*/
        app_default_param_set(obj);
    
        /*Config parameter reading*/
        app_parse_cmd_line_args(obj, argc, argv);
    
        /* Querry sensor parameters */
        status = app_querry_sensor(&obj->sensorObj);
        if(1 == obj->sensorObj.sensor_out_format)
        {
            printf("YUV Input selected. VISS, AEWB and Mosaic nodes will be bypassed. \n");
            obj->enable_viss = 0;
            obj->enable_aewb = 0;
            obj->enable_mosaic = 1;
        }
        else
        {
            obj->enable_viss = 1;
            obj->enable_aewb = 1;
            obj->enable_mosaic = 1;
        }
    
        /*Update of parameters are config file read*/
        app_update_param_set(obj);
    
        if (status == VX_SUCCESS)
        {
            status = app_init(obj);
        }
        if(status == VX_SUCCESS)
        {
            APP_PRINTF("App Init Done! \n");
    
            status = app_create_graph(obj);
    
            if(status == VX_SUCCESS)
            {
                APP_PRINTF("App Create Graph Done! \n");
    
                status = app_verify_graph(obj);
    
                if(status == VX_SUCCESS)
                {
                    APP_PRINTF("App Verify Graph Done! \n");
    
                    if (status == VX_SUCCESS)
                    {
                        APP_PRINTF("App Send Error Frame Done! \n");
                        if(obj->is_interactive)
                        {
                            status = app_run_graph_interactive(obj);
                        }
                        else
                        {
                            status = app_run_graph(obj);
                        }
                    }
                }
            }
    
            APP_PRINTF("App Run Graph Done! \n");
        }
    
        app_delete_graph(obj);
    
        APP_PRINTF("App Delete Graph Done! \n");
    
        app_deinit(obj);
    
        APP_PRINTF("App De-init Done! \n");
    
        if(obj->test_mode == 1)
        {
            if((vx_false_e == test_result) || (status != VX_SUCCESS))
            {
                printf("\n\nTEST FAILED\n\n");
                print_new_checksum_structs();
                status = (status == VX_SUCCESS) ? VX_FAILURE : status;
            }
            else
            {
                printf("\n\nTEST PASSED\n\n");
            }
        }
        return status;
    }
    
    static vx_status app_init(AppObj *obj)
    {
        vx_status status = VX_SUCCESS;
        ImgMosaicObj  *pimgMosaicObj;
    
        /* Create OpenVx Context */
        obj->context = vxCreateContext();
        status = vxGetStatus((vx_reference)obj->context);
        APP_PRINTF("Creating context done!\n");
        if (status == VX_SUCCESS)
        {
            tivxHwaLoadKernels(obj->context);
            tivxImagingLoadKernels(obj->context);
            tivxFileIOLoadKernels(obj->context);
            APP_PRINTF("Kernel loading done!\n");
        }
    
        /* Initialize modules */
        printf("sensorObj->ch_mask = %d\n",obj->sensorObj.ch_mask);
        app_init_sensor(&obj->sensorObj, "sensor_obj");
    
        if (status == VX_SUCCESS)
        {
            APP_PRINTF("Sensor init done!\n");
            status = app_init_capture(obj->context, &obj->captureObj, &obj->sensorObj, "capture_obj", APP_BUFFER_Q_DEPTH);
        }
    
        if((1 == obj->enable_viss) && (status == VX_SUCCESS))
        {
            status = app_init_viss(obj->context, &obj->vissObj, &obj->sensorObj, "viss_obj", obj->sensorObj.num_cameras_enabled);
            APP_PRINTF("VISS init done!\n");
        }
    
        if((1 == obj->enable_aewb) && (status == VX_SUCCESS))
        {
            status = app_init_aewb(obj->context, &obj->aewbObj, &obj->sensorObj, "aewb_obj", 0, obj->sensorObj.num_cameras_enabled);
            APP_PRINTF("AEWB init done!\n");
        }
    
        if((obj->sensorObj.enable_ldc == 1) && (status == VX_SUCCESS))
        {
            status = app_init_ldc(obj->context, &obj->ldcObj, &obj->sensorObj, "ldc_obj", obj->sensorObj.num_cameras_enabled);
            APP_PRINTF("LDC init done!\n");
        }
    
    
        if((obj->enable_mosaic == 1) && (status == VX_SUCCESS))
        {
            status = app_init_img_mosaic(obj->context, &obj->imgMosaicObj, "img_mosaic_obj", APP_BUFFER_Q_DEPTH);
            APP_PRINTF("Img Mosaic init done!\n");
    
            pimgMosaicObj = &obj->imgMosaicObj;
    
            vx_int32 q;
            vx_imagepatch_addressing_t  image_addr;
            vx_rectangle_t              rect;
            vx_map_id                   map_id;
            void *data_ptr1, *data_ptr2;
    
            // for(q = 0; q < APP_BUFFER_Q_DEPTH; q++)
            // {
            //         rect.start_x = 0;
            //         rect.start_y = 0;
            //         rect.end_x = pimgMosaicObj->out_width;
            //         rect.end_y = pimgMosaicObj->out_height;
    
            //         status = vxMapImagePatch(pimgMosaicObj->output_image[q],
            //             &rect,
            //             0,
            //             &map_id,
            //             &image_addr,
            //             &data_ptr1,
            //             (vx_enum)VX_READ_ONLY,
            //             (vx_enum)VX_MEMORY_TYPE_HOST,
            //             (vx_enum)VX_NOGAP_X
            //             );
    
            //         vxUnmapImagePatch(pimgMosaicObj->output_image[q], map_id);
    
            //         rect.start_x = 0;
            //         rect.start_y = 0;
            //         rect.end_x = pimgMosaicObj->out_width;
            //         rect.end_y = pimgMosaicObj->out_height/2;
    
            //         status = vxMapImagePatch(pimgMosaicObj->output_image[q],
            //             &rect,
            //             1,
            //             &map_id,
            //             &image_addr,
            //             &data_ptr2,
            //             (vx_enum)VX_READ_ONLY,
            //             (vx_enum)VX_MEMORY_TYPE_HOST,
            //             (vx_enum)VX_NOGAP_X
            //             );
    
            //         vxUnmapImagePatch(pimgMosaicObj->output_image[q], map_id);
            // }
    
    
            // ExportOutputImageFd(pimgMosaicObj->output_image, APP_BUFFER_Q_DEPTH);
        }
    
        if((obj->enable_display == 1) && (status == VX_SUCCESS))
        {
            status = app_init_display(obj->context, &obj->displayObj, "display_obj");
            APP_PRINTF("Display init done!\n");
        }
    
        appPerfPointSetName(&obj->total_perf , "TOTAL");
        appPerfPointSetName(&obj->fileio_perf, "FILEIO");
        return status;
    }
    
    static void app_deinit(AppObj *obj)
    {
        app_deinit_sensor(&obj->sensorObj);
        APP_PRINTF("Sensor deinit done!\n");
    
        app_deinit_capture(&obj->captureObj, APP_BUFFER_Q_DEPTH);
        APP_PRINTF("Capture deinit done!\n");
    
        if(1 == obj->enable_viss)
        {
            app_deinit_viss(&obj->vissObj);
            APP_PRINTF("VISS deinit done!\n");
        }
    
        if(1 == obj->enable_aewb)
        {
            app_deinit_aewb(&obj->aewbObj);
            APP_PRINTF("AEWB deinit done!\n");
        }
    
        if(obj->sensorObj.enable_ldc == 1)
        {
            app_deinit_ldc(&obj->ldcObj);
            APP_PRINTF("LDC deinit done!\n");
        }
    
        if(obj->enable_mosaic == 1)
        {
            app_deinit_img_mosaic(&obj->imgMosaicObj, APP_BUFFER_Q_DEPTH);
            APP_PRINTF("Img Mosaic deinit done!\n");
        }
    
        if(obj->enable_display == 1)
        {
            app_deinit_display(&obj->displayObj);
            APP_PRINTF("Display deinit done!\n");
        }
    
        tivxHwaUnLoadKernels(obj->context);
        tivxImagingUnLoadKernels(obj->context);
        tivxFileIOUnLoadKernels(obj->context);
        APP_PRINTF("Kernels unload done!\n");
    
        vxReleaseContext(&obj->context);
        APP_PRINTF("Release context done!\n");
    }
    
    static void app_delete_graph(AppObj *obj)
    {
        app_delete_capture(&obj->captureObj);
        APP_PRINTF("Capture delete done!\n");
    
        app_delete_viss(&obj->vissObj);
        APP_PRINTF("VISS delete done!\n");
    
        app_delete_aewb(&obj->aewbObj);
        APP_PRINTF("AEWB delete done!\n");
    
        if(obj->sensorObj.enable_ldc == 1)
        {
            app_delete_ldc(&obj->ldcObj);
            APP_PRINTF("LDC delete done!\n");
        }
    
        app_delete_img_mosaic(&obj->imgMosaicObj);
        APP_PRINTF("Img Mosaic delete done!\n");
    
        if(1 == obj->enable_display)
        {
            app_delete_display(&obj->displayObj);
            APP_PRINTF("Display delete done!\n");
        }
    
        vxReleaseGraph(&obj->graph);
        APP_PRINTF("Graph delete done!\n");
    }
    
    static vx_status app_create_graph(AppObj *obj)
    {
        vx_status status = VX_SUCCESS;
        vx_graph_parameter_queue_params_t graph_parameters_queue_params_list[2];
        vx_int32 graph_parameter_index;
    
        obj->graph = vxCreateGraph(obj->context);
        status = vxGetStatus((vx_reference)obj->graph);
        if (status == VX_SUCCESS)
        {
            status = vxSetReferenceName((vx_reference)obj->graph, "app_multi_cam_graph");
            APP_PRINTF("Graph create done!\n");
        }
    
        if(status == VX_SUCCESS)
        {
            status = app_create_graph_capture(obj->graph, &obj->captureObj);
            APP_PRINTF("Capture graph done!\n");
        }
    
        if(1 == obj->enable_viss)
        {
            if(status == VX_SUCCESS)
            {
                status = app_create_graph_viss(obj->graph, &obj->vissObj, obj->captureObj.raw_image_arr[0], TIVX_TARGET_VPAC_VISS1);
                APP_PRINTF("VISS graph done!\n");
            }
        }
    
        if(1 == obj->enable_aewb)
        {
            if(status == VX_SUCCESS)
            {
                status = app_create_graph_aewb(obj->graph, &obj->aewbObj, obj->vissObj.h3a_stats_arr);
    
                APP_PRINTF("AEWB graph done!\n");
            }
        }
    
        vx_int32 idx = 0;
        if(obj->sensorObj.enable_ldc == 1)
        {
            vx_object_array ldc_in_arr;
            if(1 == obj->enable_viss)
            {
                ldc_in_arr = obj->vissObj.output_arr;
            }
            else
            {
                ldc_in_arr = obj->captureObj.raw_image_arr[0];
            }
            if (status == VX_SUCCESS)
            {
                status = app_create_graph_ldc(obj->graph, &obj->ldcObj, ldc_in_arr, TIVX_TARGET_VPAC_LDC1);
                APP_PRINTF("LDC graph done!\n");
            }
            obj->imgMosaicObj.input_arr[idx++] = obj->ldcObj.output_arr;
        }
        else
        {
            vx_object_array mosaic_in_arr;
            if(1 == obj->enable_viss)
            {
                mosaic_in_arr = obj->vissObj.output_arr;
            }
            else
            {
                mosaic_in_arr = obj->captureObj.raw_image_arr[0];
            }
    
            obj->imgMosaicObj.input_arr[idx++] = mosaic_in_arr;
        }
    
        vx_image display_in_image;
        if(obj->enable_mosaic == 1)
        {
            obj->imgMosaicObj.num_inputs = idx;
    
            if(status == VX_SUCCESS)
            {
                status = app_create_graph_img_mosaic(obj->graph, &obj->imgMosaicObj,NULL);
                APP_PRINTF("Img Mosaic graph done!\n");
            }
            display_in_image = obj->imgMosaicObj.output_image[0];
        }
        else
        {
            display_in_image = (vx_image)vxGetObjectArrayItem(obj->captureObj.raw_image_arr[0], 0);
        }
    
        if((obj->enable_display == 1) && (status == VX_SUCCESS))
        {
            status = app_create_graph_display(obj->graph, &obj->displayObj, display_in_image);
            APP_PRINTF("Display graph done!\n");
        }
    
        if(status == VX_SUCCESS)
        {
            graph_parameter_index = 0;
            add_graph_parameter_by_node_index(obj->graph, obj->captureObj.node, 1);
            obj->captureObj.graph_parameter_index = graph_parameter_index;
            graph_parameters_queue_params_list[graph_parameter_index].graph_parameter_index = graph_parameter_index;
            graph_parameters_queue_params_list[graph_parameter_index].refs_list_size = APP_BUFFER_Q_DEPTH;
            graph_parameters_queue_params_list[graph_parameter_index].refs_list = (vx_reference*)&obj->captureObj.raw_image_arr[0];
            graph_parameter_index++;
    
            if((obj->en_out_img_write == 1) || (obj->test_mode == 1))
            {
                add_graph_parameter_by_node_index(obj->graph, obj->imgMosaicObj.node, 1);
                obj->imgMosaicObj.graph_parameter_index = graph_parameter_index;
                graph_parameters_queue_params_list[graph_parameter_index].graph_parameter_index = graph_parameter_index;
                graph_parameters_queue_params_list[graph_parameter_index].refs_list_size = APP_BUFFER_Q_DEPTH;
                graph_parameters_queue_params_list[graph_parameter_index].refs_list = (vx_reference*)&obj->imgMosaicObj.output_image[0];
                graph_parameter_index++;
            }
    
            status = vxSetGraphScheduleConfig(obj->graph,
                    VX_GRAPH_SCHEDULE_MODE_QUEUE_AUTO,
                    graph_parameter_index,
                    graph_parameters_queue_params_list);
    
            if (status == VX_SUCCESS)
            {
                status = tivxSetGraphPipelineDepth(obj->graph, APP_PIPELINE_DEPTH);
            }
            if((obj->enable_viss == 1) && (status == VX_SUCCESS))
            {
                status = tivxSetNodeParameterNumBufByIndex(obj->vissObj.node, 6, APP_BUFFER_Q_DEPTH);
    
                if (status == VX_SUCCESS)
                {
                    status = tivxSetNodeParameterNumBufByIndex(obj->vissObj.node, 9, APP_BUFFER_Q_DEPTH);
                }
                if (status == VX_SUCCESS)
                {
                    status = tivxSetNodeParameterNumBufByIndex(obj->aewbObj.node, 4, APP_BUFFER_Q_DEPTH);
                }
            }
            if((obj->sensorObj.enable_ldc == 1) && (status == VX_SUCCESS))
            {
                status = tivxSetNodeParameterNumBufByIndex(obj->ldcObj.node, 7, APP_BUFFER_Q_DEPTH);
            }
            if((obj->enable_mosaic == 1) && (status == VX_SUCCESS))
            {
                if(!((obj->en_out_img_write == 1) || (obj->test_mode == 1)))
                {
                    status = tivxSetNodeParameterNumBufByIndex(obj->imgMosaicObj.node, 1, APP_BUFFER_Q_DEPTH);
                    APP_PRINTF("Pipeline params setup done!\n");
                }
            }
        }
    
        return status;
    }
    
    static vx_status app_verify_graph(AppObj *obj)
    {
        vx_status status = VX_SUCCESS;
    
        status = vxVerifyGraph(obj->graph);
    
        if(status == VX_SUCCESS)
        {
            APP_PRINTF("Graph verify done!\n");
        }
    
        #if 0
        if(VX_SUCCESS == status)
        {
          status = tivxExportGraphToDot(obj->graph,".", "vx_app_multi_cam");
        }
        #endif
    
        if (((obj->captureObj.enable_error_detection) || (obj->test_mode)) && (status == VX_SUCCESS))
        {
            status = app_send_error_frame(&obj->captureObj);
        }
    
        /* wait a while for prints to flush */
        tivxTaskWaitMsecs(100);
    
        return status;
    }
    
    vx_int32 write_output_image_fp(FILE * fp, vx_image out_image)
    {
        vx_uint32 width, height;
        vx_df_image df;
        vx_imagepatch_addressing_t image_addr;
        vx_rectangle_t rect;
        vx_map_id map_id1, map_id2;
        void *data_ptr1, *data_ptr2;
        vx_uint32 num_bytes_per_4pixels;
        vx_uint32 num_luma_bytes_written_to_file=0;
        vx_uint32 num_chroma_bytes_written_to_file=0;
        vx_uint32 num_bytes_written_to_file=0;
        vx_uint32 imgaddr_width, imgaddr_height, imgaddr_stride;
        int i;
    
        vxQueryImage(out_image, VX_IMAGE_WIDTH, &width, sizeof(vx_uint32));
        vxQueryImage(out_image, VX_IMAGE_HEIGHT, &height, sizeof(vx_uint32));
        vxQueryImage(out_image, VX_IMAGE_FORMAT, &df, sizeof(vx_df_image));
    
        // printf("%d\n",VX_DF_IMAGE_VIRT);
        // printf("%d\n",VX_DF_IMAGE_RGB );
        // printf("%d\n",VX_DF_IMAGE_RGBX);
        // printf("%d\n",VX_DF_IMAGE_NV12);
        // printf("%d\n",VX_DF_IMAGE_NV21);
        // printf("%d\n",VX_DF_IMAGE_UYVY);
        // printf("%d\n",VX_DF_IMAGE_YUYV);
        // printf("%d\n",VX_DF_IMAGE_IYUV);
        // printf("%d\n",VX_DF_IMAGE_YUV4);
        // printf("=========%d======\n",df);
    
    
    
        if(VX_DF_IMAGE_NV12 == df)
        {
            num_bytes_per_4pixels = 4;
        }
        else if(TIVX_DF_IMAGE_NV12_P12 == df)
        {
            num_bytes_per_4pixels = 6;
        }
        else
        {
            num_bytes_per_4pixels = 8;
        }
    
        rect.start_x = 0;
        rect.start_y = 0;
        rect.end_x = width;
        rect.end_y = height;
    
        vxMapImagePatch(out_image,
            &rect,
            0,
            &map_id1,
            &image_addr,
            &data_ptr1,
            VX_WRITE_ONLY,
            VX_MEMORY_TYPE_HOST,
            VX_NOGAP_X
            );
    
        if(!data_ptr1)
        {
            printf("data_ptr1 is NULL \n");
            return -1;
        }
    
        imgaddr_width  = image_addr.dim_x;
        imgaddr_height = image_addr.dim_y;
        imgaddr_stride = image_addr.stride_y;
        printf("imgaddr_width = %d \n", imgaddr_width);
        printf("imgaddr_height = %d \n", imgaddr_height);
        printf("imgaddr_stride = %d \n", imgaddr_stride);
        printf("width = %d \n", width);
        printf("height = %d \n", height);
    
        num_luma_bytes_written_to_file = 0;
    
        for(i=0;i<height;i++)
        {
            num_luma_bytes_written_to_file  += fwrite(data_ptr1, 1, width*num_bytes_per_4pixels/4, fp);
            data_ptr1 += imgaddr_stride;
        }
        vxUnmapImagePatch(out_image, map_id1);
    
        fflush(fp);
    
    
        if(VX_DF_IMAGE_NV12 == df || TIVX_DF_IMAGE_NV12_P12 == df)
        {
            vxMapImagePatch(out_image,
                &rect,
                1,
                &map_id2,
                &image_addr,
                &data_ptr2,
                VX_WRITE_ONLY,
                VX_MEMORY_TYPE_HOST,
                VX_NOGAP_X
                );
    
            if(!data_ptr2)
            {
                printf("data_ptr2 is NULL \n");
                return -1;
            }
    
            imgaddr_width  = image_addr.dim_x;
            imgaddr_height = image_addr.dim_y;
            imgaddr_stride = image_addr.stride_y;
    
            num_chroma_bytes_written_to_file = 0;
            for(i=0;i<imgaddr_height/2;i++)
            {
                num_chroma_bytes_written_to_file  += fwrite(data_ptr2, 1, imgaddr_width*num_bytes_per_4pixels/4, fp);
                data_ptr2 += imgaddr_stride;
            }
    
            fflush(fp);
    
            vxUnmapImagePatch(out_image, map_id2);
        }
    
        num_bytes_written_to_file = num_luma_bytes_written_to_file + num_chroma_bytes_written_to_file;
        printf("Written %d bytes \n", num_bytes_written_to_file);
    
        return num_bytes_written_to_file;
    }
    
    vx_int32 write_output_image_yuv422_8bit(char * file_name, vx_image out_yuv)
    {
        FILE * fp = fopen(file_name, "wb");
        if(!fp)
        {
            APP_PRINTF("Unable to open file %s\n", file_name);
            return -1;
        }
        vx_uint32 len1 = write_output_image_fp(fp, out_yuv);
        fclose(fp);
        APP_PRINTF("%d bytes written to %s\n", len1, file_name);
        return len1;
    }
    
    vx_int32 write_output_image_nv12_8bit(char * file_name, vx_image out_nv12)
    {
        FILE * fp = fopen(file_name, "ab+");
        if(!fp)
        {
            APP_PRINTF("Unable to open file %s\n", file_name);
            return -1;
        }
        vx_uint32 len1 = write_output_image_fp(fp, out_nv12);
        fclose(fp);
        APP_PRINTF("%d bytes written to %s\n", len1, file_name);
        return len1;
    }
    
    static vx_status app_run_graph_for_one_frame_pipeline(AppObj *obj, vx_int32 frame_id)
    {
        vx_status status = VX_SUCCESS;
        int32_t index;
    
        printf("app_run_graph_for_one_pipeline: frame %d beginning\n", frame_id);
        appPerfPointBegin(&obj->total_perf);
    
        ImgMosaicObj *imgMosaicObj = &obj->imgMosaicObj;
        CaptureObj *captureObj = &obj->captureObj;
    
        /* checksum_actual is the checksum determined by the realtime test
            checksum_expected is the checksum that is expected to be the pipeline output */
        uint32_t checksum_actual = 0;
    
        /* This is the number of frames required for the pipeline AWB and AE algorithms to stabilize
            (note that 15 is only required for the 6-8 camera use cases - others converge quicker) */
        uint8_t stability_frame = 15;
    
        if(obj->pipeline < 0)
        {
            /* Enqueue outpus */
            if ((obj->en_out_img_write == 1) || (obj->test_mode == 1))
            {
                status = vxGraphParameterEnqueueReadyRef(obj->graph, imgMosaicObj->graph_parameter_index, (vx_reference*)&imgMosaicObj->output_image[obj->enqueueCnt], 1);
            }
    
            /* Enqueue inputs during pipeup dont execute */
            if (status == VX_SUCCESS)
            {
                status = vxGraphParameterEnqueueReadyRef(obj->graph, captureObj->graph_parameter_index, (vx_reference*)&obj->captureObj.raw_image_arr[obj->enqueueCnt], 1);
            }
            obj->enqueueCnt++;
            obj->enqueueCnt   = (obj->enqueueCnt  >= APP_BUFFER_Q_DEPTH)? 0 : obj->enqueueCnt;
            obj->pipeline++;
        }
    
        if((obj->pipeline == 0) && (status == VX_SUCCESS))
        {
            if((obj->en_out_img_write == 1) || (obj->test_mode == 1))
            {
                status = vxGraphParameterEnqueueReadyRef(obj->graph, imgMosaicObj->graph_parameter_index, (vx_reference*)&imgMosaicObj->output_image[obj->enqueueCnt], 1);
            }
    
            /* Execute 1st frame */
            if(status == VX_SUCCESS)
            {
                status = vxGraphParameterEnqueueReadyRef(obj->graph, captureObj->graph_parameter_index, (vx_reference*)&obj->captureObj.raw_image_arr[obj->enqueueCnt], 1);
            }
            obj->enqueueCnt++;
            obj->enqueueCnt   = (obj->enqueueCnt  >= APP_BUFFER_Q_DEPTH)? 0 : obj->enqueueCnt;
            obj->pipeline++;
        }
    
        if((obj->pipeline > 0) && (status == VX_SUCCESS))
        {
            vx_object_array capture_input_arr;
            vx_image mosaic_output_image;
            uint32_t num_refs;
            vx_image capture_image0;
            uint64_t timestamp = 0;
    
            /* Dequeue input */
            status = vxGraphParameterDequeueDoneRef(obj->graph, captureObj->graph_parameter_index, (vx_reference*)&capture_input_arr, 1, &num_refs);
            if((obj->en_out_img_write == 1) || (obj->test_mode == 1))
            {
                //vx_char output_file_name[APP_MAX_FILE_PATH];
    
                /* Dequeue output */
                if (status == VX_SUCCESS)
                {
                    status = vxGraphParameterDequeueDoneRef(obj->graph, imgMosaicObj->graph_parameter_index, (vx_reference*)&mosaic_output_image, 1, &num_refs);
                }
                if ((status == VX_SUCCESS) && (obj->test_mode == 1) && (frame_id > TEST_BUFFER))
                {
                    /* calculate the checksum of the mosaic output */
    
                    if ((app_test_check_image(mosaic_output_image, checksums_expected[obj->sensorObj.sensor_index][obj->sensorObj.num_cameras_enabled-1],
                                            &checksum_actual) != vx_true_e) && (frame_id > stability_frame))
                    {
                        test_result = vx_false_e;
                        /* in case test fails and needs to change */
                        populate_gatherer(obj->sensorObj.sensor_index, obj->sensorObj.num_cameras_enabled-1, checksum_actual);
                    }
                }
    
                if (obj->en_out_img_write == 1) {
                    appPerfPointBegin(&obj->fileio_perf);
    
                    index = app_img_mosaic_getimageindex(&obj->imgMosaicObj,mosaic_output_image,APP_BUFFER_Q_DEPTH);
                    capture_image0 = (vx_image) vxGetObjectArrayItem(capture_input_arr, 0);
                    vxQueryReference((vx_reference)capture_image0, TIVX_REFERENCE_TIMESTAMP, &timestamp, sizeof(timestamp));
                    // SendFrameData_Server(index, timestamp);
    
                    // write_output_image_nv12_8bit("./out.yuv", (vx_image) vxGetObjectArrayItem(capture_input_arr, 0));
                    // write_output_image_nv12_8bit("./out.yuv", (vx_image) vxGetObjectArrayItem(capture_input_arr, 1));
                    // write_output_image_nv12_8bit("./out.yuv", (vx_image) vxGetObjectArrayItem(capture_input_arr, 2));
                    // write_output_image_nv12_8bit("./out.yuv", (vx_image) vxGetObjectArrayItem(capture_input_arr, 3));
                    //snprintf(output_file_name, APP_MAX_FILE_PATH, "%s/mosaic_output_%010d_%dx%d.yuv", obj->output_file_path, (frame_id - APP_BUFFER_Q_DEPTH), imgMosaicObj->out_width, imgMosaicObj->out_height);
                    if (status == VX_SUCCESS)
                    {
                        //status = writeMosaicOutput(output_file_name, mosaic_output_image);
                    }
                    appPerfPointEnd(&obj->fileio_perf);
                }
                /* Enqueue output */
                if (status == VX_SUCCESS)
                {
                    status = vxGraphParameterEnqueueReadyRef(obj->graph, imgMosaicObj->graph_parameter_index, (vx_reference*)&mosaic_output_image, 1);
                }
            }
    
            /* Enqueue input - start execution */
            if (status == VX_SUCCESS)
            {
                status = vxGraphParameterEnqueueReadyRef(obj->graph, captureObj->graph_parameter_index, (vx_reference*)&capture_input_arr, 1);
            }
            obj->enqueueCnt++;
            obj->dequeueCnt++;
    
            obj->enqueueCnt = (obj->enqueueCnt >= APP_BUFFER_Q_DEPTH)? 0 : obj->enqueueCnt;
            obj->dequeueCnt = (obj->dequeueCnt >= APP_BUFFER_Q_DEPTH)? 0 : obj->dequeueCnt;
        }
    
        appPerfPointEnd(&obj->total_perf);
        return status;
    }
    
    static vx_status app_run_graph(AppObj *obj)
    {
        vx_status status = VX_SUCCESS;
    
        SensorObj *sensorObj = &obj->sensorObj;
        vx_int32 frame_id = 0;
    
        //camera_reply_register();
        //camera_fault_register(obj);
        //camera_health_register();
        app_pipeline_params_defaults(obj);
        APP_PRINTF("app_pipeline_params_defaults returned\n");
    
        if(NULL == sensorObj->sensor_name)
        {
            printf("sensor name is NULL \n");
            return VX_FAILURE;
        }
    
        // if test_mode is enabled, don't fail the program if the sensor init fails
        if(obj->test_mode)
        {
            appStartImageSensor(sensorObj->sensor_name, ((1 << sensorObj->num_cameras_enabled) - 1));
        }
        else
        {
            status = appStartImageSensor(sensorObj->sensor_name, ((1 << sensorObj->num_cameras_enabled) - 1));
            APP_PRINTF("appStartImageSensor returned with status: %d\n", status);
        }
    
        if(0 == obj->enable_viss)
        {
            obj->vissObj.en_out_viss_write = 0;
        }
    
        if (obj->test_mode == 1) {
            // The buffer allows AWB/AE algos to converge before checksums are calculated
            obj->num_frames_to_run = TEST_BUFFER + 30;
        }
    
        //for(frame_id = 0; frame_id < obj->num_frames_to_run; frame_id++)
        while (run_flag == 1)
        {
            frame_id++;
    
            if(obj->write_file == 1)
            {
                if((obj->captureObj.en_out_capture_write == 1) && (status == VX_SUCCESS))
                {
                    status = app_send_cmd_capture_write_node(&obj->captureObj, frame_id, obj->num_frames_to_write, obj->num_frames_to_skip);
                }
                if((obj->vissObj.en_out_viss_write == 1) && (status == VX_SUCCESS))
                {
                    status = app_send_cmd_viss_write_node(&obj->vissObj, frame_id, obj->num_frames_to_write, obj->num_frames_to_skip);
                }
                if((obj->ldcObj.en_out_ldc_write == 1) && (status == VX_SUCCESS))
                {
                    status = app_send_cmd_ldc_write_node(&obj->ldcObj, frame_id, obj->num_frames_to_write, obj->num_frames_to_skip);
                }
                obj->write_file = 0;
            }
    
            if (status == VX_SUCCESS)
            {
                status = app_run_graph_for_one_frame_pipeline(obj, frame_id);
                //camera_health_report();
            }
    
            /* user asked to stop processing */
            if(obj->stop_task)
              break;
        }
    
        if (status == VX_SUCCESS)
        {
            status = vxWaitGraph(obj->graph);
        }
        obj->stop_task = 1;
    
        if (status == VX_SUCCESS)
        {
            status = appStopImageSensor(obj->sensorObj.sensor_name, ((1 << sensorObj->num_cameras_enabled) - 1));
        }
    
        return status;
    }
    
    static void set_display_defaults(DisplayObj *displayObj)
    {
        displayObj->display_option = 1;
    }
    
    static void app_pipeline_params_defaults(AppObj *obj)
    {
        obj->pipeline       = -APP_BUFFER_Q_DEPTH + 1;
        obj->enqueueCnt     = 0;
        obj->dequeueCnt     = 0;
    }
    
    static void set_sensor_defaults(SensorObj *sensorObj)
    {
        strcpy(sensorObj->sensor_name, SENSOR_SONY_IMX390_UB953_D3);
    
        sensorObj->num_sensors_found = 0;
        sensorObj->sensor_features_enabled = 0;
        sensorObj->sensor_features_supported = 0;
        sensorObj->sensor_dcc_enabled = 0;
        sensorObj->sensor_wdr_enabled = 0;
        sensorObj->sensor_exp_control_enabled = 0;
        sensorObj->sensor_gain_control_enabled = 0;
    }
    
    static void app_default_param_set(AppObj *obj)
    {
        set_sensor_defaults(&obj->sensorObj);
    
        if(obj->enable_display == 1)
        {
            set_display_defaults(&obj->displayObj);
        }
    
        app_pipeline_params_defaults(obj);
    
        obj->is_interactive = 1;
        obj->test_mode = 0;
        obj->write_file = 0;
    
        obj->sensorObj.enable_ldc = 0;
        obj->sensorObj.num_cameras_enabled = 1;
        obj->sensorObj.usecase_option = APP_SENSOR_FEATURE_CFG_UC0;
    }
    
    static vx_int32 calc_grid_size(vx_uint32 ch)
    {
        if(0==ch)
        {
            return -1;
        }
        else if(1==ch)
        {
            return 1;
        }
        else if(4>=ch)
        {
            return 2;
        }
        else if(9>=ch)
        {
            return 3;
        }
        else if(16>=ch)
        {
            return 4;
        }else
        {
            return -1;
        }
    }
    
    static void set_img_mosaic_params(ImgMosaicObj *imgMosaicObj, vx_uint32 in_width, vx_uint32 in_height, vx_int32 numCh)
    {
        vx_int32 idx, ch;
        vx_int32 grid_size = calc_grid_size(numCh);
    
        imgMosaicObj->out_width    = MOSAIC_OUT_WIDTH;
        imgMosaicObj->out_height   = MOSAIC_OUT_HEIGHT;
        imgMosaicObj->num_inputs   = 1;
        printf("imgMosaicObj->out_width = %d %d\n",imgMosaicObj->out_width,numCh);
    
        idx = 0;
    
        tivxImgMosaicParamsSetDefaults(&imgMosaicObj->params);
    
        for(ch = 0; ch < numCh; ch++)
        {
            vx_int32 winX = ch%grid_size;
            vx_int32 winY = ch/grid_size;
    
            imgMosaicObj->params.windows[idx].startX  = (winX * (in_width/grid_size));
            imgMosaicObj->params.windows[idx].startY  = (winY * (in_height/grid_size));
            imgMosaicObj->params.windows[idx].width   = in_width/grid_size;
            imgMosaicObj->params.windows[idx].height  = in_height/grid_size;
            imgMosaicObj->params.windows[idx].input_select   = 0;
            imgMosaicObj->params.windows[idx].channel_select = ch;
            idx++;
        }
    
        imgMosaicObj->params.num_windows  = idx;
    
        /* Number of time to clear the output buffer before it gets reused */
        imgMosaicObj->params.clear_count  = APP_BUFFER_Q_DEPTH;
        //imgMosaicObj->params.enable_overlay = 0;
    }
    
    static void app_update_param_set(AppObj *obj)
    {
    
        vx_uint16 resized_width, resized_height;
        appIssGetResizeParams(obj->sensorObj.image_width, obj->sensorObj.image_height, MOSAIC_OUT_WIDTH, MOSAIC_OUT_HEIGHT, &resized_width, &resized_height);
    
        set_img_mosaic_params(&obj->imgMosaicObj, resized_width, resized_height, obj->sensorObj.num_cameras_enabled);
    
    }
    
    /*
     * Utility API used to add a graph parameter from a node, node parameter index
     */
    static void add_graph_parameter_by_node_index(vx_graph graph, vx_node node, vx_uint32 node_parameter_index)
    {
        vx_parameter parameter = vxGetParameterByIndex(node, node_parameter_index);
    
        vxAddParameterToGraph(graph, parameter);
        vxReleaseParameter(&parameter);
    }
    

  • 请看下面e2e英文论坛工程师的回复。

    Mosaic node uses Multi Scalar Component in the VPAC HWA. This does not support input format of UYVY.

    Customer could use the DSS M2M node to convert from UYVY to NV12 format.

    TDA4VH-Q1: mosaic node: Failed to Submit Request - Processors forum - Processors - TI E2E support forums

  • 你好,我看到之前的代码如下,mosaic节点是可以直接接收capture的输入的,capture输入的就是uyvy类型的数据

    mosaic_in_arr = obj->captureObj.raw_image_arr[0];

  • 你好,我看到之前的代码如下,mosaic节点是可以直接接收capture的输入的,capture输入的就是uyvy类型的数据

    请问是在哪里看到这个代码的?是TI 例程里的吗?

  • 是TI例程的,比如

    vision_apps/apps/basic_demos/app_multi_cam/main.c

    app_create_graph

  • 请看下面工程师的回复。

    obj->enable_mosaic = 0 for YUV422 format as shown below

    Hence, in this above case, the mosaic node is not created at all.

    It is direct capture to display node.