{ "cells": [ { "cell_type": "markdown", "id": "5b1cf409", "metadata": {}, "source": [ "# Triton on SageMaker - NLP Bert\n", "\n", "[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is a fully managed service for data science and machine learning workflows. It helps data scientists and developers to prepare, build, train, and deploy high-quality ML models quickly by bringing together a broad set of capabilities purpose-built for ML.\n", "\n", "Now, [NVIDIA Triton Inference Server](https://github.com/triton-inference-server/server/) can be used to serve models for inference in Amazon SageMaker. Thanks to the new NVIDIA Triton container image, you can easily serve ML models and benefit from the performance optimizations, dynamic batching, and multi-framework support provided by NVIDIA Triton. Triton helps maximize the utilization of GPU and CPU, further lowering the cost of inference.\n", "\n", "This notebook was tested with the `conda_python3` kernel on an Amazon SageMaker notebook instance of type `g4dn`." ] }, { "cell_type": "markdown", "id": "13be192c", "metadata": {}, "source": [ "## Contents\n", "1. [Introduction to NVIDIA Triton Server](#Introduction-to-NVIDIA-Triton-Server)\n", "1. [Set up the environment](#Set-up-the-environment)\n", "1. [Add utility methods for preparing request payload](#Add-utility-methods-for-preparing-request-payload)\n", "1. [Basic: PyTorch NLP-Bert](#PyTorch-NLP-Bert)\n", " 1. [PyTorch: Packaging model files and uploading to s3](#PyTorch:-Packaging-model-files-and-uploading-to-s3)\n", " 1. [PyTorch: Create SageMaker Endpoint](#PyTorch:-Create-SageMaker-Endpoint)\n", " 1. [PyTorch: Run inference](#PyTorch:-Run-inference)\n", " 1. [PyTorch: Terminate endpoint and clean up artifacts](#PyTorch:-Terminate-endpoint-and-clean-up-artifacts)\n", "1. [Advanced: TensorRT NLP-Bert](#TensorRT-NLP-Bert)\n", " 1. [TensorRT: Packaging model files and uploading to s3](#TensorRT:-Packaging-model-files-and-uploading-to-s3)\n", " 1. [TensorRT: Create SageMaker Endpoint](#TensorRT:-Create-SageMaker-Endpoint)\n", " 1. [TensorRT: Run inference](#TensorRT:-Run-inference)\n", " 1. [TensorRT: Terminate endpoint and clean up artifacts](#TensorRT:-Terminate-endpoint-and-clean-up-artifacts)" ] }, { "cell_type": "markdown", "id": "a4fef1b8", "metadata": {}, "source": [ "## Introduction to NVIDIA Triton Server\n", "\n", "[NVIDIA Triton Inference Server](https://github.com/triton-inference-server/server/) was developed specifically to enable scalable, cost-effective, and easy deployment of models in production. NVIDIA Triton Inference Server is open-source inference serving software that simplifies the inference serving process and provides high inference performance.\n", "\n", "Some key features of Triton are:\n", "* **Support for Multiple frameworks**: Triton can be used to deploy models from all major frameworks. Triton supports TensorFlow GraphDef, TensorFlow SavedModel, ONNX, PyTorch TorchScript, TensorRT, RAPIDS FIL for tree based models, and OpenVINO model formats. \n", "* **Model pipelines**: Triton model ensemble represents a pipeline of one or more models or pre/post processing logic and the connection of input and output tensors between them. A single inference request to an ensemble will trigger the execution of the entire pipeline.\n", "* **Concurrent model execution**: Multiple models (or multiple instances of the same model) can run simultaneously on the same GPU or on multiple GPUs for different model management needs.\n", "* **Dynamic batching**: For models that support batching, Triton has multiple built-in scheduling and batching algorithms that combine individual inference requests together to improve inference throughput. These scheduling and batching decisions are transparent to the client requesting inference.\n", "* **Diverse CPUs and GPUs**: The models can be executed on CPUs or GPUs for maximum flexibility and to support heterogeneous computing requirements.\n", "\n", "**Note**: This initial release of NVIDIA Triton on SageMaker will only support a single model. Future releases will have multi-model support. A minimal `config.pbtxt` configuration file is **required** in the model artifacts. This release doesn't support inferring the model config automatically." ] }, { "cell_type": "markdown", "id": "37b4264f", "metadata": {}, "source": [ "## Set up the environment\n", "\n", "Installs the dependencies required to package the model and run inferences using Triton server.\n", "\n", "Also define the IAM role that will give SageMaker access to the model artifacts and the NVIDIA Triton ECR image." ] }, { "cell_type": "code", "execution_count": 65, "id": "bbae2cd8", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com\n", "Requirement already satisfied: nvidia-pyindex in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (1.0.9)\n", "Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com\n", "Requirement already satisfied: tritonclient[http] in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (2.17.0)\n", "Requirement already satisfied: python-rapidjson>=0.9.1 in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from tritonclient[http]) (1.5)\n", "Requirement already satisfied: numpy>=1.19.1 in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from tritonclient[http]) (1.19.5)\n", "Requirement already satisfied: geventhttpclient>=1.4.4 in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from tritonclient[http]) (1.5.3)\n", "Requirement already satisfied: gevent>=0.13 in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from geventhttpclient>=1.4.4->tritonclient[http]) (21.1.2)\n", "Requirement already satisfied: brotli in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from geventhttpclient>=1.4.4->tritonclient[http]) (1.0.9)\n", "Requirement already satisfied: six in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from geventhttpclient>=1.4.4->tritonclient[http]) (1.15.0)\n", "Requirement already satisfied: certifi in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from geventhttpclient>=1.4.4->tritonclient[http]) (2021.5.30)\n", "Requirement already satisfied: zope.event in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from gevent>=0.13->geventhttpclient>=1.4.4->tritonclient[http]) (4.5.0)\n", "Requirement already satisfied: greenlet<2.0,>=0.4.17 in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from gevent>=0.13->geventhttpclient>=1.4.4->tritonclient[http]) (0.4.17)\n", "Requirement already satisfied: setuptools in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from gevent>=0.13->geventhttpclient>=1.4.4->tritonclient[http]) (49.6.0.post20210108)\n", "Requirement already satisfied: zope.interface in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from gevent>=0.13->geventhttpclient>=1.4.4->tritonclient[http]) (5.2.0)\n" ] } ], "source": [ "!pip install -qU pip awscli boto3 sagemaker transformers==4.9.1\n", "!pip install nvidia-pyindex\n", "!pip install tritonclient[http]" ] }, { "cell_type": "code", "execution_count": 66, "id": "01f893aa", "metadata": {}, "outputs": [], "source": [ "import boto3, json, sagemaker, time\n", "from sagemaker import get_execution_role\n", "\n", "sess = boto3.Session()\n", "sm = sess.client(\"sagemaker\")\n", "sagemaker_session = sagemaker.Session(boto_session=sess)\n", "role = get_execution_role()\n", "client = boto3.client(\"sagemaker-runtime\")" ] }, { "cell_type": "code", "execution_count": 67, "id": "70daca03", "metadata": {}, "outputs": [], "source": [ "account_id_map = {\n", " 'us-east-1': '785573368785',\n", " 'us-east-2': '007439368137',\n", " 'us-west-1': '710691900526',\n", " 'us-west-2': '301217895009',\n", " 'eu-west-1': '802834080501',\n", " 'eu-west-2': '205493899709',\n", " 'eu-west-3': '254080097072',\n", " 'eu-north-1': '601324751636',\n", " 'eu-south-1': '966458181534',\n", " 'eu-central-1': '746233611703',\n", " 'ap-east-1': '110948597952',\n", " 'ap-south-1': '763008648453',\n", " 'ap-northeast-1': '941853720454',\n", " 'ap-northeast-2': '151534178276',\n", " 'ap-southeast-1': '324986816169',\n", " 'ap-southeast-2': '355873309152',\n", " 'cn-northwest-1': '474822919863',\n", " 'cn-north-1': '472730292857',\n", " 'sa-east-1': '756306329178',\n", " 'ca-central-1': '464438896020',\n", " 'me-south-1': '836785723513',\n", " 'af-south-1': '774647643957'\n", "}" ] }, { "cell_type": "code", "execution_count": 68, "id": "122a2f7d", "metadata": {}, "outputs": [], "source": [ "region = boto3.Session().region_name\n", "if region not in account_id_map.keys():\n", " raise(\"UNSUPPORTED REGION\")" ] }, { "cell_type": "code", "execution_count": 70, "id": "af3046d2", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'785573368785.dkr.ecr.us-east-1.amazonaws.com/sagemaker-tritonserver:21.08-py3'" ] }, "execution_count": 70, "metadata": {}, "output_type": "execute_result" } ], "source": [ "base = \"amazonaws.com.cn\" if region.startswith(\"cn-\") else \"amazonaws.com\"\n", "triton_image_uri = \"{account_id}.dkr.ecr.{region}.{base}/sagemaker-tritonserver:21.08-py3\".format(\n", " account_id=account_id_map[region], region=region, base=base\n", ")\n", "triton_image_uri" ] }, { "cell_type": "markdown", "id": "43cfe13e", "metadata": {}, "source": [ "## Add utility methods for preparing request payload\n", "\n", "The following method transforms the sample text we will be using for inference into the payload that can be sent for inference to the Triton server." ] }, { "cell_type": "markdown", "id": "2637e931", "metadata": {}, "source": [ "The `tritonclient` package provides utility methods to generate the payload without having to know the details of the specification. We'll use the following methods to convert our inference request into a binary format which provides lower latencies for inference." ] }, { "cell_type": "code", "execution_count": 72, "id": "ce3487af", "metadata": {}, "outputs": [], "source": [ "import tritonclient.http as httpclient\n", "from transformers import BertTokenizer\n", "import numpy as np\n", "\n", "\n", "def tokenize_text(text):\n", " enc = BertTokenizer.from_pretrained(\"bert-base-uncased\")\n", " encoded_text = enc(text, padding=\"max_length\", max_length=128)\n", " return encoded_text[\"input_ids\"], encoded_text[\"attention_mask\"]\n", "\n", "\n", "def _get_sample_tokenized_text_binary(text, input_names, output_names):\n", " inputs = []\n", " outputs = []\n", " inputs.append(httpclient.InferInput(input_names[0], [1, 128], \"INT32\"))\n", " inputs.append(httpclient.InferInput(input_names[1], [1, 128], \"INT32\"))\n", " indexed_tokens, attention_mask = tokenize_text(text)\n", "\n", " indexed_tokens = np.array(indexed_tokens, dtype=np.int32)\n", " indexed_tokens = np.expand_dims(indexed_tokens, axis=0)\n", " inputs[0].set_data_from_numpy(indexed_tokens, binary_data=True)\n", "\n", " attention_mask = np.array(attention_mask, dtype=np.int32)\n", " attention_mask = np.expand_dims(attention_mask, axis=0)\n", " inputs[1].set_data_from_numpy(attention_mask, binary_data=True)\n", "\n", " outputs.append(httpclient.InferRequestedOutput(output_names[0], binary_data=True))\n", " outputs.append(httpclient.InferRequestedOutput(output_names[1], binary_data=True))\n", " request_body, header_length = httpclient.InferenceServerClient.generate_request_body(\n", " inputs, outputs=outputs\n", " )\n", " return request_body, header_length\n", "\n", "\n", "def get_sample_tokenized_text_binary_pt(text):\n", " return _get_sample_tokenized_text_binary(\n", " text, [\"INPUT__0\", \"INPUT__1\"], [\"OUTPUT__0\", \"1634__1\"]\n", " )\n", "\n", "\n", "def get_sample_tokenized_text_binary_trt(text):\n", " return _get_sample_tokenized_text_binary(text, [\"token_ids\", \"attn_mask\"], [\"output\", \"1634\"])" ] }, { "cell_type": "markdown", "id": "9807ad6c", "metadata": {}, "source": [ "### You do not need to run the below cell, unless you want to make changes to the current onnx_exporter.py and pt_exporter.py files with an updated model" ] }, { "cell_type": "code", "execution_count": 73, "id": "2451f34b", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "=============\n", "== PyTorch ==\n", "=============\n", "\n", "NVIDIA Release 21.08 (build 26011915)\n", "PyTorch Version 1.10.0a0+3fd9dcf\n", "\n", "Container image Copyright (c) 2021, NVIDIA CORPORATION. All rights reserved.\n", "\n", "Copyright (c) 2014-2021 Facebook Inc.\n", "Copyright (c) 2011-2014 Idiap Research Institute (Ronan Collobert)\n", "Copyright (c) 2012-2014 Deepmind Technologies (Koray Kavukcuoglu)\n", "Copyright (c) 2011-2012 NEC Laboratories America (Koray Kavukcuoglu)\n", "Copyright (c) 2011-2013 NYU (Clement Farabet)\n", "Copyright (c) 2006-2010 NEC Laboratories America (Ronan Collobert, Leon Bottou, Iain Melvin, Jason Weston)\n", "Copyright (c) 2006 Idiap Research Institute (Samy Bengio)\n", "Copyright (c) 2001-2004 Idiap Research Institute (Ronan Collobert, Samy Bengio, Johnny Mariethoz)\n", "Copyright (c) 2015 Google Inc.\n", "Copyright (c) 2015 Yangqing Jia\n", "Copyright (c) 2013-2016 The Caffe contributors\n", "All rights reserved.\n", "\n", "NVIDIA Deep Learning Profiler (dlprof) Copyright (c) 2021, NVIDIA CORPORATION. All rights reserved.\n", "\n", "Various files include modifications (c) NVIDIA CORPORATION. All rights reserved.\n", "\n", "This container image and its contents are governed by the NVIDIA Deep Learning Container License.\n", "By pulling and using the container, you accept the terms and conditions of this license:\n", "https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license\n", "\n", "NOTE: Legacy NVIDIA Driver detected. Compatibility mode ENABLED.\n", "\n", "NOTE: MOFED driver for multi-node communication was not detected.\n", " Multi-node communication performance may be reduced.\n", "\n", "NOTE: The SHMEM allocation limit is set to the default of 64MB. This may be\n", " insufficient for PyTorch. NVIDIA recommends the use of the following flags:\n", " nvidia-docker run --ipc=host ...\n", "\n", "Collecting transformers==4.9.1\n", " Downloading transformers-4.9.1-py3-none-any.whl (2.6 MB)\n", "\u001b[K |████████████████████████████████| 2.6 MB 26.4 MB/s eta 0:00:01\n", "\u001b[?25hRequirement already satisfied: sacremoses in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (0.0.45)\n", "Requirement already satisfied: filelock in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (3.0.12)\n", "Requirement already satisfied: tqdm>=4.27 in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (4.62.1)\n", "Requirement already satisfied: pyyaml>=5.1 in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (5.4.1)\n", "Collecting huggingface-hub==0.0.12\n", " Downloading huggingface_hub-0.0.12-py3-none-any.whl (37 kB)\n", "Requirement already satisfied: numpy>=1.17 in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (1.21.2)\n", "Collecting tokenizers<0.11,>=0.10.1\n", " Downloading tokenizers-0.10.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (3.3 MB)\n", "\u001b[K |████████████████████████████████| 3.3 MB 92.5 MB/s eta 0:00:01\n", "\u001b[?25hRequirement already satisfied: packaging in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (21.0)\n", "Requirement already satisfied: requests in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (2.26.0)\n", "Requirement already satisfied: regex!=2019.12.17 in /opt/conda/lib/python3.8/site-packages (from transformers==4.9.1) (2021.8.3)\n", "Requirement already satisfied: typing-extensions in /opt/conda/lib/python3.8/site-packages (from huggingface-hub==0.0.12->transformers==4.9.1) (3.10.0.0)\n", "Requirement already satisfied: pyparsing>=2.0.2 in /opt/conda/lib/python3.8/site-packages (from packaging->transformers==4.9.1) (2.4.7)\n", "Requirement already satisfied: charset-normalizer~=2.0.0 in /opt/conda/lib/python3.8/site-packages (from requests->transformers==4.9.1) (2.0.0)\n", "Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.8/site-packages (from requests->transformers==4.9.1) (3.1)\n", "Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.8/site-packages (from requests->transformers==4.9.1) (2021.5.30)\n", "Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/conda/lib/python3.8/site-packages (from requests->transformers==4.9.1) (1.26.6)\n", "Requirement already satisfied: joblib in /opt/conda/lib/python3.8/site-packages (from sacremoses->transformers==4.9.1) (1.0.1)\n", "Requirement already satisfied: six in /opt/conda/lib/python3.8/site-packages (from sacremoses->transformers==4.9.1) (1.16.0)\n", "Requirement already satisfied: click in /opt/conda/lib/python3.8/site-packages (from sacremoses->transformers==4.9.1) (7.1.2)\n", "Installing collected packages: tokenizers, huggingface-hub, transformers\n", "Successfully installed huggingface-hub-0.0.12 tokenizers-0.10.3 transformers-4.9.1\n", "\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\n", "Downloading: 100%|██████████████████████████████| 625/625 [00:00<00:00, 718kB/s]\n", "Downloading: 100%|███████████████████████████| 672M/672M [00:06<00:00, 98.7MB/s]\n", "Some weights of the model checkpoint at bert-base-multilingual-uncased were not used when initializing BertModel: ['cls.predictions.transform.LayerNorm.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.dense.weight']\n", "- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", "- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py:1241: UserWarning: No names were found for specified dynamic axes of provided input.Automatically generated names will be applied to each dynamic axes of input token_ids\n", " warnings.warn(\"No names were found for specified dynamic axes of provided input.\"\n", "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py:1241: UserWarning: No names were found for specified dynamic axes of provided input.Automatically generated names will be applied to each dynamic axes of input attn_mask\n", " warnings.warn(\"No names were found for specified dynamic axes of provided input.\"\n", "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py:1241: UserWarning: No names were found for specified dynamic axes of provided input.Automatically generated names will be applied to each dynamic axes of input output\n", " warnings.warn(\"No names were found for specified dynamic axes of provided input.\"\n", "/opt/conda/lib/python3.8/site-packages/transformers/modeling_utils.py:2154: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!\n", " assert all(\n", "Saved model.onnx\n", "Using cuda device\n", "Some weights of the model checkpoint at bert-base-multilingual-uncased were not used when initializing BertModel: ['cls.seq_relationship.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.weight']\n", "- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", "- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", "/opt/conda/lib/python3.8/site-packages/transformers/modeling_utils.py:2154: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!\n", " assert all(\n", "Saved model.pt\n", "&&&& RUNNING TensorRT.trtexec [TensorRT v8001] # trtexec --onnx=model.onnx --saveEngine=model_bs16.plan --minShapes=token_ids:1x128,attn_mask:1x128 --optShapes=token_ids:16x128,attn_mask:16x128 --maxShapes=token_ids:128x128,attn_mask:128x128 --fp16 --verbose --workspace=14000\n", "[01/17/2022-21:09:32] [I] === Model Options ===\n", "[01/17/2022-21:09:32] [I] Format: ONNX\n", "[01/17/2022-21:09:32] [I] Model: model.onnx\n", "[01/17/2022-21:09:32] [I] Output:\n", "[01/17/2022-21:09:32] [I] === Build Options ===\n", "[01/17/2022-21:09:32] [I] Max batch: explicit\n", "[01/17/2022-21:09:32] [I] Workspace: 14000 MiB\n", "[01/17/2022-21:09:32] [I] minTiming: 1\n", "[01/17/2022-21:09:32] [I] avgTiming: 8\n", "[01/17/2022-21:09:32] [I] Precision: FP32+FP16\n", "[01/17/2022-21:09:32] [I] Calibration: \n", "[01/17/2022-21:09:32] [I] Refit: Disabled\n", "[01/17/2022-21:09:32] [I] Sparsity: Disabled\n", "[01/17/2022-21:09:32] [I] Safe mode: Disabled\n", "[01/17/2022-21:09:32] [I] Restricted mode: Disabled\n", "[01/17/2022-21:09:32] [I] Save engine: model_bs16.plan\n", "[01/17/2022-21:09:32] [I] Load engine: \n", "[01/17/2022-21:09:32] [I] NVTX verbosity: 0\n", "[01/17/2022-21:09:32] [I] Tactic sources: Using default tactic sources\n", "[01/17/2022-21:09:32] [I] timingCacheMode: local\n", "[01/17/2022-21:09:32] [I] timingCacheFile: \n", "[01/17/2022-21:09:32] [I] Input(s)s format: fp32:CHW\n", "[01/17/2022-21:09:32] [I] Output(s)s format: fp32:CHW\n", "[01/17/2022-21:09:32] [I] Input build shape: attn_mask=1x128+16x128+128x128\n", "[01/17/2022-21:09:32] [I] Input build shape: token_ids=1x128+16x128+128x128\n", "[01/17/2022-21:09:32] [I] Input calibration shapes: model\n", "[01/17/2022-21:09:32] [I] === System Options ===\n", "[01/17/2022-21:09:32] [I] Device: 0\n", "[01/17/2022-21:09:32] [I] DLACore: \n", "[01/17/2022-21:09:32] [I] Plugins:\n", "[01/17/2022-21:09:32] [I] === Inference Options ===\n", "[01/17/2022-21:09:32] [I] Batch: Explicit\n", "[01/17/2022-21:09:32] [I] Input inference shape: token_ids=16x128\n", "[01/17/2022-21:09:32] [I] Input inference shape: attn_mask=16x128\n", "[01/17/2022-21:09:32] [I] Iterations: 10\n", "[01/17/2022-21:09:32] [I] Duration: 3s (+ 200ms warm up)\n", "[01/17/2022-21:09:32] [I] Sleep time: 0ms\n", "[01/17/2022-21:09:32] [I] Streams: 1\n", "[01/17/2022-21:09:32] [I] ExposeDMA: Disabled\n", "[01/17/2022-21:09:32] [I] Data transfers: Enabled\n", "[01/17/2022-21:09:32] [I] Spin-wait: Disabled\n", "[01/17/2022-21:09:32] [I] Multithreading: Disabled\n", "[01/17/2022-21:09:32] [I] CUDA Graph: Disabled\n", "[01/17/2022-21:09:32] [I] Separate profiling: Disabled\n", "[01/17/2022-21:09:32] [I] Time Deserialize: Disabled\n", "[01/17/2022-21:09:32] [I] Time Refit: Disabled\n", "[01/17/2022-21:09:32] [I] Skip inference: Disabled\n", "[01/17/2022-21:09:32] [I] Inputs:\n", "[01/17/2022-21:09:32] [I] === Reporting Options ===\n", "[01/17/2022-21:09:32] [I] Verbose: Enabled\n", "[01/17/2022-21:09:32] [I] Averages: 10 inferences\n", "[01/17/2022-21:09:32] [I] Percentile: 99\n", "[01/17/2022-21:09:32] [I] Dump refittable layers:Disabled\n", "[01/17/2022-21:09:32] [I] Dump output: Disabled\n", "[01/17/2022-21:09:32] [I] Profile: Disabled\n", "[01/17/2022-21:09:32] [I] Export timing to JSON file: \n", "[01/17/2022-21:09:32] [I] Export output to JSON file: \n", "[01/17/2022-21:09:32] [I] Export profile to JSON file: \n", "[01/17/2022-21:09:32] [I] \n", "[01/17/2022-21:09:32] [I] === Device Information ===\n", "[01/17/2022-21:09:32] [I] Selected Device: Tesla T4\n", "[01/17/2022-21:09:32] [I] Compute Capability: 7.5\n", "[01/17/2022-21:09:32] [I] SMs: 40\n", "[01/17/2022-21:09:32] [I] Compute Clock Rate: 1.59 GHz\n", "[01/17/2022-21:09:32] [I] Device Global Memory: 15109 MiB\n", "[01/17/2022-21:09:32] [I] Shared Memory per SM: 64 KiB\n", "[01/17/2022-21:09:32] [I] Memory Bus Width: 256 bits (ECC enabled)\n", "[01/17/2022-21:09:32] [I] Memory Clock Rate: 5.001 GHz\n", "[01/17/2022-21:09:32] [I] \n", "[01/17/2022-21:09:32] [I] TensorRT version: 8001\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::GridAnchor_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::GridAnchorRect_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::NMS_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::Reorg_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::Region_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::Clip_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::LReLU_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::PriorBox_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::Normalize_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::ScatterND version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::RPROI_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::BatchedNMS_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::BatchedNMSDynamic_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::FlattenConcat_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::CropAndResize version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::DetectionLayer_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::EfficientNMS_ONNX_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::EfficientNMS_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::Proposal version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::ProposalLayer_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::PyramidROIAlign_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::ResizeNearest_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::Split version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::SpecialSlice_TRT version 1\n", "[01/17/2022-21:09:32] [V] [TRT] Registered plugin creator - ::InstanceNormalization_TRT version 1\n", "[01/17/2022-21:09:33] [I] [TRT] [MemUsageChange] Init CUDA: CPU +328, GPU +0, now: CPU 335, GPU 250 (MiB)\n", "[01/17/2022-21:09:33] [I] Start parsing network model\n", "[libprotobuf WARNING google/protobuf/io/coded_stream.cc:604] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.\n", "[libprotobuf WARNING google/protobuf/io/coded_stream.cc:81] The total number of bytes read was 669501704\n", "[01/17/2022-21:09:34] [I] [TRT] ----------------------------------------------------------------\n", "[01/17/2022-21:09:34] [I] [TRT] Input filename: model.onnx\n", "[01/17/2022-21:09:34] [I] [TRT] ONNX IR version: 0.0.6\n", "[01/17/2022-21:09:34] [I] [TRT] Opset version: 10\n", "[01/17/2022-21:09:34] [I] [TRT] Producer name: pytorch\n", "[01/17/2022-21:09:34] [I] [TRT] Producer version: 1.10\n", "[01/17/2022-21:09:34] [I] [TRT] Domain: \n", "[01/17/2022-21:09:34] [I] [TRT] Model version: 0\n", "[01/17/2022-21:09:34] [I] [TRT] Doc string: \n", "[01/17/2022-21:09:34] [I] [TRT] ----------------------------------------------------------------\n", "[libprotobuf WARNING google/protobuf/io/coded_stream.cc:604] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.\n", "[libprotobuf WARNING google/protobuf/io/coded_stream.cc:81] The total number of bytes read was 669501704\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::GridAnchor_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::GridAnchorRect_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::NMS_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::Reorg_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::Region_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::Clip_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::LReLU_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::PriorBox_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::Normalize_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::ScatterND version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::RPROI_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::BatchedNMS_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::BatchedNMSDynamic_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::FlattenConcat_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::CropAndResize version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::DetectionLayer_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::EfficientNMS_ONNX_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::EfficientNMS_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::Proposal version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::ProposalLayer_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::PyramidROIAlign_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::ResizeNearest_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::Split version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::SpecialSlice_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Plugin creator already registered - ::InstanceNormalization_TRT version 1\n", "[01/17/2022-21:09:35] [V] [TRT] Adding network input: token_ids with dtype: int32, dimensions: (-1, -1)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: token_ids for ONNX tensor: token_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Adding network input: attn_mask with dtype: int32, dimensions: (-1, -1)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: attn_mask for ONNX tensor: attn_mask\n", "[W] [01/17/2022-21:09:35] [V] [TRT] Importing initializer: embeddings.position_ids\n", "[01/17/2022-21:09:35] [TRT] onnx2trt_utils.cpp:362: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: embeddings.word_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: embeddings.position_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: embeddings.token_type_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: embeddings.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: embeddings.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.0.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.1.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.2.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.3.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.4.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.5.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.6.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.7.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.8.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.9.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.10.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: encoder.layer.11.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: pooler.dense.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: pooler.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1635\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1636\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1637\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1638\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1639\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1640\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1641\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1642\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1643\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1644\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1645\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1646\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1647\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1648\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1649\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1650\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1651\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1652\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1653\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1654\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1655\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1656\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1657\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1658\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1659\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1660\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1661\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1662\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1663\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1664\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1665\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1666\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1667\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1668\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1669\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1670\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1671\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1672\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1673\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1674\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1675\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1676\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1677\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1678\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1679\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1680\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1681\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1682\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1683\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1684\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1685\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1686\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1687\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1688\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1689\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1690\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1691\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1692\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1693\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1694\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1695\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1696\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1697\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1698\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1699\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1700\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1701\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1702\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1703\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1704\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1705\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1706\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1707\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1708\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1709\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1710\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1711\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1712\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1713\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1714\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1715\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1716\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1717\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1718\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1719\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1720\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1721\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1722\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1723\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1724\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1725\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1726\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1727\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1728\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1729\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1730\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1731\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1732\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1733\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1734\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1735\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1736\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1737\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1738\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1739\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1740\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1741\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1742\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1743\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1744\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1745\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1746\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1747\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1748\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1749\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1750\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1751\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1752\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1753\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1754\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1755\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1756\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1757\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1758\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1759\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1760\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1761\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1762\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1763\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1764\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1765\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1766\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1767\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1768\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1769\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1770\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1771\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1772\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1773\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1774\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1775\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1776\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1777\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1778\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1779\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1780\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1781\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1782\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1783\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1784\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1785\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1786\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1787\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1788\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1789\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1790\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1791\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1792\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1793\n", "[01/17/2022-21:09:35] [V] [TRT] Importing initializer: 1794\n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_0 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: token_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_0 [Shape] inputs: [token_ids -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_0 for ONNX node: Shape_0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 202 for ONNX tensor: 202\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_0 [Shape] outputs: [202 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_1 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_1 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_1 [Constant] outputs: [203 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_2 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 202\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 203\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_2 [Gather] inputs: [202 -> (2)[INT32]], [203 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 203 for ONNX node: 203\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_2 for ONNX node: Gather_2\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 204 for ONNX tensor: 204\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_2 [Gather] outputs: [204 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_3 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: token_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_3 [Shape] inputs: [token_ids -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_3 for ONNX node: Shape_3\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 205 for ONNX tensor: 205\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_3 [Shape] outputs: [205 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_4 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_4 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_4 [Constant] outputs: [206 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_5 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 205\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 206\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_5 [Gather] inputs: [205 -> (2)[INT32]], [206 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 206 for ONNX node: 206\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_5 for ONNX node: Gather_5\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 207 for ONNX tensor: 207\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_5 [Gather] outputs: [207 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_6 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_6 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_6 [Constant] outputs: [208 -> (1, 512)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_7 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 207\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_7 [Unsqueeze] inputs: [207 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_7 for ONNX node: Unsqueeze_7\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 212 for ONNX tensor: 212\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_7 [Unsqueeze] outputs: [212 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_8 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_8 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_8 [Constant] outputs: [214 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Slice_9 [Slice]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 208\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1635\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 212\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1636\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 214\n", "[01/17/2022-21:09:35] [V] [TRT] Slice_9 [Slice] inputs: [208 -> (1, 512)[INT32]], [1635 -> (1)[INT32]], [212 -> (1)[INT32]], [1636 -> (1)[INT32]], [214 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 208 for ONNX node: 208\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Slice_9 for ONNX node: Slice_9\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 215 for ONNX tensor: 215\n", "[01/17/2022-21:09:35] [V] [TRT] Slice_9 [Slice] outputs: [215 -> (1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_10 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 204\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_10 [Unsqueeze] inputs: [204 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_10 for ONNX node: Unsqueeze_10\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 216 for ONNX tensor: 216\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_10 [Unsqueeze] outputs: [216 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_11 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 207\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_11 [Unsqueeze] inputs: [207 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_11 for ONNX node: Unsqueeze_11\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 217 for ONNX tensor: 217\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_11 [Unsqueeze] outputs: [217 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_12 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 216\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 217\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_12 [Concat] inputs: [216 -> (1)[INT32]], [217 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_12 for ONNX node: Concat_12\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 218 for ONNX tensor: 218\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_12 [Concat] outputs: [218 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_13 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_13 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_13 [Constant] outputs: [219 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_14 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 218\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 219\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_14 [Reshape] inputs: [218 -> (2)[INT32]], [219 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_14 for ONNX node: Reshape_14\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 220 for ONNX tensor: 220\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_14 [Reshape] outputs: [220 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_15 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 220\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_15 [Shape] inputs: [220 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_15 for ONNX node: Shape_15\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 221 for ONNX tensor: 221\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_15 [Shape] outputs: [221 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ConstantOfShape_16 [ConstantOfShape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 221\n", "[01/17/2022-21:09:35] [V] [TRT] ConstantOfShape_16 [ConstantOfShape] inputs: [221 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ConstantOfShape_16 for ONNX node: ConstantOfShape_16\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 222 for ONNX tensor: 222\n", "[01/17/2022-21:09:35] [V] [TRT] ConstantOfShape_16 [ConstantOfShape] outputs: [222 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_17 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_17 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_17 [Constant] outputs: [223 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_18 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 222\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 223\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_18 [Mul] inputs: [222 -> (2)[INT32]], [223 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 223 for ONNX node: 223\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_18 for ONNX node: Mul_18\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 224 for ONNX tensor: 224\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_18 [Mul] outputs: [224 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Equal_19 [Equal]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 220\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 224\n", "[01/17/2022-21:09:35] [V] [TRT] Equal_19 [Equal] inputs: [220 -> (2)[INT32]], [224 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Equal_19 for ONNX node: Equal_19\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 225 for ONNX tensor: 225\n", "[01/17/2022-21:09:35] [V] [TRT] Equal_19 [Equal] outputs: [225 -> (2)[BOOL]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Where_20 [Where]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 225\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 222\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 220\n", "[01/17/2022-21:09:35] [V] [TRT] Where_20 [Where] inputs: [225 -> (2)[BOOL]], [222 -> (2)[INT32]], [220 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Where_20 for ONNX node: Where_20\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 226 for ONNX tensor: 226\n", "[01/17/2022-21:09:35] [V] [TRT] Where_20 [Where] outputs: [226 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Expand_21 [Expand]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 215\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 226\n", "[01/17/2022-21:09:35] [V] [TRT] Expand_21 [Expand] inputs: [215 -> (1, -1)[INT32]], [226 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Expand_21 for ONNX node: Expand_21\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 227 for ONNX tensor: 227\n", "[01/17/2022-21:09:35] [V] [TRT] Expand_21 [Expand] outputs: [227 -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_22 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: attn_mask\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_22 [Unsqueeze] inputs: [attn_mask -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (_, _), unsqueezing to: (_, _, _)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_22 for ONNX node: Unsqueeze_22\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 228 for ONNX tensor: 228\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_22 [Unsqueeze] outputs: [228 -> (-1, 1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_23 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 228\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_23 [Unsqueeze] inputs: [228 -> (-1, 1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (_, 1, _), unsqueezing to: (_, _, _, _)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_23 for ONNX node: Unsqueeze_23\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 229 for ONNX tensor: 229\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_23 [Unsqueeze] outputs: [229 -> (-1, 1, 1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Cast_24 [Cast]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 229\n", "[01/17/2022-21:09:35] [V] [TRT] Cast_24 [Cast] inputs: [229 -> (-1, 1, 1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Casting to type: float32\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Cast_24 for ONNX node: Cast_24\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 230 for ONNX tensor: 230\n", "[01/17/2022-21:09:35] [V] [TRT] Cast_24 [Cast] outputs: [230 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_25 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_25 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_25 [Constant] outputs: [231 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_26 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 231\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 230\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_26 [Sub] inputs: [231 -> ()[FLOAT]], [230 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 231 for ONNX node: 231\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_26 for ONNX node: Sub_26\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 232 for ONNX tensor: 232\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_26 [Sub] outputs: [232 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_27 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_27 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_27 [Constant] outputs: [233 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_28 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 232\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 233\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_28 [Mul] inputs: [232 -> (-1, 1, 1, -1)[FLOAT]], [233 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 233 for ONNX node: 233\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_28 for ONNX node: Mul_28\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 234 for ONNX tensor: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_28 [Mul] outputs: [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_29 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: token_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_29 [Shape] inputs: [token_ids -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_29 for ONNX node: Shape_29\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 235 for ONNX tensor: 235\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_29 [Shape] outputs: [235 -> (2)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_30 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_30 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_30 [Constant] outputs: [236 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_31 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 235\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 236\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_31 [Gather] inputs: [235 -> (2)[INT32]], [236 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 236 for ONNX node: 236\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_31 for ONNX node: Gather_31\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 237 for ONNX tensor: 237\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_31 [Gather] outputs: [237 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_32 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_32 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_32 [Constant] outputs: [238 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_33 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 237\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 238\n", "[01/17/2022-21:09:35] [V] [TRT] Add_33 [Add] inputs: [237 -> ()[INT32]], [238 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 238 for ONNX node: 238\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_33 for ONNX node: Add_33\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 239 for ONNX tensor: 239\n", "[01/17/2022-21:09:35] [V] [TRT] Add_33 [Add] outputs: [239 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_34 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 239\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_34 [Unsqueeze] inputs: [239 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_34 for ONNX node: Unsqueeze_34\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 243 for ONNX tensor: 243\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_34 [Unsqueeze] outputs: [243 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_35 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_35 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_35 [Constant] outputs: [245 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Slice_36 [Slice]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: embeddings.position_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1637\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 243\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1638\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 245\n", "[01/17/2022-21:09:35] [V] [TRT] Slice_36 [Slice] inputs: [embeddings.position_ids -> (1, 512)[INT32]], [1637 -> (1)[INT32]], [243 -> (1)[INT32]], [1638 -> (1)[INT32]], [245 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: embeddings.position_ids for ONNX node: embeddings.position_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Slice_36 for ONNX node: Slice_36\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 246 for ONNX tensor: 246\n", "[01/17/2022-21:09:35] [V] [TRT] Slice_36 [Slice] outputs: [246 -> (1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_37 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: embeddings.word_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: token_ids\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_37 [Gather] inputs: [embeddings.word_embeddings.weight -> (105879, 768)[FLOAT]], [token_ids -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: embeddings.word_embeddings.weight for ONNX node: embeddings.word_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_37 for ONNX node: Gather_37\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 247 for ONNX tensor: 247\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_37 [Gather] outputs: [247 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_38 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: embeddings.token_type_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 227\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_38 [Gather] inputs: [embeddings.token_type_embeddings.weight -> (2, 768)[FLOAT]], [227 -> (-1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: embeddings.token_type_embeddings.weight for ONNX node: embeddings.token_type_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_38 for ONNX node: Gather_38\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 248 for ONNX tensor: 248\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_38 [Gather] outputs: [248 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_39 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 247\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 248\n", "[01/17/2022-21:09:35] [V] [TRT] Add_39 [Add] inputs: [247 -> (-1, -1, 768)[FLOAT]], [248 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_39 for ONNX node: Add_39\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 249 for ONNX tensor: 249\n", "[01/17/2022-21:09:35] [V] [TRT] Add_39 [Add] outputs: [249 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_40 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: embeddings.position_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 246\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_40 [Gather] inputs: [embeddings.position_embeddings.weight -> (512, 768)[FLOAT]], [246 -> (1, -1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: embeddings.position_embeddings.weight for ONNX node: embeddings.position_embeddings.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_40 for ONNX node: Gather_40\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 250 for ONNX tensor: 250\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_40 [Gather] outputs: [250 -> (1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_41 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 249\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 250\n", "[01/17/2022-21:09:35] [V] [TRT] Add_41 [Add] inputs: [249 -> (-1, -1, 768)[FLOAT]], [250 -> (1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_41 for ONNX node: Add_41\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 251 for ONNX tensor: 251\n", "[01/17/2022-21:09:35] [V] [TRT] Add_41 [Add] outputs: [251 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_42 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 251\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_42 [ReduceMean] inputs: [251 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_42 for ONNX node: ReduceMean_42\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 252 for ONNX tensor: 252\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_42 [ReduceMean] outputs: [252 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_43 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 251\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 252\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_43 [Sub] inputs: [251 -> (-1, -1, 768)[FLOAT]], [252 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_43 for ONNX node: Sub_43\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 253 for ONNX tensor: 253\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_43 [Sub] outputs: [253 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_44 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_44 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_44 [Constant] outputs: [254 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_45 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 253\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 254\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_45 [Pow] inputs: [253 -> (-1, -1, 768)[FLOAT]], [254 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 254 for ONNX node: 254\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_45 for ONNX node: Pow_45\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 255 for ONNX tensor: 255\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_45 [Pow] outputs: [255 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_46 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 255\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_46 [ReduceMean] inputs: [255 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_46 for ONNX node: ReduceMean_46\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 256 for ONNX tensor: 256\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_46 [ReduceMean] outputs: [256 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_47 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_47 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_47 [Constant] outputs: [257 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_48 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 256\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 257\n", "[01/17/2022-21:09:35] [V] [TRT] Add_48 [Add] inputs: [256 -> (-1, -1, 1)[FLOAT]], [257 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 257 for ONNX node: 257\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_48 for ONNX node: Add_48\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 258 for ONNX tensor: 258\n", "[01/17/2022-21:09:35] [V] [TRT] Add_48 [Add] outputs: [258 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_49 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 258\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_49 [Sqrt] inputs: [258 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_49 for ONNX node: Sqrt_49\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 259 for ONNX tensor: 259\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_49 [Sqrt] outputs: [259 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_50 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 253\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 259\n", "[01/17/2022-21:09:35] [V] [TRT] Div_50 [Div] inputs: [253 -> (-1, -1, 768)[FLOAT]], [259 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_50 for ONNX node: Div_50\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 260 for ONNX tensor: 260\n", "[01/17/2022-21:09:35] [V] [TRT] Div_50 [Div] outputs: [260 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_51 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 260\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: embeddings.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_51 [Mul] inputs: [260 -> (-1, -1, 768)[FLOAT]], [embeddings.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: embeddings.LayerNorm.weight for ONNX node: embeddings.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_51 for ONNX node: Mul_51\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 261 for ONNX tensor: 261\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_51 [Mul] outputs: [261 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_52 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 261\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: embeddings.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_52 [Add] inputs: [261 -> (-1, -1, 768)[FLOAT]], [embeddings.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: embeddings.LayerNorm.bias for ONNX node: embeddings.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_52 for ONNX node: Add_52\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 262 for ONNX tensor: 262\n", "[01/17/2022-21:09:35] [V] [TRT] Add_52 [Add] outputs: [262 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_53 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 262\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1639\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_53 [MatMul] inputs: [262 -> (-1, -1, 768)[FLOAT]], [1639 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1639 for ONNX node: 1639\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_53 for ONNX node: MatMul_53\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 264 for ONNX tensor: 264\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_53 [MatMul] outputs: [264 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_54 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 264\n", "[01/17/2022-21:09:35] [V] [TRT] Add_54 [Add] inputs: [encoder.layer.0.attention.self.query.bias -> (768)[FLOAT]], [264 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.attention.self.query.bias for ONNX node: encoder.layer.0.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_54 for ONNX node: Add_54\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 265 for ONNX tensor: 265\n", "[01/17/2022-21:09:35] [V] [TRT] Add_54 [Add] outputs: [265 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_55 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 262\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1640\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_55 [MatMul] inputs: [262 -> (-1, -1, 768)[FLOAT]], [1640 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1640 for ONNX node: 1640\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_55 for ONNX node: MatMul_55\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 267 for ONNX tensor: 267\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_55 [MatMul] outputs: [267 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_56 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 267\n", "[01/17/2022-21:09:35] [V] [TRT] Add_56 [Add] inputs: [encoder.layer.0.attention.self.key.bias -> (768)[FLOAT]], [267 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.attention.self.key.bias for ONNX node: encoder.layer.0.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_56 for ONNX node: Add_56\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 268 for ONNX tensor: 268\n", "[01/17/2022-21:09:35] [V] [TRT] Add_56 [Add] outputs: [268 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_57 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 268\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_57 [Shape] inputs: [268 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_57 for ONNX node: Shape_57\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 269 for ONNX tensor: 269\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_57 [Shape] outputs: [269 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_58 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_58 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_58 [Constant] outputs: [270 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_59 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 269\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 270\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_59 [Gather] inputs: [269 -> (3)[INT32]], [270 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 270 for ONNX node: 270\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_59 for ONNX node: Gather_59\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 271 for ONNX tensor: 271\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_59 [Gather] outputs: [271 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_60 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 268\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_60 [Shape] inputs: [268 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_60 for ONNX node: Shape_60\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 272 for ONNX tensor: 272\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_60 [Shape] outputs: [272 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_61 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_61 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_61 [Constant] outputs: [273 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_62 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 272\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 273\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_62 [Gather] inputs: [272 -> (3)[INT32]], [273 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 273 for ONNX node: 273\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_62 for ONNX node: Gather_62\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 274 for ONNX tensor: 274\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_62 [Gather] outputs: [274 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_63 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 271\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_63 [Unsqueeze] inputs: [271 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_63 for ONNX node: Unsqueeze_63\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 277 for ONNX tensor: 277\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_63 [Unsqueeze] outputs: [277 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_64 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 274\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_64 [Unsqueeze] inputs: [274 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_64 for ONNX node: Unsqueeze_64\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 278 for ONNX tensor: 278\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_64 [Unsqueeze] outputs: [278 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_65 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 277\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 278\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1641\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1642\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_65 [Concat] inputs: [277 -> (1)[INT32]], [278 -> (1)[INT32]], [1641 -> (1)[INT32]], [1642 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1641 for ONNX node: 1641\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1642 for ONNX node: 1642\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_65 for ONNX node: Concat_65\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 281 for ONNX tensor: 281\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_65 [Concat] outputs: [281 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_66 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 268\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 281\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_66 [Reshape] inputs: [268 -> (-1, -1, 768)[FLOAT]], [281 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_66 for ONNX node: Reshape_66\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 282 for ONNX tensor: 282\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_66 [Reshape] outputs: [282 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_67 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 262\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1643\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_67 [MatMul] inputs: [262 -> (-1, -1, 768)[FLOAT]], [1643 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1643 for ONNX node: 1643\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_67 for ONNX node: MatMul_67\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 284 for ONNX tensor: 284\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_67 [MatMul] outputs: [284 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_68 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 284\n", "[01/17/2022-21:09:35] [V] [TRT] Add_68 [Add] inputs: [encoder.layer.0.attention.self.value.bias -> (768)[FLOAT]], [284 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.attention.self.value.bias for ONNX node: encoder.layer.0.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_68 for ONNX node: Add_68\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 285 for ONNX tensor: 285\n", "[01/17/2022-21:09:35] [V] [TRT] Add_68 [Add] outputs: [285 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_69 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 285\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_69 [Shape] inputs: [285 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_69 for ONNX node: Shape_69\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 286 for ONNX tensor: 286\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_69 [Shape] outputs: [286 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_70 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_70 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_70 [Constant] outputs: [287 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_71 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 286\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 287\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_71 [Gather] inputs: [286 -> (3)[INT32]], [287 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 287 for ONNX node: 287\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_71 for ONNX node: Gather_71\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 288 for ONNX tensor: 288\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_71 [Gather] outputs: [288 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_72 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 285\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_72 [Shape] inputs: [285 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_72 for ONNX node: Shape_72\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 289 for ONNX tensor: 289\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_72 [Shape] outputs: [289 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_73 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_73 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_73 [Constant] outputs: [290 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_74 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 289\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 290\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_74 [Gather] inputs: [289 -> (3)[INT32]], [290 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 290 for ONNX node: 290\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_74 for ONNX node: Gather_74\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 291 for ONNX tensor: 291\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_74 [Gather] outputs: [291 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_75 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 288\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_75 [Unsqueeze] inputs: [288 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_75 for ONNX node: Unsqueeze_75\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 294 for ONNX tensor: 294\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_75 [Unsqueeze] outputs: [294 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_76 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 291\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_76 [Unsqueeze] inputs: [291 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_76 for ONNX node: Unsqueeze_76\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 295 for ONNX tensor: 295\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_76 [Unsqueeze] outputs: [295 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_77 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 294\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 295\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1644\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1645\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_77 [Concat] inputs: [294 -> (1)[INT32]], [295 -> (1)[INT32]], [1644 -> (1)[INT32]], [1645 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1644 for ONNX node: 1644\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1645 for ONNX node: 1645\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_77 for ONNX node: Concat_77\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 298 for ONNX tensor: 298\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_77 [Concat] outputs: [298 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_78 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 285\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 298\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_78 [Reshape] inputs: [285 -> (-1, -1, 768)[FLOAT]], [298 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_78 for ONNX node: Reshape_78\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 299 for ONNX tensor: 299\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_78 [Reshape] outputs: [299 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_79 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 299\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_79 [Transpose] inputs: [299 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_79 for ONNX node: Transpose_79\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 300 for ONNX tensor: 300\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_79 [Transpose] outputs: [300 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_80 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 265\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_80 [Shape] inputs: [265 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_80 for ONNX node: Shape_80\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 301 for ONNX tensor: 301\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_80 [Shape] outputs: [301 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_81 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_81 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_81 [Constant] outputs: [302 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_82 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 301\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 302\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_82 [Gather] inputs: [301 -> (3)[INT32]], [302 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 302 for ONNX node: 302\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_82 for ONNX node: Gather_82\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 303 for ONNX tensor: 303\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_82 [Gather] outputs: [303 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_83 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 265\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_83 [Shape] inputs: [265 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_83 for ONNX node: Shape_83\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 304 for ONNX tensor: 304\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_83 [Shape] outputs: [304 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_84 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_84 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_84 [Constant] outputs: [305 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_85 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 304\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 305\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_85 [Gather] inputs: [304 -> (3)[INT32]], [305 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 305 for ONNX node: 305\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_85 for ONNX node: Gather_85\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 306 for ONNX tensor: 306\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_85 [Gather] outputs: [306 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_86 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 303\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_86 [Unsqueeze] inputs: [303 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_86 for ONNX node: Unsqueeze_86\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 309 for ONNX tensor: 309\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_86 [Unsqueeze] outputs: [309 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_87 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 306\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_87 [Unsqueeze] inputs: [306 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_87 for ONNX node: Unsqueeze_87\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 310 for ONNX tensor: 310\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_87 [Unsqueeze] outputs: [310 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_88 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 309\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 310\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1646\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1647\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_88 [Concat] inputs: [309 -> (1)[INT32]], [310 -> (1)[INT32]], [1646 -> (1)[INT32]], [1647 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1646 for ONNX node: 1646\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1647 for ONNX node: 1647\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_88 for ONNX node: Concat_88\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 313 for ONNX tensor: 313\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_88 [Concat] outputs: [313 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_89 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 265\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 313\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_89 [Reshape] inputs: [265 -> (-1, -1, 768)[FLOAT]], [313 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_89 for ONNX node: Reshape_89\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 314 for ONNX tensor: 314\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_89 [Reshape] outputs: [314 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_90 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 314\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_90 [Transpose] inputs: [314 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_90 for ONNX node: Transpose_90\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 315 for ONNX tensor: 315\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_90 [Transpose] outputs: [315 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_91 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 282\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_91 [Transpose] inputs: [282 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_91 for ONNX node: Transpose_91\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 316 for ONNX tensor: 316\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_91 [Transpose] outputs: [316 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_92 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 315\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 316\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_92 [MatMul] inputs: [315 -> (-1, 12, -1, 64)[FLOAT]], [316 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_92 for ONNX node: MatMul_92\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 317 for ONNX tensor: 317\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_92 [MatMul] outputs: [317 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_93 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_93 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_93 [Constant] outputs: [318 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_94 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 317\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 318\n", "[01/17/2022-21:09:35] [V] [TRT] Div_94 [Div] inputs: [317 -> (-1, 12, -1, -1)[FLOAT]], [318 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 318 for ONNX node: 318\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_94 for ONNX node: Div_94\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 319 for ONNX tensor: 319\n", "[01/17/2022-21:09:35] [V] [TRT] Div_94 [Div] outputs: [319 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_95 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 319\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_95 [Add] inputs: [319 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_95 for ONNX node: Add_95\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 320 for ONNX tensor: 320\n", "[01/17/2022-21:09:35] [V] [TRT] Add_95 [Add] outputs: [320 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_96 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 320\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_96 [Softmax] inputs: [320 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_96 for ONNX node: Softmax_96\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 321 for ONNX tensor: 321\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_96 [Softmax] outputs: [321 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_97 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 321\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 300\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_97 [MatMul] inputs: [321 -> (-1, 12, -1, -1)[FLOAT]], [300 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_97 for ONNX node: MatMul_97\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 322 for ONNX tensor: 322\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_97 [MatMul] outputs: [322 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_98 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 322\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_98 [Transpose] inputs: [322 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_98 for ONNX node: Transpose_98\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 323 for ONNX tensor: 323\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_98 [Transpose] outputs: [323 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_99 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 323\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_99 [Shape] inputs: [323 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_99 for ONNX node: Shape_99\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 324 for ONNX tensor: 324\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_99 [Shape] outputs: [324 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_100 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_100 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_100 [Constant] outputs: [325 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_101 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 324\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 325\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_101 [Gather] inputs: [324 -> (4)[INT32]], [325 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 325 for ONNX node: 325\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_101 for ONNX node: Gather_101\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 326 for ONNX tensor: 326\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_101 [Gather] outputs: [326 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_102 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 323\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_102 [Shape] inputs: [323 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_102 for ONNX node: Shape_102\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 327 for ONNX tensor: 327\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_102 [Shape] outputs: [327 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_103 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_103 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_103 [Constant] outputs: [328 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_104 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 327\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 328\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_104 [Gather] inputs: [327 -> (4)[INT32]], [328 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 328 for ONNX node: 328\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_104 for ONNX node: Gather_104\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 329 for ONNX tensor: 329\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_104 [Gather] outputs: [329 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_105 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 326\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_105 [Unsqueeze] inputs: [326 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_105 for ONNX node: Unsqueeze_105\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 331 for ONNX tensor: 331\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_105 [Unsqueeze] outputs: [331 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_106 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 329\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_106 [Unsqueeze] inputs: [329 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_106 for ONNX node: Unsqueeze_106\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 332 for ONNX tensor: 332\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_106 [Unsqueeze] outputs: [332 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_107 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 331\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 332\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1648\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_107 [Concat] inputs: [331 -> (1)[INT32]], [332 -> (1)[INT32]], [1648 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1648 for ONNX node: 1648\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_107 for ONNX node: Concat_107\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 334 for ONNX tensor: 334\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_107 [Concat] outputs: [334 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_108 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 323\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 334\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_108 [Reshape] inputs: [323 -> (-1, -1, 12, 64)[FLOAT]], [334 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_108 for ONNX node: Reshape_108\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 335 for ONNX tensor: 335\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_108 [Reshape] outputs: [335 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_109 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 335\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1649\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_109 [MatMul] inputs: [335 -> (-1, -1, 768)[FLOAT]], [1649 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1649 for ONNX node: 1649\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_109 for ONNX node: MatMul_109\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 337 for ONNX tensor: 337\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_109 [MatMul] outputs: [337 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_110 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 337\n", "[01/17/2022-21:09:35] [V] [TRT] Add_110 [Add] inputs: [encoder.layer.0.attention.output.dense.bias -> (768)[FLOAT]], [337 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.attention.output.dense.bias for ONNX node: encoder.layer.0.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_110 for ONNX node: Add_110\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 338 for ONNX tensor: 338\n", "[01/17/2022-21:09:35] [V] [TRT] Add_110 [Add] outputs: [338 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_111 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 338\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 262\n", "[01/17/2022-21:09:35] [V] [TRT] Add_111 [Add] inputs: [338 -> (-1, -1, 768)[FLOAT]], [262 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_111 for ONNX node: Add_111\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 339 for ONNX tensor: 339\n", "[01/17/2022-21:09:35] [V] [TRT] Add_111 [Add] outputs: [339 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_112 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 339\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_112 [ReduceMean] inputs: [339 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_112 for ONNX node: ReduceMean_112\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 340 for ONNX tensor: 340\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_112 [ReduceMean] outputs: [340 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_113 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 339\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 340\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_113 [Sub] inputs: [339 -> (-1, -1, 768)[FLOAT]], [340 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_113 for ONNX node: Sub_113\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 341 for ONNX tensor: 341\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_113 [Sub] outputs: [341 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_114 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_114 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_114 [Constant] outputs: [342 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_115 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 341\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 342\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_115 [Pow] inputs: [341 -> (-1, -1, 768)[FLOAT]], [342 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 342 for ONNX node: 342\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_115 for ONNX node: Pow_115\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 343 for ONNX tensor: 343\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_115 [Pow] outputs: [343 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_116 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 343\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_116 [ReduceMean] inputs: [343 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_116 for ONNX node: ReduceMean_116\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 344 for ONNX tensor: 344\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_116 [ReduceMean] outputs: [344 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_117 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_117 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_117 [Constant] outputs: [345 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_118 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 344\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 345\n", "[01/17/2022-21:09:35] [V] [TRT] Add_118 [Add] inputs: [344 -> (-1, -1, 1)[FLOAT]], [345 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 345 for ONNX node: 345\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_118 for ONNX node: Add_118\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 346 for ONNX tensor: 346\n", "[01/17/2022-21:09:35] [V] [TRT] Add_118 [Add] outputs: [346 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_119 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 346\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_119 [Sqrt] inputs: [346 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_119 for ONNX node: Sqrt_119\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 347 for ONNX tensor: 347\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_119 [Sqrt] outputs: [347 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_120 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 341\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 347\n", "[01/17/2022-21:09:35] [V] [TRT] Div_120 [Div] inputs: [341 -> (-1, -1, 768)[FLOAT]], [347 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_120 for ONNX node: Div_120\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 348 for ONNX tensor: 348\n", "[01/17/2022-21:09:35] [V] [TRT] Div_120 [Div] outputs: [348 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_121 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 348\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_121 [Mul] inputs: [348 -> (-1, -1, 768)[FLOAT]], [encoder.layer.0.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.attention.output.LayerNorm.weight for ONNX node: encoder.layer.0.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_121 for ONNX node: Mul_121\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 349 for ONNX tensor: 349\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_121 [Mul] outputs: [349 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_122 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 349\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_122 [Add] inputs: [349 -> (-1, -1, 768)[FLOAT]], [encoder.layer.0.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.attention.output.LayerNorm.bias for ONNX node: encoder.layer.0.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_122 for ONNX node: Add_122\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 350 for ONNX tensor: 350\n", "[01/17/2022-21:09:35] [V] [TRT] Add_122 [Add] outputs: [350 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_123 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 350\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1650\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_123 [MatMul] inputs: [350 -> (-1, -1, 768)[FLOAT]], [1650 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1650 for ONNX node: 1650\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_123 for ONNX node: MatMul_123\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 352 for ONNX tensor: 352\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_123 [MatMul] outputs: [352 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_124 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 352\n", "[01/17/2022-21:09:35] [V] [TRT] Add_124 [Add] inputs: [encoder.layer.0.intermediate.dense.bias -> (3072)[FLOAT]], [352 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.intermediate.dense.bias for ONNX node: encoder.layer.0.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_124 for ONNX node: Add_124\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 353 for ONNX tensor: 353\n", "[01/17/2022-21:09:35] [V] [TRT] Add_124 [Add] outputs: [353 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_125 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_125 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_125 [Constant] outputs: [354 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_126 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 353\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 354\n", "[01/17/2022-21:09:35] [V] [TRT] Div_126 [Div] inputs: [353 -> (-1, -1, 3072)[FLOAT]], [354 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 354 for ONNX node: 354\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_126 for ONNX node: Div_126\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 355 for ONNX tensor: 355\n", "[01/17/2022-21:09:35] [V] [TRT] Div_126 [Div] outputs: [355 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_127 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 355\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_127 [Erf] inputs: [355 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_127 for ONNX node: Erf_127\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 356 for ONNX tensor: 356\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_127 [Erf] outputs: [356 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_128 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_128 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_128 [Constant] outputs: [357 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_129 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 356\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 357\n", "[01/17/2022-21:09:35] [V] [TRT] Add_129 [Add] inputs: [356 -> (-1, -1, 3072)[FLOAT]], [357 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 357 for ONNX node: 357\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_129 for ONNX node: Add_129\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 358 for ONNX tensor: 358\n", "[01/17/2022-21:09:35] [V] [TRT] Add_129 [Add] outputs: [358 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_130 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 353\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 358\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_130 [Mul] inputs: [353 -> (-1, -1, 3072)[FLOAT]], [358 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_130 for ONNX node: Mul_130\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 359 for ONNX tensor: 359\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_130 [Mul] outputs: [359 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_131 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_131 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_131 [Constant] outputs: [360 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_132 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 359\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 360\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_132 [Mul] inputs: [359 -> (-1, -1, 3072)[FLOAT]], [360 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 360 for ONNX node: 360\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_132 for ONNX node: Mul_132\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 361 for ONNX tensor: 361\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_132 [Mul] outputs: [361 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_133 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 361\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1651\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_133 [MatMul] inputs: [361 -> (-1, -1, 3072)[FLOAT]], [1651 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1651 for ONNX node: 1651\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_133 for ONNX node: MatMul_133\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 363 for ONNX tensor: 363\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_133 [MatMul] outputs: [363 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_134 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 363\n", "[01/17/2022-21:09:35] [V] [TRT] Add_134 [Add] inputs: [encoder.layer.0.output.dense.bias -> (768)[FLOAT]], [363 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.output.dense.bias for ONNX node: encoder.layer.0.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_134 for ONNX node: Add_134\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 364 for ONNX tensor: 364\n", "[01/17/2022-21:09:35] [V] [TRT] Add_134 [Add] outputs: [364 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_135 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 364\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 350\n", "[01/17/2022-21:09:35] [V] [TRT] Add_135 [Add] inputs: [364 -> (-1, -1, 768)[FLOAT]], [350 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_135 for ONNX node: Add_135\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 365 for ONNX tensor: 365\n", "[01/17/2022-21:09:35] [V] [TRT] Add_135 [Add] outputs: [365 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_136 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 365\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_136 [ReduceMean] inputs: [365 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_136 for ONNX node: ReduceMean_136\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 366 for ONNX tensor: 366\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_136 [ReduceMean] outputs: [366 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_137 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 365\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 366\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_137 [Sub] inputs: [365 -> (-1, -1, 768)[FLOAT]], [366 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_137 for ONNX node: Sub_137\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 367 for ONNX tensor: 367\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_137 [Sub] outputs: [367 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_138 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_138 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_138 [Constant] outputs: [368 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_139 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 367\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 368\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_139 [Pow] inputs: [367 -> (-1, -1, 768)[FLOAT]], [368 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 368 for ONNX node: 368\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_139 for ONNX node: Pow_139\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 369 for ONNX tensor: 369\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_139 [Pow] outputs: [369 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_140 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 369\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_140 [ReduceMean] inputs: [369 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_140 for ONNX node: ReduceMean_140\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 370 for ONNX tensor: 370\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_140 [ReduceMean] outputs: [370 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_141 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_141 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_141 [Constant] outputs: [371 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_142 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 370\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 371\n", "[01/17/2022-21:09:35] [V] [TRT] Add_142 [Add] inputs: [370 -> (-1, -1, 1)[FLOAT]], [371 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 371 for ONNX node: 371\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_142 for ONNX node: Add_142\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 372 for ONNX tensor: 372\n", "[01/17/2022-21:09:35] [V] [TRT] Add_142 [Add] outputs: [372 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_143 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 372\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_143 [Sqrt] inputs: [372 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_143 for ONNX node: Sqrt_143\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 373 for ONNX tensor: 373\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_143 [Sqrt] outputs: [373 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_144 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 367\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 373\n", "[01/17/2022-21:09:35] [V] [TRT] Div_144 [Div] inputs: [367 -> (-1, -1, 768)[FLOAT]], [373 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_144 for ONNX node: Div_144\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 374 for ONNX tensor: 374\n", "[01/17/2022-21:09:35] [V] [TRT] Div_144 [Div] outputs: [374 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_145 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 374\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_145 [Mul] inputs: [374 -> (-1, -1, 768)[FLOAT]], [encoder.layer.0.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.output.LayerNorm.weight for ONNX node: encoder.layer.0.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_145 for ONNX node: Mul_145\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 375 for ONNX tensor: 375\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_145 [Mul] outputs: [375 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_146 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 375\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.0.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_146 [Add] inputs: [375 -> (-1, -1, 768)[FLOAT]], [encoder.layer.0.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.0.output.LayerNorm.bias for ONNX node: encoder.layer.0.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_146 for ONNX node: Add_146\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 376 for ONNX tensor: 376\n", "[01/17/2022-21:09:35] [V] [TRT] Add_146 [Add] outputs: [376 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_147 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 376\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1652\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_147 [MatMul] inputs: [376 -> (-1, -1, 768)[FLOAT]], [1652 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1652 for ONNX node: 1652\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_147 for ONNX node: MatMul_147\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 378 for ONNX tensor: 378\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_147 [MatMul] outputs: [378 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_148 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 378\n", "[01/17/2022-21:09:35] [V] [TRT] Add_148 [Add] inputs: [encoder.layer.1.attention.self.query.bias -> (768)[FLOAT]], [378 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.attention.self.query.bias for ONNX node: encoder.layer.1.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_148 for ONNX node: Add_148\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 379 for ONNX tensor: 379\n", "[01/17/2022-21:09:35] [V] [TRT] Add_148 [Add] outputs: [379 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_149 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 376\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1653\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_149 [MatMul] inputs: [376 -> (-1, -1, 768)[FLOAT]], [1653 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1653 for ONNX node: 1653\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_149 for ONNX node: MatMul_149\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 381 for ONNX tensor: 381\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_149 [MatMul] outputs: [381 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_150 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 381\n", "[01/17/2022-21:09:35] [V] [TRT] Add_150 [Add] inputs: [encoder.layer.1.attention.self.key.bias -> (768)[FLOAT]], [381 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.attention.self.key.bias for ONNX node: encoder.layer.1.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_150 for ONNX node: Add_150\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 382 for ONNX tensor: 382\n", "[01/17/2022-21:09:35] [V] [TRT] Add_150 [Add] outputs: [382 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_151 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 382\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_151 [Shape] inputs: [382 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_151 for ONNX node: Shape_151\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 383 for ONNX tensor: 383\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_151 [Shape] outputs: [383 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_152 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_152 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_152 [Constant] outputs: [384 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_153 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 383\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 384\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_153 [Gather] inputs: [383 -> (3)[INT32]], [384 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 384 for ONNX node: 384\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_153 for ONNX node: Gather_153\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 385 for ONNX tensor: 385\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_153 [Gather] outputs: [385 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_154 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 382\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_154 [Shape] inputs: [382 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_154 for ONNX node: Shape_154\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 386 for ONNX tensor: 386\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_154 [Shape] outputs: [386 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_155 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_155 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_155 [Constant] outputs: [387 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_156 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 386\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 387\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_156 [Gather] inputs: [386 -> (3)[INT32]], [387 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 387 for ONNX node: 387\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_156 for ONNX node: Gather_156\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 388 for ONNX tensor: 388\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_156 [Gather] outputs: [388 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_157 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 385\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_157 [Unsqueeze] inputs: [385 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_157 for ONNX node: Unsqueeze_157\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 391 for ONNX tensor: 391\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_157 [Unsqueeze] outputs: [391 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_158 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 388\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_158 [Unsqueeze] inputs: [388 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_158 for ONNX node: Unsqueeze_158\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 392 for ONNX tensor: 392\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_158 [Unsqueeze] outputs: [392 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_159 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 391\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 392\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1654\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1655\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_159 [Concat] inputs: [391 -> (1)[INT32]], [392 -> (1)[INT32]], [1654 -> (1)[INT32]], [1655 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1654 for ONNX node: 1654\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1655 for ONNX node: 1655\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_159 for ONNX node: Concat_159\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 395 for ONNX tensor: 395\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_159 [Concat] outputs: [395 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_160 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 382\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 395\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_160 [Reshape] inputs: [382 -> (-1, -1, 768)[FLOAT]], [395 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_160 for ONNX node: Reshape_160\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 396 for ONNX tensor: 396\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_160 [Reshape] outputs: [396 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_161 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 376\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1656\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_161 [MatMul] inputs: [376 -> (-1, -1, 768)[FLOAT]], [1656 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1656 for ONNX node: 1656\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_161 for ONNX node: MatMul_161\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 398 for ONNX tensor: 398\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_161 [MatMul] outputs: [398 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_162 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 398\n", "[01/17/2022-21:09:35] [V] [TRT] Add_162 [Add] inputs: [encoder.layer.1.attention.self.value.bias -> (768)[FLOAT]], [398 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.attention.self.value.bias for ONNX node: encoder.layer.1.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_162 for ONNX node: Add_162\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 399 for ONNX tensor: 399\n", "[01/17/2022-21:09:35] [V] [TRT] Add_162 [Add] outputs: [399 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_163 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 399\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_163 [Shape] inputs: [399 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_163 for ONNX node: Shape_163\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 400 for ONNX tensor: 400\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_163 [Shape] outputs: [400 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_164 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_164 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_164 [Constant] outputs: [401 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_165 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 400\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 401\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_165 [Gather] inputs: [400 -> (3)[INT32]], [401 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 401 for ONNX node: 401\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_165 for ONNX node: Gather_165\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 402 for ONNX tensor: 402\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_165 [Gather] outputs: [402 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_166 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 399\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_166 [Shape] inputs: [399 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_166 for ONNX node: Shape_166\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 403 for ONNX tensor: 403\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_166 [Shape] outputs: [403 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_167 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_167 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_167 [Constant] outputs: [404 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_168 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 403\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 404\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_168 [Gather] inputs: [403 -> (3)[INT32]], [404 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 404 for ONNX node: 404\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_168 for ONNX node: Gather_168\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 405 for ONNX tensor: 405\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_168 [Gather] outputs: [405 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_169 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 402\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_169 [Unsqueeze] inputs: [402 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_169 for ONNX node: Unsqueeze_169\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 408 for ONNX tensor: 408\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_169 [Unsqueeze] outputs: [408 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_170 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 405\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_170 [Unsqueeze] inputs: [405 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_170 for ONNX node: Unsqueeze_170\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 409 for ONNX tensor: 409\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_170 [Unsqueeze] outputs: [409 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_171 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 408\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 409\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1657\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1658\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_171 [Concat] inputs: [408 -> (1)[INT32]], [409 -> (1)[INT32]], [1657 -> (1)[INT32]], [1658 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1657 for ONNX node: 1657\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1658 for ONNX node: 1658\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_171 for ONNX node: Concat_171\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 412 for ONNX tensor: 412\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_171 [Concat] outputs: [412 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_172 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 399\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 412\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_172 [Reshape] inputs: [399 -> (-1, -1, 768)[FLOAT]], [412 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_172 for ONNX node: Reshape_172\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 413 for ONNX tensor: 413\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_172 [Reshape] outputs: [413 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_173 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 413\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_173 [Transpose] inputs: [413 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_173 for ONNX node: Transpose_173\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 414 for ONNX tensor: 414\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_173 [Transpose] outputs: [414 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_174 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 379\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_174 [Shape] inputs: [379 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_174 for ONNX node: Shape_174\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 415 for ONNX tensor: 415\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_174 [Shape] outputs: [415 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_175 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_175 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_175 [Constant] outputs: [416 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_176 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 415\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 416\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_176 [Gather] inputs: [415 -> (3)[INT32]], [416 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 416 for ONNX node: 416\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_176 for ONNX node: Gather_176\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 417 for ONNX tensor: 417\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_176 [Gather] outputs: [417 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_177 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 379\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_177 [Shape] inputs: [379 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_177 for ONNX node: Shape_177\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 418 for ONNX tensor: 418\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_177 [Shape] outputs: [418 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_178 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_178 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_178 [Constant] outputs: [419 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_179 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 418\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 419\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_179 [Gather] inputs: [418 -> (3)[INT32]], [419 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 419 for ONNX node: 419\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_179 for ONNX node: Gather_179\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 420 for ONNX tensor: 420\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_179 [Gather] outputs: [420 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_180 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 417\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_180 [Unsqueeze] inputs: [417 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_180 for ONNX node: Unsqueeze_180\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 423 for ONNX tensor: 423\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_180 [Unsqueeze] outputs: [423 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_181 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 420\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_181 [Unsqueeze] inputs: [420 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_181 for ONNX node: Unsqueeze_181\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 424 for ONNX tensor: 424\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_181 [Unsqueeze] outputs: [424 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_182 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 423\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 424\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1659\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1660\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_182 [Concat] inputs: [423 -> (1)[INT32]], [424 -> (1)[INT32]], [1659 -> (1)[INT32]], [1660 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1659 for ONNX node: 1659\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1660 for ONNX node: 1660\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_182 for ONNX node: Concat_182\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 427 for ONNX tensor: 427\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_182 [Concat] outputs: [427 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_183 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 379\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 427\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_183 [Reshape] inputs: [379 -> (-1, -1, 768)[FLOAT]], [427 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_183 for ONNX node: Reshape_183\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 428 for ONNX tensor: 428\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_183 [Reshape] outputs: [428 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_184 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 428\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_184 [Transpose] inputs: [428 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_184 for ONNX node: Transpose_184\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 429 for ONNX tensor: 429\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_184 [Transpose] outputs: [429 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_185 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 396\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_185 [Transpose] inputs: [396 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_185 for ONNX node: Transpose_185\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 430 for ONNX tensor: 430\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_185 [Transpose] outputs: [430 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_186 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 429\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 430\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_186 [MatMul] inputs: [429 -> (-1, 12, -1, 64)[FLOAT]], [430 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_186 for ONNX node: MatMul_186\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 431 for ONNX tensor: 431\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_186 [MatMul] outputs: [431 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_187 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_187 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_187 [Constant] outputs: [432 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_188 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 431\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 432\n", "[01/17/2022-21:09:35] [V] [TRT] Div_188 [Div] inputs: [431 -> (-1, 12, -1, -1)[FLOAT]], [432 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 432 for ONNX node: 432\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_188 for ONNX node: Div_188\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 433 for ONNX tensor: 433\n", "[01/17/2022-21:09:35] [V] [TRT] Div_188 [Div] outputs: [433 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_189 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 433\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_189 [Add] inputs: [433 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_189 for ONNX node: Add_189\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 434 for ONNX tensor: 434\n", "[01/17/2022-21:09:35] [V] [TRT] Add_189 [Add] outputs: [434 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_190 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 434\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_190 [Softmax] inputs: [434 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_190 for ONNX node: Softmax_190\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 435 for ONNX tensor: 435\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_190 [Softmax] outputs: [435 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_191 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 435\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 414\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_191 [MatMul] inputs: [435 -> (-1, 12, -1, -1)[FLOAT]], [414 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_191 for ONNX node: MatMul_191\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 436 for ONNX tensor: 436\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_191 [MatMul] outputs: [436 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_192 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 436\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_192 [Transpose] inputs: [436 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_192 for ONNX node: Transpose_192\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 437 for ONNX tensor: 437\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_192 [Transpose] outputs: [437 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_193 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 437\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_193 [Shape] inputs: [437 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_193 for ONNX node: Shape_193\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 438 for ONNX tensor: 438\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_193 [Shape] outputs: [438 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_194 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_194 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_194 [Constant] outputs: [439 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_195 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 438\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 439\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_195 [Gather] inputs: [438 -> (4)[INT32]], [439 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 439 for ONNX node: 439\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_195 for ONNX node: Gather_195\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 440 for ONNX tensor: 440\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_195 [Gather] outputs: [440 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_196 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 437\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_196 [Shape] inputs: [437 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_196 for ONNX node: Shape_196\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 441 for ONNX tensor: 441\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_196 [Shape] outputs: [441 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_197 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_197 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_197 [Constant] outputs: [442 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_198 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 441\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 442\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_198 [Gather] inputs: [441 -> (4)[INT32]], [442 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 442 for ONNX node: 442\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_198 for ONNX node: Gather_198\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 443 for ONNX tensor: 443\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_198 [Gather] outputs: [443 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_199 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 440\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_199 [Unsqueeze] inputs: [440 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_199 for ONNX node: Unsqueeze_199\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 445 for ONNX tensor: 445\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_199 [Unsqueeze] outputs: [445 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_200 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 443\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_200 [Unsqueeze] inputs: [443 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_200 for ONNX node: Unsqueeze_200\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 446 for ONNX tensor: 446\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_200 [Unsqueeze] outputs: [446 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_201 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 445\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 446\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1661\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_201 [Concat] inputs: [445 -> (1)[INT32]], [446 -> (1)[INT32]], [1661 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1661 for ONNX node: 1661\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_201 for ONNX node: Concat_201\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 448 for ONNX tensor: 448\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_201 [Concat] outputs: [448 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_202 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 437\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 448\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_202 [Reshape] inputs: [437 -> (-1, -1, 12, 64)[FLOAT]], [448 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_202 for ONNX node: Reshape_202\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 449 for ONNX tensor: 449\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_202 [Reshape] outputs: [449 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_203 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 449\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1662\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_203 [MatMul] inputs: [449 -> (-1, -1, 768)[FLOAT]], [1662 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1662 for ONNX node: 1662\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_203 for ONNX node: MatMul_203\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 451 for ONNX tensor: 451\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_203 [MatMul] outputs: [451 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_204 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 451\n", "[01/17/2022-21:09:35] [V] [TRT] Add_204 [Add] inputs: [encoder.layer.1.attention.output.dense.bias -> (768)[FLOAT]], [451 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.attention.output.dense.bias for ONNX node: encoder.layer.1.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_204 for ONNX node: Add_204\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 452 for ONNX tensor: 452\n", "[01/17/2022-21:09:35] [V] [TRT] Add_204 [Add] outputs: [452 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_205 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 452\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 376\n", "[01/17/2022-21:09:35] [V] [TRT] Add_205 [Add] inputs: [452 -> (-1, -1, 768)[FLOAT]], [376 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_205 for ONNX node: Add_205\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 453 for ONNX tensor: 453\n", "[01/17/2022-21:09:35] [V] [TRT] Add_205 [Add] outputs: [453 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_206 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 453\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_206 [ReduceMean] inputs: [453 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_206 for ONNX node: ReduceMean_206\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 454 for ONNX tensor: 454\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_206 [ReduceMean] outputs: [454 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_207 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 453\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 454\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_207 [Sub] inputs: [453 -> (-1, -1, 768)[FLOAT]], [454 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_207 for ONNX node: Sub_207\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 455 for ONNX tensor: 455\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_207 [Sub] outputs: [455 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_208 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_208 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_208 [Constant] outputs: [456 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_209 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 455\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 456\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_209 [Pow] inputs: [455 -> (-1, -1, 768)[FLOAT]], [456 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 456 for ONNX node: 456\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_209 for ONNX node: Pow_209\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 457 for ONNX tensor: 457\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_209 [Pow] outputs: [457 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_210 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 457\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_210 [ReduceMean] inputs: [457 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_210 for ONNX node: ReduceMean_210\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 458 for ONNX tensor: 458\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_210 [ReduceMean] outputs: [458 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_211 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_211 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_211 [Constant] outputs: [459 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_212 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 458\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 459\n", "[01/17/2022-21:09:35] [V] [TRT] Add_212 [Add] inputs: [458 -> (-1, -1, 1)[FLOAT]], [459 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 459 for ONNX node: 459\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_212 for ONNX node: Add_212\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 460 for ONNX tensor: 460\n", "[01/17/2022-21:09:35] [V] [TRT] Add_212 [Add] outputs: [460 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_213 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 460\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_213 [Sqrt] inputs: [460 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_213 for ONNX node: Sqrt_213\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 461 for ONNX tensor: 461\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_213 [Sqrt] outputs: [461 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_214 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 455\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 461\n", "[01/17/2022-21:09:35] [V] [TRT] Div_214 [Div] inputs: [455 -> (-1, -1, 768)[FLOAT]], [461 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_214 for ONNX node: Div_214\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 462 for ONNX tensor: 462\n", "[01/17/2022-21:09:35] [V] [TRT] Div_214 [Div] outputs: [462 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_215 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 462\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_215 [Mul] inputs: [462 -> (-1, -1, 768)[FLOAT]], [encoder.layer.1.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.attention.output.LayerNorm.weight for ONNX node: encoder.layer.1.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_215 for ONNX node: Mul_215\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 463 for ONNX tensor: 463\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_215 [Mul] outputs: [463 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_216 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 463\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_216 [Add] inputs: [463 -> (-1, -1, 768)[FLOAT]], [encoder.layer.1.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.attention.output.LayerNorm.bias for ONNX node: encoder.layer.1.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_216 for ONNX node: Add_216\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 464 for ONNX tensor: 464\n", "[01/17/2022-21:09:35] [V] [TRT] Add_216 [Add] outputs: [464 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_217 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 464\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1663\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_217 [MatMul] inputs: [464 -> (-1, -1, 768)[FLOAT]], [1663 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1663 for ONNX node: 1663\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_217 for ONNX node: MatMul_217\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 466 for ONNX tensor: 466\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_217 [MatMul] outputs: [466 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_218 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 466\n", "[01/17/2022-21:09:35] [V] [TRT] Add_218 [Add] inputs: [encoder.layer.1.intermediate.dense.bias -> (3072)[FLOAT]], [466 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.intermediate.dense.bias for ONNX node: encoder.layer.1.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_218 for ONNX node: Add_218\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 467 for ONNX tensor: 467\n", "[01/17/2022-21:09:35] [V] [TRT] Add_218 [Add] outputs: [467 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_219 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_219 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_219 [Constant] outputs: [468 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_220 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 467\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 468\n", "[01/17/2022-21:09:35] [V] [TRT] Div_220 [Div] inputs: [467 -> (-1, -1, 3072)[FLOAT]], [468 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 468 for ONNX node: 468\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_220 for ONNX node: Div_220\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 469 for ONNX tensor: 469\n", "[01/17/2022-21:09:35] [V] [TRT] Div_220 [Div] outputs: [469 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_221 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 469\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_221 [Erf] inputs: [469 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_221 for ONNX node: Erf_221\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 470 for ONNX tensor: 470\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_221 [Erf] outputs: [470 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_222 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_222 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_222 [Constant] outputs: [471 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_223 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 470\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 471\n", "[01/17/2022-21:09:35] [V] [TRT] Add_223 [Add] inputs: [470 -> (-1, -1, 3072)[FLOAT]], [471 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 471 for ONNX node: 471\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_223 for ONNX node: Add_223\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 472 for ONNX tensor: 472\n", "[01/17/2022-21:09:35] [V] [TRT] Add_223 [Add] outputs: [472 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_224 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 467\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 472\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_224 [Mul] inputs: [467 -> (-1, -1, 3072)[FLOAT]], [472 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_224 for ONNX node: Mul_224\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 473 for ONNX tensor: 473\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_224 [Mul] outputs: [473 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_225 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_225 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_225 [Constant] outputs: [474 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_226 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 473\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 474\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_226 [Mul] inputs: [473 -> (-1, -1, 3072)[FLOAT]], [474 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 474 for ONNX node: 474\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_226 for ONNX node: Mul_226\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 475 for ONNX tensor: 475\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_226 [Mul] outputs: [475 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_227 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 475\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1664\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_227 [MatMul] inputs: [475 -> (-1, -1, 3072)[FLOAT]], [1664 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1664 for ONNX node: 1664\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_227 for ONNX node: MatMul_227\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 477 for ONNX tensor: 477\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_227 [MatMul] outputs: [477 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_228 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 477\n", "[01/17/2022-21:09:35] [V] [TRT] Add_228 [Add] inputs: [encoder.layer.1.output.dense.bias -> (768)[FLOAT]], [477 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.output.dense.bias for ONNX node: encoder.layer.1.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_228 for ONNX node: Add_228\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 478 for ONNX tensor: 478\n", "[01/17/2022-21:09:35] [V] [TRT] Add_228 [Add] outputs: [478 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_229 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 478\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 464\n", "[01/17/2022-21:09:35] [V] [TRT] Add_229 [Add] inputs: [478 -> (-1, -1, 768)[FLOAT]], [464 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_229 for ONNX node: Add_229\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 479 for ONNX tensor: 479\n", "[01/17/2022-21:09:35] [V] [TRT] Add_229 [Add] outputs: [479 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_230 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 479\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_230 [ReduceMean] inputs: [479 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_230 for ONNX node: ReduceMean_230\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 480 for ONNX tensor: 480\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_230 [ReduceMean] outputs: [480 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_231 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 479\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 480\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_231 [Sub] inputs: [479 -> (-1, -1, 768)[FLOAT]], [480 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_231 for ONNX node: Sub_231\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 481 for ONNX tensor: 481\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_231 [Sub] outputs: [481 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_232 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_232 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_232 [Constant] outputs: [482 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_233 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 481\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 482\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_233 [Pow] inputs: [481 -> (-1, -1, 768)[FLOAT]], [482 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 482 for ONNX node: 482\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_233 for ONNX node: Pow_233\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 483 for ONNX tensor: 483\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_233 [Pow] outputs: [483 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_234 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 483\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_234 [ReduceMean] inputs: [483 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_234 for ONNX node: ReduceMean_234\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 484 for ONNX tensor: 484\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_234 [ReduceMean] outputs: [484 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_235 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_235 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_235 [Constant] outputs: [485 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_236 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 484\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 485\n", "[01/17/2022-21:09:35] [V] [TRT] Add_236 [Add] inputs: [484 -> (-1, -1, 1)[FLOAT]], [485 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 485 for ONNX node: 485\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_236 for ONNX node: Add_236\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 486 for ONNX tensor: 486\n", "[01/17/2022-21:09:35] [V] [TRT] Add_236 [Add] outputs: [486 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_237 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 486\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_237 [Sqrt] inputs: [486 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_237 for ONNX node: Sqrt_237\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 487 for ONNX tensor: 487\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_237 [Sqrt] outputs: [487 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_238 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 481\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 487\n", "[01/17/2022-21:09:35] [V] [TRT] Div_238 [Div] inputs: [481 -> (-1, -1, 768)[FLOAT]], [487 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_238 for ONNX node: Div_238\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 488 for ONNX tensor: 488\n", "[01/17/2022-21:09:35] [V] [TRT] Div_238 [Div] outputs: [488 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_239 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 488\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_239 [Mul] inputs: [488 -> (-1, -1, 768)[FLOAT]], [encoder.layer.1.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.output.LayerNorm.weight for ONNX node: encoder.layer.1.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_239 for ONNX node: Mul_239\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 489 for ONNX tensor: 489\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_239 [Mul] outputs: [489 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_240 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 489\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.1.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_240 [Add] inputs: [489 -> (-1, -1, 768)[FLOAT]], [encoder.layer.1.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.1.output.LayerNorm.bias for ONNX node: encoder.layer.1.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_240 for ONNX node: Add_240\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 490 for ONNX tensor: 490\n", "[01/17/2022-21:09:35] [V] [TRT] Add_240 [Add] outputs: [490 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_241 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 490\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1665\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_241 [MatMul] inputs: [490 -> (-1, -1, 768)[FLOAT]], [1665 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1665 for ONNX node: 1665\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_241 for ONNX node: MatMul_241\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 492 for ONNX tensor: 492\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_241 [MatMul] outputs: [492 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_242 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 492\n", "[01/17/2022-21:09:35] [V] [TRT] Add_242 [Add] inputs: [encoder.layer.2.attention.self.query.bias -> (768)[FLOAT]], [492 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.attention.self.query.bias for ONNX node: encoder.layer.2.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_242 for ONNX node: Add_242\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 493 for ONNX tensor: 493\n", "[01/17/2022-21:09:35] [V] [TRT] Add_242 [Add] outputs: [493 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_243 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 490\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1666\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_243 [MatMul] inputs: [490 -> (-1, -1, 768)[FLOAT]], [1666 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1666 for ONNX node: 1666\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_243 for ONNX node: MatMul_243\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 495 for ONNX tensor: 495\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_243 [MatMul] outputs: [495 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_244 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 495\n", "[01/17/2022-21:09:35] [V] [TRT] Add_244 [Add] inputs: [encoder.layer.2.attention.self.key.bias -> (768)[FLOAT]], [495 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.attention.self.key.bias for ONNX node: encoder.layer.2.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_244 for ONNX node: Add_244\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 496 for ONNX tensor: 496\n", "[01/17/2022-21:09:35] [V] [TRT] Add_244 [Add] outputs: [496 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_245 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 496\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_245 [Shape] inputs: [496 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_245 for ONNX node: Shape_245\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 497 for ONNX tensor: 497\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_245 [Shape] outputs: [497 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_246 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_246 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_246 [Constant] outputs: [498 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_247 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 497\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 498\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_247 [Gather] inputs: [497 -> (3)[INT32]], [498 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 498 for ONNX node: 498\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_247 for ONNX node: Gather_247\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 499 for ONNX tensor: 499\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_247 [Gather] outputs: [499 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_248 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 496\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_248 [Shape] inputs: [496 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_248 for ONNX node: Shape_248\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 500 for ONNX tensor: 500\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_248 [Shape] outputs: [500 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_249 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_249 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_249 [Constant] outputs: [501 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_250 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 500\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 501\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_250 [Gather] inputs: [500 -> (3)[INT32]], [501 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 501 for ONNX node: 501\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_250 for ONNX node: Gather_250\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 502 for ONNX tensor: 502\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_250 [Gather] outputs: [502 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_251 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 499\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_251 [Unsqueeze] inputs: [499 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_251 for ONNX node: Unsqueeze_251\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 505 for ONNX tensor: 505\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_251 [Unsqueeze] outputs: [505 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_252 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 502\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_252 [Unsqueeze] inputs: [502 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_252 for ONNX node: Unsqueeze_252\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 506 for ONNX tensor: 506\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_252 [Unsqueeze] outputs: [506 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_253 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 505\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 506\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1667\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1668\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_253 [Concat] inputs: [505 -> (1)[INT32]], [506 -> (1)[INT32]], [1667 -> (1)[INT32]], [1668 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1667 for ONNX node: 1667\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1668 for ONNX node: 1668\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_253 for ONNX node: Concat_253\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 509 for ONNX tensor: 509\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_253 [Concat] outputs: [509 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_254 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 496\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 509\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_254 [Reshape] inputs: [496 -> (-1, -1, 768)[FLOAT]], [509 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_254 for ONNX node: Reshape_254\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 510 for ONNX tensor: 510\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_254 [Reshape] outputs: [510 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_255 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 490\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1669\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_255 [MatMul] inputs: [490 -> (-1, -1, 768)[FLOAT]], [1669 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1669 for ONNX node: 1669\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_255 for ONNX node: MatMul_255\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 512 for ONNX tensor: 512\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_255 [MatMul] outputs: [512 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_256 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 512\n", "[01/17/2022-21:09:35] [V] [TRT] Add_256 [Add] inputs: [encoder.layer.2.attention.self.value.bias -> (768)[FLOAT]], [512 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.attention.self.value.bias for ONNX node: encoder.layer.2.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_256 for ONNX node: Add_256\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 513 for ONNX tensor: 513\n", "[01/17/2022-21:09:35] [V] [TRT] Add_256 [Add] outputs: [513 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_257 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 513\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_257 [Shape] inputs: [513 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_257 for ONNX node: Shape_257\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 514 for ONNX tensor: 514\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_257 [Shape] outputs: [514 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_258 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_258 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_258 [Constant] outputs: [515 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_259 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 514\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 515\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_259 [Gather] inputs: [514 -> (3)[INT32]], [515 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 515 for ONNX node: 515\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_259 for ONNX node: Gather_259\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 516 for ONNX tensor: 516\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_259 [Gather] outputs: [516 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_260 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 513\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_260 [Shape] inputs: [513 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_260 for ONNX node: Shape_260\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 517 for ONNX tensor: 517\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_260 [Shape] outputs: [517 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_261 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_261 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_261 [Constant] outputs: [518 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_262 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 517\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 518\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_262 [Gather] inputs: [517 -> (3)[INT32]], [518 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 518 for ONNX node: 518\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_262 for ONNX node: Gather_262\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 519 for ONNX tensor: 519\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_262 [Gather] outputs: [519 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_263 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 516\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_263 [Unsqueeze] inputs: [516 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_263 for ONNX node: Unsqueeze_263\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 522 for ONNX tensor: 522\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_263 [Unsqueeze] outputs: [522 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_264 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 519\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_264 [Unsqueeze] inputs: [519 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_264 for ONNX node: Unsqueeze_264\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 523 for ONNX tensor: 523\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_264 [Unsqueeze] outputs: [523 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_265 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 522\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 523\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1670\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1671\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_265 [Concat] inputs: [522 -> (1)[INT32]], [523 -> (1)[INT32]], [1670 -> (1)[INT32]], [1671 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1670 for ONNX node: 1670\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1671 for ONNX node: 1671\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_265 for ONNX node: Concat_265\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 526 for ONNX tensor: 526\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_265 [Concat] outputs: [526 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_266 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 513\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 526\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_266 [Reshape] inputs: [513 -> (-1, -1, 768)[FLOAT]], [526 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_266 for ONNX node: Reshape_266\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 527 for ONNX tensor: 527\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_266 [Reshape] outputs: [527 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_267 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 527\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_267 [Transpose] inputs: [527 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_267 for ONNX node: Transpose_267\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 528 for ONNX tensor: 528\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_267 [Transpose] outputs: [528 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_268 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 493\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_268 [Shape] inputs: [493 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_268 for ONNX node: Shape_268\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 529 for ONNX tensor: 529\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_268 [Shape] outputs: [529 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_269 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_269 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_269 [Constant] outputs: [530 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_270 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 529\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 530\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_270 [Gather] inputs: [529 -> (3)[INT32]], [530 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 530 for ONNX node: 530\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_270 for ONNX node: Gather_270\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 531 for ONNX tensor: 531\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_270 [Gather] outputs: [531 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_271 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 493\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_271 [Shape] inputs: [493 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_271 for ONNX node: Shape_271\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 532 for ONNX tensor: 532\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_271 [Shape] outputs: [532 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_272 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_272 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_272 [Constant] outputs: [533 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_273 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 532\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 533\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_273 [Gather] inputs: [532 -> (3)[INT32]], [533 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 533 for ONNX node: 533\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_273 for ONNX node: Gather_273\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 534 for ONNX tensor: 534\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_273 [Gather] outputs: [534 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_274 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 531\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_274 [Unsqueeze] inputs: [531 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_274 for ONNX node: Unsqueeze_274\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 537 for ONNX tensor: 537\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_274 [Unsqueeze] outputs: [537 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_275 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 534\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_275 [Unsqueeze] inputs: [534 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_275 for ONNX node: Unsqueeze_275\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 538 for ONNX tensor: 538\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_275 [Unsqueeze] outputs: [538 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_276 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 537\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 538\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1672\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1673\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_276 [Concat] inputs: [537 -> (1)[INT32]], [538 -> (1)[INT32]], [1672 -> (1)[INT32]], [1673 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1672 for ONNX node: 1672\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1673 for ONNX node: 1673\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_276 for ONNX node: Concat_276\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 541 for ONNX tensor: 541\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_276 [Concat] outputs: [541 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_277 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 493\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 541\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_277 [Reshape] inputs: [493 -> (-1, -1, 768)[FLOAT]], [541 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_277 for ONNX node: Reshape_277\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 542 for ONNX tensor: 542\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_277 [Reshape] outputs: [542 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_278 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 542\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_278 [Transpose] inputs: [542 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_278 for ONNX node: Transpose_278\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 543 for ONNX tensor: 543\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_278 [Transpose] outputs: [543 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_279 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 510\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_279 [Transpose] inputs: [510 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_279 for ONNX node: Transpose_279\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 544 for ONNX tensor: 544\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_279 [Transpose] outputs: [544 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_280 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 543\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 544\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_280 [MatMul] inputs: [543 -> (-1, 12, -1, 64)[FLOAT]], [544 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_280 for ONNX node: MatMul_280\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 545 for ONNX tensor: 545\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_280 [MatMul] outputs: [545 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_281 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_281 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_281 [Constant] outputs: [546 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_282 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 545\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 546\n", "[01/17/2022-21:09:35] [V] [TRT] Div_282 [Div] inputs: [545 -> (-1, 12, -1, -1)[FLOAT]], [546 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 546 for ONNX node: 546\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_282 for ONNX node: Div_282\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 547 for ONNX tensor: 547\n", "[01/17/2022-21:09:35] [V] [TRT] Div_282 [Div] outputs: [547 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_283 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 547\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_283 [Add] inputs: [547 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_283 for ONNX node: Add_283\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 548 for ONNX tensor: 548\n", "[01/17/2022-21:09:35] [V] [TRT] Add_283 [Add] outputs: [548 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_284 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 548\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_284 [Softmax] inputs: [548 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_284 for ONNX node: Softmax_284\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 549 for ONNX tensor: 549\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_284 [Softmax] outputs: [549 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_285 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 549\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 528\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_285 [MatMul] inputs: [549 -> (-1, 12, -1, -1)[FLOAT]], [528 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_285 for ONNX node: MatMul_285\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 550 for ONNX tensor: 550\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_285 [MatMul] outputs: [550 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_286 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 550\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_286 [Transpose] inputs: [550 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_286 for ONNX node: Transpose_286\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 551 for ONNX tensor: 551\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_286 [Transpose] outputs: [551 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_287 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 551\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_287 [Shape] inputs: [551 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_287 for ONNX node: Shape_287\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 552 for ONNX tensor: 552\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_287 [Shape] outputs: [552 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_288 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_288 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_288 [Constant] outputs: [553 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_289 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 552\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 553\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_289 [Gather] inputs: [552 -> (4)[INT32]], [553 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 553 for ONNX node: 553\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_289 for ONNX node: Gather_289\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 554 for ONNX tensor: 554\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_289 [Gather] outputs: [554 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_290 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 551\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_290 [Shape] inputs: [551 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_290 for ONNX node: Shape_290\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 555 for ONNX tensor: 555\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_290 [Shape] outputs: [555 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_291 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_291 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_291 [Constant] outputs: [556 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_292 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 555\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 556\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_292 [Gather] inputs: [555 -> (4)[INT32]], [556 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 556 for ONNX node: 556\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_292 for ONNX node: Gather_292\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 557 for ONNX tensor: 557\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_292 [Gather] outputs: [557 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_293 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 554\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_293 [Unsqueeze] inputs: [554 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_293 for ONNX node: Unsqueeze_293\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 559 for ONNX tensor: 559\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_293 [Unsqueeze] outputs: [559 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_294 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 557\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_294 [Unsqueeze] inputs: [557 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_294 for ONNX node: Unsqueeze_294\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 560 for ONNX tensor: 560\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_294 [Unsqueeze] outputs: [560 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_295 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 559\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 560\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1674\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_295 [Concat] inputs: [559 -> (1)[INT32]], [560 -> (1)[INT32]], [1674 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1674 for ONNX node: 1674\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_295 for ONNX node: Concat_295\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 562 for ONNX tensor: 562\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_295 [Concat] outputs: [562 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_296 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 551\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 562\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_296 [Reshape] inputs: [551 -> (-1, -1, 12, 64)[FLOAT]], [562 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_296 for ONNX node: Reshape_296\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 563 for ONNX tensor: 563\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_296 [Reshape] outputs: [563 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_297 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 563\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1675\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_297 [MatMul] inputs: [563 -> (-1, -1, 768)[FLOAT]], [1675 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1675 for ONNX node: 1675\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_297 for ONNX node: MatMul_297\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 565 for ONNX tensor: 565\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_297 [MatMul] outputs: [565 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_298 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 565\n", "[01/17/2022-21:09:35] [V] [TRT] Add_298 [Add] inputs: [encoder.layer.2.attention.output.dense.bias -> (768)[FLOAT]], [565 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.attention.output.dense.bias for ONNX node: encoder.layer.2.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_298 for ONNX node: Add_298\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 566 for ONNX tensor: 566\n", "[01/17/2022-21:09:35] [V] [TRT] Add_298 [Add] outputs: [566 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_299 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 566\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 490\n", "[01/17/2022-21:09:35] [V] [TRT] Add_299 [Add] inputs: [566 -> (-1, -1, 768)[FLOAT]], [490 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_299 for ONNX node: Add_299\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 567 for ONNX tensor: 567\n", "[01/17/2022-21:09:35] [V] [TRT] Add_299 [Add] outputs: [567 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_300 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 567\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_300 [ReduceMean] inputs: [567 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_300 for ONNX node: ReduceMean_300\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 568 for ONNX tensor: 568\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_300 [ReduceMean] outputs: [568 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_301 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 567\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 568\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_301 [Sub] inputs: [567 -> (-1, -1, 768)[FLOAT]], [568 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_301 for ONNX node: Sub_301\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 569 for ONNX tensor: 569\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_301 [Sub] outputs: [569 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_302 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_302 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_302 [Constant] outputs: [570 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_303 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 569\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 570\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_303 [Pow] inputs: [569 -> (-1, -1, 768)[FLOAT]], [570 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 570 for ONNX node: 570\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_303 for ONNX node: Pow_303\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 571 for ONNX tensor: 571\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_303 [Pow] outputs: [571 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_304 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 571\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_304 [ReduceMean] inputs: [571 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_304 for ONNX node: ReduceMean_304\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 572 for ONNX tensor: 572\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_304 [ReduceMean] outputs: [572 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_305 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_305 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_305 [Constant] outputs: [573 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_306 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 572\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 573\n", "[01/17/2022-21:09:35] [V] [TRT] Add_306 [Add] inputs: [572 -> (-1, -1, 1)[FLOAT]], [573 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 573 for ONNX node: 573\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_306 for ONNX node: Add_306\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 574 for ONNX tensor: 574\n", "[01/17/2022-21:09:35] [V] [TRT] Add_306 [Add] outputs: [574 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_307 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 574\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_307 [Sqrt] inputs: [574 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_307 for ONNX node: Sqrt_307\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 575 for ONNX tensor: 575\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_307 [Sqrt] outputs: [575 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_308 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 569\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 575\n", "[01/17/2022-21:09:35] [V] [TRT] Div_308 [Div] inputs: [569 -> (-1, -1, 768)[FLOAT]], [575 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_308 for ONNX node: Div_308\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 576 for ONNX tensor: 576\n", "[01/17/2022-21:09:35] [V] [TRT] Div_308 [Div] outputs: [576 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_309 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 576\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_309 [Mul] inputs: [576 -> (-1, -1, 768)[FLOAT]], [encoder.layer.2.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.attention.output.LayerNorm.weight for ONNX node: encoder.layer.2.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_309 for ONNX node: Mul_309\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 577 for ONNX tensor: 577\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_309 [Mul] outputs: [577 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_310 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 577\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_310 [Add] inputs: [577 -> (-1, -1, 768)[FLOAT]], [encoder.layer.2.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.attention.output.LayerNorm.bias for ONNX node: encoder.layer.2.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_310 for ONNX node: Add_310\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 578 for ONNX tensor: 578\n", "[01/17/2022-21:09:35] [V] [TRT] Add_310 [Add] outputs: [578 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_311 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 578\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1676\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_311 [MatMul] inputs: [578 -> (-1, -1, 768)[FLOAT]], [1676 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1676 for ONNX node: 1676\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_311 for ONNX node: MatMul_311\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 580 for ONNX tensor: 580\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_311 [MatMul] outputs: [580 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_312 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 580\n", "[01/17/2022-21:09:35] [V] [TRT] Add_312 [Add] inputs: [encoder.layer.2.intermediate.dense.bias -> (3072)[FLOAT]], [580 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.intermediate.dense.bias for ONNX node: encoder.layer.2.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_312 for ONNX node: Add_312\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 581 for ONNX tensor: 581\n", "[01/17/2022-21:09:35] [V] [TRT] Add_312 [Add] outputs: [581 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_313 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_313 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_313 [Constant] outputs: [582 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_314 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 581\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 582\n", "[01/17/2022-21:09:35] [V] [TRT] Div_314 [Div] inputs: [581 -> (-1, -1, 3072)[FLOAT]], [582 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 582 for ONNX node: 582\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_314 for ONNX node: Div_314\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 583 for ONNX tensor: 583\n", "[01/17/2022-21:09:35] [V] [TRT] Div_314 [Div] outputs: [583 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_315 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 583\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_315 [Erf] inputs: [583 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_315 for ONNX node: Erf_315\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 584 for ONNX tensor: 584\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_315 [Erf] outputs: [584 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_316 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_316 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_316 [Constant] outputs: [585 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_317 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 584\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 585\n", "[01/17/2022-21:09:35] [V] [TRT] Add_317 [Add] inputs: [584 -> (-1, -1, 3072)[FLOAT]], [585 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 585 for ONNX node: 585\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_317 for ONNX node: Add_317\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 586 for ONNX tensor: 586\n", "[01/17/2022-21:09:35] [V] [TRT] Add_317 [Add] outputs: [586 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_318 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 581\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 586\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_318 [Mul] inputs: [581 -> (-1, -1, 3072)[FLOAT]], [586 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_318 for ONNX node: Mul_318\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 587 for ONNX tensor: 587\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_318 [Mul] outputs: [587 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_319 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_319 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_319 [Constant] outputs: [588 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_320 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 587\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 588\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_320 [Mul] inputs: [587 -> (-1, -1, 3072)[FLOAT]], [588 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 588 for ONNX node: 588\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_320 for ONNX node: Mul_320\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 589 for ONNX tensor: 589\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_320 [Mul] outputs: [589 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_321 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 589\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1677\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_321 [MatMul] inputs: [589 -> (-1, -1, 3072)[FLOAT]], [1677 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1677 for ONNX node: 1677\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_321 for ONNX node: MatMul_321\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 591 for ONNX tensor: 591\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_321 [MatMul] outputs: [591 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_322 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 591\n", "[01/17/2022-21:09:35] [V] [TRT] Add_322 [Add] inputs: [encoder.layer.2.output.dense.bias -> (768)[FLOAT]], [591 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.output.dense.bias for ONNX node: encoder.layer.2.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_322 for ONNX node: Add_322\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 592 for ONNX tensor: 592\n", "[01/17/2022-21:09:35] [V] [TRT] Add_322 [Add] outputs: [592 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_323 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 592\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 578\n", "[01/17/2022-21:09:35] [V] [TRT] Add_323 [Add] inputs: [592 -> (-1, -1, 768)[FLOAT]], [578 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_323 for ONNX node: Add_323\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 593 for ONNX tensor: 593\n", "[01/17/2022-21:09:35] [V] [TRT] Add_323 [Add] outputs: [593 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_324 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 593\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_324 [ReduceMean] inputs: [593 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_324 for ONNX node: ReduceMean_324\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 594 for ONNX tensor: 594\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_324 [ReduceMean] outputs: [594 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_325 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 593\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 594\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_325 [Sub] inputs: [593 -> (-1, -1, 768)[FLOAT]], [594 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_325 for ONNX node: Sub_325\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 595 for ONNX tensor: 595\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_325 [Sub] outputs: [595 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_326 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_326 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_326 [Constant] outputs: [596 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_327 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 595\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 596\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_327 [Pow] inputs: [595 -> (-1, -1, 768)[FLOAT]], [596 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 596 for ONNX node: 596\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_327 for ONNX node: Pow_327\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 597 for ONNX tensor: 597\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_327 [Pow] outputs: [597 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_328 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 597\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_328 [ReduceMean] inputs: [597 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_328 for ONNX node: ReduceMean_328\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 598 for ONNX tensor: 598\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_328 [ReduceMean] outputs: [598 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_329 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_329 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_329 [Constant] outputs: [599 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_330 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 598\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 599\n", "[01/17/2022-21:09:35] [V] [TRT] Add_330 [Add] inputs: [598 -> (-1, -1, 1)[FLOAT]], [599 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 599 for ONNX node: 599\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_330 for ONNX node: Add_330\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 600 for ONNX tensor: 600\n", "[01/17/2022-21:09:35] [V] [TRT] Add_330 [Add] outputs: [600 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_331 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 600\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_331 [Sqrt] inputs: [600 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_331 for ONNX node: Sqrt_331\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 601 for ONNX tensor: 601\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_331 [Sqrt] outputs: [601 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_332 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 595\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 601\n", "[01/17/2022-21:09:35] [V] [TRT] Div_332 [Div] inputs: [595 -> (-1, -1, 768)[FLOAT]], [601 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_332 for ONNX node: Div_332\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 602 for ONNX tensor: 602\n", "[01/17/2022-21:09:35] [V] [TRT] Div_332 [Div] outputs: [602 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_333 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 602\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_333 [Mul] inputs: [602 -> (-1, -1, 768)[FLOAT]], [encoder.layer.2.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.output.LayerNorm.weight for ONNX node: encoder.layer.2.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_333 for ONNX node: Mul_333\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 603 for ONNX tensor: 603\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_333 [Mul] outputs: [603 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_334 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 603\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.2.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_334 [Add] inputs: [603 -> (-1, -1, 768)[FLOAT]], [encoder.layer.2.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.2.output.LayerNorm.bias for ONNX node: encoder.layer.2.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_334 for ONNX node: Add_334\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 604 for ONNX tensor: 604\n", "[01/17/2022-21:09:35] [V] [TRT] Add_334 [Add] outputs: [604 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_335 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 604\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1678\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_335 [MatMul] inputs: [604 -> (-1, -1, 768)[FLOAT]], [1678 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1678 for ONNX node: 1678\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_335 for ONNX node: MatMul_335\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 606 for ONNX tensor: 606\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_335 [MatMul] outputs: [606 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_336 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 606\n", "[01/17/2022-21:09:35] [V] [TRT] Add_336 [Add] inputs: [encoder.layer.3.attention.self.query.bias -> (768)[FLOAT]], [606 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.attention.self.query.bias for ONNX node: encoder.layer.3.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_336 for ONNX node: Add_336\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 607 for ONNX tensor: 607\n", "[01/17/2022-21:09:35] [V] [TRT] Add_336 [Add] outputs: [607 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_337 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 604\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1679\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_337 [MatMul] inputs: [604 -> (-1, -1, 768)[FLOAT]], [1679 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1679 for ONNX node: 1679\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_337 for ONNX node: MatMul_337\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 609 for ONNX tensor: 609\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_337 [MatMul] outputs: [609 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_338 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 609\n", "[01/17/2022-21:09:35] [V] [TRT] Add_338 [Add] inputs: [encoder.layer.3.attention.self.key.bias -> (768)[FLOAT]], [609 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.attention.self.key.bias for ONNX node: encoder.layer.3.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_338 for ONNX node: Add_338\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 610 for ONNX tensor: 610\n", "[01/17/2022-21:09:35] [V] [TRT] Add_338 [Add] outputs: [610 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_339 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 610\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_339 [Shape] inputs: [610 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_339 for ONNX node: Shape_339\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 611 for ONNX tensor: 611\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_339 [Shape] outputs: [611 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_340 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_340 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_340 [Constant] outputs: [612 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_341 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 611\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 612\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_341 [Gather] inputs: [611 -> (3)[INT32]], [612 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 612 for ONNX node: 612\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_341 for ONNX node: Gather_341\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 613 for ONNX tensor: 613\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_341 [Gather] outputs: [613 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_342 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 610\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_342 [Shape] inputs: [610 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_342 for ONNX node: Shape_342\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 614 for ONNX tensor: 614\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_342 [Shape] outputs: [614 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_343 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_343 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_343 [Constant] outputs: [615 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_344 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 614\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 615\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_344 [Gather] inputs: [614 -> (3)[INT32]], [615 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 615 for ONNX node: 615\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_344 for ONNX node: Gather_344\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 616 for ONNX tensor: 616\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_344 [Gather] outputs: [616 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_345 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 613\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_345 [Unsqueeze] inputs: [613 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_345 for ONNX node: Unsqueeze_345\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 619 for ONNX tensor: 619\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_345 [Unsqueeze] outputs: [619 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_346 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 616\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_346 [Unsqueeze] inputs: [616 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_346 for ONNX node: Unsqueeze_346\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 620 for ONNX tensor: 620\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_346 [Unsqueeze] outputs: [620 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_347 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 619\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 620\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1680\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1681\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_347 [Concat] inputs: [619 -> (1)[INT32]], [620 -> (1)[INT32]], [1680 -> (1)[INT32]], [1681 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1680 for ONNX node: 1680\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1681 for ONNX node: 1681\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_347 for ONNX node: Concat_347\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 623 for ONNX tensor: 623\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_347 [Concat] outputs: [623 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_348 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 610\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 623\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_348 [Reshape] inputs: [610 -> (-1, -1, 768)[FLOAT]], [623 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_348 for ONNX node: Reshape_348\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 624 for ONNX tensor: 624\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_348 [Reshape] outputs: [624 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_349 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 604\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1682\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_349 [MatMul] inputs: [604 -> (-1, -1, 768)[FLOAT]], [1682 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1682 for ONNX node: 1682\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_349 for ONNX node: MatMul_349\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 626 for ONNX tensor: 626\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_349 [MatMul] outputs: [626 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_350 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 626\n", "[01/17/2022-21:09:35] [V] [TRT] Add_350 [Add] inputs: [encoder.layer.3.attention.self.value.bias -> (768)[FLOAT]], [626 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.attention.self.value.bias for ONNX node: encoder.layer.3.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_350 for ONNX node: Add_350\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 627 for ONNX tensor: 627\n", "[01/17/2022-21:09:35] [V] [TRT] Add_350 [Add] outputs: [627 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_351 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 627\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_351 [Shape] inputs: [627 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_351 for ONNX node: Shape_351\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 628 for ONNX tensor: 628\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_351 [Shape] outputs: [628 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_352 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_352 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_352 [Constant] outputs: [629 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_353 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 628\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 629\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_353 [Gather] inputs: [628 -> (3)[INT32]], [629 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 629 for ONNX node: 629\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_353 for ONNX node: Gather_353\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 630 for ONNX tensor: 630\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_353 [Gather] outputs: [630 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_354 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 627\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_354 [Shape] inputs: [627 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_354 for ONNX node: Shape_354\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 631 for ONNX tensor: 631\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_354 [Shape] outputs: [631 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_355 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_355 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_355 [Constant] outputs: [632 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_356 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 631\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 632\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_356 [Gather] inputs: [631 -> (3)[INT32]], [632 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 632 for ONNX node: 632\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_356 for ONNX node: Gather_356\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 633 for ONNX tensor: 633\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_356 [Gather] outputs: [633 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_357 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 630\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_357 [Unsqueeze] inputs: [630 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_357 for ONNX node: Unsqueeze_357\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 636 for ONNX tensor: 636\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_357 [Unsqueeze] outputs: [636 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_358 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 633\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_358 [Unsqueeze] inputs: [633 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_358 for ONNX node: Unsqueeze_358\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 637 for ONNX tensor: 637\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_358 [Unsqueeze] outputs: [637 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_359 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 636\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 637\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1683\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1684\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_359 [Concat] inputs: [636 -> (1)[INT32]], [637 -> (1)[INT32]], [1683 -> (1)[INT32]], [1684 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1683 for ONNX node: 1683\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1684 for ONNX node: 1684\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_359 for ONNX node: Concat_359\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 640 for ONNX tensor: 640\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_359 [Concat] outputs: [640 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_360 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 627\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 640\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_360 [Reshape] inputs: [627 -> (-1, -1, 768)[FLOAT]], [640 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_360 for ONNX node: Reshape_360\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 641 for ONNX tensor: 641\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_360 [Reshape] outputs: [641 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_361 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 641\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_361 [Transpose] inputs: [641 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_361 for ONNX node: Transpose_361\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 642 for ONNX tensor: 642\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_361 [Transpose] outputs: [642 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_362 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 607\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_362 [Shape] inputs: [607 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_362 for ONNX node: Shape_362\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 643 for ONNX tensor: 643\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_362 [Shape] outputs: [643 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_363 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_363 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_363 [Constant] outputs: [644 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_364 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 643\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 644\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_364 [Gather] inputs: [643 -> (3)[INT32]], [644 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 644 for ONNX node: 644\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_364 for ONNX node: Gather_364\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 645 for ONNX tensor: 645\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_364 [Gather] outputs: [645 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_365 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 607\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_365 [Shape] inputs: [607 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_365 for ONNX node: Shape_365\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 646 for ONNX tensor: 646\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_365 [Shape] outputs: [646 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_366 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_366 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_366 [Constant] outputs: [647 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_367 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 646\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 647\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_367 [Gather] inputs: [646 -> (3)[INT32]], [647 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 647 for ONNX node: 647\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_367 for ONNX node: Gather_367\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 648 for ONNX tensor: 648\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_367 [Gather] outputs: [648 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_368 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 645\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_368 [Unsqueeze] inputs: [645 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_368 for ONNX node: Unsqueeze_368\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 651 for ONNX tensor: 651\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_368 [Unsqueeze] outputs: [651 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_369 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 648\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_369 [Unsqueeze] inputs: [648 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_369 for ONNX node: Unsqueeze_369\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 652 for ONNX tensor: 652\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_369 [Unsqueeze] outputs: [652 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_370 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 651\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 652\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1685\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1686\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_370 [Concat] inputs: [651 -> (1)[INT32]], [652 -> (1)[INT32]], [1685 -> (1)[INT32]], [1686 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1685 for ONNX node: 1685\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1686 for ONNX node: 1686\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_370 for ONNX node: Concat_370\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 655 for ONNX tensor: 655\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_370 [Concat] outputs: [655 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_371 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 607\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 655\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_371 [Reshape] inputs: [607 -> (-1, -1, 768)[FLOAT]], [655 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_371 for ONNX node: Reshape_371\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 656 for ONNX tensor: 656\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_371 [Reshape] outputs: [656 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_372 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 656\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_372 [Transpose] inputs: [656 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_372 for ONNX node: Transpose_372\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 657 for ONNX tensor: 657\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_372 [Transpose] outputs: [657 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_373 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 624\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_373 [Transpose] inputs: [624 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_373 for ONNX node: Transpose_373\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 658 for ONNX tensor: 658\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_373 [Transpose] outputs: [658 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_374 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 657\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 658\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_374 [MatMul] inputs: [657 -> (-1, 12, -1, 64)[FLOAT]], [658 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_374 for ONNX node: MatMul_374\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 659 for ONNX tensor: 659\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_374 [MatMul] outputs: [659 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_375 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_375 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_375 [Constant] outputs: [660 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_376 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 659\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 660\n", "[01/17/2022-21:09:35] [V] [TRT] Div_376 [Div] inputs: [659 -> (-1, 12, -1, -1)[FLOAT]], [660 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 660 for ONNX node: 660\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_376 for ONNX node: Div_376\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 661 for ONNX tensor: 661\n", "[01/17/2022-21:09:35] [V] [TRT] Div_376 [Div] outputs: [661 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_377 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 661\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_377 [Add] inputs: [661 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_377 for ONNX node: Add_377\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 662 for ONNX tensor: 662\n", "[01/17/2022-21:09:35] [V] [TRT] Add_377 [Add] outputs: [662 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_378 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 662\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_378 [Softmax] inputs: [662 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_378 for ONNX node: Softmax_378\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 663 for ONNX tensor: 663\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_378 [Softmax] outputs: [663 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_379 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 663\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 642\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_379 [MatMul] inputs: [663 -> (-1, 12, -1, -1)[FLOAT]], [642 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_379 for ONNX node: MatMul_379\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 664 for ONNX tensor: 664\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_379 [MatMul] outputs: [664 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_380 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 664\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_380 [Transpose] inputs: [664 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_380 for ONNX node: Transpose_380\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 665 for ONNX tensor: 665\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_380 [Transpose] outputs: [665 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_381 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 665\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_381 [Shape] inputs: [665 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_381 for ONNX node: Shape_381\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 666 for ONNX tensor: 666\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_381 [Shape] outputs: [666 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_382 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_382 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_382 [Constant] outputs: [667 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_383 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 666\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 667\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_383 [Gather] inputs: [666 -> (4)[INT32]], [667 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 667 for ONNX node: 667\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_383 for ONNX node: Gather_383\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 668 for ONNX tensor: 668\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_383 [Gather] outputs: [668 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_384 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 665\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_384 [Shape] inputs: [665 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_384 for ONNX node: Shape_384\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 669 for ONNX tensor: 669\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_384 [Shape] outputs: [669 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_385 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_385 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_385 [Constant] outputs: [670 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_386 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 669\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 670\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_386 [Gather] inputs: [669 -> (4)[INT32]], [670 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 670 for ONNX node: 670\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_386 for ONNX node: Gather_386\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 671 for ONNX tensor: 671\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_386 [Gather] outputs: [671 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_387 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 668\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_387 [Unsqueeze] inputs: [668 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_387 for ONNX node: Unsqueeze_387\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 673 for ONNX tensor: 673\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_387 [Unsqueeze] outputs: [673 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_388 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 671\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_388 [Unsqueeze] inputs: [671 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_388 for ONNX node: Unsqueeze_388\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 674 for ONNX tensor: 674\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_388 [Unsqueeze] outputs: [674 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_389 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 673\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 674\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1687\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_389 [Concat] inputs: [673 -> (1)[INT32]], [674 -> (1)[INT32]], [1687 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1687 for ONNX node: 1687\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_389 for ONNX node: Concat_389\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 676 for ONNX tensor: 676\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_389 [Concat] outputs: [676 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_390 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 665\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 676\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_390 [Reshape] inputs: [665 -> (-1, -1, 12, 64)[FLOAT]], [676 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_390 for ONNX node: Reshape_390\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 677 for ONNX tensor: 677\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_390 [Reshape] outputs: [677 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_391 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 677\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1688\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_391 [MatMul] inputs: [677 -> (-1, -1, 768)[FLOAT]], [1688 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1688 for ONNX node: 1688\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_391 for ONNX node: MatMul_391\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 679 for ONNX tensor: 679\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_391 [MatMul] outputs: [679 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_392 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 679\n", "[01/17/2022-21:09:35] [V] [TRT] Add_392 [Add] inputs: [encoder.layer.3.attention.output.dense.bias -> (768)[FLOAT]], [679 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.attention.output.dense.bias for ONNX node: encoder.layer.3.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_392 for ONNX node: Add_392\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 680 for ONNX tensor: 680\n", "[01/17/2022-21:09:35] [V] [TRT] Add_392 [Add] outputs: [680 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_393 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 680\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 604\n", "[01/17/2022-21:09:35] [V] [TRT] Add_393 [Add] inputs: [680 -> (-1, -1, 768)[FLOAT]], [604 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_393 for ONNX node: Add_393\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 681 for ONNX tensor: 681\n", "[01/17/2022-21:09:35] [V] [TRT] Add_393 [Add] outputs: [681 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_394 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 681\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_394 [ReduceMean] inputs: [681 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_394 for ONNX node: ReduceMean_394\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 682 for ONNX tensor: 682\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_394 [ReduceMean] outputs: [682 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_395 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 681\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 682\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_395 [Sub] inputs: [681 -> (-1, -1, 768)[FLOAT]], [682 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_395 for ONNX node: Sub_395\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 683 for ONNX tensor: 683\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_395 [Sub] outputs: [683 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_396 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_396 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_396 [Constant] outputs: [684 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_397 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 683\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 684\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_397 [Pow] inputs: [683 -> (-1, -1, 768)[FLOAT]], [684 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 684 for ONNX node: 684\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_397 for ONNX node: Pow_397\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 685 for ONNX tensor: 685\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_397 [Pow] outputs: [685 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_398 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 685\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_398 [ReduceMean] inputs: [685 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_398 for ONNX node: ReduceMean_398\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 686 for ONNX tensor: 686\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_398 [ReduceMean] outputs: [686 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_399 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_399 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_399 [Constant] outputs: [687 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_400 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 686\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 687\n", "[01/17/2022-21:09:35] [V] [TRT] Add_400 [Add] inputs: [686 -> (-1, -1, 1)[FLOAT]], [687 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 687 for ONNX node: 687\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_400 for ONNX node: Add_400\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 688 for ONNX tensor: 688\n", "[01/17/2022-21:09:35] [V] [TRT] Add_400 [Add] outputs: [688 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_401 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 688\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_401 [Sqrt] inputs: [688 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_401 for ONNX node: Sqrt_401\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 689 for ONNX tensor: 689\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_401 [Sqrt] outputs: [689 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_402 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 683\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 689\n", "[01/17/2022-21:09:35] [V] [TRT] Div_402 [Div] inputs: [683 -> (-1, -1, 768)[FLOAT]], [689 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_402 for ONNX node: Div_402\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 690 for ONNX tensor: 690\n", "[01/17/2022-21:09:35] [V] [TRT] Div_402 [Div] outputs: [690 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_403 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 690\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_403 [Mul] inputs: [690 -> (-1, -1, 768)[FLOAT]], [encoder.layer.3.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.attention.output.LayerNorm.weight for ONNX node: encoder.layer.3.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_403 for ONNX node: Mul_403\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 691 for ONNX tensor: 691\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_403 [Mul] outputs: [691 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_404 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 691\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_404 [Add] inputs: [691 -> (-1, -1, 768)[FLOAT]], [encoder.layer.3.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.attention.output.LayerNorm.bias for ONNX node: encoder.layer.3.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_404 for ONNX node: Add_404\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 692 for ONNX tensor: 692\n", "[01/17/2022-21:09:35] [V] [TRT] Add_404 [Add] outputs: [692 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_405 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 692\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1689\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_405 [MatMul] inputs: [692 -> (-1, -1, 768)[FLOAT]], [1689 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1689 for ONNX node: 1689\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_405 for ONNX node: MatMul_405\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 694 for ONNX tensor: 694\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_405 [MatMul] outputs: [694 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_406 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 694\n", "[01/17/2022-21:09:35] [V] [TRT] Add_406 [Add] inputs: [encoder.layer.3.intermediate.dense.bias -> (3072)[FLOAT]], [694 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.intermediate.dense.bias for ONNX node: encoder.layer.3.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_406 for ONNX node: Add_406\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 695 for ONNX tensor: 695\n", "[01/17/2022-21:09:35] [V] [TRT] Add_406 [Add] outputs: [695 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_407 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_407 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_407 [Constant] outputs: [696 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_408 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 695\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 696\n", "[01/17/2022-21:09:35] [V] [TRT] Div_408 [Div] inputs: [695 -> (-1, -1, 3072)[FLOAT]], [696 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 696 for ONNX node: 696\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_408 for ONNX node: Div_408\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 697 for ONNX tensor: 697\n", "[01/17/2022-21:09:35] [V] [TRT] Div_408 [Div] outputs: [697 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_409 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 697\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_409 [Erf] inputs: [697 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_409 for ONNX node: Erf_409\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 698 for ONNX tensor: 698\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_409 [Erf] outputs: [698 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_410 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_410 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_410 [Constant] outputs: [699 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_411 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 698\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 699\n", "[01/17/2022-21:09:35] [V] [TRT] Add_411 [Add] inputs: [698 -> (-1, -1, 3072)[FLOAT]], [699 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 699 for ONNX node: 699\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_411 for ONNX node: Add_411\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 700 for ONNX tensor: 700\n", "[01/17/2022-21:09:35] [V] [TRT] Add_411 [Add] outputs: [700 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_412 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 695\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 700\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_412 [Mul] inputs: [695 -> (-1, -1, 3072)[FLOAT]], [700 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_412 for ONNX node: Mul_412\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 701 for ONNX tensor: 701\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_412 [Mul] outputs: [701 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_413 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_413 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_413 [Constant] outputs: [702 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_414 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 701\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 702\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_414 [Mul] inputs: [701 -> (-1, -1, 3072)[FLOAT]], [702 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 702 for ONNX node: 702\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_414 for ONNX node: Mul_414\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 703 for ONNX tensor: 703\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_414 [Mul] outputs: [703 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_415 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 703\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1690\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_415 [MatMul] inputs: [703 -> (-1, -1, 3072)[FLOAT]], [1690 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1690 for ONNX node: 1690\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_415 for ONNX node: MatMul_415\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 705 for ONNX tensor: 705\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_415 [MatMul] outputs: [705 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_416 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 705\n", "[01/17/2022-21:09:35] [V] [TRT] Add_416 [Add] inputs: [encoder.layer.3.output.dense.bias -> (768)[FLOAT]], [705 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.output.dense.bias for ONNX node: encoder.layer.3.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_416 for ONNX node: Add_416\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 706 for ONNX tensor: 706\n", "[01/17/2022-21:09:35] [V] [TRT] Add_416 [Add] outputs: [706 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_417 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 706\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 692\n", "[01/17/2022-21:09:35] [V] [TRT] Add_417 [Add] inputs: [706 -> (-1, -1, 768)[FLOAT]], [692 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_417 for ONNX node: Add_417\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 707 for ONNX tensor: 707\n", "[01/17/2022-21:09:35] [V] [TRT] Add_417 [Add] outputs: [707 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_418 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 707\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_418 [ReduceMean] inputs: [707 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_418 for ONNX node: ReduceMean_418\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 708 for ONNX tensor: 708\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_418 [ReduceMean] outputs: [708 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_419 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 707\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 708\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_419 [Sub] inputs: [707 -> (-1, -1, 768)[FLOAT]], [708 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_419 for ONNX node: Sub_419\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 709 for ONNX tensor: 709\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_419 [Sub] outputs: [709 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_420 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_420 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_420 [Constant] outputs: [710 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_421 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 709\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 710\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_421 [Pow] inputs: [709 -> (-1, -1, 768)[FLOAT]], [710 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 710 for ONNX node: 710\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_421 for ONNX node: Pow_421\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 711 for ONNX tensor: 711\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_421 [Pow] outputs: [711 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_422 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 711\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_422 [ReduceMean] inputs: [711 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_422 for ONNX node: ReduceMean_422\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 712 for ONNX tensor: 712\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_422 [ReduceMean] outputs: [712 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_423 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_423 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_423 [Constant] outputs: [713 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_424 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 712\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 713\n", "[01/17/2022-21:09:35] [V] [TRT] Add_424 [Add] inputs: [712 -> (-1, -1, 1)[FLOAT]], [713 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 713 for ONNX node: 713\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_424 for ONNX node: Add_424\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 714 for ONNX tensor: 714\n", "[01/17/2022-21:09:35] [V] [TRT] Add_424 [Add] outputs: [714 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_425 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 714\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_425 [Sqrt] inputs: [714 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_425 for ONNX node: Sqrt_425\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 715 for ONNX tensor: 715\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_425 [Sqrt] outputs: [715 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_426 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 709\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 715\n", "[01/17/2022-21:09:35] [V] [TRT] Div_426 [Div] inputs: [709 -> (-1, -1, 768)[FLOAT]], [715 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_426 for ONNX node: Div_426\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 716 for ONNX tensor: 716\n", "[01/17/2022-21:09:35] [V] [TRT] Div_426 [Div] outputs: [716 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_427 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 716\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_427 [Mul] inputs: [716 -> (-1, -1, 768)[FLOAT]], [encoder.layer.3.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.output.LayerNorm.weight for ONNX node: encoder.layer.3.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_427 for ONNX node: Mul_427\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 717 for ONNX tensor: 717\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_427 [Mul] outputs: [717 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_428 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 717\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.3.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_428 [Add] inputs: [717 -> (-1, -1, 768)[FLOAT]], [encoder.layer.3.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.3.output.LayerNorm.bias for ONNX node: encoder.layer.3.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_428 for ONNX node: Add_428\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 718 for ONNX tensor: 718\n", "[01/17/2022-21:09:35] [V] [TRT] Add_428 [Add] outputs: [718 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_429 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 718\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1691\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_429 [MatMul] inputs: [718 -> (-1, -1, 768)[FLOAT]], [1691 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1691 for ONNX node: 1691\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_429 for ONNX node: MatMul_429\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 720 for ONNX tensor: 720\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_429 [MatMul] outputs: [720 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_430 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 720\n", "[01/17/2022-21:09:35] [V] [TRT] Add_430 [Add] inputs: [encoder.layer.4.attention.self.query.bias -> (768)[FLOAT]], [720 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.attention.self.query.bias for ONNX node: encoder.layer.4.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_430 for ONNX node: Add_430\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 721 for ONNX tensor: 721\n", "[01/17/2022-21:09:35] [V] [TRT] Add_430 [Add] outputs: [721 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_431 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 718\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1692\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_431 [MatMul] inputs: [718 -> (-1, -1, 768)[FLOAT]], [1692 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1692 for ONNX node: 1692\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_431 for ONNX node: MatMul_431\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 723 for ONNX tensor: 723\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_431 [MatMul] outputs: [723 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_432 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 723\n", "[01/17/2022-21:09:35] [V] [TRT] Add_432 [Add] inputs: [encoder.layer.4.attention.self.key.bias -> (768)[FLOAT]], [723 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.attention.self.key.bias for ONNX node: encoder.layer.4.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_432 for ONNX node: Add_432\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 724 for ONNX tensor: 724\n", "[01/17/2022-21:09:35] [V] [TRT] Add_432 [Add] outputs: [724 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_433 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 724\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_433 [Shape] inputs: [724 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_433 for ONNX node: Shape_433\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 725 for ONNX tensor: 725\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_433 [Shape] outputs: [725 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_434 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_434 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_434 [Constant] outputs: [726 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_435 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 725\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 726\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_435 [Gather] inputs: [725 -> (3)[INT32]], [726 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 726 for ONNX node: 726\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_435 for ONNX node: Gather_435\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 727 for ONNX tensor: 727\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_435 [Gather] outputs: [727 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_436 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 724\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_436 [Shape] inputs: [724 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_436 for ONNX node: Shape_436\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 728 for ONNX tensor: 728\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_436 [Shape] outputs: [728 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_437 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_437 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_437 [Constant] outputs: [729 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_438 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 728\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 729\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_438 [Gather] inputs: [728 -> (3)[INT32]], [729 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 729 for ONNX node: 729\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_438 for ONNX node: Gather_438\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 730 for ONNX tensor: 730\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_438 [Gather] outputs: [730 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_439 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 727\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_439 [Unsqueeze] inputs: [727 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_439 for ONNX node: Unsqueeze_439\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 733 for ONNX tensor: 733\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_439 [Unsqueeze] outputs: [733 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_440 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 730\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_440 [Unsqueeze] inputs: [730 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_440 for ONNX node: Unsqueeze_440\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 734 for ONNX tensor: 734\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_440 [Unsqueeze] outputs: [734 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_441 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 733\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 734\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1693\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1694\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_441 [Concat] inputs: [733 -> (1)[INT32]], [734 -> (1)[INT32]], [1693 -> (1)[INT32]], [1694 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1693 for ONNX node: 1693\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1694 for ONNX node: 1694\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_441 for ONNX node: Concat_441\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 737 for ONNX tensor: 737\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_441 [Concat] outputs: [737 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_442 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 724\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 737\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_442 [Reshape] inputs: [724 -> (-1, -1, 768)[FLOAT]], [737 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_442 for ONNX node: Reshape_442\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 738 for ONNX tensor: 738\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_442 [Reshape] outputs: [738 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_443 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 718\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1695\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_443 [MatMul] inputs: [718 -> (-1, -1, 768)[FLOAT]], [1695 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1695 for ONNX node: 1695\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_443 for ONNX node: MatMul_443\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 740 for ONNX tensor: 740\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_443 [MatMul] outputs: [740 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_444 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 740\n", "[01/17/2022-21:09:35] [V] [TRT] Add_444 [Add] inputs: [encoder.layer.4.attention.self.value.bias -> (768)[FLOAT]], [740 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.attention.self.value.bias for ONNX node: encoder.layer.4.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_444 for ONNX node: Add_444\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 741 for ONNX tensor: 741\n", "[01/17/2022-21:09:35] [V] [TRT] Add_444 [Add] outputs: [741 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_445 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 741\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_445 [Shape] inputs: [741 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_445 for ONNX node: Shape_445\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 742 for ONNX tensor: 742\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_445 [Shape] outputs: [742 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_446 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_446 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_446 [Constant] outputs: [743 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_447 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 742\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 743\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_447 [Gather] inputs: [742 -> (3)[INT32]], [743 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 743 for ONNX node: 743\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_447 for ONNX node: Gather_447\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 744 for ONNX tensor: 744\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_447 [Gather] outputs: [744 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_448 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 741\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_448 [Shape] inputs: [741 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_448 for ONNX node: Shape_448\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 745 for ONNX tensor: 745\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_448 [Shape] outputs: [745 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_449 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_449 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_449 [Constant] outputs: [746 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_450 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 745\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 746\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_450 [Gather] inputs: [745 -> (3)[INT32]], [746 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 746 for ONNX node: 746\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_450 for ONNX node: Gather_450\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 747 for ONNX tensor: 747\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_450 [Gather] outputs: [747 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_451 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 744\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_451 [Unsqueeze] inputs: [744 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_451 for ONNX node: Unsqueeze_451\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 750 for ONNX tensor: 750\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_451 [Unsqueeze] outputs: [750 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_452 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 747\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_452 [Unsqueeze] inputs: [747 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_452 for ONNX node: Unsqueeze_452\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 751 for ONNX tensor: 751\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_452 [Unsqueeze] outputs: [751 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_453 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 750\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 751\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1696\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1697\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_453 [Concat] inputs: [750 -> (1)[INT32]], [751 -> (1)[INT32]], [1696 -> (1)[INT32]], [1697 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1696 for ONNX node: 1696\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1697 for ONNX node: 1697\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_453 for ONNX node: Concat_453\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 754 for ONNX tensor: 754\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_453 [Concat] outputs: [754 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_454 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 741\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 754\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_454 [Reshape] inputs: [741 -> (-1, -1, 768)[FLOAT]], [754 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_454 for ONNX node: Reshape_454\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 755 for ONNX tensor: 755\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_454 [Reshape] outputs: [755 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_455 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 755\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_455 [Transpose] inputs: [755 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_455 for ONNX node: Transpose_455\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 756 for ONNX tensor: 756\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_455 [Transpose] outputs: [756 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_456 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 721\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_456 [Shape] inputs: [721 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_456 for ONNX node: Shape_456\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 757 for ONNX tensor: 757\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_456 [Shape] outputs: [757 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_457 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_457 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_457 [Constant] outputs: [758 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_458 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 757\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 758\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_458 [Gather] inputs: [757 -> (3)[INT32]], [758 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 758 for ONNX node: 758\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_458 for ONNX node: Gather_458\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 759 for ONNX tensor: 759\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_458 [Gather] outputs: [759 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_459 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 721\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_459 [Shape] inputs: [721 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_459 for ONNX node: Shape_459\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 760 for ONNX tensor: 760\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_459 [Shape] outputs: [760 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_460 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_460 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_460 [Constant] outputs: [761 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_461 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 760\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 761\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_461 [Gather] inputs: [760 -> (3)[INT32]], [761 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 761 for ONNX node: 761\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_461 for ONNX node: Gather_461\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 762 for ONNX tensor: 762\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_461 [Gather] outputs: [762 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_462 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 759\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_462 [Unsqueeze] inputs: [759 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_462 for ONNX node: Unsqueeze_462\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 765 for ONNX tensor: 765\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_462 [Unsqueeze] outputs: [765 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_463 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 762\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_463 [Unsqueeze] inputs: [762 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_463 for ONNX node: Unsqueeze_463\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 766 for ONNX tensor: 766\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_463 [Unsqueeze] outputs: [766 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_464 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 765\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 766\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1698\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1699\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_464 [Concat] inputs: [765 -> (1)[INT32]], [766 -> (1)[INT32]], [1698 -> (1)[INT32]], [1699 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1698 for ONNX node: 1698\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1699 for ONNX node: 1699\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_464 for ONNX node: Concat_464\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 769 for ONNX tensor: 769\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_464 [Concat] outputs: [769 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_465 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 721\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 769\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_465 [Reshape] inputs: [721 -> (-1, -1, 768)[FLOAT]], [769 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_465 for ONNX node: Reshape_465\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 770 for ONNX tensor: 770\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_465 [Reshape] outputs: [770 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_466 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 770\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_466 [Transpose] inputs: [770 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_466 for ONNX node: Transpose_466\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 771 for ONNX tensor: 771\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_466 [Transpose] outputs: [771 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_467 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 738\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_467 [Transpose] inputs: [738 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_467 for ONNX node: Transpose_467\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 772 for ONNX tensor: 772\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_467 [Transpose] outputs: [772 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_468 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 771\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 772\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_468 [MatMul] inputs: [771 -> (-1, 12, -1, 64)[FLOAT]], [772 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_468 for ONNX node: MatMul_468\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 773 for ONNX tensor: 773\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_468 [MatMul] outputs: [773 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_469 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_469 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_469 [Constant] outputs: [774 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_470 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 773\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 774\n", "[01/17/2022-21:09:35] [V] [TRT] Div_470 [Div] inputs: [773 -> (-1, 12, -1, -1)[FLOAT]], [774 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 774 for ONNX node: 774\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_470 for ONNX node: Div_470\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 775 for ONNX tensor: 775\n", "[01/17/2022-21:09:35] [V] [TRT] Div_470 [Div] outputs: [775 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_471 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 775\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_471 [Add] inputs: [775 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_471 for ONNX node: Add_471\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 776 for ONNX tensor: 776\n", "[01/17/2022-21:09:35] [V] [TRT] Add_471 [Add] outputs: [776 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_472 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 776\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_472 [Softmax] inputs: [776 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_472 for ONNX node: Softmax_472\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 777 for ONNX tensor: 777\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_472 [Softmax] outputs: [777 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_473 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 777\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 756\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_473 [MatMul] inputs: [777 -> (-1, 12, -1, -1)[FLOAT]], [756 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_473 for ONNX node: MatMul_473\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 778 for ONNX tensor: 778\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_473 [MatMul] outputs: [778 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_474 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 778\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_474 [Transpose] inputs: [778 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_474 for ONNX node: Transpose_474\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 779 for ONNX tensor: 779\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_474 [Transpose] outputs: [779 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_475 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 779\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_475 [Shape] inputs: [779 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_475 for ONNX node: Shape_475\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 780 for ONNX tensor: 780\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_475 [Shape] outputs: [780 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_476 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_476 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_476 [Constant] outputs: [781 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_477 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 780\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 781\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_477 [Gather] inputs: [780 -> (4)[INT32]], [781 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 781 for ONNX node: 781\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_477 for ONNX node: Gather_477\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 782 for ONNX tensor: 782\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_477 [Gather] outputs: [782 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_478 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 779\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_478 [Shape] inputs: [779 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_478 for ONNX node: Shape_478\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 783 for ONNX tensor: 783\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_478 [Shape] outputs: [783 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_479 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_479 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_479 [Constant] outputs: [784 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_480 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 783\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 784\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_480 [Gather] inputs: [783 -> (4)[INT32]], [784 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 784 for ONNX node: 784\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_480 for ONNX node: Gather_480\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 785 for ONNX tensor: 785\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_480 [Gather] outputs: [785 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_481 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 782\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_481 [Unsqueeze] inputs: [782 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_481 for ONNX node: Unsqueeze_481\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 787 for ONNX tensor: 787\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_481 [Unsqueeze] outputs: [787 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_482 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 785\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_482 [Unsqueeze] inputs: [785 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_482 for ONNX node: Unsqueeze_482\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 788 for ONNX tensor: 788\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_482 [Unsqueeze] outputs: [788 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_483 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 787\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 788\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1700\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_483 [Concat] inputs: [787 -> (1)[INT32]], [788 -> (1)[INT32]], [1700 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1700 for ONNX node: 1700\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_483 for ONNX node: Concat_483\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 790 for ONNX tensor: 790\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_483 [Concat] outputs: [790 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_484 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 779\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 790\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_484 [Reshape] inputs: [779 -> (-1, -1, 12, 64)[FLOAT]], [790 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_484 for ONNX node: Reshape_484\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 791 for ONNX tensor: 791\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_484 [Reshape] outputs: [791 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_485 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 791\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1701\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_485 [MatMul] inputs: [791 -> (-1, -1, 768)[FLOAT]], [1701 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1701 for ONNX node: 1701\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_485 for ONNX node: MatMul_485\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 793 for ONNX tensor: 793\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_485 [MatMul] outputs: [793 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_486 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 793\n", "[01/17/2022-21:09:35] [V] [TRT] Add_486 [Add] inputs: [encoder.layer.4.attention.output.dense.bias -> (768)[FLOAT]], [793 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.attention.output.dense.bias for ONNX node: encoder.layer.4.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_486 for ONNX node: Add_486\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 794 for ONNX tensor: 794\n", "[01/17/2022-21:09:35] [V] [TRT] Add_486 [Add] outputs: [794 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_487 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 794\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 718\n", "[01/17/2022-21:09:35] [V] [TRT] Add_487 [Add] inputs: [794 -> (-1, -1, 768)[FLOAT]], [718 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_487 for ONNX node: Add_487\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 795 for ONNX tensor: 795\n", "[01/17/2022-21:09:35] [V] [TRT] Add_487 [Add] outputs: [795 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_488 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 795\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_488 [ReduceMean] inputs: [795 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_488 for ONNX node: ReduceMean_488\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 796 for ONNX tensor: 796\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_488 [ReduceMean] outputs: [796 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_489 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 795\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 796\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_489 [Sub] inputs: [795 -> (-1, -1, 768)[FLOAT]], [796 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_489 for ONNX node: Sub_489\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 797 for ONNX tensor: 797\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_489 [Sub] outputs: [797 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_490 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_490 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_490 [Constant] outputs: [798 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_491 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 797\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 798\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_491 [Pow] inputs: [797 -> (-1, -1, 768)[FLOAT]], [798 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 798 for ONNX node: 798\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_491 for ONNX node: Pow_491\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 799 for ONNX tensor: 799\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_491 [Pow] outputs: [799 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_492 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 799\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_492 [ReduceMean] inputs: [799 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_492 for ONNX node: ReduceMean_492\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 800 for ONNX tensor: 800\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_492 [ReduceMean] outputs: [800 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_493 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_493 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_493 [Constant] outputs: [801 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_494 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 800\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 801\n", "[01/17/2022-21:09:35] [V] [TRT] Add_494 [Add] inputs: [800 -> (-1, -1, 1)[FLOAT]], [801 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 801 for ONNX node: 801\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_494 for ONNX node: Add_494\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 802 for ONNX tensor: 802\n", "[01/17/2022-21:09:35] [V] [TRT] Add_494 [Add] outputs: [802 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_495 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 802\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_495 [Sqrt] inputs: [802 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_495 for ONNX node: Sqrt_495\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 803 for ONNX tensor: 803\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_495 [Sqrt] outputs: [803 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_496 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 797\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 803\n", "[01/17/2022-21:09:35] [V] [TRT] Div_496 [Div] inputs: [797 -> (-1, -1, 768)[FLOAT]], [803 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_496 for ONNX node: Div_496\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 804 for ONNX tensor: 804\n", "[01/17/2022-21:09:35] [V] [TRT] Div_496 [Div] outputs: [804 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_497 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 804\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_497 [Mul] inputs: [804 -> (-1, -1, 768)[FLOAT]], [encoder.layer.4.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.attention.output.LayerNorm.weight for ONNX node: encoder.layer.4.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_497 for ONNX node: Mul_497\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 805 for ONNX tensor: 805\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_497 [Mul] outputs: [805 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_498 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 805\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_498 [Add] inputs: [805 -> (-1, -1, 768)[FLOAT]], [encoder.layer.4.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.attention.output.LayerNorm.bias for ONNX node: encoder.layer.4.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_498 for ONNX node: Add_498\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 806 for ONNX tensor: 806\n", "[01/17/2022-21:09:35] [V] [TRT] Add_498 [Add] outputs: [806 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_499 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 806\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1702\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_499 [MatMul] inputs: [806 -> (-1, -1, 768)[FLOAT]], [1702 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1702 for ONNX node: 1702\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_499 for ONNX node: MatMul_499\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 808 for ONNX tensor: 808\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_499 [MatMul] outputs: [808 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_500 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 808\n", "[01/17/2022-21:09:35] [V] [TRT] Add_500 [Add] inputs: [encoder.layer.4.intermediate.dense.bias -> (3072)[FLOAT]], [808 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.intermediate.dense.bias for ONNX node: encoder.layer.4.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_500 for ONNX node: Add_500\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 809 for ONNX tensor: 809\n", "[01/17/2022-21:09:35] [V] [TRT] Add_500 [Add] outputs: [809 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_501 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_501 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_501 [Constant] outputs: [810 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_502 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 809\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 810\n", "[01/17/2022-21:09:35] [V] [TRT] Div_502 [Div] inputs: [809 -> (-1, -1, 3072)[FLOAT]], [810 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 810 for ONNX node: 810\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_502 for ONNX node: Div_502\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 811 for ONNX tensor: 811\n", "[01/17/2022-21:09:35] [V] [TRT] Div_502 [Div] outputs: [811 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_503 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 811\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_503 [Erf] inputs: [811 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_503 for ONNX node: Erf_503\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 812 for ONNX tensor: 812\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_503 [Erf] outputs: [812 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_504 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_504 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_504 [Constant] outputs: [813 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_505 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 812\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 813\n", "[01/17/2022-21:09:35] [V] [TRT] Add_505 [Add] inputs: [812 -> (-1, -1, 3072)[FLOAT]], [813 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 813 for ONNX node: 813\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_505 for ONNX node: Add_505\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 814 for ONNX tensor: 814\n", "[01/17/2022-21:09:35] [V] [TRT] Add_505 [Add] outputs: [814 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_506 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 809\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 814\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_506 [Mul] inputs: [809 -> (-1, -1, 3072)[FLOAT]], [814 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_506 for ONNX node: Mul_506\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 815 for ONNX tensor: 815\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_506 [Mul] outputs: [815 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_507 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_507 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_507 [Constant] outputs: [816 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_508 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 815\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 816\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_508 [Mul] inputs: [815 -> (-1, -1, 3072)[FLOAT]], [816 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 816 for ONNX node: 816\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_508 for ONNX node: Mul_508\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 817 for ONNX tensor: 817\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_508 [Mul] outputs: [817 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_509 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 817\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1703\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_509 [MatMul] inputs: [817 -> (-1, -1, 3072)[FLOAT]], [1703 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1703 for ONNX node: 1703\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_509 for ONNX node: MatMul_509\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 819 for ONNX tensor: 819\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_509 [MatMul] outputs: [819 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_510 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 819\n", "[01/17/2022-21:09:35] [V] [TRT] Add_510 [Add] inputs: [encoder.layer.4.output.dense.bias -> (768)[FLOAT]], [819 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.output.dense.bias for ONNX node: encoder.layer.4.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_510 for ONNX node: Add_510\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 820 for ONNX tensor: 820\n", "[01/17/2022-21:09:35] [V] [TRT] Add_510 [Add] outputs: [820 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_511 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 820\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 806\n", "[01/17/2022-21:09:35] [V] [TRT] Add_511 [Add] inputs: [820 -> (-1, -1, 768)[FLOAT]], [806 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_511 for ONNX node: Add_511\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 821 for ONNX tensor: 821\n", "[01/17/2022-21:09:35] [V] [TRT] Add_511 [Add] outputs: [821 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_512 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 821\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_512 [ReduceMean] inputs: [821 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_512 for ONNX node: ReduceMean_512\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 822 for ONNX tensor: 822\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_512 [ReduceMean] outputs: [822 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_513 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 821\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 822\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_513 [Sub] inputs: [821 -> (-1, -1, 768)[FLOAT]], [822 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_513 for ONNX node: Sub_513\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 823 for ONNX tensor: 823\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_513 [Sub] outputs: [823 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_514 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_514 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_514 [Constant] outputs: [824 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_515 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 823\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 824\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_515 [Pow] inputs: [823 -> (-1, -1, 768)[FLOAT]], [824 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 824 for ONNX node: 824\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_515 for ONNX node: Pow_515\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 825 for ONNX tensor: 825\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_515 [Pow] outputs: [825 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_516 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 825\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_516 [ReduceMean] inputs: [825 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_516 for ONNX node: ReduceMean_516\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 826 for ONNX tensor: 826\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_516 [ReduceMean] outputs: [826 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_517 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_517 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_517 [Constant] outputs: [827 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_518 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 826\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 827\n", "[01/17/2022-21:09:35] [V] [TRT] Add_518 [Add] inputs: [826 -> (-1, -1, 1)[FLOAT]], [827 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 827 for ONNX node: 827\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_518 for ONNX node: Add_518\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 828 for ONNX tensor: 828\n", "[01/17/2022-21:09:35] [V] [TRT] Add_518 [Add] outputs: [828 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_519 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 828\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_519 [Sqrt] inputs: [828 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_519 for ONNX node: Sqrt_519\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 829 for ONNX tensor: 829\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_519 [Sqrt] outputs: [829 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_520 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 823\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 829\n", "[01/17/2022-21:09:35] [V] [TRT] Div_520 [Div] inputs: [823 -> (-1, -1, 768)[FLOAT]], [829 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_520 for ONNX node: Div_520\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 830 for ONNX tensor: 830\n", "[01/17/2022-21:09:35] [V] [TRT] Div_520 [Div] outputs: [830 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_521 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 830\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_521 [Mul] inputs: [830 -> (-1, -1, 768)[FLOAT]], [encoder.layer.4.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.output.LayerNorm.weight for ONNX node: encoder.layer.4.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_521 for ONNX node: Mul_521\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 831 for ONNX tensor: 831\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_521 [Mul] outputs: [831 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_522 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 831\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.4.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_522 [Add] inputs: [831 -> (-1, -1, 768)[FLOAT]], [encoder.layer.4.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.4.output.LayerNorm.bias for ONNX node: encoder.layer.4.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_522 for ONNX node: Add_522\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 832 for ONNX tensor: 832\n", "[01/17/2022-21:09:35] [V] [TRT] Add_522 [Add] outputs: [832 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_523 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 832\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1704\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_523 [MatMul] inputs: [832 -> (-1, -1, 768)[FLOAT]], [1704 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1704 for ONNX node: 1704\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_523 for ONNX node: MatMul_523\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 834 for ONNX tensor: 834\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_523 [MatMul] outputs: [834 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_524 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 834\n", "[01/17/2022-21:09:35] [V] [TRT] Add_524 [Add] inputs: [encoder.layer.5.attention.self.query.bias -> (768)[FLOAT]], [834 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.attention.self.query.bias for ONNX node: encoder.layer.5.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_524 for ONNX node: Add_524\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 835 for ONNX tensor: 835\n", "[01/17/2022-21:09:35] [V] [TRT] Add_524 [Add] outputs: [835 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_525 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 832\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1705\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_525 [MatMul] inputs: [832 -> (-1, -1, 768)[FLOAT]], [1705 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1705 for ONNX node: 1705\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_525 for ONNX node: MatMul_525\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 837 for ONNX tensor: 837\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_525 [MatMul] outputs: [837 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_526 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 837\n", "[01/17/2022-21:09:35] [V] [TRT] Add_526 [Add] inputs: [encoder.layer.5.attention.self.key.bias -> (768)[FLOAT]], [837 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.attention.self.key.bias for ONNX node: encoder.layer.5.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_526 for ONNX node: Add_526\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 838 for ONNX tensor: 838\n", "[01/17/2022-21:09:35] [V] [TRT] Add_526 [Add] outputs: [838 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_527 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 838\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_527 [Shape] inputs: [838 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_527 for ONNX node: Shape_527\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 839 for ONNX tensor: 839\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_527 [Shape] outputs: [839 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_528 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_528 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_528 [Constant] outputs: [840 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_529 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 839\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 840\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_529 [Gather] inputs: [839 -> (3)[INT32]], [840 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 840 for ONNX node: 840\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_529 for ONNX node: Gather_529\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 841 for ONNX tensor: 841\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_529 [Gather] outputs: [841 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_530 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 838\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_530 [Shape] inputs: [838 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_530 for ONNX node: Shape_530\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 842 for ONNX tensor: 842\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_530 [Shape] outputs: [842 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_531 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_531 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_531 [Constant] outputs: [843 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_532 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 842\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 843\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_532 [Gather] inputs: [842 -> (3)[INT32]], [843 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 843 for ONNX node: 843\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_532 for ONNX node: Gather_532\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 844 for ONNX tensor: 844\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_532 [Gather] outputs: [844 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_533 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 841\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_533 [Unsqueeze] inputs: [841 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_533 for ONNX node: Unsqueeze_533\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 847 for ONNX tensor: 847\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_533 [Unsqueeze] outputs: [847 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_534 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 844\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_534 [Unsqueeze] inputs: [844 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_534 for ONNX node: Unsqueeze_534\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 848 for ONNX tensor: 848\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_534 [Unsqueeze] outputs: [848 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_535 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 847\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 848\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1706\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1707\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_535 [Concat] inputs: [847 -> (1)[INT32]], [848 -> (1)[INT32]], [1706 -> (1)[INT32]], [1707 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1706 for ONNX node: 1706\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1707 for ONNX node: 1707\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_535 for ONNX node: Concat_535\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 851 for ONNX tensor: 851\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_535 [Concat] outputs: [851 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_536 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 838\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 851\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_536 [Reshape] inputs: [838 -> (-1, -1, 768)[FLOAT]], [851 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_536 for ONNX node: Reshape_536\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 852 for ONNX tensor: 852\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_536 [Reshape] outputs: [852 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_537 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 832\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1708\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_537 [MatMul] inputs: [832 -> (-1, -1, 768)[FLOAT]], [1708 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1708 for ONNX node: 1708\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_537 for ONNX node: MatMul_537\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 854 for ONNX tensor: 854\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_537 [MatMul] outputs: [854 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_538 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 854\n", "[01/17/2022-21:09:35] [V] [TRT] Add_538 [Add] inputs: [encoder.layer.5.attention.self.value.bias -> (768)[FLOAT]], [854 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.attention.self.value.bias for ONNX node: encoder.layer.5.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_538 for ONNX node: Add_538\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 855 for ONNX tensor: 855\n", "[01/17/2022-21:09:35] [V] [TRT] Add_538 [Add] outputs: [855 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_539 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 855\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_539 [Shape] inputs: [855 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_539 for ONNX node: Shape_539\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 856 for ONNX tensor: 856\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_539 [Shape] outputs: [856 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_540 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_540 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_540 [Constant] outputs: [857 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_541 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 856\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 857\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_541 [Gather] inputs: [856 -> (3)[INT32]], [857 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 857 for ONNX node: 857\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_541 for ONNX node: Gather_541\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 858 for ONNX tensor: 858\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_541 [Gather] outputs: [858 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_542 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 855\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_542 [Shape] inputs: [855 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_542 for ONNX node: Shape_542\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 859 for ONNX tensor: 859\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_542 [Shape] outputs: [859 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_543 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_543 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_543 [Constant] outputs: [860 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_544 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 859\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 860\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_544 [Gather] inputs: [859 -> (3)[INT32]], [860 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 860 for ONNX node: 860\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_544 for ONNX node: Gather_544\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 861 for ONNX tensor: 861\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_544 [Gather] outputs: [861 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_545 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 858\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_545 [Unsqueeze] inputs: [858 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_545 for ONNX node: Unsqueeze_545\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 864 for ONNX tensor: 864\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_545 [Unsqueeze] outputs: [864 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_546 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 861\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_546 [Unsqueeze] inputs: [861 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_546 for ONNX node: Unsqueeze_546\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 865 for ONNX tensor: 865\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_546 [Unsqueeze] outputs: [865 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_547 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 864\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 865\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1709\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1710\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_547 [Concat] inputs: [864 -> (1)[INT32]], [865 -> (1)[INT32]], [1709 -> (1)[INT32]], [1710 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1709 for ONNX node: 1709\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1710 for ONNX node: 1710\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_547 for ONNX node: Concat_547\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 868 for ONNX tensor: 868\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_547 [Concat] outputs: [868 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_548 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 855\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 868\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_548 [Reshape] inputs: [855 -> (-1, -1, 768)[FLOAT]], [868 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_548 for ONNX node: Reshape_548\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 869 for ONNX tensor: 869\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_548 [Reshape] outputs: [869 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_549 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 869\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_549 [Transpose] inputs: [869 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_549 for ONNX node: Transpose_549\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 870 for ONNX tensor: 870\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_549 [Transpose] outputs: [870 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_550 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 835\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_550 [Shape] inputs: [835 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_550 for ONNX node: Shape_550\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 871 for ONNX tensor: 871\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_550 [Shape] outputs: [871 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_551 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_551 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_551 [Constant] outputs: [872 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_552 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 871\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 872\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_552 [Gather] inputs: [871 -> (3)[INT32]], [872 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 872 for ONNX node: 872\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_552 for ONNX node: Gather_552\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 873 for ONNX tensor: 873\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_552 [Gather] outputs: [873 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_553 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 835\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_553 [Shape] inputs: [835 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_553 for ONNX node: Shape_553\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 874 for ONNX tensor: 874\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_553 [Shape] outputs: [874 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_554 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_554 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_554 [Constant] outputs: [875 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_555 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 874\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 875\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_555 [Gather] inputs: [874 -> (3)[INT32]], [875 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 875 for ONNX node: 875\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_555 for ONNX node: Gather_555\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 876 for ONNX tensor: 876\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_555 [Gather] outputs: [876 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_556 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 873\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_556 [Unsqueeze] inputs: [873 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_556 for ONNX node: Unsqueeze_556\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 879 for ONNX tensor: 879\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_556 [Unsqueeze] outputs: [879 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_557 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 876\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_557 [Unsqueeze] inputs: [876 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_557 for ONNX node: Unsqueeze_557\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 880 for ONNX tensor: 880\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_557 [Unsqueeze] outputs: [880 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_558 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 879\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 880\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1711\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1712\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_558 [Concat] inputs: [879 -> (1)[INT32]], [880 -> (1)[INT32]], [1711 -> (1)[INT32]], [1712 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1711 for ONNX node: 1711\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1712 for ONNX node: 1712\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_558 for ONNX node: Concat_558\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 883 for ONNX tensor: 883\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_558 [Concat] outputs: [883 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_559 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 835\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 883\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_559 [Reshape] inputs: [835 -> (-1, -1, 768)[FLOAT]], [883 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_559 for ONNX node: Reshape_559\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 884 for ONNX tensor: 884\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_559 [Reshape] outputs: [884 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_560 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 884\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_560 [Transpose] inputs: [884 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_560 for ONNX node: Transpose_560\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 885 for ONNX tensor: 885\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_560 [Transpose] outputs: [885 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_561 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 852\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_561 [Transpose] inputs: [852 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_561 for ONNX node: Transpose_561\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 886 for ONNX tensor: 886\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_561 [Transpose] outputs: [886 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_562 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 885\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 886\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_562 [MatMul] inputs: [885 -> (-1, 12, -1, 64)[FLOAT]], [886 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_562 for ONNX node: MatMul_562\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 887 for ONNX tensor: 887\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_562 [MatMul] outputs: [887 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_563 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_563 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_563 [Constant] outputs: [888 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_564 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 887\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 888\n", "[01/17/2022-21:09:35] [V] [TRT] Div_564 [Div] inputs: [887 -> (-1, 12, -1, -1)[FLOAT]], [888 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 888 for ONNX node: 888\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_564 for ONNX node: Div_564\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 889 for ONNX tensor: 889\n", "[01/17/2022-21:09:35] [V] [TRT] Div_564 [Div] outputs: [889 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_565 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 889\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_565 [Add] inputs: [889 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_565 for ONNX node: Add_565\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 890 for ONNX tensor: 890\n", "[01/17/2022-21:09:35] [V] [TRT] Add_565 [Add] outputs: [890 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_566 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 890\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_566 [Softmax] inputs: [890 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_566 for ONNX node: Softmax_566\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 891 for ONNX tensor: 891\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_566 [Softmax] outputs: [891 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_567 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 891\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 870\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_567 [MatMul] inputs: [891 -> (-1, 12, -1, -1)[FLOAT]], [870 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_567 for ONNX node: MatMul_567\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 892 for ONNX tensor: 892\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_567 [MatMul] outputs: [892 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_568 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 892\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_568 [Transpose] inputs: [892 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_568 for ONNX node: Transpose_568\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 893 for ONNX tensor: 893\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_568 [Transpose] outputs: [893 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_569 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 893\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_569 [Shape] inputs: [893 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_569 for ONNX node: Shape_569\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 894 for ONNX tensor: 894\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_569 [Shape] outputs: [894 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_570 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_570 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_570 [Constant] outputs: [895 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_571 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 894\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 895\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_571 [Gather] inputs: [894 -> (4)[INT32]], [895 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 895 for ONNX node: 895\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_571 for ONNX node: Gather_571\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 896 for ONNX tensor: 896\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_571 [Gather] outputs: [896 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_572 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 893\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_572 [Shape] inputs: [893 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_572 for ONNX node: Shape_572\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 897 for ONNX tensor: 897\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_572 [Shape] outputs: [897 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_573 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_573 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_573 [Constant] outputs: [898 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_574 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 897\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 898\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_574 [Gather] inputs: [897 -> (4)[INT32]], [898 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 898 for ONNX node: 898\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_574 for ONNX node: Gather_574\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 899 for ONNX tensor: 899\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_574 [Gather] outputs: [899 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_575 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 896\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_575 [Unsqueeze] inputs: [896 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_575 for ONNX node: Unsqueeze_575\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 901 for ONNX tensor: 901\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_575 [Unsqueeze] outputs: [901 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_576 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 899\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_576 [Unsqueeze] inputs: [899 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_576 for ONNX node: Unsqueeze_576\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 902 for ONNX tensor: 902\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_576 [Unsqueeze] outputs: [902 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_577 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 901\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 902\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1713\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_577 [Concat] inputs: [901 -> (1)[INT32]], [902 -> (1)[INT32]], [1713 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1713 for ONNX node: 1713\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_577 for ONNX node: Concat_577\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 904 for ONNX tensor: 904\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_577 [Concat] outputs: [904 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_578 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 893\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 904\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_578 [Reshape] inputs: [893 -> (-1, -1, 12, 64)[FLOAT]], [904 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_578 for ONNX node: Reshape_578\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 905 for ONNX tensor: 905\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_578 [Reshape] outputs: [905 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_579 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 905\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1714\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_579 [MatMul] inputs: [905 -> (-1, -1, 768)[FLOAT]], [1714 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1714 for ONNX node: 1714\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_579 for ONNX node: MatMul_579\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 907 for ONNX tensor: 907\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_579 [MatMul] outputs: [907 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_580 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 907\n", "[01/17/2022-21:09:35] [V] [TRT] Add_580 [Add] inputs: [encoder.layer.5.attention.output.dense.bias -> (768)[FLOAT]], [907 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.attention.output.dense.bias for ONNX node: encoder.layer.5.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_580 for ONNX node: Add_580\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 908 for ONNX tensor: 908\n", "[01/17/2022-21:09:35] [V] [TRT] Add_580 [Add] outputs: [908 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_581 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 908\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 832\n", "[01/17/2022-21:09:35] [V] [TRT] Add_581 [Add] inputs: [908 -> (-1, -1, 768)[FLOAT]], [832 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_581 for ONNX node: Add_581\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 909 for ONNX tensor: 909\n", "[01/17/2022-21:09:35] [V] [TRT] Add_581 [Add] outputs: [909 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_582 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 909\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_582 [ReduceMean] inputs: [909 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_582 for ONNX node: ReduceMean_582\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 910 for ONNX tensor: 910\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_582 [ReduceMean] outputs: [910 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_583 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 909\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 910\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_583 [Sub] inputs: [909 -> (-1, -1, 768)[FLOAT]], [910 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_583 for ONNX node: Sub_583\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 911 for ONNX tensor: 911\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_583 [Sub] outputs: [911 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_584 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_584 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_584 [Constant] outputs: [912 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_585 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 911\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 912\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_585 [Pow] inputs: [911 -> (-1, -1, 768)[FLOAT]], [912 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 912 for ONNX node: 912\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_585 for ONNX node: Pow_585\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 913 for ONNX tensor: 913\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_585 [Pow] outputs: [913 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_586 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 913\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_586 [ReduceMean] inputs: [913 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_586 for ONNX node: ReduceMean_586\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 914 for ONNX tensor: 914\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_586 [ReduceMean] outputs: [914 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_587 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_587 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_587 [Constant] outputs: [915 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_588 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 914\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 915\n", "[01/17/2022-21:09:35] [V] [TRT] Add_588 [Add] inputs: [914 -> (-1, -1, 1)[FLOAT]], [915 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 915 for ONNX node: 915\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_588 for ONNX node: Add_588\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 916 for ONNX tensor: 916\n", "[01/17/2022-21:09:35] [V] [TRT] Add_588 [Add] outputs: [916 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_589 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 916\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_589 [Sqrt] inputs: [916 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_589 for ONNX node: Sqrt_589\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 917 for ONNX tensor: 917\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_589 [Sqrt] outputs: [917 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_590 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 911\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 917\n", "[01/17/2022-21:09:35] [V] [TRT] Div_590 [Div] inputs: [911 -> (-1, -1, 768)[FLOAT]], [917 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_590 for ONNX node: Div_590\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 918 for ONNX tensor: 918\n", "[01/17/2022-21:09:35] [V] [TRT] Div_590 [Div] outputs: [918 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_591 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 918\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_591 [Mul] inputs: [918 -> (-1, -1, 768)[FLOAT]], [encoder.layer.5.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.attention.output.LayerNorm.weight for ONNX node: encoder.layer.5.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_591 for ONNX node: Mul_591\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 919 for ONNX tensor: 919\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_591 [Mul] outputs: [919 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_592 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 919\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_592 [Add] inputs: [919 -> (-1, -1, 768)[FLOAT]], [encoder.layer.5.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.attention.output.LayerNorm.bias for ONNX node: encoder.layer.5.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_592 for ONNX node: Add_592\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 920 for ONNX tensor: 920\n", "[01/17/2022-21:09:35] [V] [TRT] Add_592 [Add] outputs: [920 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_593 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 920\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1715\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_593 [MatMul] inputs: [920 -> (-1, -1, 768)[FLOAT]], [1715 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1715 for ONNX node: 1715\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_593 for ONNX node: MatMul_593\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 922 for ONNX tensor: 922\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_593 [MatMul] outputs: [922 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_594 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 922\n", "[01/17/2022-21:09:35] [V] [TRT] Add_594 [Add] inputs: [encoder.layer.5.intermediate.dense.bias -> (3072)[FLOAT]], [922 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.intermediate.dense.bias for ONNX node: encoder.layer.5.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_594 for ONNX node: Add_594\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 923 for ONNX tensor: 923\n", "[01/17/2022-21:09:35] [V] [TRT] Add_594 [Add] outputs: [923 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_595 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_595 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_595 [Constant] outputs: [924 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_596 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 923\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 924\n", "[01/17/2022-21:09:35] [V] [TRT] Div_596 [Div] inputs: [923 -> (-1, -1, 3072)[FLOAT]], [924 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 924 for ONNX node: 924\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_596 for ONNX node: Div_596\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 925 for ONNX tensor: 925\n", "[01/17/2022-21:09:35] [V] [TRT] Div_596 [Div] outputs: [925 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_597 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 925\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_597 [Erf] inputs: [925 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_597 for ONNX node: Erf_597\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 926 for ONNX tensor: 926\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_597 [Erf] outputs: [926 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_598 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_598 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_598 [Constant] outputs: [927 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_599 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 926\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 927\n", "[01/17/2022-21:09:35] [V] [TRT] Add_599 [Add] inputs: [926 -> (-1, -1, 3072)[FLOAT]], [927 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 927 for ONNX node: 927\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_599 for ONNX node: Add_599\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 928 for ONNX tensor: 928\n", "[01/17/2022-21:09:35] [V] [TRT] Add_599 [Add] outputs: [928 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_600 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 923\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 928\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_600 [Mul] inputs: [923 -> (-1, -1, 3072)[FLOAT]], [928 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_600 for ONNX node: Mul_600\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 929 for ONNX tensor: 929\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_600 [Mul] outputs: [929 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_601 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_601 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_601 [Constant] outputs: [930 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_602 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 929\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 930\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_602 [Mul] inputs: [929 -> (-1, -1, 3072)[FLOAT]], [930 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 930 for ONNX node: 930\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_602 for ONNX node: Mul_602\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 931 for ONNX tensor: 931\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_602 [Mul] outputs: [931 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_603 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 931\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1716\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_603 [MatMul] inputs: [931 -> (-1, -1, 3072)[FLOAT]], [1716 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1716 for ONNX node: 1716\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_603 for ONNX node: MatMul_603\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 933 for ONNX tensor: 933\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_603 [MatMul] outputs: [933 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_604 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 933\n", "[01/17/2022-21:09:35] [V] [TRT] Add_604 [Add] inputs: [encoder.layer.5.output.dense.bias -> (768)[FLOAT]], [933 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.output.dense.bias for ONNX node: encoder.layer.5.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_604 for ONNX node: Add_604\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 934 for ONNX tensor: 934\n", "[01/17/2022-21:09:35] [V] [TRT] Add_604 [Add] outputs: [934 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_605 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 934\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 920\n", "[01/17/2022-21:09:35] [V] [TRT] Add_605 [Add] inputs: [934 -> (-1, -1, 768)[FLOAT]], [920 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_605 for ONNX node: Add_605\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 935 for ONNX tensor: 935\n", "[01/17/2022-21:09:35] [V] [TRT] Add_605 [Add] outputs: [935 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_606 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 935\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_606 [ReduceMean] inputs: [935 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_606 for ONNX node: ReduceMean_606\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 936 for ONNX tensor: 936\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_606 [ReduceMean] outputs: [936 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_607 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 935\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 936\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_607 [Sub] inputs: [935 -> (-1, -1, 768)[FLOAT]], [936 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_607 for ONNX node: Sub_607\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 937 for ONNX tensor: 937\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_607 [Sub] outputs: [937 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_608 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_608 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_608 [Constant] outputs: [938 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_609 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 937\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 938\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_609 [Pow] inputs: [937 -> (-1, -1, 768)[FLOAT]], [938 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 938 for ONNX node: 938\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_609 for ONNX node: Pow_609\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 939 for ONNX tensor: 939\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_609 [Pow] outputs: [939 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_610 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 939\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_610 [ReduceMean] inputs: [939 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_610 for ONNX node: ReduceMean_610\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 940 for ONNX tensor: 940\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_610 [ReduceMean] outputs: [940 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_611 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_611 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_611 [Constant] outputs: [941 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_612 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 940\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 941\n", "[01/17/2022-21:09:35] [V] [TRT] Add_612 [Add] inputs: [940 -> (-1, -1, 1)[FLOAT]], [941 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 941 for ONNX node: 941\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_612 for ONNX node: Add_612\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 942 for ONNX tensor: 942\n", "[01/17/2022-21:09:35] [V] [TRT] Add_612 [Add] outputs: [942 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_613 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 942\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_613 [Sqrt] inputs: [942 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_613 for ONNX node: Sqrt_613\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 943 for ONNX tensor: 943\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_613 [Sqrt] outputs: [943 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_614 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 937\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 943\n", "[01/17/2022-21:09:35] [V] [TRT] Div_614 [Div] inputs: [937 -> (-1, -1, 768)[FLOAT]], [943 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_614 for ONNX node: Div_614\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 944 for ONNX tensor: 944\n", "[01/17/2022-21:09:35] [V] [TRT] Div_614 [Div] outputs: [944 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_615 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 944\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_615 [Mul] inputs: [944 -> (-1, -1, 768)[FLOAT]], [encoder.layer.5.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.output.LayerNorm.weight for ONNX node: encoder.layer.5.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_615 for ONNX node: Mul_615\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 945 for ONNX tensor: 945\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_615 [Mul] outputs: [945 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_616 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 945\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.5.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_616 [Add] inputs: [945 -> (-1, -1, 768)[FLOAT]], [encoder.layer.5.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.5.output.LayerNorm.bias for ONNX node: encoder.layer.5.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_616 for ONNX node: Add_616\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 946 for ONNX tensor: 946\n", "[01/17/2022-21:09:35] [V] [TRT] Add_616 [Add] outputs: [946 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_617 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 946\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1717\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_617 [MatMul] inputs: [946 -> (-1, -1, 768)[FLOAT]], [1717 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1717 for ONNX node: 1717\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_617 for ONNX node: MatMul_617\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 948 for ONNX tensor: 948\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_617 [MatMul] outputs: [948 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_618 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 948\n", "[01/17/2022-21:09:35] [V] [TRT] Add_618 [Add] inputs: [encoder.layer.6.attention.self.query.bias -> (768)[FLOAT]], [948 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.attention.self.query.bias for ONNX node: encoder.layer.6.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_618 for ONNX node: Add_618\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 949 for ONNX tensor: 949\n", "[01/17/2022-21:09:35] [V] [TRT] Add_618 [Add] outputs: [949 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_619 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 946\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1718\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_619 [MatMul] inputs: [946 -> (-1, -1, 768)[FLOAT]], [1718 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1718 for ONNX node: 1718\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_619 for ONNX node: MatMul_619\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 951 for ONNX tensor: 951\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_619 [MatMul] outputs: [951 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_620 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 951\n", "[01/17/2022-21:09:35] [V] [TRT] Add_620 [Add] inputs: [encoder.layer.6.attention.self.key.bias -> (768)[FLOAT]], [951 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.attention.self.key.bias for ONNX node: encoder.layer.6.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_620 for ONNX node: Add_620\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 952 for ONNX tensor: 952\n", "[01/17/2022-21:09:35] [V] [TRT] Add_620 [Add] outputs: [952 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_621 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 952\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_621 [Shape] inputs: [952 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_621 for ONNX node: Shape_621\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 953 for ONNX tensor: 953\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_621 [Shape] outputs: [953 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_622 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_622 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_622 [Constant] outputs: [954 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_623 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 953\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 954\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_623 [Gather] inputs: [953 -> (3)[INT32]], [954 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 954 for ONNX node: 954\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_623 for ONNX node: Gather_623\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 955 for ONNX tensor: 955\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_623 [Gather] outputs: [955 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_624 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 952\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_624 [Shape] inputs: [952 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_624 for ONNX node: Shape_624\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 956 for ONNX tensor: 956\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_624 [Shape] outputs: [956 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_625 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_625 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_625 [Constant] outputs: [957 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_626 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 956\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 957\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_626 [Gather] inputs: [956 -> (3)[INT32]], [957 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 957 for ONNX node: 957\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_626 for ONNX node: Gather_626\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 958 for ONNX tensor: 958\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_626 [Gather] outputs: [958 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_627 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 955\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_627 [Unsqueeze] inputs: [955 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_627 for ONNX node: Unsqueeze_627\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 961 for ONNX tensor: 961\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_627 [Unsqueeze] outputs: [961 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_628 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 958\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_628 [Unsqueeze] inputs: [958 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_628 for ONNX node: Unsqueeze_628\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 962 for ONNX tensor: 962\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_628 [Unsqueeze] outputs: [962 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_629 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 961\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 962\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1719\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1720\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_629 [Concat] inputs: [961 -> (1)[INT32]], [962 -> (1)[INT32]], [1719 -> (1)[INT32]], [1720 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1719 for ONNX node: 1719\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1720 for ONNX node: 1720\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_629 for ONNX node: Concat_629\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 965 for ONNX tensor: 965\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_629 [Concat] outputs: [965 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_630 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 952\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 965\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_630 [Reshape] inputs: [952 -> (-1, -1, 768)[FLOAT]], [965 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_630 for ONNX node: Reshape_630\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 966 for ONNX tensor: 966\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_630 [Reshape] outputs: [966 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_631 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 946\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1721\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_631 [MatMul] inputs: [946 -> (-1, -1, 768)[FLOAT]], [1721 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1721 for ONNX node: 1721\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_631 for ONNX node: MatMul_631\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 968 for ONNX tensor: 968\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_631 [MatMul] outputs: [968 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_632 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 968\n", "[01/17/2022-21:09:35] [V] [TRT] Add_632 [Add] inputs: [encoder.layer.6.attention.self.value.bias -> (768)[FLOAT]], [968 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.attention.self.value.bias for ONNX node: encoder.layer.6.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_632 for ONNX node: Add_632\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 969 for ONNX tensor: 969\n", "[01/17/2022-21:09:35] [V] [TRT] Add_632 [Add] outputs: [969 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_633 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 969\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_633 [Shape] inputs: [969 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_633 for ONNX node: Shape_633\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 970 for ONNX tensor: 970\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_633 [Shape] outputs: [970 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_634 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_634 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_634 [Constant] outputs: [971 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_635 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 970\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 971\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_635 [Gather] inputs: [970 -> (3)[INT32]], [971 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 971 for ONNX node: 971\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_635 for ONNX node: Gather_635\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 972 for ONNX tensor: 972\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_635 [Gather] outputs: [972 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_636 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 969\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_636 [Shape] inputs: [969 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_636 for ONNX node: Shape_636\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 973 for ONNX tensor: 973\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_636 [Shape] outputs: [973 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_637 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_637 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_637 [Constant] outputs: [974 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_638 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 973\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 974\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_638 [Gather] inputs: [973 -> (3)[INT32]], [974 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 974 for ONNX node: 974\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_638 for ONNX node: Gather_638\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 975 for ONNX tensor: 975\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_638 [Gather] outputs: [975 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_639 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 972\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_639 [Unsqueeze] inputs: [972 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_639 for ONNX node: Unsqueeze_639\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 978 for ONNX tensor: 978\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_639 [Unsqueeze] outputs: [978 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_640 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 975\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_640 [Unsqueeze] inputs: [975 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_640 for ONNX node: Unsqueeze_640\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 979 for ONNX tensor: 979\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_640 [Unsqueeze] outputs: [979 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_641 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 978\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 979\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1722\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1723\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_641 [Concat] inputs: [978 -> (1)[INT32]], [979 -> (1)[INT32]], [1722 -> (1)[INT32]], [1723 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1722 for ONNX node: 1722\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1723 for ONNX node: 1723\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_641 for ONNX node: Concat_641\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 982 for ONNX tensor: 982\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_641 [Concat] outputs: [982 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_642 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 969\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 982\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_642 [Reshape] inputs: [969 -> (-1, -1, 768)[FLOAT]], [982 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_642 for ONNX node: Reshape_642\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 983 for ONNX tensor: 983\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_642 [Reshape] outputs: [983 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_643 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 983\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_643 [Transpose] inputs: [983 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_643 for ONNX node: Transpose_643\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 984 for ONNX tensor: 984\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_643 [Transpose] outputs: [984 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_644 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 949\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_644 [Shape] inputs: [949 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_644 for ONNX node: Shape_644\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 985 for ONNX tensor: 985\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_644 [Shape] outputs: [985 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_645 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_645 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_645 [Constant] outputs: [986 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_646 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 985\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 986\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_646 [Gather] inputs: [985 -> (3)[INT32]], [986 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 986 for ONNX node: 986\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_646 for ONNX node: Gather_646\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 987 for ONNX tensor: 987\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_646 [Gather] outputs: [987 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_647 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 949\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_647 [Shape] inputs: [949 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_647 for ONNX node: Shape_647\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 988 for ONNX tensor: 988\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_647 [Shape] outputs: [988 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_648 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_648 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_648 [Constant] outputs: [989 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_649 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 988\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 989\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_649 [Gather] inputs: [988 -> (3)[INT32]], [989 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 989 for ONNX node: 989\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_649 for ONNX node: Gather_649\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 990 for ONNX tensor: 990\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_649 [Gather] outputs: [990 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_650 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 987\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_650 [Unsqueeze] inputs: [987 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_650 for ONNX node: Unsqueeze_650\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 993 for ONNX tensor: 993\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_650 [Unsqueeze] outputs: [993 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_651 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 990\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_651 [Unsqueeze] inputs: [990 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_651 for ONNX node: Unsqueeze_651\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 994 for ONNX tensor: 994\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_651 [Unsqueeze] outputs: [994 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_652 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 993\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 994\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1724\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1725\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_652 [Concat] inputs: [993 -> (1)[INT32]], [994 -> (1)[INT32]], [1724 -> (1)[INT32]], [1725 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1724 for ONNX node: 1724\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1725 for ONNX node: 1725\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_652 for ONNX node: Concat_652\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 997 for ONNX tensor: 997\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_652 [Concat] outputs: [997 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_653 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 949\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 997\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_653 [Reshape] inputs: [949 -> (-1, -1, 768)[FLOAT]], [997 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_653 for ONNX node: Reshape_653\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 998 for ONNX tensor: 998\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_653 [Reshape] outputs: [998 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_654 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 998\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_654 [Transpose] inputs: [998 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_654 for ONNX node: Transpose_654\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 999 for ONNX tensor: 999\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_654 [Transpose] outputs: [999 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_655 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 966\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_655 [Transpose] inputs: [966 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_655 for ONNX node: Transpose_655\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1000 for ONNX tensor: 1000\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_655 [Transpose] outputs: [1000 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_656 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 999\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1000\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_656 [MatMul] inputs: [999 -> (-1, 12, -1, 64)[FLOAT]], [1000 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_656 for ONNX node: MatMul_656\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1001 for ONNX tensor: 1001\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_656 [MatMul] outputs: [1001 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_657 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_657 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_657 [Constant] outputs: [1002 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_658 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1001\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1002\n", "[01/17/2022-21:09:35] [V] [TRT] Div_658 [Div] inputs: [1001 -> (-1, 12, -1, -1)[FLOAT]], [1002 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1002 for ONNX node: 1002\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_658 for ONNX node: Div_658\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1003 for ONNX tensor: 1003\n", "[01/17/2022-21:09:35] [V] [TRT] Div_658 [Div] outputs: [1003 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_659 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1003\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_659 [Add] inputs: [1003 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_659 for ONNX node: Add_659\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1004 for ONNX tensor: 1004\n", "[01/17/2022-21:09:35] [V] [TRT] Add_659 [Add] outputs: [1004 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_660 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1004\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_660 [Softmax] inputs: [1004 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_660 for ONNX node: Softmax_660\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1005 for ONNX tensor: 1005\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_660 [Softmax] outputs: [1005 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_661 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1005\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 984\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_661 [MatMul] inputs: [1005 -> (-1, 12, -1, -1)[FLOAT]], [984 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_661 for ONNX node: MatMul_661\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1006 for ONNX tensor: 1006\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_661 [MatMul] outputs: [1006 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_662 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1006\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_662 [Transpose] inputs: [1006 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_662 for ONNX node: Transpose_662\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1007 for ONNX tensor: 1007\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_662 [Transpose] outputs: [1007 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_663 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1007\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_663 [Shape] inputs: [1007 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_663 for ONNX node: Shape_663\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1008 for ONNX tensor: 1008\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_663 [Shape] outputs: [1008 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_664 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_664 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_664 [Constant] outputs: [1009 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_665 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1008\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1009\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_665 [Gather] inputs: [1008 -> (4)[INT32]], [1009 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1009 for ONNX node: 1009\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_665 for ONNX node: Gather_665\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1010 for ONNX tensor: 1010\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_665 [Gather] outputs: [1010 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_666 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1007\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_666 [Shape] inputs: [1007 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_666 for ONNX node: Shape_666\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1011 for ONNX tensor: 1011\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_666 [Shape] outputs: [1011 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_667 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_667 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_667 [Constant] outputs: [1012 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_668 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1011\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1012\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_668 [Gather] inputs: [1011 -> (4)[INT32]], [1012 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1012 for ONNX node: 1012\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_668 for ONNX node: Gather_668\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1013 for ONNX tensor: 1013\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_668 [Gather] outputs: [1013 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_669 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1010\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_669 [Unsqueeze] inputs: [1010 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_669 for ONNX node: Unsqueeze_669\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1015 for ONNX tensor: 1015\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_669 [Unsqueeze] outputs: [1015 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_670 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1013\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_670 [Unsqueeze] inputs: [1013 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_670 for ONNX node: Unsqueeze_670\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1016 for ONNX tensor: 1016\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_670 [Unsqueeze] outputs: [1016 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_671 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1015\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1016\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1726\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_671 [Concat] inputs: [1015 -> (1)[INT32]], [1016 -> (1)[INT32]], [1726 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1726 for ONNX node: 1726\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_671 for ONNX node: Concat_671\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1018 for ONNX tensor: 1018\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_671 [Concat] outputs: [1018 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_672 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1007\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1018\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_672 [Reshape] inputs: [1007 -> (-1, -1, 12, 64)[FLOAT]], [1018 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_672 for ONNX node: Reshape_672\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1019 for ONNX tensor: 1019\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_672 [Reshape] outputs: [1019 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_673 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1019\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1727\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_673 [MatMul] inputs: [1019 -> (-1, -1, 768)[FLOAT]], [1727 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1727 for ONNX node: 1727\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_673 for ONNX node: MatMul_673\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1021 for ONNX tensor: 1021\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_673 [MatMul] outputs: [1021 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_674 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1021\n", "[01/17/2022-21:09:35] [V] [TRT] Add_674 [Add] inputs: [encoder.layer.6.attention.output.dense.bias -> (768)[FLOAT]], [1021 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.attention.output.dense.bias for ONNX node: encoder.layer.6.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_674 for ONNX node: Add_674\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1022 for ONNX tensor: 1022\n", "[01/17/2022-21:09:35] [V] [TRT] Add_674 [Add] outputs: [1022 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_675 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1022\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 946\n", "[01/17/2022-21:09:35] [V] [TRT] Add_675 [Add] inputs: [1022 -> (-1, -1, 768)[FLOAT]], [946 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_675 for ONNX node: Add_675\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1023 for ONNX tensor: 1023\n", "[01/17/2022-21:09:35] [V] [TRT] Add_675 [Add] outputs: [1023 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_676 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1023\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_676 [ReduceMean] inputs: [1023 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_676 for ONNX node: ReduceMean_676\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1024 for ONNX tensor: 1024\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_676 [ReduceMean] outputs: [1024 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_677 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1023\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1024\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_677 [Sub] inputs: [1023 -> (-1, -1, 768)[FLOAT]], [1024 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_677 for ONNX node: Sub_677\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1025 for ONNX tensor: 1025\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_677 [Sub] outputs: [1025 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_678 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_678 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_678 [Constant] outputs: [1026 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_679 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1025\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1026\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_679 [Pow] inputs: [1025 -> (-1, -1, 768)[FLOAT]], [1026 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1026 for ONNX node: 1026\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_679 for ONNX node: Pow_679\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1027 for ONNX tensor: 1027\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_679 [Pow] outputs: [1027 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_680 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1027\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_680 [ReduceMean] inputs: [1027 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_680 for ONNX node: ReduceMean_680\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1028 for ONNX tensor: 1028\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_680 [ReduceMean] outputs: [1028 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_681 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_681 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_681 [Constant] outputs: [1029 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_682 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1028\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1029\n", "[01/17/2022-21:09:35] [V] [TRT] Add_682 [Add] inputs: [1028 -> (-1, -1, 1)[FLOAT]], [1029 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1029 for ONNX node: 1029\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_682 for ONNX node: Add_682\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1030 for ONNX tensor: 1030\n", "[01/17/2022-21:09:35] [V] [TRT] Add_682 [Add] outputs: [1030 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_683 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1030\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_683 [Sqrt] inputs: [1030 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_683 for ONNX node: Sqrt_683\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1031 for ONNX tensor: 1031\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_683 [Sqrt] outputs: [1031 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_684 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1025\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1031\n", "[01/17/2022-21:09:35] [V] [TRT] Div_684 [Div] inputs: [1025 -> (-1, -1, 768)[FLOAT]], [1031 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_684 for ONNX node: Div_684\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1032 for ONNX tensor: 1032\n", "[01/17/2022-21:09:35] [V] [TRT] Div_684 [Div] outputs: [1032 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_685 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1032\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_685 [Mul] inputs: [1032 -> (-1, -1, 768)[FLOAT]], [encoder.layer.6.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.attention.output.LayerNorm.weight for ONNX node: encoder.layer.6.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_685 for ONNX node: Mul_685\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1033 for ONNX tensor: 1033\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_685 [Mul] outputs: [1033 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_686 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1033\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_686 [Add] inputs: [1033 -> (-1, -1, 768)[FLOAT]], [encoder.layer.6.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.attention.output.LayerNorm.bias for ONNX node: encoder.layer.6.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_686 for ONNX node: Add_686\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1034 for ONNX tensor: 1034\n", "[01/17/2022-21:09:35] [V] [TRT] Add_686 [Add] outputs: [1034 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_687 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1034\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1728\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_687 [MatMul] inputs: [1034 -> (-1, -1, 768)[FLOAT]], [1728 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1728 for ONNX node: 1728\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_687 for ONNX node: MatMul_687\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1036 for ONNX tensor: 1036\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_687 [MatMul] outputs: [1036 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_688 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1036\n", "[01/17/2022-21:09:35] [V] [TRT] Add_688 [Add] inputs: [encoder.layer.6.intermediate.dense.bias -> (3072)[FLOAT]], [1036 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.intermediate.dense.bias for ONNX node: encoder.layer.6.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_688 for ONNX node: Add_688\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1037 for ONNX tensor: 1037\n", "[01/17/2022-21:09:35] [V] [TRT] Add_688 [Add] outputs: [1037 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_689 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_689 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_689 [Constant] outputs: [1038 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_690 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1037\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1038\n", "[01/17/2022-21:09:35] [V] [TRT] Div_690 [Div] inputs: [1037 -> (-1, -1, 3072)[FLOAT]], [1038 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1038 for ONNX node: 1038\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_690 for ONNX node: Div_690\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1039 for ONNX tensor: 1039\n", "[01/17/2022-21:09:35] [V] [TRT] Div_690 [Div] outputs: [1039 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_691 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1039\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_691 [Erf] inputs: [1039 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_691 for ONNX node: Erf_691\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1040 for ONNX tensor: 1040\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_691 [Erf] outputs: [1040 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_692 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_692 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_692 [Constant] outputs: [1041 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_693 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1040\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1041\n", "[01/17/2022-21:09:35] [V] [TRT] Add_693 [Add] inputs: [1040 -> (-1, -1, 3072)[FLOAT]], [1041 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1041 for ONNX node: 1041\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_693 for ONNX node: Add_693\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1042 for ONNX tensor: 1042\n", "[01/17/2022-21:09:35] [V] [TRT] Add_693 [Add] outputs: [1042 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_694 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1037\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1042\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_694 [Mul] inputs: [1037 -> (-1, -1, 3072)[FLOAT]], [1042 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_694 for ONNX node: Mul_694\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1043 for ONNX tensor: 1043\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_694 [Mul] outputs: [1043 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_695 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_695 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_695 [Constant] outputs: [1044 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_696 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1043\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1044\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_696 [Mul] inputs: [1043 -> (-1, -1, 3072)[FLOAT]], [1044 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1044 for ONNX node: 1044\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_696 for ONNX node: Mul_696\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1045 for ONNX tensor: 1045\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_696 [Mul] outputs: [1045 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_697 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1045\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1729\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_697 [MatMul] inputs: [1045 -> (-1, -1, 3072)[FLOAT]], [1729 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1729 for ONNX node: 1729\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_697 for ONNX node: MatMul_697\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1047 for ONNX tensor: 1047\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_697 [MatMul] outputs: [1047 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_698 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1047\n", "[01/17/2022-21:09:35] [V] [TRT] Add_698 [Add] inputs: [encoder.layer.6.output.dense.bias -> (768)[FLOAT]], [1047 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.output.dense.bias for ONNX node: encoder.layer.6.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_698 for ONNX node: Add_698\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1048 for ONNX tensor: 1048\n", "[01/17/2022-21:09:35] [V] [TRT] Add_698 [Add] outputs: [1048 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_699 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1048\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1034\n", "[01/17/2022-21:09:35] [V] [TRT] Add_699 [Add] inputs: [1048 -> (-1, -1, 768)[FLOAT]], [1034 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_699 for ONNX node: Add_699\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1049 for ONNX tensor: 1049\n", "[01/17/2022-21:09:35] [V] [TRT] Add_699 [Add] outputs: [1049 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_700 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1049\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_700 [ReduceMean] inputs: [1049 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_700 for ONNX node: ReduceMean_700\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1050 for ONNX tensor: 1050\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_700 [ReduceMean] outputs: [1050 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_701 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1049\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1050\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_701 [Sub] inputs: [1049 -> (-1, -1, 768)[FLOAT]], [1050 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_701 for ONNX node: Sub_701\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1051 for ONNX tensor: 1051\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_701 [Sub] outputs: [1051 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_702 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_702 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_702 [Constant] outputs: [1052 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_703 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1051\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1052\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_703 [Pow] inputs: [1051 -> (-1, -1, 768)[FLOAT]], [1052 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1052 for ONNX node: 1052\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_703 for ONNX node: Pow_703\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1053 for ONNX tensor: 1053\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_703 [Pow] outputs: [1053 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_704 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1053\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_704 [ReduceMean] inputs: [1053 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_704 for ONNX node: ReduceMean_704\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1054 for ONNX tensor: 1054\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_704 [ReduceMean] outputs: [1054 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_705 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_705 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_705 [Constant] outputs: [1055 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_706 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1054\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1055\n", "[01/17/2022-21:09:35] [V] [TRT] Add_706 [Add] inputs: [1054 -> (-1, -1, 1)[FLOAT]], [1055 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1055 for ONNX node: 1055\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_706 for ONNX node: Add_706\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1056 for ONNX tensor: 1056\n", "[01/17/2022-21:09:35] [V] [TRT] Add_706 [Add] outputs: [1056 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_707 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1056\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_707 [Sqrt] inputs: [1056 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_707 for ONNX node: Sqrt_707\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1057 for ONNX tensor: 1057\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_707 [Sqrt] outputs: [1057 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_708 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1051\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1057\n", "[01/17/2022-21:09:35] [V] [TRT] Div_708 [Div] inputs: [1051 -> (-1, -1, 768)[FLOAT]], [1057 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_708 for ONNX node: Div_708\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1058 for ONNX tensor: 1058\n", "[01/17/2022-21:09:35] [V] [TRT] Div_708 [Div] outputs: [1058 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_709 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1058\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_709 [Mul] inputs: [1058 -> (-1, -1, 768)[FLOAT]], [encoder.layer.6.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.output.LayerNorm.weight for ONNX node: encoder.layer.6.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_709 for ONNX node: Mul_709\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1059 for ONNX tensor: 1059\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_709 [Mul] outputs: [1059 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_710 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1059\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.6.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_710 [Add] inputs: [1059 -> (-1, -1, 768)[FLOAT]], [encoder.layer.6.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.6.output.LayerNorm.bias for ONNX node: encoder.layer.6.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_710 for ONNX node: Add_710\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1060 for ONNX tensor: 1060\n", "[01/17/2022-21:09:35] [V] [TRT] Add_710 [Add] outputs: [1060 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_711 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1060\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1730\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_711 [MatMul] inputs: [1060 -> (-1, -1, 768)[FLOAT]], [1730 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1730 for ONNX node: 1730\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_711 for ONNX node: MatMul_711\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1062 for ONNX tensor: 1062\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_711 [MatMul] outputs: [1062 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_712 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1062\n", "[01/17/2022-21:09:35] [V] [TRT] Add_712 [Add] inputs: [encoder.layer.7.attention.self.query.bias -> (768)[FLOAT]], [1062 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.attention.self.query.bias for ONNX node: encoder.layer.7.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_712 for ONNX node: Add_712\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1063 for ONNX tensor: 1063\n", "[01/17/2022-21:09:35] [V] [TRT] Add_712 [Add] outputs: [1063 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_713 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1060\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1731\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_713 [MatMul] inputs: [1060 -> (-1, -1, 768)[FLOAT]], [1731 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1731 for ONNX node: 1731\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_713 for ONNX node: MatMul_713\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1065 for ONNX tensor: 1065\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_713 [MatMul] outputs: [1065 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_714 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1065\n", "[01/17/2022-21:09:35] [V] [TRT] Add_714 [Add] inputs: [encoder.layer.7.attention.self.key.bias -> (768)[FLOAT]], [1065 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.attention.self.key.bias for ONNX node: encoder.layer.7.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_714 for ONNX node: Add_714\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1066 for ONNX tensor: 1066\n", "[01/17/2022-21:09:35] [V] [TRT] Add_714 [Add] outputs: [1066 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_715 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1066\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_715 [Shape] inputs: [1066 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_715 for ONNX node: Shape_715\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1067 for ONNX tensor: 1067\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_715 [Shape] outputs: [1067 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_716 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_716 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_716 [Constant] outputs: [1068 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_717 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1067\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1068\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_717 [Gather] inputs: [1067 -> (3)[INT32]], [1068 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1068 for ONNX node: 1068\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_717 for ONNX node: Gather_717\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1069 for ONNX tensor: 1069\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_717 [Gather] outputs: [1069 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_718 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1066\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_718 [Shape] inputs: [1066 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_718 for ONNX node: Shape_718\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1070 for ONNX tensor: 1070\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_718 [Shape] outputs: [1070 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_719 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_719 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_719 [Constant] outputs: [1071 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_720 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1070\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1071\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_720 [Gather] inputs: [1070 -> (3)[INT32]], [1071 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1071 for ONNX node: 1071\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_720 for ONNX node: Gather_720\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1072 for ONNX tensor: 1072\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_720 [Gather] outputs: [1072 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_721 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1069\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_721 [Unsqueeze] inputs: [1069 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_721 for ONNX node: Unsqueeze_721\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1075 for ONNX tensor: 1075\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_721 [Unsqueeze] outputs: [1075 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_722 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1072\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_722 [Unsqueeze] inputs: [1072 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_722 for ONNX node: Unsqueeze_722\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1076 for ONNX tensor: 1076\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_722 [Unsqueeze] outputs: [1076 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_723 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1075\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1076\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1732\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1733\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_723 [Concat] inputs: [1075 -> (1)[INT32]], [1076 -> (1)[INT32]], [1732 -> (1)[INT32]], [1733 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1732 for ONNX node: 1732\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1733 for ONNX node: 1733\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_723 for ONNX node: Concat_723\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1079 for ONNX tensor: 1079\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_723 [Concat] outputs: [1079 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_724 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1066\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1079\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_724 [Reshape] inputs: [1066 -> (-1, -1, 768)[FLOAT]], [1079 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_724 for ONNX node: Reshape_724\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1080 for ONNX tensor: 1080\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_724 [Reshape] outputs: [1080 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_725 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1060\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1734\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_725 [MatMul] inputs: [1060 -> (-1, -1, 768)[FLOAT]], [1734 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1734 for ONNX node: 1734\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_725 for ONNX node: MatMul_725\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1082 for ONNX tensor: 1082\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_725 [MatMul] outputs: [1082 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_726 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1082\n", "[01/17/2022-21:09:35] [V] [TRT] Add_726 [Add] inputs: [encoder.layer.7.attention.self.value.bias -> (768)[FLOAT]], [1082 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.attention.self.value.bias for ONNX node: encoder.layer.7.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_726 for ONNX node: Add_726\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1083 for ONNX tensor: 1083\n", "[01/17/2022-21:09:35] [V] [TRT] Add_726 [Add] outputs: [1083 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_727 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1083\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_727 [Shape] inputs: [1083 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_727 for ONNX node: Shape_727\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1084 for ONNX tensor: 1084\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_727 [Shape] outputs: [1084 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_728 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_728 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_728 [Constant] outputs: [1085 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_729 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1084\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1085\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_729 [Gather] inputs: [1084 -> (3)[INT32]], [1085 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1085 for ONNX node: 1085\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_729 for ONNX node: Gather_729\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1086 for ONNX tensor: 1086\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_729 [Gather] outputs: [1086 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_730 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1083\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_730 [Shape] inputs: [1083 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_730 for ONNX node: Shape_730\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1087 for ONNX tensor: 1087\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_730 [Shape] outputs: [1087 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_731 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_731 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_731 [Constant] outputs: [1088 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_732 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1087\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1088\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_732 [Gather] inputs: [1087 -> (3)[INT32]], [1088 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1088 for ONNX node: 1088\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_732 for ONNX node: Gather_732\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1089 for ONNX tensor: 1089\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_732 [Gather] outputs: [1089 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_733 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1086\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_733 [Unsqueeze] inputs: [1086 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_733 for ONNX node: Unsqueeze_733\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1092 for ONNX tensor: 1092\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_733 [Unsqueeze] outputs: [1092 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_734 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1089\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_734 [Unsqueeze] inputs: [1089 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_734 for ONNX node: Unsqueeze_734\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1093 for ONNX tensor: 1093\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_734 [Unsqueeze] outputs: [1093 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_735 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1092\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1093\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1735\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1736\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_735 [Concat] inputs: [1092 -> (1)[INT32]], [1093 -> (1)[INT32]], [1735 -> (1)[INT32]], [1736 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1735 for ONNX node: 1735\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1736 for ONNX node: 1736\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_735 for ONNX node: Concat_735\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1096 for ONNX tensor: 1096\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_735 [Concat] outputs: [1096 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_736 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1083\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1096\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_736 [Reshape] inputs: [1083 -> (-1, -1, 768)[FLOAT]], [1096 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_736 for ONNX node: Reshape_736\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1097 for ONNX tensor: 1097\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_736 [Reshape] outputs: [1097 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_737 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1097\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_737 [Transpose] inputs: [1097 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_737 for ONNX node: Transpose_737\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1098 for ONNX tensor: 1098\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_737 [Transpose] outputs: [1098 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_738 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1063\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_738 [Shape] inputs: [1063 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_738 for ONNX node: Shape_738\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1099 for ONNX tensor: 1099\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_738 [Shape] outputs: [1099 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_739 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_739 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_739 [Constant] outputs: [1100 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_740 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1099\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1100\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_740 [Gather] inputs: [1099 -> (3)[INT32]], [1100 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1100 for ONNX node: 1100\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_740 for ONNX node: Gather_740\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1101 for ONNX tensor: 1101\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_740 [Gather] outputs: [1101 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_741 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1063\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_741 [Shape] inputs: [1063 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_741 for ONNX node: Shape_741\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1102 for ONNX tensor: 1102\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_741 [Shape] outputs: [1102 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_742 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_742 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_742 [Constant] outputs: [1103 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_743 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1102\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1103\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_743 [Gather] inputs: [1102 -> (3)[INT32]], [1103 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1103 for ONNX node: 1103\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_743 for ONNX node: Gather_743\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1104 for ONNX tensor: 1104\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_743 [Gather] outputs: [1104 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_744 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1101\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_744 [Unsqueeze] inputs: [1101 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_744 for ONNX node: Unsqueeze_744\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1107 for ONNX tensor: 1107\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_744 [Unsqueeze] outputs: [1107 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_745 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1104\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_745 [Unsqueeze] inputs: [1104 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_745 for ONNX node: Unsqueeze_745\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1108 for ONNX tensor: 1108\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_745 [Unsqueeze] outputs: [1108 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_746 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1107\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1108\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1737\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1738\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_746 [Concat] inputs: [1107 -> (1)[INT32]], [1108 -> (1)[INT32]], [1737 -> (1)[INT32]], [1738 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1737 for ONNX node: 1737\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1738 for ONNX node: 1738\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_746 for ONNX node: Concat_746\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1111 for ONNX tensor: 1111\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_746 [Concat] outputs: [1111 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_747 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1063\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1111\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_747 [Reshape] inputs: [1063 -> (-1, -1, 768)[FLOAT]], [1111 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_747 for ONNX node: Reshape_747\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1112 for ONNX tensor: 1112\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_747 [Reshape] outputs: [1112 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_748 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1112\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_748 [Transpose] inputs: [1112 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_748 for ONNX node: Transpose_748\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1113 for ONNX tensor: 1113\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_748 [Transpose] outputs: [1113 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_749 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1080\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_749 [Transpose] inputs: [1080 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_749 for ONNX node: Transpose_749\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1114 for ONNX tensor: 1114\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_749 [Transpose] outputs: [1114 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_750 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1113\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1114\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_750 [MatMul] inputs: [1113 -> (-1, 12, -1, 64)[FLOAT]], [1114 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_750 for ONNX node: MatMul_750\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1115 for ONNX tensor: 1115\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_750 [MatMul] outputs: [1115 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_751 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_751 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_751 [Constant] outputs: [1116 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_752 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1115\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1116\n", "[01/17/2022-21:09:35] [V] [TRT] Div_752 [Div] inputs: [1115 -> (-1, 12, -1, -1)[FLOAT]], [1116 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1116 for ONNX node: 1116\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_752 for ONNX node: Div_752\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1117 for ONNX tensor: 1117\n", "[01/17/2022-21:09:35] [V] [TRT] Div_752 [Div] outputs: [1117 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_753 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1117\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_753 [Add] inputs: [1117 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_753 for ONNX node: Add_753\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1118 for ONNX tensor: 1118\n", "[01/17/2022-21:09:35] [V] [TRT] Add_753 [Add] outputs: [1118 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_754 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1118\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_754 [Softmax] inputs: [1118 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_754 for ONNX node: Softmax_754\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1119 for ONNX tensor: 1119\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_754 [Softmax] outputs: [1119 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_755 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1119\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1098\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_755 [MatMul] inputs: [1119 -> (-1, 12, -1, -1)[FLOAT]], [1098 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_755 for ONNX node: MatMul_755\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1120 for ONNX tensor: 1120\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_755 [MatMul] outputs: [1120 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_756 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1120\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_756 [Transpose] inputs: [1120 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_756 for ONNX node: Transpose_756\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1121 for ONNX tensor: 1121\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_756 [Transpose] outputs: [1121 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_757 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1121\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_757 [Shape] inputs: [1121 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_757 for ONNX node: Shape_757\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1122 for ONNX tensor: 1122\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_757 [Shape] outputs: [1122 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_758 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_758 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_758 [Constant] outputs: [1123 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_759 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1122\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1123\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_759 [Gather] inputs: [1122 -> (4)[INT32]], [1123 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1123 for ONNX node: 1123\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_759 for ONNX node: Gather_759\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1124 for ONNX tensor: 1124\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_759 [Gather] outputs: [1124 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_760 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1121\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_760 [Shape] inputs: [1121 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_760 for ONNX node: Shape_760\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1125 for ONNX tensor: 1125\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_760 [Shape] outputs: [1125 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_761 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_761 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_761 [Constant] outputs: [1126 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_762 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1125\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1126\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_762 [Gather] inputs: [1125 -> (4)[INT32]], [1126 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1126 for ONNX node: 1126\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_762 for ONNX node: Gather_762\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1127 for ONNX tensor: 1127\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_762 [Gather] outputs: [1127 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_763 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1124\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_763 [Unsqueeze] inputs: [1124 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_763 for ONNX node: Unsqueeze_763\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1129 for ONNX tensor: 1129\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_763 [Unsqueeze] outputs: [1129 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_764 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1127\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_764 [Unsqueeze] inputs: [1127 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_764 for ONNX node: Unsqueeze_764\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1130 for ONNX tensor: 1130\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_764 [Unsqueeze] outputs: [1130 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_765 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1129\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1130\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1739\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_765 [Concat] inputs: [1129 -> (1)[INT32]], [1130 -> (1)[INT32]], [1739 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1739 for ONNX node: 1739\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_765 for ONNX node: Concat_765\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1132 for ONNX tensor: 1132\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_765 [Concat] outputs: [1132 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_766 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1121\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1132\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_766 [Reshape] inputs: [1121 -> (-1, -1, 12, 64)[FLOAT]], [1132 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_766 for ONNX node: Reshape_766\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1133 for ONNX tensor: 1133\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_766 [Reshape] outputs: [1133 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_767 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1133\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1740\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_767 [MatMul] inputs: [1133 -> (-1, -1, 768)[FLOAT]], [1740 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1740 for ONNX node: 1740\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_767 for ONNX node: MatMul_767\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1135 for ONNX tensor: 1135\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_767 [MatMul] outputs: [1135 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_768 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1135\n", "[01/17/2022-21:09:35] [V] [TRT] Add_768 [Add] inputs: [encoder.layer.7.attention.output.dense.bias -> (768)[FLOAT]], [1135 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.attention.output.dense.bias for ONNX node: encoder.layer.7.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_768 for ONNX node: Add_768\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1136 for ONNX tensor: 1136\n", "[01/17/2022-21:09:35] [V] [TRT] Add_768 [Add] outputs: [1136 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_769 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1136\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1060\n", "[01/17/2022-21:09:35] [V] [TRT] Add_769 [Add] inputs: [1136 -> (-1, -1, 768)[FLOAT]], [1060 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_769 for ONNX node: Add_769\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1137 for ONNX tensor: 1137\n", "[01/17/2022-21:09:35] [V] [TRT] Add_769 [Add] outputs: [1137 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_770 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1137\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_770 [ReduceMean] inputs: [1137 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_770 for ONNX node: ReduceMean_770\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1138 for ONNX tensor: 1138\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_770 [ReduceMean] outputs: [1138 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_771 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1137\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1138\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_771 [Sub] inputs: [1137 -> (-1, -1, 768)[FLOAT]], [1138 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_771 for ONNX node: Sub_771\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1139 for ONNX tensor: 1139\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_771 [Sub] outputs: [1139 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_772 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_772 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_772 [Constant] outputs: [1140 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_773 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1139\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1140\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_773 [Pow] inputs: [1139 -> (-1, -1, 768)[FLOAT]], [1140 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1140 for ONNX node: 1140\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_773 for ONNX node: Pow_773\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1141 for ONNX tensor: 1141\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_773 [Pow] outputs: [1141 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_774 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1141\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_774 [ReduceMean] inputs: [1141 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_774 for ONNX node: ReduceMean_774\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1142 for ONNX tensor: 1142\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_774 [ReduceMean] outputs: [1142 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_775 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_775 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_775 [Constant] outputs: [1143 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_776 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1142\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1143\n", "[01/17/2022-21:09:35] [V] [TRT] Add_776 [Add] inputs: [1142 -> (-1, -1, 1)[FLOAT]], [1143 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1143 for ONNX node: 1143\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_776 for ONNX node: Add_776\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1144 for ONNX tensor: 1144\n", "[01/17/2022-21:09:35] [V] [TRT] Add_776 [Add] outputs: [1144 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_777 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1144\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_777 [Sqrt] inputs: [1144 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_777 for ONNX node: Sqrt_777\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1145 for ONNX tensor: 1145\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_777 [Sqrt] outputs: [1145 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_778 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1139\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1145\n", "[01/17/2022-21:09:35] [V] [TRT] Div_778 [Div] inputs: [1139 -> (-1, -1, 768)[FLOAT]], [1145 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_778 for ONNX node: Div_778\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1146 for ONNX tensor: 1146\n", "[01/17/2022-21:09:35] [V] [TRT] Div_778 [Div] outputs: [1146 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_779 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1146\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_779 [Mul] inputs: [1146 -> (-1, -1, 768)[FLOAT]], [encoder.layer.7.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.attention.output.LayerNorm.weight for ONNX node: encoder.layer.7.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_779 for ONNX node: Mul_779\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1147 for ONNX tensor: 1147\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_779 [Mul] outputs: [1147 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_780 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1147\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_780 [Add] inputs: [1147 -> (-1, -1, 768)[FLOAT]], [encoder.layer.7.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.attention.output.LayerNorm.bias for ONNX node: encoder.layer.7.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_780 for ONNX node: Add_780\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1148 for ONNX tensor: 1148\n", "[01/17/2022-21:09:35] [V] [TRT] Add_780 [Add] outputs: [1148 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_781 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1148\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1741\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_781 [MatMul] inputs: [1148 -> (-1, -1, 768)[FLOAT]], [1741 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1741 for ONNX node: 1741\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_781 for ONNX node: MatMul_781\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1150 for ONNX tensor: 1150\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_781 [MatMul] outputs: [1150 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_782 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1150\n", "[01/17/2022-21:09:35] [V] [TRT] Add_782 [Add] inputs: [encoder.layer.7.intermediate.dense.bias -> (3072)[FLOAT]], [1150 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.intermediate.dense.bias for ONNX node: encoder.layer.7.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_782 for ONNX node: Add_782\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1151 for ONNX tensor: 1151\n", "[01/17/2022-21:09:35] [V] [TRT] Add_782 [Add] outputs: [1151 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_783 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_783 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_783 [Constant] outputs: [1152 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_784 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1151\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1152\n", "[01/17/2022-21:09:35] [V] [TRT] Div_784 [Div] inputs: [1151 -> (-1, -1, 3072)[FLOAT]], [1152 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1152 for ONNX node: 1152\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_784 for ONNX node: Div_784\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1153 for ONNX tensor: 1153\n", "[01/17/2022-21:09:35] [V] [TRT] Div_784 [Div] outputs: [1153 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_785 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1153\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_785 [Erf] inputs: [1153 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_785 for ONNX node: Erf_785\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1154 for ONNX tensor: 1154\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_785 [Erf] outputs: [1154 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_786 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_786 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_786 [Constant] outputs: [1155 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_787 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1154\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1155\n", "[01/17/2022-21:09:35] [V] [TRT] Add_787 [Add] inputs: [1154 -> (-1, -1, 3072)[FLOAT]], [1155 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1155 for ONNX node: 1155\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_787 for ONNX node: Add_787\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1156 for ONNX tensor: 1156\n", "[01/17/2022-21:09:35] [V] [TRT] Add_787 [Add] outputs: [1156 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_788 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1151\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1156\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_788 [Mul] inputs: [1151 -> (-1, -1, 3072)[FLOAT]], [1156 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_788 for ONNX node: Mul_788\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1157 for ONNX tensor: 1157\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_788 [Mul] outputs: [1157 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_789 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_789 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_789 [Constant] outputs: [1158 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_790 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1157\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1158\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_790 [Mul] inputs: [1157 -> (-1, -1, 3072)[FLOAT]], [1158 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1158 for ONNX node: 1158\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_790 for ONNX node: Mul_790\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1159 for ONNX tensor: 1159\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_790 [Mul] outputs: [1159 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_791 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1159\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1742\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_791 [MatMul] inputs: [1159 -> (-1, -1, 3072)[FLOAT]], [1742 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1742 for ONNX node: 1742\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_791 for ONNX node: MatMul_791\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1161 for ONNX tensor: 1161\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_791 [MatMul] outputs: [1161 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_792 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1161\n", "[01/17/2022-21:09:35] [V] [TRT] Add_792 [Add] inputs: [encoder.layer.7.output.dense.bias -> (768)[FLOAT]], [1161 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.output.dense.bias for ONNX node: encoder.layer.7.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_792 for ONNX node: Add_792\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1162 for ONNX tensor: 1162\n", "[01/17/2022-21:09:35] [V] [TRT] Add_792 [Add] outputs: [1162 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_793 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1162\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1148\n", "[01/17/2022-21:09:35] [V] [TRT] Add_793 [Add] inputs: [1162 -> (-1, -1, 768)[FLOAT]], [1148 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_793 for ONNX node: Add_793\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1163 for ONNX tensor: 1163\n", "[01/17/2022-21:09:35] [V] [TRT] Add_793 [Add] outputs: [1163 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_794 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1163\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_794 [ReduceMean] inputs: [1163 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_794 for ONNX node: ReduceMean_794\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1164 for ONNX tensor: 1164\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_794 [ReduceMean] outputs: [1164 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_795 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1163\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1164\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_795 [Sub] inputs: [1163 -> (-1, -1, 768)[FLOAT]], [1164 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_795 for ONNX node: Sub_795\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1165 for ONNX tensor: 1165\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_795 [Sub] outputs: [1165 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_796 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_796 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_796 [Constant] outputs: [1166 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_797 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1165\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1166\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_797 [Pow] inputs: [1165 -> (-1, -1, 768)[FLOAT]], [1166 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1166 for ONNX node: 1166\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_797 for ONNX node: Pow_797\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1167 for ONNX tensor: 1167\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_797 [Pow] outputs: [1167 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_798 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1167\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_798 [ReduceMean] inputs: [1167 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_798 for ONNX node: ReduceMean_798\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1168 for ONNX tensor: 1168\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_798 [ReduceMean] outputs: [1168 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_799 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_799 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_799 [Constant] outputs: [1169 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_800 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1168\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1169\n", "[01/17/2022-21:09:35] [V] [TRT] Add_800 [Add] inputs: [1168 -> (-1, -1, 1)[FLOAT]], [1169 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1169 for ONNX node: 1169\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_800 for ONNX node: Add_800\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1170 for ONNX tensor: 1170\n", "[01/17/2022-21:09:35] [V] [TRT] Add_800 [Add] outputs: [1170 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_801 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1170\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_801 [Sqrt] inputs: [1170 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_801 for ONNX node: Sqrt_801\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1171 for ONNX tensor: 1171\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_801 [Sqrt] outputs: [1171 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_802 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1165\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1171\n", "[01/17/2022-21:09:35] [V] [TRT] Div_802 [Div] inputs: [1165 -> (-1, -1, 768)[FLOAT]], [1171 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_802 for ONNX node: Div_802\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1172 for ONNX tensor: 1172\n", "[01/17/2022-21:09:35] [V] [TRT] Div_802 [Div] outputs: [1172 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_803 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1172\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_803 [Mul] inputs: [1172 -> (-1, -1, 768)[FLOAT]], [encoder.layer.7.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.output.LayerNorm.weight for ONNX node: encoder.layer.7.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_803 for ONNX node: Mul_803\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1173 for ONNX tensor: 1173\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_803 [Mul] outputs: [1173 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_804 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1173\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.7.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_804 [Add] inputs: [1173 -> (-1, -1, 768)[FLOAT]], [encoder.layer.7.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.7.output.LayerNorm.bias for ONNX node: encoder.layer.7.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_804 for ONNX node: Add_804\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1174 for ONNX tensor: 1174\n", "[01/17/2022-21:09:35] [V] [TRT] Add_804 [Add] outputs: [1174 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_805 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1174\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1743\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_805 [MatMul] inputs: [1174 -> (-1, -1, 768)[FLOAT]], [1743 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1743 for ONNX node: 1743\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_805 for ONNX node: MatMul_805\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1176 for ONNX tensor: 1176\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_805 [MatMul] outputs: [1176 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_806 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1176\n", "[01/17/2022-21:09:35] [V] [TRT] Add_806 [Add] inputs: [encoder.layer.8.attention.self.query.bias -> (768)[FLOAT]], [1176 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.attention.self.query.bias for ONNX node: encoder.layer.8.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_806 for ONNX node: Add_806\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1177 for ONNX tensor: 1177\n", "[01/17/2022-21:09:35] [V] [TRT] Add_806 [Add] outputs: [1177 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_807 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1174\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1744\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_807 [MatMul] inputs: [1174 -> (-1, -1, 768)[FLOAT]], [1744 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1744 for ONNX node: 1744\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_807 for ONNX node: MatMul_807\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1179 for ONNX tensor: 1179\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_807 [MatMul] outputs: [1179 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_808 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1179\n", "[01/17/2022-21:09:35] [V] [TRT] Add_808 [Add] inputs: [encoder.layer.8.attention.self.key.bias -> (768)[FLOAT]], [1179 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.attention.self.key.bias for ONNX node: encoder.layer.8.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_808 for ONNX node: Add_808\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1180 for ONNX tensor: 1180\n", "[01/17/2022-21:09:35] [V] [TRT] Add_808 [Add] outputs: [1180 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_809 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1180\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_809 [Shape] inputs: [1180 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_809 for ONNX node: Shape_809\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1181 for ONNX tensor: 1181\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_809 [Shape] outputs: [1181 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_810 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_810 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_810 [Constant] outputs: [1182 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_811 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1181\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1182\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_811 [Gather] inputs: [1181 -> (3)[INT32]], [1182 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1182 for ONNX node: 1182\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_811 for ONNX node: Gather_811\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1183 for ONNX tensor: 1183\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_811 [Gather] outputs: [1183 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_812 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1180\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_812 [Shape] inputs: [1180 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_812 for ONNX node: Shape_812\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1184 for ONNX tensor: 1184\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_812 [Shape] outputs: [1184 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_813 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_813 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_813 [Constant] outputs: [1185 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_814 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1184\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1185\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_814 [Gather] inputs: [1184 -> (3)[INT32]], [1185 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1185 for ONNX node: 1185\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_814 for ONNX node: Gather_814\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1186 for ONNX tensor: 1186\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_814 [Gather] outputs: [1186 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_815 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1183\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_815 [Unsqueeze] inputs: [1183 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_815 for ONNX node: Unsqueeze_815\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1189 for ONNX tensor: 1189\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_815 [Unsqueeze] outputs: [1189 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_816 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1186\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_816 [Unsqueeze] inputs: [1186 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_816 for ONNX node: Unsqueeze_816\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1190 for ONNX tensor: 1190\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_816 [Unsqueeze] outputs: [1190 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_817 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1189\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1190\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1745\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1746\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_817 [Concat] inputs: [1189 -> (1)[INT32]], [1190 -> (1)[INT32]], [1745 -> (1)[INT32]], [1746 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1745 for ONNX node: 1745\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1746 for ONNX node: 1746\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_817 for ONNX node: Concat_817\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1193 for ONNX tensor: 1193\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_817 [Concat] outputs: [1193 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_818 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1180\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1193\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_818 [Reshape] inputs: [1180 -> (-1, -1, 768)[FLOAT]], [1193 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_818 for ONNX node: Reshape_818\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1194 for ONNX tensor: 1194\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_818 [Reshape] outputs: [1194 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_819 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1174\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1747\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_819 [MatMul] inputs: [1174 -> (-1, -1, 768)[FLOAT]], [1747 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1747 for ONNX node: 1747\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_819 for ONNX node: MatMul_819\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1196 for ONNX tensor: 1196\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_819 [MatMul] outputs: [1196 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_820 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1196\n", "[01/17/2022-21:09:35] [V] [TRT] Add_820 [Add] inputs: [encoder.layer.8.attention.self.value.bias -> (768)[FLOAT]], [1196 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.attention.self.value.bias for ONNX node: encoder.layer.8.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_820 for ONNX node: Add_820\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1197 for ONNX tensor: 1197\n", "[01/17/2022-21:09:35] [V] [TRT] Add_820 [Add] outputs: [1197 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_821 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1197\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_821 [Shape] inputs: [1197 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_821 for ONNX node: Shape_821\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1198 for ONNX tensor: 1198\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_821 [Shape] outputs: [1198 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_822 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_822 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_822 [Constant] outputs: [1199 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_823 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1198\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1199\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_823 [Gather] inputs: [1198 -> (3)[INT32]], [1199 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1199 for ONNX node: 1199\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_823 for ONNX node: Gather_823\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1200 for ONNX tensor: 1200\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_823 [Gather] outputs: [1200 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_824 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1197\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_824 [Shape] inputs: [1197 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_824 for ONNX node: Shape_824\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1201 for ONNX tensor: 1201\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_824 [Shape] outputs: [1201 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_825 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_825 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_825 [Constant] outputs: [1202 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_826 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1201\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1202\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_826 [Gather] inputs: [1201 -> (3)[INT32]], [1202 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1202 for ONNX node: 1202\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_826 for ONNX node: Gather_826\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1203 for ONNX tensor: 1203\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_826 [Gather] outputs: [1203 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_827 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1200\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_827 [Unsqueeze] inputs: [1200 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_827 for ONNX node: Unsqueeze_827\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1206 for ONNX tensor: 1206\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_827 [Unsqueeze] outputs: [1206 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_828 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1203\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_828 [Unsqueeze] inputs: [1203 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_828 for ONNX node: Unsqueeze_828\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1207 for ONNX tensor: 1207\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_828 [Unsqueeze] outputs: [1207 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_829 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1206\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1207\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1748\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1749\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_829 [Concat] inputs: [1206 -> (1)[INT32]], [1207 -> (1)[INT32]], [1748 -> (1)[INT32]], [1749 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1748 for ONNX node: 1748\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1749 for ONNX node: 1749\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_829 for ONNX node: Concat_829\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1210 for ONNX tensor: 1210\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_829 [Concat] outputs: [1210 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_830 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1197\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1210\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_830 [Reshape] inputs: [1197 -> (-1, -1, 768)[FLOAT]], [1210 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_830 for ONNX node: Reshape_830\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1211 for ONNX tensor: 1211\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_830 [Reshape] outputs: [1211 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_831 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1211\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_831 [Transpose] inputs: [1211 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_831 for ONNX node: Transpose_831\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1212 for ONNX tensor: 1212\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_831 [Transpose] outputs: [1212 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_832 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1177\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_832 [Shape] inputs: [1177 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_832 for ONNX node: Shape_832\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1213 for ONNX tensor: 1213\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_832 [Shape] outputs: [1213 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_833 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_833 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_833 [Constant] outputs: [1214 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_834 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1213\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1214\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_834 [Gather] inputs: [1213 -> (3)[INT32]], [1214 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1214 for ONNX node: 1214\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_834 for ONNX node: Gather_834\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1215 for ONNX tensor: 1215\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_834 [Gather] outputs: [1215 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_835 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1177\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_835 [Shape] inputs: [1177 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_835 for ONNX node: Shape_835\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1216 for ONNX tensor: 1216\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_835 [Shape] outputs: [1216 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_836 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_836 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_836 [Constant] outputs: [1217 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_837 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1216\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1217\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_837 [Gather] inputs: [1216 -> (3)[INT32]], [1217 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1217 for ONNX node: 1217\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_837 for ONNX node: Gather_837\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1218 for ONNX tensor: 1218\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_837 [Gather] outputs: [1218 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_838 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1215\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_838 [Unsqueeze] inputs: [1215 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_838 for ONNX node: Unsqueeze_838\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1221 for ONNX tensor: 1221\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_838 [Unsqueeze] outputs: [1221 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_839 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1218\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_839 [Unsqueeze] inputs: [1218 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_839 for ONNX node: Unsqueeze_839\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1222 for ONNX tensor: 1222\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_839 [Unsqueeze] outputs: [1222 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_840 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1221\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1222\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1750\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1751\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_840 [Concat] inputs: [1221 -> (1)[INT32]], [1222 -> (1)[INT32]], [1750 -> (1)[INT32]], [1751 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1750 for ONNX node: 1750\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1751 for ONNX node: 1751\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_840 for ONNX node: Concat_840\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1225 for ONNX tensor: 1225\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_840 [Concat] outputs: [1225 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_841 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1177\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1225\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_841 [Reshape] inputs: [1177 -> (-1, -1, 768)[FLOAT]], [1225 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_841 for ONNX node: Reshape_841\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1226 for ONNX tensor: 1226\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_841 [Reshape] outputs: [1226 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_842 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1226\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_842 [Transpose] inputs: [1226 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_842 for ONNX node: Transpose_842\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1227 for ONNX tensor: 1227\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_842 [Transpose] outputs: [1227 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_843 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1194\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_843 [Transpose] inputs: [1194 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_843 for ONNX node: Transpose_843\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1228 for ONNX tensor: 1228\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_843 [Transpose] outputs: [1228 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_844 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1227\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1228\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_844 [MatMul] inputs: [1227 -> (-1, 12, -1, 64)[FLOAT]], [1228 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_844 for ONNX node: MatMul_844\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1229 for ONNX tensor: 1229\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_844 [MatMul] outputs: [1229 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_845 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_845 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_845 [Constant] outputs: [1230 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_846 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1229\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1230\n", "[01/17/2022-21:09:35] [V] [TRT] Div_846 [Div] inputs: [1229 -> (-1, 12, -1, -1)[FLOAT]], [1230 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1230 for ONNX node: 1230\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_846 for ONNX node: Div_846\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1231 for ONNX tensor: 1231\n", "[01/17/2022-21:09:35] [V] [TRT] Div_846 [Div] outputs: [1231 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_847 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1231\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_847 [Add] inputs: [1231 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_847 for ONNX node: Add_847\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1232 for ONNX tensor: 1232\n", "[01/17/2022-21:09:35] [V] [TRT] Add_847 [Add] outputs: [1232 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_848 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1232\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_848 [Softmax] inputs: [1232 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_848 for ONNX node: Softmax_848\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1233 for ONNX tensor: 1233\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_848 [Softmax] outputs: [1233 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_849 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1233\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1212\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_849 [MatMul] inputs: [1233 -> (-1, 12, -1, -1)[FLOAT]], [1212 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_849 for ONNX node: MatMul_849\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1234 for ONNX tensor: 1234\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_849 [MatMul] outputs: [1234 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_850 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1234\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_850 [Transpose] inputs: [1234 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_850 for ONNX node: Transpose_850\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1235 for ONNX tensor: 1235\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_850 [Transpose] outputs: [1235 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_851 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1235\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_851 [Shape] inputs: [1235 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_851 for ONNX node: Shape_851\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1236 for ONNX tensor: 1236\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_851 [Shape] outputs: [1236 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_852 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_852 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_852 [Constant] outputs: [1237 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_853 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1236\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1237\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_853 [Gather] inputs: [1236 -> (4)[INT32]], [1237 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1237 for ONNX node: 1237\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_853 for ONNX node: Gather_853\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1238 for ONNX tensor: 1238\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_853 [Gather] outputs: [1238 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_854 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1235\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_854 [Shape] inputs: [1235 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_854 for ONNX node: Shape_854\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1239 for ONNX tensor: 1239\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_854 [Shape] outputs: [1239 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_855 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_855 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_855 [Constant] outputs: [1240 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_856 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1239\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1240\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_856 [Gather] inputs: [1239 -> (4)[INT32]], [1240 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1240 for ONNX node: 1240\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_856 for ONNX node: Gather_856\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1241 for ONNX tensor: 1241\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_856 [Gather] outputs: [1241 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_857 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1238\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_857 [Unsqueeze] inputs: [1238 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_857 for ONNX node: Unsqueeze_857\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1243 for ONNX tensor: 1243\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_857 [Unsqueeze] outputs: [1243 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_858 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1241\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_858 [Unsqueeze] inputs: [1241 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_858 for ONNX node: Unsqueeze_858\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1244 for ONNX tensor: 1244\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_858 [Unsqueeze] outputs: [1244 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_859 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1243\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1244\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1752\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_859 [Concat] inputs: [1243 -> (1)[INT32]], [1244 -> (1)[INT32]], [1752 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1752 for ONNX node: 1752\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_859 for ONNX node: Concat_859\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1246 for ONNX tensor: 1246\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_859 [Concat] outputs: [1246 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_860 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1235\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1246\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_860 [Reshape] inputs: [1235 -> (-1, -1, 12, 64)[FLOAT]], [1246 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_860 for ONNX node: Reshape_860\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1247 for ONNX tensor: 1247\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_860 [Reshape] outputs: [1247 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_861 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1247\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1753\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_861 [MatMul] inputs: [1247 -> (-1, -1, 768)[FLOAT]], [1753 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1753 for ONNX node: 1753\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_861 for ONNX node: MatMul_861\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1249 for ONNX tensor: 1249\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_861 [MatMul] outputs: [1249 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_862 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1249\n", "[01/17/2022-21:09:35] [V] [TRT] Add_862 [Add] inputs: [encoder.layer.8.attention.output.dense.bias -> (768)[FLOAT]], [1249 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.attention.output.dense.bias for ONNX node: encoder.layer.8.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_862 for ONNX node: Add_862\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1250 for ONNX tensor: 1250\n", "[01/17/2022-21:09:35] [V] [TRT] Add_862 [Add] outputs: [1250 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_863 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1250\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1174\n", "[01/17/2022-21:09:35] [V] [TRT] Add_863 [Add] inputs: [1250 -> (-1, -1, 768)[FLOAT]], [1174 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_863 for ONNX node: Add_863\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1251 for ONNX tensor: 1251\n", "[01/17/2022-21:09:35] [V] [TRT] Add_863 [Add] outputs: [1251 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_864 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1251\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_864 [ReduceMean] inputs: [1251 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_864 for ONNX node: ReduceMean_864\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1252 for ONNX tensor: 1252\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_864 [ReduceMean] outputs: [1252 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_865 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1251\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1252\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_865 [Sub] inputs: [1251 -> (-1, -1, 768)[FLOAT]], [1252 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_865 for ONNX node: Sub_865\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1253 for ONNX tensor: 1253\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_865 [Sub] outputs: [1253 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_866 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_866 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_866 [Constant] outputs: [1254 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_867 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1253\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1254\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_867 [Pow] inputs: [1253 -> (-1, -1, 768)[FLOAT]], [1254 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1254 for ONNX node: 1254\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_867 for ONNX node: Pow_867\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1255 for ONNX tensor: 1255\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_867 [Pow] outputs: [1255 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_868 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1255\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_868 [ReduceMean] inputs: [1255 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_868 for ONNX node: ReduceMean_868\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1256 for ONNX tensor: 1256\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_868 [ReduceMean] outputs: [1256 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_869 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_869 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_869 [Constant] outputs: [1257 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_870 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1256\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1257\n", "[01/17/2022-21:09:35] [V] [TRT] Add_870 [Add] inputs: [1256 -> (-1, -1, 1)[FLOAT]], [1257 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1257 for ONNX node: 1257\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_870 for ONNX node: Add_870\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1258 for ONNX tensor: 1258\n", "[01/17/2022-21:09:35] [V] [TRT] Add_870 [Add] outputs: [1258 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_871 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1258\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_871 [Sqrt] inputs: [1258 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_871 for ONNX node: Sqrt_871\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1259 for ONNX tensor: 1259\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_871 [Sqrt] outputs: [1259 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_872 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1253\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1259\n", "[01/17/2022-21:09:35] [V] [TRT] Div_872 [Div] inputs: [1253 -> (-1, -1, 768)[FLOAT]], [1259 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_872 for ONNX node: Div_872\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1260 for ONNX tensor: 1260\n", "[01/17/2022-21:09:35] [V] [TRT] Div_872 [Div] outputs: [1260 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_873 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1260\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_873 [Mul] inputs: [1260 -> (-1, -1, 768)[FLOAT]], [encoder.layer.8.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.attention.output.LayerNorm.weight for ONNX node: encoder.layer.8.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_873 for ONNX node: Mul_873\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1261 for ONNX tensor: 1261\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_873 [Mul] outputs: [1261 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_874 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1261\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_874 [Add] inputs: [1261 -> (-1, -1, 768)[FLOAT]], [encoder.layer.8.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.attention.output.LayerNorm.bias for ONNX node: encoder.layer.8.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_874 for ONNX node: Add_874\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1262 for ONNX tensor: 1262\n", "[01/17/2022-21:09:35] [V] [TRT] Add_874 [Add] outputs: [1262 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_875 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1262\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1754\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_875 [MatMul] inputs: [1262 -> (-1, -1, 768)[FLOAT]], [1754 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1754 for ONNX node: 1754\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_875 for ONNX node: MatMul_875\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1264 for ONNX tensor: 1264\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_875 [MatMul] outputs: [1264 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_876 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1264\n", "[01/17/2022-21:09:35] [V] [TRT] Add_876 [Add] inputs: [encoder.layer.8.intermediate.dense.bias -> (3072)[FLOAT]], [1264 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.intermediate.dense.bias for ONNX node: encoder.layer.8.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_876 for ONNX node: Add_876\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1265 for ONNX tensor: 1265\n", "[01/17/2022-21:09:35] [V] [TRT] Add_876 [Add] outputs: [1265 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_877 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_877 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_877 [Constant] outputs: [1266 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_878 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1265\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1266\n", "[01/17/2022-21:09:35] [V] [TRT] Div_878 [Div] inputs: [1265 -> (-1, -1, 3072)[FLOAT]], [1266 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1266 for ONNX node: 1266\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_878 for ONNX node: Div_878\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1267 for ONNX tensor: 1267\n", "[01/17/2022-21:09:35] [V] [TRT] Div_878 [Div] outputs: [1267 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Erf_879 [Erf]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1267\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_879 [Erf] inputs: [1267 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Erf_879 for ONNX node: Erf_879\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1268 for ONNX tensor: 1268\n", "[01/17/2022-21:09:35] [V] [TRT] Erf_879 [Erf] outputs: [1268 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_880 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_880 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_880 [Constant] outputs: [1269 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_881 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1268\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1269\n", "[01/17/2022-21:09:35] [V] [TRT] Add_881 [Add] inputs: [1268 -> (-1, -1, 3072)[FLOAT]], [1269 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1269 for ONNX node: 1269\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_881 for ONNX node: Add_881\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1270 for ONNX tensor: 1270\n", "[01/17/2022-21:09:35] [V] [TRT] Add_881 [Add] outputs: [1270 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_882 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1265\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1270\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_882 [Mul] inputs: [1265 -> (-1, -1, 3072)[FLOAT]], [1270 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_882 for ONNX node: Mul_882\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1271 for ONNX tensor: 1271\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_882 [Mul] outputs: [1271 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_883 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_883 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_883 [Constant] outputs: [1272 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_884 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1271\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1272\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_884 [Mul] inputs: [1271 -> (-1, -1, 3072)[FLOAT]], [1272 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1272 for ONNX node: 1272\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_884 for ONNX node: Mul_884\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1273 for ONNX tensor: 1273\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_884 [Mul] outputs: [1273 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_885 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1273\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1755\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_885 [MatMul] inputs: [1273 -> (-1, -1, 3072)[FLOAT]], [1755 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1755 for ONNX node: 1755\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_885 for ONNX node: MatMul_885\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1275 for ONNX tensor: 1275\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_885 [MatMul] outputs: [1275 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_886 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1275\n", "[01/17/2022-21:09:35] [V] [TRT] Add_886 [Add] inputs: [encoder.layer.8.output.dense.bias -> (768)[FLOAT]], [1275 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.output.dense.bias for ONNX node: encoder.layer.8.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_886 for ONNX node: Add_886\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1276 for ONNX tensor: 1276\n", "[01/17/2022-21:09:35] [V] [TRT] Add_886 [Add] outputs: [1276 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_887 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1276\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1262\n", "[01/17/2022-21:09:35] [V] [TRT] Add_887 [Add] inputs: [1276 -> (-1, -1, 768)[FLOAT]], [1262 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_887 for ONNX node: Add_887\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1277 for ONNX tensor: 1277\n", "[01/17/2022-21:09:35] [V] [TRT] Add_887 [Add] outputs: [1277 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_888 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1277\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_888 [ReduceMean] inputs: [1277 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_888 for ONNX node: ReduceMean_888\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1278 for ONNX tensor: 1278\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_888 [ReduceMean] outputs: [1278 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_889 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1277\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1278\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_889 [Sub] inputs: [1277 -> (-1, -1, 768)[FLOAT]], [1278 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_889 for ONNX node: Sub_889\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1279 for ONNX tensor: 1279\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_889 [Sub] outputs: [1279 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_890 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_890 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_890 [Constant] outputs: [1280 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_891 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1279\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1280\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_891 [Pow] inputs: [1279 -> (-1, -1, 768)[FLOAT]], [1280 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1280 for ONNX node: 1280\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_891 for ONNX node: Pow_891\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1281 for ONNX tensor: 1281\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_891 [Pow] outputs: [1281 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_892 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1281\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_892 [ReduceMean] inputs: [1281 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_892 for ONNX node: ReduceMean_892\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1282 for ONNX tensor: 1282\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_892 [ReduceMean] outputs: [1282 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_893 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_893 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_893 [Constant] outputs: [1283 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_894 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1282\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1283\n", "[01/17/2022-21:09:35] [V] [TRT] Add_894 [Add] inputs: [1282 -> (-1, -1, 1)[FLOAT]], [1283 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1283 for ONNX node: 1283\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_894 for ONNX node: Add_894\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1284 for ONNX tensor: 1284\n", "[01/17/2022-21:09:35] [V] [TRT] Add_894 [Add] outputs: [1284 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_895 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1284\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_895 [Sqrt] inputs: [1284 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_895 for ONNX node: Sqrt_895\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1285 for ONNX tensor: 1285\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_895 [Sqrt] outputs: [1285 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_896 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1279\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1285\n", "[01/17/2022-21:09:35] [V] [TRT] Div_896 [Div] inputs: [1279 -> (-1, -1, 768)[FLOAT]], [1285 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_896 for ONNX node: Div_896\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1286 for ONNX tensor: 1286\n", "[01/17/2022-21:09:35] [V] [TRT] Div_896 [Div] outputs: [1286 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_897 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1286\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_897 [Mul] inputs: [1286 -> (-1, -1, 768)[FLOAT]], [encoder.layer.8.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.output.LayerNorm.weight for ONNX node: encoder.layer.8.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_897 for ONNX node: Mul_897\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1287 for ONNX tensor: 1287\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_897 [Mul] outputs: [1287 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_898 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1287\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.8.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_898 [Add] inputs: [1287 -> (-1, -1, 768)[FLOAT]], [encoder.layer.8.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.8.output.LayerNorm.bias for ONNX node: encoder.layer.8.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_898 for ONNX node: Add_898\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1288 for ONNX tensor: 1288\n", "[01/17/2022-21:09:35] [V] [TRT] Add_898 [Add] outputs: [1288 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_899 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1288\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1756\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_899 [MatMul] inputs: [1288 -> (-1, -1, 768)[FLOAT]], [1756 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1756 for ONNX node: 1756\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_899 for ONNX node: MatMul_899\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1290 for ONNX tensor: 1290\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_899 [MatMul] outputs: [1290 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_900 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1290\n", "[01/17/2022-21:09:35] [V] [TRT] Add_900 [Add] inputs: [encoder.layer.9.attention.self.query.bias -> (768)[FLOAT]], [1290 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.attention.self.query.bias for ONNX node: encoder.layer.9.attention.self.query.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_900 for ONNX node: Add_900\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1291 for ONNX tensor: 1291\n", "[01/17/2022-21:09:35] [V] [TRT] Add_900 [Add] outputs: [1291 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_901 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1288\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1757\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_901 [MatMul] inputs: [1288 -> (-1, -1, 768)[FLOAT]], [1757 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1757 for ONNX node: 1757\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_901 for ONNX node: MatMul_901\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1293 for ONNX tensor: 1293\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_901 [MatMul] outputs: [1293 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_902 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1293\n", "[01/17/2022-21:09:35] [V] [TRT] Add_902 [Add] inputs: [encoder.layer.9.attention.self.key.bias -> (768)[FLOAT]], [1293 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.attention.self.key.bias for ONNX node: encoder.layer.9.attention.self.key.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_902 for ONNX node: Add_902\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1294 for ONNX tensor: 1294\n", "[01/17/2022-21:09:35] [V] [TRT] Add_902 [Add] outputs: [1294 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_903 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1294\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_903 [Shape] inputs: [1294 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_903 for ONNX node: Shape_903\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1295 for ONNX tensor: 1295\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_903 [Shape] outputs: [1295 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_904 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_904 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_904 [Constant] outputs: [1296 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_905 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1295\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1296\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_905 [Gather] inputs: [1295 -> (3)[INT32]], [1296 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1296 for ONNX node: 1296\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_905 for ONNX node: Gather_905\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1297 for ONNX tensor: 1297\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_905 [Gather] outputs: [1297 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_906 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1294\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_906 [Shape] inputs: [1294 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_906 for ONNX node: Shape_906\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1298 for ONNX tensor: 1298\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_906 [Shape] outputs: [1298 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_907 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_907 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_907 [Constant] outputs: [1299 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_908 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1298\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1299\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_908 [Gather] inputs: [1298 -> (3)[INT32]], [1299 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1299 for ONNX node: 1299\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_908 for ONNX node: Gather_908\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1300 for ONNX tensor: 1300\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_908 [Gather] outputs: [1300 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_909 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1297\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_909 [Unsqueeze] inputs: [1297 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_909 for ONNX node: Unsqueeze_909\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1303 for ONNX tensor: 1303\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_909 [Unsqueeze] outputs: [1303 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_910 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1300\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_910 [Unsqueeze] inputs: [1300 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_910 for ONNX node: Unsqueeze_910\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1304 for ONNX tensor: 1304\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_910 [Unsqueeze] outputs: [1304 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_911 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1303\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1304\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1758\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1759\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_911 [Concat] inputs: [1303 -> (1)[INT32]], [1304 -> (1)[INT32]], [1758 -> (1)[INT32]], [1759 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1758 for ONNX node: 1758\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1759 for ONNX node: 1759\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_911 for ONNX node: Concat_911\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1307 for ONNX tensor: 1307\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_911 [Concat] outputs: [1307 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_912 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1294\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1307\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_912 [Reshape] inputs: [1294 -> (-1, -1, 768)[FLOAT]], [1307 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_912 for ONNX node: Reshape_912\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1308 for ONNX tensor: 1308\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_912 [Reshape] outputs: [1308 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_913 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1288\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1760\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_913 [MatMul] inputs: [1288 -> (-1, -1, 768)[FLOAT]], [1760 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1760 for ONNX node: 1760\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_913 for ONNX node: MatMul_913\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1310 for ONNX tensor: 1310\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_913 [MatMul] outputs: [1310 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_914 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1310\n", "[01/17/2022-21:09:35] [V] [TRT] Add_914 [Add] inputs: [encoder.layer.9.attention.self.value.bias -> (768)[FLOAT]], [1310 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.attention.self.value.bias for ONNX node: encoder.layer.9.attention.self.value.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_914 for ONNX node: Add_914\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1311 for ONNX tensor: 1311\n", "[01/17/2022-21:09:35] [V] [TRT] Add_914 [Add] outputs: [1311 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_915 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1311\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_915 [Shape] inputs: [1311 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_915 for ONNX node: Shape_915\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1312 for ONNX tensor: 1312\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_915 [Shape] outputs: [1312 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_916 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_916 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_916 [Constant] outputs: [1313 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_917 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1312\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1313\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_917 [Gather] inputs: [1312 -> (3)[INT32]], [1313 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1313 for ONNX node: 1313\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_917 for ONNX node: Gather_917\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1314 for ONNX tensor: 1314\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_917 [Gather] outputs: [1314 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_918 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1311\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_918 [Shape] inputs: [1311 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_918 for ONNX node: Shape_918\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1315 for ONNX tensor: 1315\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_918 [Shape] outputs: [1315 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_919 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_919 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_919 [Constant] outputs: [1316 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_920 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1315\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1316\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_920 [Gather] inputs: [1315 -> (3)[INT32]], [1316 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1316 for ONNX node: 1316\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_920 for ONNX node: Gather_920\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1317 for ONNX tensor: 1317\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_920 [Gather] outputs: [1317 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_921 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1314\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_921 [Unsqueeze] inputs: [1314 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_921 for ONNX node: Unsqueeze_921\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1320 for ONNX tensor: 1320\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_921 [Unsqueeze] outputs: [1320 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_922 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1317\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_922 [Unsqueeze] inputs: [1317 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_922 for ONNX node: Unsqueeze_922\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1321 for ONNX tensor: 1321\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_922 [Unsqueeze] outputs: [1321 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_923 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1320\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1321\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1761\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1762\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_923 [Concat] inputs: [1320 -> (1)[INT32]], [1321 -> (1)[INT32]], [1761 -> (1)[INT32]], [1762 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1761 for ONNX node: 1761\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1762 for ONNX node: 1762\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_923 for ONNX node: Concat_923\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1324 for ONNX tensor: 1324\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_923 [Concat] outputs: [1324 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_924 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1311\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1324\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_924 [Reshape] inputs: [1311 -> (-1, -1, 768)[FLOAT]], [1324 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_924 for ONNX node: Reshape_924\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1325 for ONNX tensor: 1325\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_924 [Reshape] outputs: [1325 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_925 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1325\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_925 [Transpose] inputs: [1325 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_925 for ONNX node: Transpose_925\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1326 for ONNX tensor: 1326\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_925 [Transpose] outputs: [1326 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_926 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1291\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_926 [Shape] inputs: [1291 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_926 for ONNX node: Shape_926\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1327 for ONNX tensor: 1327\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_926 [Shape] outputs: [1327 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_927 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_927 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_927 [Constant] outputs: [1328 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_928 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1327\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1328\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_928 [Gather] inputs: [1327 -> (3)[INT32]], [1328 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1328 for ONNX node: 1328\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_928 for ONNX node: Gather_928\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1329 for ONNX tensor: 1329\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_928 [Gather] outputs: [1329 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_929 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1291\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_929 [Shape] inputs: [1291 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_929 for ONNX node: Shape_929\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1330 for ONNX tensor: 1330\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_929 [Shape] outputs: [1330 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_930 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_930 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_930 [Constant] outputs: [1331 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_931 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1330\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1331\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_931 [Gather] inputs: [1330 -> (3)[INT32]], [1331 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1331 for ONNX node: 1331\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_931 for ONNX node: Gather_931\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1332 for ONNX tensor: 1332\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_931 [Gather] outputs: [1332 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_932 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1329\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_932 [Unsqueeze] inputs: [1329 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_932 for ONNX node: Unsqueeze_932\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1335 for ONNX tensor: 1335\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_932 [Unsqueeze] outputs: [1335 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_933 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1332\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_933 [Unsqueeze] inputs: [1332 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_933 for ONNX node: Unsqueeze_933\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1336 for ONNX tensor: 1336\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_933 [Unsqueeze] outputs: [1336 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_934 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1335\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1336\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1763\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1764\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_934 [Concat] inputs: [1335 -> (1)[INT32]], [1336 -> (1)[INT32]], [1763 -> (1)[INT32]], [1764 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1763 for ONNX node: 1763\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1764 for ONNX node: 1764\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_934 for ONNX node: Concat_934\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1339 for ONNX tensor: 1339\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_934 [Concat] outputs: [1339 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_935 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1291\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1339\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_935 [Reshape] inputs: [1291 -> (-1, -1, 768)[FLOAT]], [1339 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_935 for ONNX node: Reshape_935\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1340 for ONNX tensor: 1340\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_935 [Reshape] outputs: [1340 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_936 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1340\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_936 [Transpose] inputs: [1340 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_936 for ONNX node: Transpose_936\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1341 for ONNX tensor: 1341\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_936 [Transpose] outputs: [1341 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_937 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1308\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_937 [Transpose] inputs: [1308 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_937 for ONNX node: Transpose_937\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1342 for ONNX tensor: 1342\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_937 [Transpose] outputs: [1342 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_938 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1341\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1342\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_938 [MatMul] inputs: [1341 -> (-1, 12, -1, 64)[FLOAT]], [1342 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_938 for ONNX node: MatMul_938\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1343 for ONNX tensor: 1343\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_938 [MatMul] outputs: [1343 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_939 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_939 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_939 [Constant] outputs: [1344 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_940 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1343\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1344\n", "[01/17/2022-21:09:35] [V] [TRT] Div_940 [Div] inputs: [1343 -> (-1, 12, -1, -1)[FLOAT]], [1344 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1344 for ONNX node: 1344\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_940 for ONNX node: Div_940\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1345 for ONNX tensor: 1345\n", "[01/17/2022-21:09:35] [V] [TRT] Div_940 [Div] outputs: [1345 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_941 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1345\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:35] [V] [TRT] Add_941 [Add] inputs: [1345 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_941 for ONNX node: Add_941\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1346 for ONNX tensor: 1346\n", "[01/17/2022-21:09:35] [V] [TRT] Add_941 [Add] outputs: [1346 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Softmax_942 [Softmax]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1346\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_942 [Softmax] inputs: [1346 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Softmax_942 for ONNX node: Softmax_942\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1347 for ONNX tensor: 1347\n", "[01/17/2022-21:09:35] [V] [TRT] Softmax_942 [Softmax] outputs: [1347 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_943 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1347\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1326\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_943 [MatMul] inputs: [1347 -> (-1, 12, -1, -1)[FLOAT]], [1326 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_943 for ONNX node: MatMul_943\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1348 for ONNX tensor: 1348\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_943 [MatMul] outputs: [1348 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Transpose_944 [Transpose]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1348\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_944 [Transpose] inputs: [1348 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Transpose_944 for ONNX node: Transpose_944\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1349 for ONNX tensor: 1349\n", "[01/17/2022-21:09:35] [V] [TRT] Transpose_944 [Transpose] outputs: [1349 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_945 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1349\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_945 [Shape] inputs: [1349 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_945 for ONNX node: Shape_945\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1350 for ONNX tensor: 1350\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_945 [Shape] outputs: [1350 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_946 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_946 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_946 [Constant] outputs: [1351 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_947 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1350\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1351\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_947 [Gather] inputs: [1350 -> (4)[INT32]], [1351 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1351 for ONNX node: 1351\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_947 for ONNX node: Gather_947\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1352 for ONNX tensor: 1352\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_947 [Gather] outputs: [1352 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Shape_948 [Shape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1349\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_948 [Shape] inputs: [1349 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Shape_948 for ONNX node: Shape_948\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1353 for ONNX tensor: 1353\n", "[01/17/2022-21:09:35] [V] [TRT] Shape_948 [Shape] outputs: [1353 -> (4)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_949 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_949 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_949 [Constant] outputs: [1354 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Gather_950 [Gather]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1353\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1354\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_950 [Gather] inputs: [1353 -> (4)[INT32]], [1354 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1354 for ONNX node: 1354\n", "[01/17/2022-21:09:35] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Gather_950 for ONNX node: Gather_950\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1355 for ONNX tensor: 1355\n", "[01/17/2022-21:09:35] [V] [TRT] Gather_950 [Gather] outputs: [1355 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_951 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1352\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_951 [Unsqueeze] inputs: [1352 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_951 for ONNX node: Unsqueeze_951\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1357 for ONNX tensor: 1357\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_951 [Unsqueeze] outputs: [1357 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Unsqueeze_952 [Unsqueeze]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1355\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_952 [Unsqueeze] inputs: [1355 -> ()[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Unsqueeze_952 for ONNX node: Unsqueeze_952\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1358 for ONNX tensor: 1358\n", "[01/17/2022-21:09:35] [V] [TRT] Unsqueeze_952 [Unsqueeze] outputs: [1358 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Concat_953 [Concat]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1357\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1358\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1765\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_953 [Concat] inputs: [1357 -> (1)[INT32]], [1358 -> (1)[INT32]], [1765 -> (1)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1765 for ONNX node: 1765\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Concat_953 for ONNX node: Concat_953\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1360 for ONNX tensor: 1360\n", "[01/17/2022-21:09:35] [V] [TRT] Concat_953 [Concat] outputs: [1360 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Reshape_954 [Reshape]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1349\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1360\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_954 [Reshape] inputs: [1349 -> (-1, -1, 12, 64)[FLOAT]], [1360 -> (3)[INT32]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Reshape_954 for ONNX node: Reshape_954\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1361 for ONNX tensor: 1361\n", "[01/17/2022-21:09:35] [V] [TRT] Reshape_954 [Reshape] outputs: [1361 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_955 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1361\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1766\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_955 [MatMul] inputs: [1361 -> (-1, -1, 768)[FLOAT]], [1766 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1766 for ONNX node: 1766\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_955 for ONNX node: MatMul_955\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1363 for ONNX tensor: 1363\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_955 [MatMul] outputs: [1363 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_956 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1363\n", "[01/17/2022-21:09:35] [V] [TRT] Add_956 [Add] inputs: [encoder.layer.9.attention.output.dense.bias -> (768)[FLOAT]], [1363 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.attention.output.dense.bias for ONNX node: encoder.layer.9.attention.output.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_956 for ONNX node: Add_956\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1364 for ONNX tensor: 1364\n", "[01/17/2022-21:09:35] [V] [TRT] Add_956 [Add] outputs: [1364 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_957 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1364\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1288\n", "[01/17/2022-21:09:35] [V] [TRT] Add_957 [Add] inputs: [1364 -> (-1, -1, 768)[FLOAT]], [1288 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_957 for ONNX node: Add_957\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1365 for ONNX tensor: 1365\n", "[01/17/2022-21:09:35] [V] [TRT] Add_957 [Add] outputs: [1365 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_958 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1365\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_958 [ReduceMean] inputs: [1365 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_958 for ONNX node: ReduceMean_958\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1366 for ONNX tensor: 1366\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_958 [ReduceMean] outputs: [1366 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sub_959 [Sub]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1365\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1366\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_959 [Sub] inputs: [1365 -> (-1, -1, 768)[FLOAT]], [1366 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sub_959 for ONNX node: Sub_959\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1367 for ONNX tensor: 1367\n", "[01/17/2022-21:09:35] [V] [TRT] Sub_959 [Sub] outputs: [1367 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_960 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_960 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_960 [Constant] outputs: [1368 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Pow_961 [Pow]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1367\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1368\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_961 [Pow] inputs: [1367 -> (-1, -1, 768)[FLOAT]], [1368 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1368 for ONNX node: 1368\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Pow_961 for ONNX node: Pow_961\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1369 for ONNX tensor: 1369\n", "[01/17/2022-21:09:35] [V] [TRT] Pow_961 [Pow] outputs: [1369 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: ReduceMean_962 [ReduceMean]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1369\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_962 [ReduceMean] inputs: [1369 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: ReduceMean_962 for ONNX node: ReduceMean_962\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1370 for ONNX tensor: 1370\n", "[01/17/2022-21:09:35] [V] [TRT] ReduceMean_962 [ReduceMean] outputs: [1370 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_963 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_963 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_963 [Constant] outputs: [1371 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_964 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1370\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1371\n", "[01/17/2022-21:09:35] [V] [TRT] Add_964 [Add] inputs: [1370 -> (-1, -1, 1)[FLOAT]], [1371 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1371 for ONNX node: 1371\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_964 for ONNX node: Add_964\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1372 for ONNX tensor: 1372\n", "[01/17/2022-21:09:35] [V] [TRT] Add_964 [Add] outputs: [1372 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Sqrt_965 [Sqrt]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1372\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_965 [Sqrt] inputs: [1372 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Sqrt_965 for ONNX node: Sqrt_965\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1373 for ONNX tensor: 1373\n", "[01/17/2022-21:09:35] [V] [TRT] Sqrt_965 [Sqrt] outputs: [1373 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_966 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1367\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1373\n", "[01/17/2022-21:09:35] [V] [TRT] Div_966 [Div] inputs: [1367 -> (-1, -1, 768)[FLOAT]], [1373 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Div_966 for ONNX node: Div_966\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1374 for ONNX tensor: 1374\n", "[01/17/2022-21:09:35] [V] [TRT] Div_966 [Div] outputs: [1374 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Mul_967 [Mul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1374\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_967 [Mul] inputs: [1374 -> (-1, -1, 768)[FLOAT]], [encoder.layer.9.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.attention.output.LayerNorm.weight for ONNX node: encoder.layer.9.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Mul_967 for ONNX node: Mul_967\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1375 for ONNX tensor: 1375\n", "[01/17/2022-21:09:35] [V] [TRT] Mul_967 [Mul] outputs: [1375 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_968 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1375\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Add_968 [Add] inputs: [1375 -> (-1, -1, 768)[FLOAT]], [encoder.layer.9.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.attention.output.LayerNorm.bias for ONNX node: encoder.layer.9.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_968 for ONNX node: Add_968\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1376 for ONNX tensor: 1376\n", "[01/17/2022-21:09:35] [V] [TRT] Add_968 [Add] outputs: [1376 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: MatMul_969 [MatMul]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1376\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1767\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_969 [MatMul] inputs: [1376 -> (-1, -1, 768)[FLOAT]], [1767 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1767 for ONNX node: 1767\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: MatMul_969 for ONNX node: MatMul_969\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1378 for ONNX tensor: 1378\n", "[01/17/2022-21:09:35] [V] [TRT] MatMul_969 [MatMul] outputs: [1378 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Add_970 [Add]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: encoder.layer.9.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1378\n", "[01/17/2022-21:09:35] [V] [TRT] Add_970 [Add] inputs: [encoder.layer.9.intermediate.dense.bias -> (3072)[FLOAT]], [1378 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: encoder.layer.9.intermediate.dense.bias for ONNX node: encoder.layer.9.intermediate.dense.bias\n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: Add_970 for ONNX node: Add_970\n", "[01/17/2022-21:09:35] [V] [TRT] Registering tensor: 1379 for ONNX tensor: 1379\n", "[01/17/2022-21:09:35] [V] [TRT] Add_970 [Add] outputs: [1379 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Constant_971 [Constant]\n", "[01/17/2022-21:09:35] [V] [TRT] Constant_971 [Constant] inputs: \n", "[01/17/2022-21:09:35] [V] [TRT] Constant_971 [Constant] outputs: [1380 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Parsing node: Div_972 [Div]\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1379\n", "[01/17/2022-21:09:35] [V] [TRT] Searching for input: 1380\n", "[01/17/2022-21:09:35] [V] [TRT] Div_972 [Div] inputs: [1379 -> (-1, -1, 3072)[FLOAT]], [1380 -> ()[FLOAT]], \n", "[01/17/2022-21:09:35] [V] [TRT] Registering layer: 1380 for ONNX node: 1380\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_972 for ONNX node: Div_972\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1381 for ONNX tensor: 1381\n", "[01/17/2022-21:09:36] [V] [TRT] Div_972 [Div] outputs: [1381 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Erf_973 [Erf]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1381\n", "[01/17/2022-21:09:36] [V] [TRT] Erf_973 [Erf] inputs: [1381 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Erf_973 for ONNX node: Erf_973\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1382 for ONNX tensor: 1382\n", "[01/17/2022-21:09:36] [V] [TRT] Erf_973 [Erf] outputs: [1382 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_974 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_974 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_974 [Constant] outputs: [1383 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_975 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1382\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1383\n", "[01/17/2022-21:09:36] [V] [TRT] Add_975 [Add] inputs: [1382 -> (-1, -1, 3072)[FLOAT]], [1383 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1383 for ONNX node: 1383\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_975 for ONNX node: Add_975\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1384 for ONNX tensor: 1384\n", "[01/17/2022-21:09:36] [V] [TRT] Add_975 [Add] outputs: [1384 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_976 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1379\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1384\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_976 [Mul] inputs: [1379 -> (-1, -1, 3072)[FLOAT]], [1384 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_976 for ONNX node: Mul_976\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1385 for ONNX tensor: 1385\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_976 [Mul] outputs: [1385 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_977 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_977 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_977 [Constant] outputs: [1386 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_978 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1385\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1386\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_978 [Mul] inputs: [1385 -> (-1, -1, 3072)[FLOAT]], [1386 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1386 for ONNX node: 1386\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_978 for ONNX node: Mul_978\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1387 for ONNX tensor: 1387\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_978 [Mul] outputs: [1387 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_979 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1387\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1768\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_979 [MatMul] inputs: [1387 -> (-1, -1, 3072)[FLOAT]], [1768 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1768 for ONNX node: 1768\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_979 for ONNX node: MatMul_979\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1389 for ONNX tensor: 1389\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_979 [MatMul] outputs: [1389 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_980 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.9.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1389\n", "[01/17/2022-21:09:36] [V] [TRT] Add_980 [Add] inputs: [encoder.layer.9.output.dense.bias -> (768)[FLOAT]], [1389 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.9.output.dense.bias for ONNX node: encoder.layer.9.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_980 for ONNX node: Add_980\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1390 for ONNX tensor: 1390\n", "[01/17/2022-21:09:36] [V] [TRT] Add_980 [Add] outputs: [1390 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_981 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1390\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1376\n", "[01/17/2022-21:09:36] [V] [TRT] Add_981 [Add] inputs: [1390 -> (-1, -1, 768)[FLOAT]], [1376 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_981 for ONNX node: Add_981\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1391 for ONNX tensor: 1391\n", "[01/17/2022-21:09:36] [V] [TRT] Add_981 [Add] outputs: [1391 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_982 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1391\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_982 [ReduceMean] inputs: [1391 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_982 for ONNX node: ReduceMean_982\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1392 for ONNX tensor: 1392\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_982 [ReduceMean] outputs: [1392 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sub_983 [Sub]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1391\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1392\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_983 [Sub] inputs: [1391 -> (-1, -1, 768)[FLOAT]], [1392 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sub_983 for ONNX node: Sub_983\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1393 for ONNX tensor: 1393\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_983 [Sub] outputs: [1393 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_984 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_984 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_984 [Constant] outputs: [1394 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Pow_985 [Pow]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1393\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1394\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_985 [Pow] inputs: [1393 -> (-1, -1, 768)[FLOAT]], [1394 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1394 for ONNX node: 1394\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Pow_985 for ONNX node: Pow_985\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1395 for ONNX tensor: 1395\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_985 [Pow] outputs: [1395 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_986 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1395\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_986 [ReduceMean] inputs: [1395 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_986 for ONNX node: ReduceMean_986\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1396 for ONNX tensor: 1396\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_986 [ReduceMean] outputs: [1396 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_987 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_987 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_987 [Constant] outputs: [1397 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_988 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1396\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1397\n", "[01/17/2022-21:09:36] [V] [TRT] Add_988 [Add] inputs: [1396 -> (-1, -1, 1)[FLOAT]], [1397 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1397 for ONNX node: 1397\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_988 for ONNX node: Add_988\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1398 for ONNX tensor: 1398\n", "[01/17/2022-21:09:36] [V] [TRT] Add_988 [Add] outputs: [1398 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sqrt_989 [Sqrt]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1398\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_989 [Sqrt] inputs: [1398 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sqrt_989 for ONNX node: Sqrt_989\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1399 for ONNX tensor: 1399\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_989 [Sqrt] outputs: [1399 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_990 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1393\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1399\n", "[01/17/2022-21:09:36] [V] [TRT] Div_990 [Div] inputs: [1393 -> (-1, -1, 768)[FLOAT]], [1399 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_990 for ONNX node: Div_990\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1400 for ONNX tensor: 1400\n", "[01/17/2022-21:09:36] [V] [TRT] Div_990 [Div] outputs: [1400 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_991 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1400\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.9.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_991 [Mul] inputs: [1400 -> (-1, -1, 768)[FLOAT]], [encoder.layer.9.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.9.output.LayerNorm.weight for ONNX node: encoder.layer.9.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_991 for ONNX node: Mul_991\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1401 for ONNX tensor: 1401\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_991 [Mul] outputs: [1401 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_992 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1401\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.9.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Add_992 [Add] inputs: [1401 -> (-1, -1, 768)[FLOAT]], [encoder.layer.9.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.9.output.LayerNorm.bias for ONNX node: encoder.layer.9.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_992 for ONNX node: Add_992\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1402 for ONNX tensor: 1402\n", "[01/17/2022-21:09:36] [V] [TRT] Add_992 [Add] outputs: [1402 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_993 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1402\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1769\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_993 [MatMul] inputs: [1402 -> (-1, -1, 768)[FLOAT]], [1769 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1769 for ONNX node: 1769\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_993 for ONNX node: MatMul_993\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1404 for ONNX tensor: 1404\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_993 [MatMul] outputs: [1404 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_994 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.attention.self.query.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1404\n", "[01/17/2022-21:09:36] [V] [TRT] Add_994 [Add] inputs: [encoder.layer.10.attention.self.query.bias -> (768)[FLOAT]], [1404 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.attention.self.query.bias for ONNX node: encoder.layer.10.attention.self.query.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_994 for ONNX node: Add_994\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1405 for ONNX tensor: 1405\n", "[01/17/2022-21:09:36] [V] [TRT] Add_994 [Add] outputs: [1405 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_995 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1402\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1770\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_995 [MatMul] inputs: [1402 -> (-1, -1, 768)[FLOAT]], [1770 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1770 for ONNX node: 1770\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_995 for ONNX node: MatMul_995\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1407 for ONNX tensor: 1407\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_995 [MatMul] outputs: [1407 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_996 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.attention.self.key.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1407\n", "[01/17/2022-21:09:36] [V] [TRT] Add_996 [Add] inputs: [encoder.layer.10.attention.self.key.bias -> (768)[FLOAT]], [1407 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.attention.self.key.bias for ONNX node: encoder.layer.10.attention.self.key.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_996 for ONNX node: Add_996\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1408 for ONNX tensor: 1408\n", "[01/17/2022-21:09:36] [V] [TRT] Add_996 [Add] outputs: [1408 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_997 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1408\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_997 [Shape] inputs: [1408 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_997 for ONNX node: Shape_997\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1409 for ONNX tensor: 1409\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_997 [Shape] outputs: [1409 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_998 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_998 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_998 [Constant] outputs: [1410 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_999 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1409\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1410\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_999 [Gather] inputs: [1409 -> (3)[INT32]], [1410 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1410 for ONNX node: 1410\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_999 for ONNX node: Gather_999\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1411 for ONNX tensor: 1411\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_999 [Gather] outputs: [1411 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1000 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1408\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1000 [Shape] inputs: [1408 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1000 for ONNX node: Shape_1000\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1412 for ONNX tensor: 1412\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1000 [Shape] outputs: [1412 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1001 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1001 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1001 [Constant] outputs: [1413 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1002 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1412\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1413\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1002 [Gather] inputs: [1412 -> (3)[INT32]], [1413 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1413 for ONNX node: 1413\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1002 for ONNX node: Gather_1002\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1414 for ONNX tensor: 1414\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1002 [Gather] outputs: [1414 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1003 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1411\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1003 [Unsqueeze] inputs: [1411 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1003 for ONNX node: Unsqueeze_1003\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1417 for ONNX tensor: 1417\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1003 [Unsqueeze] outputs: [1417 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1004 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1414\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1004 [Unsqueeze] inputs: [1414 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1004 for ONNX node: Unsqueeze_1004\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1418 for ONNX tensor: 1418\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1004 [Unsqueeze] outputs: [1418 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1005 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1417\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1418\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1771\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1772\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1005 [Concat] inputs: [1417 -> (1)[INT32]], [1418 -> (1)[INT32]], [1771 -> (1)[INT32]], [1772 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1771 for ONNX node: 1771\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1772 for ONNX node: 1772\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1005 for ONNX node: Concat_1005\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1421 for ONNX tensor: 1421\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1005 [Concat] outputs: [1421 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1006 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1408\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1421\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1006 [Reshape] inputs: [1408 -> (-1, -1, 768)[FLOAT]], [1421 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1006 for ONNX node: Reshape_1006\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1422 for ONNX tensor: 1422\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1006 [Reshape] outputs: [1422 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1007 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1402\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1773\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1007 [MatMul] inputs: [1402 -> (-1, -1, 768)[FLOAT]], [1773 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1773 for ONNX node: 1773\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1007 for ONNX node: MatMul_1007\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1424 for ONNX tensor: 1424\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1007 [MatMul] outputs: [1424 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1008 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.attention.self.value.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1424\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1008 [Add] inputs: [encoder.layer.10.attention.self.value.bias -> (768)[FLOAT]], [1424 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.attention.self.value.bias for ONNX node: encoder.layer.10.attention.self.value.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1008 for ONNX node: Add_1008\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1425 for ONNX tensor: 1425\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1008 [Add] outputs: [1425 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1009 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1425\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1009 [Shape] inputs: [1425 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1009 for ONNX node: Shape_1009\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1426 for ONNX tensor: 1426\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1009 [Shape] outputs: [1426 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1010 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1010 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1010 [Constant] outputs: [1427 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1011 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1426\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1427\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1011 [Gather] inputs: [1426 -> (3)[INT32]], [1427 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1427 for ONNX node: 1427\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1011 for ONNX node: Gather_1011\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1428 for ONNX tensor: 1428\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1011 [Gather] outputs: [1428 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1012 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1425\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1012 [Shape] inputs: [1425 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1012 for ONNX node: Shape_1012\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1429 for ONNX tensor: 1429\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1012 [Shape] outputs: [1429 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1013 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1013 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1013 [Constant] outputs: [1430 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1014 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1429\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1430\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1014 [Gather] inputs: [1429 -> (3)[INT32]], [1430 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1430 for ONNX node: 1430\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1014 for ONNX node: Gather_1014\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1431 for ONNX tensor: 1431\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1014 [Gather] outputs: [1431 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1015 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1428\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1015 [Unsqueeze] inputs: [1428 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1015 for ONNX node: Unsqueeze_1015\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1434 for ONNX tensor: 1434\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1015 [Unsqueeze] outputs: [1434 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1016 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1431\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1016 [Unsqueeze] inputs: [1431 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1016 for ONNX node: Unsqueeze_1016\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1435 for ONNX tensor: 1435\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1016 [Unsqueeze] outputs: [1435 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1017 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1434\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1435\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1774\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1775\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1017 [Concat] inputs: [1434 -> (1)[INT32]], [1435 -> (1)[INT32]], [1774 -> (1)[INT32]], [1775 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1774 for ONNX node: 1774\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1775 for ONNX node: 1775\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1017 for ONNX node: Concat_1017\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1438 for ONNX tensor: 1438\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1017 [Concat] outputs: [1438 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1018 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1425\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1438\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1018 [Reshape] inputs: [1425 -> (-1, -1, 768)[FLOAT]], [1438 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1018 for ONNX node: Reshape_1018\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1439 for ONNX tensor: 1439\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1018 [Reshape] outputs: [1439 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1019 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1439\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1019 [Transpose] inputs: [1439 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1019 for ONNX node: Transpose_1019\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1440 for ONNX tensor: 1440\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1019 [Transpose] outputs: [1440 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1020 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1405\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1020 [Shape] inputs: [1405 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1020 for ONNX node: Shape_1020\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1441 for ONNX tensor: 1441\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1020 [Shape] outputs: [1441 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1021 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1021 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1021 [Constant] outputs: [1442 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1022 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1441\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1442\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1022 [Gather] inputs: [1441 -> (3)[INT32]], [1442 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1442 for ONNX node: 1442\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1022 for ONNX node: Gather_1022\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1443 for ONNX tensor: 1443\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1022 [Gather] outputs: [1443 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1023 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1405\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1023 [Shape] inputs: [1405 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1023 for ONNX node: Shape_1023\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1444 for ONNX tensor: 1444\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1023 [Shape] outputs: [1444 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1024 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1024 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1024 [Constant] outputs: [1445 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1025 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1444\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1445\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1025 [Gather] inputs: [1444 -> (3)[INT32]], [1445 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1445 for ONNX node: 1445\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1025 for ONNX node: Gather_1025\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1446 for ONNX tensor: 1446\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1025 [Gather] outputs: [1446 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1026 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1443\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1026 [Unsqueeze] inputs: [1443 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1026 for ONNX node: Unsqueeze_1026\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1449 for ONNX tensor: 1449\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1026 [Unsqueeze] outputs: [1449 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1027 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1446\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1027 [Unsqueeze] inputs: [1446 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1027 for ONNX node: Unsqueeze_1027\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1450 for ONNX tensor: 1450\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1027 [Unsqueeze] outputs: [1450 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1028 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1449\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1450\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1776\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1777\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1028 [Concat] inputs: [1449 -> (1)[INT32]], [1450 -> (1)[INT32]], [1776 -> (1)[INT32]], [1777 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1776 for ONNX node: 1776\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1777 for ONNX node: 1777\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1028 for ONNX node: Concat_1028\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1453 for ONNX tensor: 1453\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1028 [Concat] outputs: [1453 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1029 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1405\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1453\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1029 [Reshape] inputs: [1405 -> (-1, -1, 768)[FLOAT]], [1453 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1029 for ONNX node: Reshape_1029\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1454 for ONNX tensor: 1454\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1029 [Reshape] outputs: [1454 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1030 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1454\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1030 [Transpose] inputs: [1454 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1030 for ONNX node: Transpose_1030\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1455 for ONNX tensor: 1455\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1030 [Transpose] outputs: [1455 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1031 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1422\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1031 [Transpose] inputs: [1422 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1031 for ONNX node: Transpose_1031\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1456 for ONNX tensor: 1456\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1031 [Transpose] outputs: [1456 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1032 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1455\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1456\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1032 [MatMul] inputs: [1455 -> (-1, 12, -1, 64)[FLOAT]], [1456 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1032 for ONNX node: MatMul_1032\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1457 for ONNX tensor: 1457\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1032 [MatMul] outputs: [1457 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1033 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1033 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1033 [Constant] outputs: [1458 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1034 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1457\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1458\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1034 [Div] inputs: [1457 -> (-1, 12, -1, -1)[FLOAT]], [1458 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1458 for ONNX node: 1458\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1034 for ONNX node: Div_1034\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1459 for ONNX tensor: 1459\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1034 [Div] outputs: [1459 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1035 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1459\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1035 [Add] inputs: [1459 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1035 for ONNX node: Add_1035\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1460 for ONNX tensor: 1460\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1035 [Add] outputs: [1460 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Softmax_1036 [Softmax]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1460\n", "[01/17/2022-21:09:36] [V] [TRT] Softmax_1036 [Softmax] inputs: [1460 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Softmax_1036 for ONNX node: Softmax_1036\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1461 for ONNX tensor: 1461\n", "[01/17/2022-21:09:36] [V] [TRT] Softmax_1036 [Softmax] outputs: [1461 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1037 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1461\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1440\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1037 [MatMul] inputs: [1461 -> (-1, 12, -1, -1)[FLOAT]], [1440 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1037 for ONNX node: MatMul_1037\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1462 for ONNX tensor: 1462\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1037 [MatMul] outputs: [1462 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1038 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1462\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1038 [Transpose] inputs: [1462 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1038 for ONNX node: Transpose_1038\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1463 for ONNX tensor: 1463\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1038 [Transpose] outputs: [1463 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1039 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1463\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1039 [Shape] inputs: [1463 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1039 for ONNX node: Shape_1039\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1464 for ONNX tensor: 1464\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1039 [Shape] outputs: [1464 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1040 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1040 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1040 [Constant] outputs: [1465 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1041 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1464\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1465\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1041 [Gather] inputs: [1464 -> (4)[INT32]], [1465 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1465 for ONNX node: 1465\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1041 for ONNX node: Gather_1041\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1466 for ONNX tensor: 1466\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1041 [Gather] outputs: [1466 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1042 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1463\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1042 [Shape] inputs: [1463 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1042 for ONNX node: Shape_1042\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1467 for ONNX tensor: 1467\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1042 [Shape] outputs: [1467 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1043 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1043 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1043 [Constant] outputs: [1468 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1044 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1467\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1468\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1044 [Gather] inputs: [1467 -> (4)[INT32]], [1468 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1468 for ONNX node: 1468\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1044 for ONNX node: Gather_1044\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1469 for ONNX tensor: 1469\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1044 [Gather] outputs: [1469 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1045 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1466\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1045 [Unsqueeze] inputs: [1466 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1045 for ONNX node: Unsqueeze_1045\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1471 for ONNX tensor: 1471\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1045 [Unsqueeze] outputs: [1471 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1046 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1469\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1046 [Unsqueeze] inputs: [1469 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1046 for ONNX node: Unsqueeze_1046\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1472 for ONNX tensor: 1472\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1046 [Unsqueeze] outputs: [1472 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1047 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1471\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1472\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1778\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1047 [Concat] inputs: [1471 -> (1)[INT32]], [1472 -> (1)[INT32]], [1778 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1778 for ONNX node: 1778\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1047 for ONNX node: Concat_1047\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1474 for ONNX tensor: 1474\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1047 [Concat] outputs: [1474 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1048 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1463\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1474\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1048 [Reshape] inputs: [1463 -> (-1, -1, 12, 64)[FLOAT]], [1474 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1048 for ONNX node: Reshape_1048\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1475 for ONNX tensor: 1475\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1048 [Reshape] outputs: [1475 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1049 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1475\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1779\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1049 [MatMul] inputs: [1475 -> (-1, -1, 768)[FLOAT]], [1779 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1779 for ONNX node: 1779\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1049 for ONNX node: MatMul_1049\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1477 for ONNX tensor: 1477\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1049 [MatMul] outputs: [1477 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1050 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.attention.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1477\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1050 [Add] inputs: [encoder.layer.10.attention.output.dense.bias -> (768)[FLOAT]], [1477 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.attention.output.dense.bias for ONNX node: encoder.layer.10.attention.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1050 for ONNX node: Add_1050\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1478 for ONNX tensor: 1478\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1050 [Add] outputs: [1478 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1051 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1478\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1402\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1051 [Add] inputs: [1478 -> (-1, -1, 768)[FLOAT]], [1402 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1051 for ONNX node: Add_1051\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1479 for ONNX tensor: 1479\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1051 [Add] outputs: [1479 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1052 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1479\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1052 [ReduceMean] inputs: [1479 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1052 for ONNX node: ReduceMean_1052\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1480 for ONNX tensor: 1480\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1052 [ReduceMean] outputs: [1480 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sub_1053 [Sub]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1479\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1480\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1053 [Sub] inputs: [1479 -> (-1, -1, 768)[FLOAT]], [1480 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sub_1053 for ONNX node: Sub_1053\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1481 for ONNX tensor: 1481\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1053 [Sub] outputs: [1481 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1054 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1054 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1054 [Constant] outputs: [1482 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Pow_1055 [Pow]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1481\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1482\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1055 [Pow] inputs: [1481 -> (-1, -1, 768)[FLOAT]], [1482 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1482 for ONNX node: 1482\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Pow_1055 for ONNX node: Pow_1055\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1483 for ONNX tensor: 1483\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1055 [Pow] outputs: [1483 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1056 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1483\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1056 [ReduceMean] inputs: [1483 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1056 for ONNX node: ReduceMean_1056\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1484 for ONNX tensor: 1484\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1056 [ReduceMean] outputs: [1484 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1057 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1057 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1057 [Constant] outputs: [1485 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1058 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1484\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1485\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1058 [Add] inputs: [1484 -> (-1, -1, 1)[FLOAT]], [1485 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1485 for ONNX node: 1485\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1058 for ONNX node: Add_1058\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1486 for ONNX tensor: 1486\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1058 [Add] outputs: [1486 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sqrt_1059 [Sqrt]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1486\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1059 [Sqrt] inputs: [1486 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sqrt_1059 for ONNX node: Sqrt_1059\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1487 for ONNX tensor: 1487\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1059 [Sqrt] outputs: [1487 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1060 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1481\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1487\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1060 [Div] inputs: [1481 -> (-1, -1, 768)[FLOAT]], [1487 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1060 for ONNX node: Div_1060\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1488 for ONNX tensor: 1488\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1060 [Div] outputs: [1488 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1061 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1488\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1061 [Mul] inputs: [1488 -> (-1, -1, 768)[FLOAT]], [encoder.layer.10.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.attention.output.LayerNorm.weight for ONNX node: encoder.layer.10.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1061 for ONNX node: Mul_1061\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1489 for ONNX tensor: 1489\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1061 [Mul] outputs: [1489 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1062 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1489\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1062 [Add] inputs: [1489 -> (-1, -1, 768)[FLOAT]], [encoder.layer.10.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.attention.output.LayerNorm.bias for ONNX node: encoder.layer.10.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1062 for ONNX node: Add_1062\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1490 for ONNX tensor: 1490\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1062 [Add] outputs: [1490 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1063 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1490\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1780\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1063 [MatMul] inputs: [1490 -> (-1, -1, 768)[FLOAT]], [1780 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1780 for ONNX node: 1780\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1063 for ONNX node: MatMul_1063\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1492 for ONNX tensor: 1492\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1063 [MatMul] outputs: [1492 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1064 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.intermediate.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1492\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1064 [Add] inputs: [encoder.layer.10.intermediate.dense.bias -> (3072)[FLOAT]], [1492 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.intermediate.dense.bias for ONNX node: encoder.layer.10.intermediate.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1064 for ONNX node: Add_1064\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1493 for ONNX tensor: 1493\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1064 [Add] outputs: [1493 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1065 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1065 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1065 [Constant] outputs: [1494 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1066 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1493\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1494\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1066 [Div] inputs: [1493 -> (-1, -1, 3072)[FLOAT]], [1494 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1494 for ONNX node: 1494\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1066 for ONNX node: Div_1066\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1495 for ONNX tensor: 1495\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1066 [Div] outputs: [1495 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Erf_1067 [Erf]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1495\n", "[01/17/2022-21:09:36] [V] [TRT] Erf_1067 [Erf] inputs: [1495 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Erf_1067 for ONNX node: Erf_1067\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1496 for ONNX tensor: 1496\n", "[01/17/2022-21:09:36] [V] [TRT] Erf_1067 [Erf] outputs: [1496 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1068 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1068 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1068 [Constant] outputs: [1497 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1069 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1496\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1497\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1069 [Add] inputs: [1496 -> (-1, -1, 3072)[FLOAT]], [1497 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1497 for ONNX node: 1497\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1069 for ONNX node: Add_1069\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1498 for ONNX tensor: 1498\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1069 [Add] outputs: [1498 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1070 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1493\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1498\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1070 [Mul] inputs: [1493 -> (-1, -1, 3072)[FLOAT]], [1498 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1070 for ONNX node: Mul_1070\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1499 for ONNX tensor: 1499\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1070 [Mul] outputs: [1499 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1071 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1071 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1071 [Constant] outputs: [1500 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1072 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1499\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1500\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1072 [Mul] inputs: [1499 -> (-1, -1, 3072)[FLOAT]], [1500 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1500 for ONNX node: 1500\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1072 for ONNX node: Mul_1072\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1501 for ONNX tensor: 1501\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1072 [Mul] outputs: [1501 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1073 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1501\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1781\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1073 [MatMul] inputs: [1501 -> (-1, -1, 3072)[FLOAT]], [1781 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1781 for ONNX node: 1781\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1073 for ONNX node: MatMul_1073\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1503 for ONNX tensor: 1503\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1073 [MatMul] outputs: [1503 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1074 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1503\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1074 [Add] inputs: [encoder.layer.10.output.dense.bias -> (768)[FLOAT]], [1503 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.output.dense.bias for ONNX node: encoder.layer.10.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1074 for ONNX node: Add_1074\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1504 for ONNX tensor: 1504\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1074 [Add] outputs: [1504 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1075 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1504\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1490\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1075 [Add] inputs: [1504 -> (-1, -1, 768)[FLOAT]], [1490 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1075 for ONNX node: Add_1075\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1505 for ONNX tensor: 1505\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1075 [Add] outputs: [1505 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1076 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1505\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1076 [ReduceMean] inputs: [1505 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1076 for ONNX node: ReduceMean_1076\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1506 for ONNX tensor: 1506\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1076 [ReduceMean] outputs: [1506 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sub_1077 [Sub]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1505\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1506\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1077 [Sub] inputs: [1505 -> (-1, -1, 768)[FLOAT]], [1506 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sub_1077 for ONNX node: Sub_1077\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1507 for ONNX tensor: 1507\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1077 [Sub] outputs: [1507 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1078 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1078 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1078 [Constant] outputs: [1508 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Pow_1079 [Pow]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1507\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1508\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1079 [Pow] inputs: [1507 -> (-1, -1, 768)[FLOAT]], [1508 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1508 for ONNX node: 1508\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Pow_1079 for ONNX node: Pow_1079\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1509 for ONNX tensor: 1509\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1079 [Pow] outputs: [1509 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1080 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1509\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1080 [ReduceMean] inputs: [1509 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1080 for ONNX node: ReduceMean_1080\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1510 for ONNX tensor: 1510\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1080 [ReduceMean] outputs: [1510 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1081 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1081 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1081 [Constant] outputs: [1511 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1082 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1510\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1511\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1082 [Add] inputs: [1510 -> (-1, -1, 1)[FLOAT]], [1511 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1511 for ONNX node: 1511\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1082 for ONNX node: Add_1082\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1512 for ONNX tensor: 1512\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1082 [Add] outputs: [1512 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sqrt_1083 [Sqrt]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1512\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1083 [Sqrt] inputs: [1512 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sqrt_1083 for ONNX node: Sqrt_1083\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1513 for ONNX tensor: 1513\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1083 [Sqrt] outputs: [1513 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1084 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1507\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1513\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1084 [Div] inputs: [1507 -> (-1, -1, 768)[FLOAT]], [1513 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1084 for ONNX node: Div_1084\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1514 for ONNX tensor: 1514\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1084 [Div] outputs: [1514 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1085 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1514\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1085 [Mul] inputs: [1514 -> (-1, -1, 768)[FLOAT]], [encoder.layer.10.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.output.LayerNorm.weight for ONNX node: encoder.layer.10.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1085 for ONNX node: Mul_1085\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1515 for ONNX tensor: 1515\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1085 [Mul] outputs: [1515 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1086 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1515\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.10.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1086 [Add] inputs: [1515 -> (-1, -1, 768)[FLOAT]], [encoder.layer.10.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.10.output.LayerNorm.bias for ONNX node: encoder.layer.10.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1086 for ONNX node: Add_1086\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1516 for ONNX tensor: 1516\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1086 [Add] outputs: [1516 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1087 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1516\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1782\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1087 [MatMul] inputs: [1516 -> (-1, -1, 768)[FLOAT]], [1782 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1782 for ONNX node: 1782\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1087 for ONNX node: MatMul_1087\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1518 for ONNX tensor: 1518\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1087 [MatMul] outputs: [1518 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1088 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.attention.self.query.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1518\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1088 [Add] inputs: [encoder.layer.11.attention.self.query.bias -> (768)[FLOAT]], [1518 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.attention.self.query.bias for ONNX node: encoder.layer.11.attention.self.query.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1088 for ONNX node: Add_1088\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1519 for ONNX tensor: 1519\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1088 [Add] outputs: [1519 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1089 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1516\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1783\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1089 [MatMul] inputs: [1516 -> (-1, -1, 768)[FLOAT]], [1783 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1783 for ONNX node: 1783\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1089 for ONNX node: MatMul_1089\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1521 for ONNX tensor: 1521\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1089 [MatMul] outputs: [1521 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1090 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.attention.self.key.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1521\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1090 [Add] inputs: [encoder.layer.11.attention.self.key.bias -> (768)[FLOAT]], [1521 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.attention.self.key.bias for ONNX node: encoder.layer.11.attention.self.key.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1090 for ONNX node: Add_1090\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1522 for ONNX tensor: 1522\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1090 [Add] outputs: [1522 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1091 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1522\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1091 [Shape] inputs: [1522 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1091 for ONNX node: Shape_1091\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1523 for ONNX tensor: 1523\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1091 [Shape] outputs: [1523 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1092 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1092 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1092 [Constant] outputs: [1524 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1093 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1523\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1524\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1093 [Gather] inputs: [1523 -> (3)[INT32]], [1524 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1524 for ONNX node: 1524\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1093 for ONNX node: Gather_1093\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1525 for ONNX tensor: 1525\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1093 [Gather] outputs: [1525 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1094 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1522\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1094 [Shape] inputs: [1522 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1094 for ONNX node: Shape_1094\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1526 for ONNX tensor: 1526\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1094 [Shape] outputs: [1526 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1095 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1095 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1095 [Constant] outputs: [1527 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1096 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1526\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1527\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1096 [Gather] inputs: [1526 -> (3)[INT32]], [1527 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1527 for ONNX node: 1527\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1096 for ONNX node: Gather_1096\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1528 for ONNX tensor: 1528\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1096 [Gather] outputs: [1528 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1097 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1525\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1097 [Unsqueeze] inputs: [1525 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1097 for ONNX node: Unsqueeze_1097\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1531 for ONNX tensor: 1531\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1097 [Unsqueeze] outputs: [1531 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1098 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1528\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1098 [Unsqueeze] inputs: [1528 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1098 for ONNX node: Unsqueeze_1098\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1532 for ONNX tensor: 1532\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1098 [Unsqueeze] outputs: [1532 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1099 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1531\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1532\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1784\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1785\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1099 [Concat] inputs: [1531 -> (1)[INT32]], [1532 -> (1)[INT32]], [1784 -> (1)[INT32]], [1785 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1784 for ONNX node: 1784\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1785 for ONNX node: 1785\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1099 for ONNX node: Concat_1099\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1535 for ONNX tensor: 1535\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1099 [Concat] outputs: [1535 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1100 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1522\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1535\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1100 [Reshape] inputs: [1522 -> (-1, -1, 768)[FLOAT]], [1535 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1100 for ONNX node: Reshape_1100\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1536 for ONNX tensor: 1536\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1100 [Reshape] outputs: [1536 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1101 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1516\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1786\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1101 [MatMul] inputs: [1516 -> (-1, -1, 768)[FLOAT]], [1786 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1786 for ONNX node: 1786\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1101 for ONNX node: MatMul_1101\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1538 for ONNX tensor: 1538\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1101 [MatMul] outputs: [1538 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1102 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.attention.self.value.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1538\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1102 [Add] inputs: [encoder.layer.11.attention.self.value.bias -> (768)[FLOAT]], [1538 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.attention.self.value.bias for ONNX node: encoder.layer.11.attention.self.value.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1102 for ONNX node: Add_1102\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1539 for ONNX tensor: 1539\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1102 [Add] outputs: [1539 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1103 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1539\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1103 [Shape] inputs: [1539 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1103 for ONNX node: Shape_1103\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1540 for ONNX tensor: 1540\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1103 [Shape] outputs: [1540 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1104 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1104 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1104 [Constant] outputs: [1541 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1105 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1540\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1541\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1105 [Gather] inputs: [1540 -> (3)[INT32]], [1541 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1541 for ONNX node: 1541\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1105 for ONNX node: Gather_1105\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1542 for ONNX tensor: 1542\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1105 [Gather] outputs: [1542 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1106 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1539\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1106 [Shape] inputs: [1539 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1106 for ONNX node: Shape_1106\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1543 for ONNX tensor: 1543\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1106 [Shape] outputs: [1543 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1107 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1107 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1107 [Constant] outputs: [1544 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1108 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1543\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1544\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1108 [Gather] inputs: [1543 -> (3)[INT32]], [1544 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1544 for ONNX node: 1544\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1108 for ONNX node: Gather_1108\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1545 for ONNX tensor: 1545\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1108 [Gather] outputs: [1545 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1109 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1542\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1109 [Unsqueeze] inputs: [1542 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1109 for ONNX node: Unsqueeze_1109\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1548 for ONNX tensor: 1548\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1109 [Unsqueeze] outputs: [1548 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1110 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1545\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1110 [Unsqueeze] inputs: [1545 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1110 for ONNX node: Unsqueeze_1110\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1549 for ONNX tensor: 1549\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1110 [Unsqueeze] outputs: [1549 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1111 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1548\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1549\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1787\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1788\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1111 [Concat] inputs: [1548 -> (1)[INT32]], [1549 -> (1)[INT32]], [1787 -> (1)[INT32]], [1788 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1787 for ONNX node: 1787\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1788 for ONNX node: 1788\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1111 for ONNX node: Concat_1111\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1552 for ONNX tensor: 1552\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1111 [Concat] outputs: [1552 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1112 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1539\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1552\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1112 [Reshape] inputs: [1539 -> (-1, -1, 768)[FLOAT]], [1552 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1112 for ONNX node: Reshape_1112\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1553 for ONNX tensor: 1553\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1112 [Reshape] outputs: [1553 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1113 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1553\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1113 [Transpose] inputs: [1553 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1113 for ONNX node: Transpose_1113\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1554 for ONNX tensor: 1554\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1113 [Transpose] outputs: [1554 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1114 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1519\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1114 [Shape] inputs: [1519 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1114 for ONNX node: Shape_1114\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1555 for ONNX tensor: 1555\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1114 [Shape] outputs: [1555 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1115 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1115 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1115 [Constant] outputs: [1556 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1116 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1555\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1556\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1116 [Gather] inputs: [1555 -> (3)[INT32]], [1556 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1556 for ONNX node: 1556\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1116 for ONNX node: Gather_1116\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1557 for ONNX tensor: 1557\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1116 [Gather] outputs: [1557 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1117 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1519\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1117 [Shape] inputs: [1519 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1117 for ONNX node: Shape_1117\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1558 for ONNX tensor: 1558\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1117 [Shape] outputs: [1558 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1118 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1118 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1118 [Constant] outputs: [1559 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1119 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1558\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1559\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1119 [Gather] inputs: [1558 -> (3)[INT32]], [1559 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1559 for ONNX node: 1559\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1119 for ONNX node: Gather_1119\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1560 for ONNX tensor: 1560\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1119 [Gather] outputs: [1560 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1120 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1557\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1120 [Unsqueeze] inputs: [1557 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1120 for ONNX node: Unsqueeze_1120\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1563 for ONNX tensor: 1563\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1120 [Unsqueeze] outputs: [1563 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1121 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1560\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1121 [Unsqueeze] inputs: [1560 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1121 for ONNX node: Unsqueeze_1121\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1564 for ONNX tensor: 1564\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1121 [Unsqueeze] outputs: [1564 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1122 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1563\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1564\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1789\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1790\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1122 [Concat] inputs: [1563 -> (1)[INT32]], [1564 -> (1)[INT32]], [1789 -> (1)[INT32]], [1790 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1789 for ONNX node: 1789\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1790 for ONNX node: 1790\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1122 for ONNX node: Concat_1122\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1567 for ONNX tensor: 1567\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1122 [Concat] outputs: [1567 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1123 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1519\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1567\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1123 [Reshape] inputs: [1519 -> (-1, -1, 768)[FLOAT]], [1567 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1123 for ONNX node: Reshape_1123\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1568 for ONNX tensor: 1568\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1123 [Reshape] outputs: [1568 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1124 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1568\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1124 [Transpose] inputs: [1568 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1124 for ONNX node: Transpose_1124\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1569 for ONNX tensor: 1569\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1124 [Transpose] outputs: [1569 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1125 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1536\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1125 [Transpose] inputs: [1536 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1125 for ONNX node: Transpose_1125\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1570 for ONNX tensor: 1570\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1125 [Transpose] outputs: [1570 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1126 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1569\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1570\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1126 [MatMul] inputs: [1569 -> (-1, 12, -1, 64)[FLOAT]], [1570 -> (-1, 12, 64, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1126 for ONNX node: MatMul_1126\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1571 for ONNX tensor: 1571\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1126 [MatMul] outputs: [1571 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1127 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1127 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1127 [Constant] outputs: [1572 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1128 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1571\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1572\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1128 [Div] inputs: [1571 -> (-1, 12, -1, -1)[FLOAT]], [1572 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1572 for ONNX node: 1572\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1128 for ONNX node: Div_1128\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1573 for ONNX tensor: 1573\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1128 [Div] outputs: [1573 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1129 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1573\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 234\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1129 [Add] inputs: [1573 -> (-1, 12, -1, -1)[FLOAT]], [234 -> (-1, 1, 1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1129 for ONNX node: Add_1129\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1574 for ONNX tensor: 1574\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1129 [Add] outputs: [1574 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Softmax_1130 [Softmax]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1574\n", "[01/17/2022-21:09:36] [V] [TRT] Softmax_1130 [Softmax] inputs: [1574 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Softmax_1130 for ONNX node: Softmax_1130\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1575 for ONNX tensor: 1575\n", "[01/17/2022-21:09:36] [V] [TRT] Softmax_1130 [Softmax] outputs: [1575 -> (-1, 12, -1, -1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1131 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1575\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1554\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1131 [MatMul] inputs: [1575 -> (-1, 12, -1, -1)[FLOAT]], [1554 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1131 for ONNX node: MatMul_1131\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1576 for ONNX tensor: 1576\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1131 [MatMul] outputs: [1576 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Transpose_1132 [Transpose]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1576\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1132 [Transpose] inputs: [1576 -> (-1, 12, -1, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Transpose_1132 for ONNX node: Transpose_1132\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1577 for ONNX tensor: 1577\n", "[01/17/2022-21:09:36] [V] [TRT] Transpose_1132 [Transpose] outputs: [1577 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1133 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1577\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1133 [Shape] inputs: [1577 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1133 for ONNX node: Shape_1133\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1578 for ONNX tensor: 1578\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1133 [Shape] outputs: [1578 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1134 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1134 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1134 [Constant] outputs: [1579 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1135 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1578\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1579\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1135 [Gather] inputs: [1578 -> (4)[INT32]], [1579 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1579 for ONNX node: 1579\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1135 for ONNX node: Gather_1135\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1580 for ONNX tensor: 1580\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1135 [Gather] outputs: [1580 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Shape_1136 [Shape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1577\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1136 [Shape] inputs: [1577 -> (-1, -1, 12, 64)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Shape_1136 for ONNX node: Shape_1136\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1581 for ONNX tensor: 1581\n", "[01/17/2022-21:09:36] [V] [TRT] Shape_1136 [Shape] outputs: [1581 -> (4)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1137 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1137 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1137 [Constant] outputs: [1582 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1138 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1581\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1582\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1138 [Gather] inputs: [1581 -> (4)[INT32]], [1582 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1582 for ONNX node: 1582\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 0\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1138 for ONNX node: Gather_1138\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1583 for ONNX tensor: 1583\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1138 [Gather] outputs: [1583 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1139 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1580\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1139 [Unsqueeze] inputs: [1580 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1139 for ONNX node: Unsqueeze_1139\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1585 for ONNX tensor: 1585\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1139 [Unsqueeze] outputs: [1585 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Unsqueeze_1140 [Unsqueeze]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1583\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1140 [Unsqueeze] inputs: [1583 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (), unsqueezing to: (1,)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Unsqueeze_1140 for ONNX node: Unsqueeze_1140\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1586 for ONNX tensor: 1586\n", "[01/17/2022-21:09:36] [V] [TRT] Unsqueeze_1140 [Unsqueeze] outputs: [1586 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Concat_1141 [Concat]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1585\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1586\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1791\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1141 [Concat] inputs: [1585 -> (1)[INT32]], [1586 -> (1)[INT32]], [1791 -> (1)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1791 for ONNX node: 1791\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Concat_1141 for ONNX node: Concat_1141\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1588 for ONNX tensor: 1588\n", "[01/17/2022-21:09:36] [V] [TRT] Concat_1141 [Concat] outputs: [1588 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Reshape_1142 [Reshape]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1577\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1588\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1142 [Reshape] inputs: [1577 -> (-1, -1, 12, 64)[FLOAT]], [1588 -> (3)[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Reshape_1142 for ONNX node: Reshape_1142\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1589 for ONNX tensor: 1589\n", "[01/17/2022-21:09:36] [V] [TRT] Reshape_1142 [Reshape] outputs: [1589 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1143 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1589\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1792\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1143 [MatMul] inputs: [1589 -> (-1, -1, 768)[FLOAT]], [1792 -> (768, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1792 for ONNX node: 1792\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1143 for ONNX node: MatMul_1143\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1591 for ONNX tensor: 1591\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1143 [MatMul] outputs: [1591 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1144 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.attention.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1591\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1144 [Add] inputs: [encoder.layer.11.attention.output.dense.bias -> (768)[FLOAT]], [1591 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.attention.output.dense.bias for ONNX node: encoder.layer.11.attention.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1144 for ONNX node: Add_1144\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1592 for ONNX tensor: 1592\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1144 [Add] outputs: [1592 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1145 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1592\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1516\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1145 [Add] inputs: [1592 -> (-1, -1, 768)[FLOAT]], [1516 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1145 for ONNX node: Add_1145\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1593 for ONNX tensor: 1593\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1145 [Add] outputs: [1593 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1146 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1593\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1146 [ReduceMean] inputs: [1593 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1146 for ONNX node: ReduceMean_1146\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1594 for ONNX tensor: 1594\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1146 [ReduceMean] outputs: [1594 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sub_1147 [Sub]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1593\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1594\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1147 [Sub] inputs: [1593 -> (-1, -1, 768)[FLOAT]], [1594 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sub_1147 for ONNX node: Sub_1147\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1595 for ONNX tensor: 1595\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1147 [Sub] outputs: [1595 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1148 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1148 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1148 [Constant] outputs: [1596 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Pow_1149 [Pow]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1595\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1596\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1149 [Pow] inputs: [1595 -> (-1, -1, 768)[FLOAT]], [1596 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1596 for ONNX node: 1596\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Pow_1149 for ONNX node: Pow_1149\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1597 for ONNX tensor: 1597\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1149 [Pow] outputs: [1597 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1150 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1597\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1150 [ReduceMean] inputs: [1597 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1150 for ONNX node: ReduceMean_1150\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1598 for ONNX tensor: 1598\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1150 [ReduceMean] outputs: [1598 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1151 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1151 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1151 [Constant] outputs: [1599 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1152 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1598\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1599\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1152 [Add] inputs: [1598 -> (-1, -1, 1)[FLOAT]], [1599 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1599 for ONNX node: 1599\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1152 for ONNX node: Add_1152\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1600 for ONNX tensor: 1600\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1152 [Add] outputs: [1600 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sqrt_1153 [Sqrt]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1600\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1153 [Sqrt] inputs: [1600 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sqrt_1153 for ONNX node: Sqrt_1153\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1601 for ONNX tensor: 1601\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1153 [Sqrt] outputs: [1601 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1154 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1595\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1601\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1154 [Div] inputs: [1595 -> (-1, -1, 768)[FLOAT]], [1601 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1154 for ONNX node: Div_1154\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1602 for ONNX tensor: 1602\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1154 [Div] outputs: [1602 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1155 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1602\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1155 [Mul] inputs: [1602 -> (-1, -1, 768)[FLOAT]], [encoder.layer.11.attention.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.attention.output.LayerNorm.weight for ONNX node: encoder.layer.11.attention.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1155 for ONNX node: Mul_1155\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1603 for ONNX tensor: 1603\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1155 [Mul] outputs: [1603 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1156 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1603\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1156 [Add] inputs: [1603 -> (-1, -1, 768)[FLOAT]], [encoder.layer.11.attention.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.attention.output.LayerNorm.bias for ONNX node: encoder.layer.11.attention.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1156 for ONNX node: Add_1156\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1604 for ONNX tensor: 1604\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1156 [Add] outputs: [1604 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1157 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1604\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1793\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1157 [MatMul] inputs: [1604 -> (-1, -1, 768)[FLOAT]], [1793 -> (768, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1793 for ONNX node: 1793\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1157 for ONNX node: MatMul_1157\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1606 for ONNX tensor: 1606\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1157 [MatMul] outputs: [1606 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1158 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.intermediate.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1606\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1158 [Add] inputs: [encoder.layer.11.intermediate.dense.bias -> (3072)[FLOAT]], [1606 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.intermediate.dense.bias for ONNX node: encoder.layer.11.intermediate.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1158 for ONNX node: Add_1158\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1607 for ONNX tensor: 1607\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1158 [Add] outputs: [1607 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1159 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1159 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1159 [Constant] outputs: [1608 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1160 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1607\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1608\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1160 [Div] inputs: [1607 -> (-1, -1, 3072)[FLOAT]], [1608 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1608 for ONNX node: 1608\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1160 for ONNX node: Div_1160\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1609 for ONNX tensor: 1609\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1160 [Div] outputs: [1609 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Erf_1161 [Erf]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1609\n", "[01/17/2022-21:09:36] [V] [TRT] Erf_1161 [Erf] inputs: [1609 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Erf_1161 for ONNX node: Erf_1161\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1610 for ONNX tensor: 1610\n", "[01/17/2022-21:09:36] [V] [TRT] Erf_1161 [Erf] outputs: [1610 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1162 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1162 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1162 [Constant] outputs: [1611 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1163 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1610\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1611\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1163 [Add] inputs: [1610 -> (-1, -1, 3072)[FLOAT]], [1611 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1611 for ONNX node: 1611\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1163 for ONNX node: Add_1163\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1612 for ONNX tensor: 1612\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1163 [Add] outputs: [1612 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1164 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1607\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1612\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1164 [Mul] inputs: [1607 -> (-1, -1, 3072)[FLOAT]], [1612 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1164 for ONNX node: Mul_1164\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1613 for ONNX tensor: 1613\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1164 [Mul] outputs: [1613 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1165 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1165 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1165 [Constant] outputs: [1614 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1166 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1613\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1614\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1166 [Mul] inputs: [1613 -> (-1, -1, 3072)[FLOAT]], [1614 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1614 for ONNX node: 1614\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1166 for ONNX node: Mul_1166\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1615 for ONNX tensor: 1615\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1166 [Mul] outputs: [1615 -> (-1, -1, 3072)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: MatMul_1167 [MatMul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1615\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1794\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1167 [MatMul] inputs: [1615 -> (-1, -1, 3072)[FLOAT]], [1794 -> (3072, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1794 for ONNX node: 1794\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: MatMul_1167 for ONNX node: MatMul_1167\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1617 for ONNX tensor: 1617\n", "[01/17/2022-21:09:36] [V] [TRT] MatMul_1167 [MatMul] outputs: [1617 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1168 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1617\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1168 [Add] inputs: [encoder.layer.11.output.dense.bias -> (768)[FLOAT]], [1617 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.output.dense.bias for ONNX node: encoder.layer.11.output.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1168 for ONNX node: Add_1168\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1618 for ONNX tensor: 1618\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1168 [Add] outputs: [1618 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1169 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1618\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1604\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1169 [Add] inputs: [1618 -> (-1, -1, 768)[FLOAT]], [1604 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1169 for ONNX node: Add_1169\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1619 for ONNX tensor: 1619\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1169 [Add] outputs: [1619 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1170 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1619\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1170 [ReduceMean] inputs: [1619 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1170 for ONNX node: ReduceMean_1170\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1620 for ONNX tensor: 1620\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1170 [ReduceMean] outputs: [1620 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sub_1171 [Sub]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1619\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1620\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1171 [Sub] inputs: [1619 -> (-1, -1, 768)[FLOAT]], [1620 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sub_1171 for ONNX node: Sub_1171\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1621 for ONNX tensor: 1621\n", "[01/17/2022-21:09:36] [V] [TRT] Sub_1171 [Sub] outputs: [1621 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1172 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1172 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1172 [Constant] outputs: [1622 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Pow_1173 [Pow]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1621\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1622\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1173 [Pow] inputs: [1621 -> (-1, -1, 768)[FLOAT]], [1622 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1622 for ONNX node: 1622\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Pow_1173 for ONNX node: Pow_1173\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1623 for ONNX tensor: 1623\n", "[01/17/2022-21:09:36] [V] [TRT] Pow_1173 [Pow] outputs: [1623 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: ReduceMean_1174 [ReduceMean]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1623\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1174 [ReduceMean] inputs: [1623 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: ReduceMean_1174 for ONNX node: ReduceMean_1174\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1624 for ONNX tensor: 1624\n", "[01/17/2022-21:09:36] [V] [TRT] ReduceMean_1174 [ReduceMean] outputs: [1624 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1175 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1175 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1175 [Constant] outputs: [1625 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1176 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1624\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1625\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1176 [Add] inputs: [1624 -> (-1, -1, 1)[FLOAT]], [1625 -> ()[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1625 for ONNX node: 1625\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1176 for ONNX node: Add_1176\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1626 for ONNX tensor: 1626\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1176 [Add] outputs: [1626 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Sqrt_1177 [Sqrt]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1626\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1177 [Sqrt] inputs: [1626 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Sqrt_1177 for ONNX node: Sqrt_1177\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1627 for ONNX tensor: 1627\n", "[01/17/2022-21:09:36] [V] [TRT] Sqrt_1177 [Sqrt] outputs: [1627 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Div_1178 [Div]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1621\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1627\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1178 [Div] inputs: [1621 -> (-1, -1, 768)[FLOAT]], [1627 -> (-1, -1, 1)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Div_1178 for ONNX node: Div_1178\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1628 for ONNX tensor: 1628\n", "[01/17/2022-21:09:36] [V] [TRT] Div_1178 [Div] outputs: [1628 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Mul_1179 [Mul]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1628\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1179 [Mul] inputs: [1628 -> (-1, -1, 768)[FLOAT]], [encoder.layer.11.output.LayerNorm.weight -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.output.LayerNorm.weight for ONNX node: encoder.layer.11.output.LayerNorm.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Mul_1179 for ONNX node: Mul_1179\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1629 for ONNX tensor: 1629\n", "[01/17/2022-21:09:36] [V] [TRT] Mul_1179 [Mul] outputs: [1629 -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Add_1180 [Add]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1629\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: encoder.layer.11.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1180 [Add] inputs: [1629 -> (-1, -1, 768)[FLOAT]], [encoder.layer.11.output.LayerNorm.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: encoder.layer.11.output.LayerNorm.bias for ONNX node: encoder.layer.11.output.LayerNorm.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Add_1180 for ONNX node: Add_1180\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: output_0 for ONNX tensor: output\n", "[01/17/2022-21:09:36] [V] [TRT] Add_1180 [Add] outputs: [output -> (-1, -1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Constant_1181 [Constant]\n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1181 [Constant] inputs: \n", "[01/17/2022-21:09:36] [V] [TRT] Constant_1181 [Constant] outputs: [1631 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gather_1182 [Gather]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: output\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1631\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1182 [Gather] inputs: [output -> (-1, -1, 768)[FLOAT]], [1631 -> ()[INT32]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: 1631 for ONNX node: 1631\n", "[01/17/2022-21:09:36] [V] [TRT] Using Gather axis: 1\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gather_1182 for ONNX node: Gather_1182\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1632 for ONNX tensor: 1632\n", "[01/17/2022-21:09:36] [V] [TRT] Gather_1182 [Gather] outputs: [1632 -> (-1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Gemm_1183 [Gemm]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1632\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: pooler.dense.weight\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: pooler.dense.bias\n", "[01/17/2022-21:09:36] [V] [TRT] Gemm_1183 [Gemm] inputs: [1632 -> (-1, 768)[FLOAT]], [pooler.dense.weight -> (768, 768)[FLOAT]], [pooler.dense.bias -> (768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] GEMM: using FC layer instead of MM because all criteria were met.\n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (_, 768), unsqueezing to: (_, _, _, _)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Gemm_1183 for ONNX node: Gemm_1183\n", "[01/17/2022-21:09:36] [V] [TRT] Original shape: (_, 768, 1, 1), squeezing to: (_, _)\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1633 for ONNX tensor: 1633\n", "[01/17/2022-21:09:36] [V] [TRT] Gemm_1183 [Gemm] outputs: [1633 -> (-1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Parsing node: Tanh_1184 [Tanh]\n", "[01/17/2022-21:09:36] [V] [TRT] Searching for input: 1633\n", "[01/17/2022-21:09:36] [V] [TRT] Tanh_1184 [Tanh] inputs: [1633 -> (-1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Registering layer: Tanh_1184 for ONNX node: Tanh_1184\n", "[01/17/2022-21:09:36] [V] [TRT] Registering tensor: 1634_1 for ONNX tensor: 1634\n", "[01/17/2022-21:09:36] [V] [TRT] Tanh_1184 [Tanh] outputs: [1634 -> (-1, 768)[FLOAT]], \n", "[01/17/2022-21:09:36] [V] [TRT] Marking output_0 as output: output\n", "[01/17/2022-21:09:36] [V] [TRT] Marking 1634_1 as output: 1634\n", "[W] [01/17/2022-21:09:36] [TRT] Output type must be INT32 for shape outputs\n", "[01/17/2022-21:09:36] [I] Finish parsing network model\n", "[01/17/2022-21:09:36] [I] [TRT] [MemUsageChange] Init CUDA: CPU +0, GPU +0, now: CPU 979, GPU 250 (MiB)\n", "[01/17/2022-21:09:36] [I] [TRT] [MemUsageSnapshot] Builder begin: CPU 979 MiB, GPU 250 MiB\n", "[01/17/2022-21:09:36] [V] [TRT] Applying generic optimizations to the graph for inference.\n", "[01/17/2022-21:09:36] [V] [TRT] Original: 1245 layers\n", "[01/17/2022-21:09:36] [V] [TRT] After dead-layer removal: 1245 layers\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 231 with (Unnamed Layer* 60) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 233 with (Unnamed Layer* 63) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 254 with (Unnamed Layer* 97) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 257 with (Unnamed Layer* 101) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing embeddings.LayerNorm.weight with (Unnamed Layer* 106) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing embeddings.LayerNorm.bias with (Unnamed Layer* 109) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1639 with (Unnamed Layer* 112) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.attention.self.query.bias with (Unnamed Layer* 115) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1640 with (Unnamed Layer* 118) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.attention.self.key.bias with (Unnamed Layer* 121) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1643 with (Unnamed Layer* 136) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.attention.self.value.bias with (Unnamed Layer* 139) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 318 with (Unnamed Layer* 170) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1649 with (Unnamed Layer* 202) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.attention.output.dense.bias with (Unnamed Layer* 205) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 342 with (Unnamed Layer* 211) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 345 with (Unnamed Layer* 215) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.attention.output.LayerNorm.weight with (Unnamed Layer* 220) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.attention.output.LayerNorm.bias with (Unnamed Layer* 223) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1650 with (Unnamed Layer* 226) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.intermediate.dense.bias with (Unnamed Layer* 229) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 354 with (Unnamed Layer* 232) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 357 with (Unnamed Layer* 236) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 360 with (Unnamed Layer* 240) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1651 with (Unnamed Layer* 243) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.output.dense.bias with (Unnamed Layer* 246) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 368 with (Unnamed Layer* 252) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 371 with (Unnamed Layer* 256) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.output.LayerNorm.weight with (Unnamed Layer* 261) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.0.output.LayerNorm.bias with (Unnamed Layer* 264) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1652 with (Unnamed Layer* 267) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.attention.self.query.bias with (Unnamed Layer* 270) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1653 with (Unnamed Layer* 273) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.attention.self.key.bias with (Unnamed Layer* 276) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1656 with (Unnamed Layer* 291) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.attention.self.value.bias with (Unnamed Layer* 294) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 432 with (Unnamed Layer* 325) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1662 with (Unnamed Layer* 357) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.attention.output.dense.bias with (Unnamed Layer* 360) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 456 with (Unnamed Layer* 366) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 459 with (Unnamed Layer* 370) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.attention.output.LayerNorm.weight with (Unnamed Layer* 375) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.attention.output.LayerNorm.bias with (Unnamed Layer* 378) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1663 with (Unnamed Layer* 381) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.intermediate.dense.bias with (Unnamed Layer* 384) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 468 with (Unnamed Layer* 387) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 471 with (Unnamed Layer* 391) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 474 with (Unnamed Layer* 395) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1664 with (Unnamed Layer* 398) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.output.dense.bias with (Unnamed Layer* 401) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 482 with (Unnamed Layer* 407) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 485 with (Unnamed Layer* 411) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.output.LayerNorm.weight with (Unnamed Layer* 416) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.1.output.LayerNorm.bias with (Unnamed Layer* 419) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1665 with (Unnamed Layer* 422) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.attention.self.query.bias with (Unnamed Layer* 425) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1666 with (Unnamed Layer* 428) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.attention.self.key.bias with (Unnamed Layer* 431) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1669 with (Unnamed Layer* 446) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.attention.self.value.bias with (Unnamed Layer* 449) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 546 with (Unnamed Layer* 480) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1675 with (Unnamed Layer* 512) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.attention.output.dense.bias with (Unnamed Layer* 515) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 570 with (Unnamed Layer* 521) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 573 with (Unnamed Layer* 525) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.attention.output.LayerNorm.weight with (Unnamed Layer* 530) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.attention.output.LayerNorm.bias with (Unnamed Layer* 533) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1676 with (Unnamed Layer* 536) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.intermediate.dense.bias with (Unnamed Layer* 539) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 582 with (Unnamed Layer* 542) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 585 with (Unnamed Layer* 546) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 588 with (Unnamed Layer* 550) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1677 with (Unnamed Layer* 553) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.output.dense.bias with (Unnamed Layer* 556) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 596 with (Unnamed Layer* 562) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 599 with (Unnamed Layer* 566) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.output.LayerNorm.weight with (Unnamed Layer* 571) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.2.output.LayerNorm.bias with (Unnamed Layer* 574) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1678 with (Unnamed Layer* 577) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.attention.self.query.bias with (Unnamed Layer* 580) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1679 with (Unnamed Layer* 583) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.attention.self.key.bias with (Unnamed Layer* 586) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1682 with (Unnamed Layer* 601) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.attention.self.value.bias with (Unnamed Layer* 604) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 660 with (Unnamed Layer* 635) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1688 with (Unnamed Layer* 667) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.attention.output.dense.bias with (Unnamed Layer* 670) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 684 with (Unnamed Layer* 676) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 687 with (Unnamed Layer* 680) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.attention.output.LayerNorm.weight with (Unnamed Layer* 685) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.attention.output.LayerNorm.bias with (Unnamed Layer* 688) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1689 with (Unnamed Layer* 691) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.intermediate.dense.bias with (Unnamed Layer* 694) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 696 with (Unnamed Layer* 697) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 699 with (Unnamed Layer* 701) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 702 with (Unnamed Layer* 705) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1690 with (Unnamed Layer* 708) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.output.dense.bias with (Unnamed Layer* 711) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 710 with (Unnamed Layer* 717) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 713 with (Unnamed Layer* 721) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.output.LayerNorm.weight with (Unnamed Layer* 726) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.3.output.LayerNorm.bias with (Unnamed Layer* 729) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1691 with (Unnamed Layer* 732) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.attention.self.query.bias with (Unnamed Layer* 735) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1692 with (Unnamed Layer* 738) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.attention.self.key.bias with (Unnamed Layer* 741) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1695 with (Unnamed Layer* 756) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.attention.self.value.bias with (Unnamed Layer* 759) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 774 with (Unnamed Layer* 790) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1701 with (Unnamed Layer* 822) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.attention.output.dense.bias with (Unnamed Layer* 825) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 798 with (Unnamed Layer* 831) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 801 with (Unnamed Layer* 835) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.attention.output.LayerNorm.weight with (Unnamed Layer* 840) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.attention.output.LayerNorm.bias with (Unnamed Layer* 843) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1702 with (Unnamed Layer* 846) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.intermediate.dense.bias with (Unnamed Layer* 849) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 810 with (Unnamed Layer* 852) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 813 with (Unnamed Layer* 856) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 816 with (Unnamed Layer* 860) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1703 with (Unnamed Layer* 863) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.output.dense.bias with (Unnamed Layer* 866) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 824 with (Unnamed Layer* 872) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 827 with (Unnamed Layer* 876) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.output.LayerNorm.weight with (Unnamed Layer* 881) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.4.output.LayerNorm.bias with (Unnamed Layer* 884) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1704 with (Unnamed Layer* 887) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.attention.self.query.bias with (Unnamed Layer* 890) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1705 with (Unnamed Layer* 893) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.attention.self.key.bias with (Unnamed Layer* 896) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1708 with (Unnamed Layer* 911) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.attention.self.value.bias with (Unnamed Layer* 914) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 888 with (Unnamed Layer* 945) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1714 with (Unnamed Layer* 977) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.attention.output.dense.bias with (Unnamed Layer* 980) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 912 with (Unnamed Layer* 986) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 915 with (Unnamed Layer* 990) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.attention.output.LayerNorm.weight with (Unnamed Layer* 995) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.attention.output.LayerNorm.bias with (Unnamed Layer* 998) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1715 with (Unnamed Layer* 1001) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.intermediate.dense.bias with (Unnamed Layer* 1004) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 924 with (Unnamed Layer* 1007) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 927 with (Unnamed Layer* 1011) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 930 with (Unnamed Layer* 1015) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1716 with (Unnamed Layer* 1018) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.output.dense.bias with (Unnamed Layer* 1021) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 938 with (Unnamed Layer* 1027) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 941 with (Unnamed Layer* 1031) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.output.LayerNorm.weight with (Unnamed Layer* 1036) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.5.output.LayerNorm.bias with (Unnamed Layer* 1039) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1717 with (Unnamed Layer* 1042) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.attention.self.query.bias with (Unnamed Layer* 1045) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1718 with (Unnamed Layer* 1048) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.attention.self.key.bias with (Unnamed Layer* 1051) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1721 with (Unnamed Layer* 1066) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.attention.self.value.bias with (Unnamed Layer* 1069) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1002 with (Unnamed Layer* 1100) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1727 with (Unnamed Layer* 1132) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.attention.output.dense.bias with (Unnamed Layer* 1135) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1026 with (Unnamed Layer* 1141) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1029 with (Unnamed Layer* 1145) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.attention.output.LayerNorm.weight with (Unnamed Layer* 1150) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.attention.output.LayerNorm.bias with (Unnamed Layer* 1153) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1728 with (Unnamed Layer* 1156) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.intermediate.dense.bias with (Unnamed Layer* 1159) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1038 with (Unnamed Layer* 1162) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1041 with (Unnamed Layer* 1166) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1044 with (Unnamed Layer* 1170) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1729 with (Unnamed Layer* 1173) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.output.dense.bias with (Unnamed Layer* 1176) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1052 with (Unnamed Layer* 1182) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1055 with (Unnamed Layer* 1186) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.output.LayerNorm.weight with (Unnamed Layer* 1191) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.6.output.LayerNorm.bias with (Unnamed Layer* 1194) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1730 with (Unnamed Layer* 1197) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.attention.self.query.bias with (Unnamed Layer* 1200) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1731 with (Unnamed Layer* 1203) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.attention.self.key.bias with (Unnamed Layer* 1206) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1734 with (Unnamed Layer* 1221) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.attention.self.value.bias with (Unnamed Layer* 1224) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1116 with (Unnamed Layer* 1255) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1740 with (Unnamed Layer* 1287) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.attention.output.dense.bias with (Unnamed Layer* 1290) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1140 with (Unnamed Layer* 1296) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1143 with (Unnamed Layer* 1300) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.attention.output.LayerNorm.weight with (Unnamed Layer* 1305) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.attention.output.LayerNorm.bias with (Unnamed Layer* 1308) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1741 with (Unnamed Layer* 1311) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.intermediate.dense.bias with (Unnamed Layer* 1314) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1152 with (Unnamed Layer* 1317) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1155 with (Unnamed Layer* 1321) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1158 with (Unnamed Layer* 1325) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1742 with (Unnamed Layer* 1328) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.output.dense.bias with (Unnamed Layer* 1331) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1166 with (Unnamed Layer* 1337) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1169 with (Unnamed Layer* 1341) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.output.LayerNorm.weight with (Unnamed Layer* 1346) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.7.output.LayerNorm.bias with (Unnamed Layer* 1349) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1743 with (Unnamed Layer* 1352) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.attention.self.query.bias with (Unnamed Layer* 1355) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1744 with (Unnamed Layer* 1358) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.attention.self.key.bias with (Unnamed Layer* 1361) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1747 with (Unnamed Layer* 1376) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.attention.self.value.bias with (Unnamed Layer* 1379) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1230 with (Unnamed Layer* 1410) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1753 with (Unnamed Layer* 1442) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.attention.output.dense.bias with (Unnamed Layer* 1445) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1254 with (Unnamed Layer* 1451) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1257 with (Unnamed Layer* 1455) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.attention.output.LayerNorm.weight with (Unnamed Layer* 1460) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.attention.output.LayerNorm.bias with (Unnamed Layer* 1463) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1754 with (Unnamed Layer* 1466) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.intermediate.dense.bias with (Unnamed Layer* 1469) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1266 with (Unnamed Layer* 1472) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1269 with (Unnamed Layer* 1476) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1272 with (Unnamed Layer* 1480) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1755 with (Unnamed Layer* 1483) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.output.dense.bias with (Unnamed Layer* 1486) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1280 with (Unnamed Layer* 1492) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1283 with (Unnamed Layer* 1496) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.output.LayerNorm.weight with (Unnamed Layer* 1501) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.8.output.LayerNorm.bias with (Unnamed Layer* 1504) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1756 with (Unnamed Layer* 1507) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.attention.self.query.bias with (Unnamed Layer* 1510) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1757 with (Unnamed Layer* 1513) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.attention.self.key.bias with (Unnamed Layer* 1516) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1760 with (Unnamed Layer* 1531) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.attention.self.value.bias with (Unnamed Layer* 1534) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1344 with (Unnamed Layer* 1565) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1766 with (Unnamed Layer* 1597) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.attention.output.dense.bias with (Unnamed Layer* 1600) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1368 with (Unnamed Layer* 1606) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1371 with (Unnamed Layer* 1610) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.attention.output.LayerNorm.weight with (Unnamed Layer* 1615) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.attention.output.LayerNorm.bias with (Unnamed Layer* 1618) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1767 with (Unnamed Layer* 1621) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.intermediate.dense.bias with (Unnamed Layer* 1624) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1380 with (Unnamed Layer* 1627) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1383 with (Unnamed Layer* 1631) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1386 with (Unnamed Layer* 1635) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1768 with (Unnamed Layer* 1638) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.output.dense.bias with (Unnamed Layer* 1641) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1394 with (Unnamed Layer* 1647) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1397 with (Unnamed Layer* 1651) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.output.LayerNorm.weight with (Unnamed Layer* 1656) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.9.output.LayerNorm.bias with (Unnamed Layer* 1659) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1769 with (Unnamed Layer* 1662) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.attention.self.query.bias with (Unnamed Layer* 1665) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1770 with (Unnamed Layer* 1668) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.attention.self.key.bias with (Unnamed Layer* 1671) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1773 with (Unnamed Layer* 1686) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.attention.self.value.bias with (Unnamed Layer* 1689) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1458 with (Unnamed Layer* 1720) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1779 with (Unnamed Layer* 1752) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.attention.output.dense.bias with (Unnamed Layer* 1755) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1482 with (Unnamed Layer* 1761) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1485 with (Unnamed Layer* 1765) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.attention.output.LayerNorm.weight with (Unnamed Layer* 1770) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.attention.output.LayerNorm.bias with (Unnamed Layer* 1773) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1780 with (Unnamed Layer* 1776) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.intermediate.dense.bias with (Unnamed Layer* 1779) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1494 with (Unnamed Layer* 1782) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1497 with (Unnamed Layer* 1786) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1500 with (Unnamed Layer* 1790) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1781 with (Unnamed Layer* 1793) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.output.dense.bias with (Unnamed Layer* 1796) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1508 with (Unnamed Layer* 1802) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1511 with (Unnamed Layer* 1806) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.output.LayerNorm.weight with (Unnamed Layer* 1811) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.10.output.LayerNorm.bias with (Unnamed Layer* 1814) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1782 with (Unnamed Layer* 1817) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.attention.self.query.bias with (Unnamed Layer* 1820) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1783 with (Unnamed Layer* 1823) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.attention.self.key.bias with (Unnamed Layer* 1826) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1786 with (Unnamed Layer* 1841) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.attention.self.value.bias with (Unnamed Layer* 1844) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1572 with (Unnamed Layer* 1875) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1792 with (Unnamed Layer* 1907) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.attention.output.dense.bias with (Unnamed Layer* 1910) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1596 with (Unnamed Layer* 1916) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1599 with (Unnamed Layer* 1920) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.attention.output.LayerNorm.weight with (Unnamed Layer* 1925) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.attention.output.LayerNorm.bias with (Unnamed Layer* 1928) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1793 with (Unnamed Layer* 1931) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.intermediate.dense.bias with (Unnamed Layer* 1934) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1608 with (Unnamed Layer* 1937) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1611 with (Unnamed Layer* 1941) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1614 with (Unnamed Layer* 1945) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1794 with (Unnamed Layer* 1948) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.output.dense.bias with (Unnamed Layer* 1951) [Shuffle]\n", "[01/17/2022-21:09:36] [V] [TRT] ConstShuffleFusion: Fusing 1622 with (Unnamed Layer* 1957) [Shuffle]\n", "[01/17/2022-21:09:37] [V] [TRT] ConstShuffleFusion: Fusing 1625 with (Unnamed Layer* 1961) [Shuffle]\n", "[01/17/2022-21:09:37] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.output.LayerNorm.weight with (Unnamed Layer* 1966) [Shuffle]\n", "[01/17/2022-21:09:37] [V] [TRT] ConstShuffleFusion: Fusing encoder.layer.11.output.LayerNorm.bias with (Unnamed Layer* 1969) [Shuffle]\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Unsqueeze_22 with Unsqueeze_23\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_89 with Transpose_90\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_66 with Transpose_91\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_78 with Transpose_79\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_98 with Reshape_108\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_183 with Transpose_184\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_160 with Transpose_185\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_172 with Transpose_173\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_192 with Reshape_202\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_277 with Transpose_278\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_254 with Transpose_279\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_266 with Transpose_267\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_286 with Reshape_296\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_371 with Transpose_372\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_348 with Transpose_373\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_360 with Transpose_361\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_380 with Reshape_390\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_465 with Transpose_466\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_442 with Transpose_467\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_454 with Transpose_455\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_474 with Reshape_484\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_559 with Transpose_560\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_536 with Transpose_561\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_548 with Transpose_549\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_568 with Reshape_578\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_653 with Transpose_654\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_630 with Transpose_655\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_642 with Transpose_643\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_662 with Reshape_672\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_747 with Transpose_748\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_724 with Transpose_749\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_736 with Transpose_737\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_756 with Reshape_766\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_841 with Transpose_842\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_818 with Transpose_843\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_830 with Transpose_831\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_850 with Reshape_860\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_935 with Transpose_936\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_912 with Transpose_937\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_924 with Transpose_925\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_944 with Reshape_954\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_1029 with Transpose_1030\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_1006 with Transpose_1031\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_1018 with Transpose_1019\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_1038 with Reshape_1048\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_1123 with Transpose_1124\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_1100 with Transpose_1125\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_1112 with Transpose_1113\n", "[01/17/2022-21:09:37] [V] [TRT] ShuffleShuffleFusion: Fusing Transpose_1132 with Reshape_1142\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_55 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_53 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_67 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_97 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_96 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_149 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_147 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_161 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_191 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_190 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_243 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_241 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_255 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_285 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_284 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_337 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_335 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_349 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_379 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_378 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_431 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_429 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_443 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_473 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_472 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_525 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_523 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_537 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_567 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_566 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_619 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_617 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_631 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_661 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_660 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_713 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_711 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_725 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_755 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_754 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_807 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_805 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_819 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_849 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_848 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_901 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_899 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_913 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_943 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_942 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_995 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_993 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_1007 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_1037 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_1036 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_1089 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_1087 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_1101 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found MatMul_1131 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found Softmax_1130 to be part of self-attention pattern.\n", "[01/17/2022-21:09:37] [V] [TRT] Found and reassigned Myelin backends for Self-Attention nodes\n", "[01/17/2022-21:09:37] [V] [TRT] After Myelin optimization: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] Convert layer type of Gemm_1183 from FULLY_CONNECTED to CONVOLUTION\n", "[01/17/2022-21:09:37] [V] [TRT] Removing shuffle_between_(Unnamed Layer* 1978) [Shuffle]_output_and_Gemm_1183\n", "[01/17/2022-21:09:37] [V] [TRT] After scale fusion: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] Swap the layer type of Tanh_1184 from ACTIVATION to POINTWISE\n", "[01/17/2022-21:09:37] [V] [TRT] After vertical fusions: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] After dupe layer removal: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] After final dead-layer removal: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] After tensor merging: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] Replacing slice Slice_9 with copy from (Unnamed Layer* 7) [Constant]_output to 215\n", "[01/17/2022-21:09:37] [V] [TRT] Replacing slice Slice_36 with copy from (Unnamed Layer* 71) [Constant]_output to 246\n", "[01/17/2022-21:09:37] [V] [TRT] After concat removal: 18 layers\n", "[01/17/2022-21:09:37] [V] [TRT] Graph construction and optimization completed in 0.582211 seconds.\n", "[01/17/2022-21:09:37] [V] [TRT] Using cublasLt a tactic source\n", "[01/17/2022-21:09:37] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +491, GPU +212, now: CPU 1471, GPU 462 (MiB)\n", "[01/17/2022-21:09:37] [V] [TRT] Using cuDNN as a tactic source\n", "[01/17/2022-21:09:38] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +286, GPU +198, now: CPU 1757, GPU 660 (MiB)\n", "[W] [01/17/2022-21:09:38] [TRT] Detected invalid timing cache, setup a local cache instead\n", "[01/17/2022-21:09:38] [V] [TRT] Constructing optimization profile number 0 [1/1].\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Int32(), Int32(), Int32(), Int32(), Int32(), Int32() ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: [HostToDeviceCopy] (ShapeHostToDevice)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 is the only option, timing skipped\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 0 Time: 0\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: ShapeHostToDevice Tactic: 0\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Float(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Half(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Int32(512,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Int32(512,1) -> Int32(128,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Slice_9 (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 0.009784\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 0.00674\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 0 Time: 0.00674\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Reformat Tactic: 0\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: Int32(128,1) -> Int32(128,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Expand_21 (Slice)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 is the only option, timing skipped\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 0 Time: 0\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Slice Tactic: 0\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Float(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Half(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Int32(512,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Int32(512,1) -> Int32(128,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Slice_36 (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 0.009548\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 0.006532\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 0 Time: 0.006532\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Reformat Tactic: 0\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Float(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: -> Half(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Float(768,1) -> Half(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 3.15527\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 4.59106\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 1002 Time: 3.15527\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Half(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 3.20506\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 4.34695\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 1002 Time: 3.20506\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: Float(768,1), Int32(128,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Gather_37 (Gather)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1 Time: 0.174444\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 2 Time: 0.288608\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 3 Time: 1.29116\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 4 Time: 0.255012\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 6 Time: 2.67281\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 7 Time: 0.058412\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 8 Time: 0.060728\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 9 Time: 0.059884\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 7 Time: 0.058412\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 7\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: Half(768,1), Int32(128,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Gather_37 (Gather)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1 Time: 0.155624\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 2 Time: 0.266524\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 3 Time: 1.26509\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 4 Time: 0.224388\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 6 Time: 2.65316\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 7 Time: 0.027088\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 8 Time: 0.027152\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 9 Time: 0.028152\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 7 Time: 0.027088\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 7\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Float(768,1) -> Half(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 0.0105\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 0.012372\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 1002 Time: 0.0105\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Half(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 0.010856\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 0.012228\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 1002 Time: 0.010856\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: Float(768,1), Int32(128,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Gather_40 (Gather)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1 Time: 0.00912\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 2 Time: 0.01168\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 3 Time: 0.072396\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 4 Time: 0.095168\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 6 Time: 0.167728\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 7 Time: 0.005748\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 8 Time: 0.004496\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 9 Time: 0.004592\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 8 Time: 0.004496\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 8\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: Half(768,1), Int32(128,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Gather_40 (Gather)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1 Time: 0.009064\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 2 Time: 0.010628\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 3 Time: 0.071108\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 4 Time: 0.093244\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 6 Time: 0.165924\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 7 Time: 0.00556\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 8 Time: 0.004204\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 9 Time: 0.004284\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 8 Time: 0.004204\n", "[01/17/2022-21:09:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 8\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Float(768,1) -> Half(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 0.005088\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 0.004272\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 0 Time: 0.004272\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning Reformat:Half(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1002 Time: 0.0049\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 0 Time: 0.003892\n", "[01/17/2022-21:09:38] [V] [TRT] Fastest Tactic: 0 Time: 0.003892\n", "[01/17/2022-21:09:38] [V] [TRT] *************** Autotuning format combination: Float(768,1), Int32(128,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:38] [V] [TRT] --------------- Timing Runner: Gather_38 (Gather)\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 1 Time: 0.12386\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 2 Time: 0.258972\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 3 Time: 1.26036\n", "[01/17/2022-21:09:38] [V] [TRT] Tactic: 4 Time: 0.147452\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 6 Time: 2.59121\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 7 Time: 0.032652\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 8 Time: 0.031688\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 9 Time: 0.031456\n", "[01/17/2022-21:09:39] [V] [TRT] Fastest Tactic: 9 Time: 0.031456\n", "[01/17/2022-21:09:39] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 9\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning format combination: Half(768,1), Int32(128,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] --------------- Timing Runner: Gather_38 (Gather)\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 1 Time: 0.105056\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 2 Time: 0.241728\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 3 Time: 1.24466\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 4 Time: 0.132364\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 6 Time: 2.56686\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 7 Time: 0.008944\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 8 Time: 0.007972\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 9 Time: 0.0082\n", "[01/17/2022-21:09:39] [V] [TRT] Fastest Tactic: 8 Time: 0.007972\n", "[01/17/2022-21:09:39] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 8\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning Reformat:Float(98304,768,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 1002 Time: 0.051208\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 0 Time: 0.054308\n", "[01/17/2022-21:09:39] [V] [TRT] Fastest Tactic: 1002 Time: 0.051208\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning Reformat:Half(98304,768,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 1002 Time: 0.058168\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 0 Time: 0.06406\n", "[01/17/2022-21:09:39] [V] [TRT] Fastest Tactic: 1002 Time: 0.058168\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning Reformat:Float(98304,768,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning Reformat:Half(98304,768,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning Reformat:Float(98304,768,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 1002 Time: 0.006152\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 0 Time: 0.006072\n", "[01/17/2022-21:09:39] [V] [TRT] Fastest Tactic: 0 Time: 0.006072\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning Reformat:Half(98304,768,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 1002 Time: 0.006256\n", "[01/17/2022-21:09:39] [V] [TRT] Tactic: 0 Time: 0.005876\n", "[01/17/2022-21:09:39] [V] [TRT] Fastest Tactic: 0 Time: 0.005876\n", "[01/17/2022-21:09:39] [V] [TRT] *************** Autotuning format combination: Float(98304,768,1), Float(98304,768,1), Float(98304,768,1), Int32(128,1), Int32(), Int32(), Int32(), Int32(), Int32(), Int32() -> Float(98304,768,1) ***************\n", "[01/17/2022-21:09:39] [V] [TRT] --------------- Timing Runner: {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]} (Myelin)\n", "[01/17/2022-21:09:39] [W] [TRT] Myelin graph with multiple dynamic values may have poor performance if they differ. Dynamic values are: \n", "[W] [TRT] (# 0 (SHAPE attn_mask))\n", "[01/17/2022-21:09:39] [W] [TRT] (# 0 (SHAPE token_ids))\n", "[01/17/2022-21:09:39] [01/17/2022-21:13:31] [V] [TRT] myelinAllocCb allocated GPU (data-constants) 12 bytes at 0x302000000.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 4325888 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 9438208 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 18876416 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 37752832 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 81813504 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbddf7af80.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 163627008 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbddf7dd80.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 327254016 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbddf7f480.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 654508032 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbddf81580.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated GPU 81805312 bytes at 0x33395dd00.\n", "[01/17/2022-21:13:33] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbddf82c80.\n", "[01/17/2022-21:13:34] [V] [TRT] Tactic: 0 Time: 79.6402\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbddf7af80.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbddf7dd80.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbddf7f480.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbddf81580.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x33395dd00.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbddf82c80.\n", "[01/17/2022-21:13:34] [V] [TRT] myelinFreeCb freeing GPU at 0x302000000.\n", "[01/17/2022-21:13:34] [V] [TRT] Fastest Tactic: 0 Time: 79.6402\n", "[01/17/2022-21:13:34] [V] [TRT] *************** Autotuning format combination: Half(98304,768,1), Half(98304,768,1), Half(98304,768,1), Int32(128,1), Int32(), Int32(), Int32(), Int32(), Int32(), Int32() -> Half(98304,768,1) ***************\n", "[01/17/2022-21:13:34] [V] [TRT] --------------- Timing Runner: {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]} (Myelin)\n", "[W] [01/17/2022-21:13:34] [TRT] Myelin graph with multiple dynamic values may have poor performance if they differ. Dynamic values are: \n", "[W] [TRT] (# 0 (SHAPE attn_mask))\n", "[01/17/2022-21:13:34] [W] [01/17/2022-21:13:34] [TRT] (# 0 (SHAPE token_ids))\n", "[01/17/2022-21:16:49] [V] [TRT] myelinAllocCb allocated GPU (data-constants) 12 bytes at 0x302000000.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 3342848 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 4719616 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 9439232 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 18878464 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 40910848 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbda3f5d80.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 81821696 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbda3f8c00.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 163643392 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbda3fa380.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 327286784 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbda3fc400.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated GPU 40902656 bytes at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x55cbda3fdb80.\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 0 Time: 17.4547\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbda3f5d80.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbda3f8c00.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbda3fa380.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbda3fc400.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x31acb7500.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing CPU at 0x55cbda3fdb80.\n", "[01/17/2022-21:16:51] [V] [TRT] myelinFreeCb freeing GPU at 0x302000000.\n", "[01/17/2022-21:16:51] [V] [TRT] Fastest Tactic: 0 Time: 17.4547\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning Reformat:Half(98304,768,1) -> Float(98304,768,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 1002 Time: 0.058212\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 0 Time: 0.063928\n", "[01/17/2022-21:16:51] [V] [TRT] Fastest Tactic: 1002 Time: 0.058212\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning format combination: -> Int32() ***************\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning Reformat:Float(98304,768,1) -> Half(98304,768,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning format combination: Float(98304,768,1), Int32() -> Float(768,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] --------------- Timing Runner: Gather_1182 (Gather)\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 5 Time: 0.065988\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 6 Time: 0.019732\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 7 Time: 0.0047\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 8 Time: 0.004176\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 9 Time: 0.004096\n", "[01/17/2022-21:16:51] [V] [TRT] Fastest Tactic: 9 Time: 0.004096\n", "[01/17/2022-21:16:51] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 9\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning format combination: Half(98304,768,1), Int32() -> Half(768,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] --------------- Timing Runner: Gather_1182 (Gather)\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 5 Time: 0.06466\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 6 Time: 0.019296\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 7 Time: 0.004832\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 8 Time: 0.00408\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 9 Time: 0.00408\n", "[01/17/2022-21:16:51] [V] [TRT] Fastest Tactic: 9 Time: 0.00408\n", "[01/17/2022-21:16:51] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Gather Tactic: 9\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning Reformat:Float(768,1) -> Half(768,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 1002 Time: 0.004924\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 0 Time: 0.004248\n", "[01/17/2022-21:16:51] [V] [TRT] Fastest Tactic: 0 Time: 0.004248\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning Reformat:Half(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 1002 Time: 0.004924\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 0 Time: 0.004076\n", "[01/17/2022-21:16:51] [V] [TRT] Fastest Tactic: 0 Time: 0.004076\n", "[01/17/2022-21:16:51] [V] [TRT] *************** Autotuning format combination: Float(768,1) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:51] [V] [TRT] --------------- Timing Runner: (Unnamed Layer* 1978) [Shuffle] (Shuffle)\n", "[01/17/2022-21:16:51] [V] [TRT] Tactic: 0 Time: 0.003796\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1 Time: 0.007404\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003796\n", "[01/17/2022-21:16:52] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Shuffle Tactic: 0\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning format combination: Half(768,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: (Unnamed Layer* 1978) [Shuffle] (Shuffle)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.003812\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1 Time: 0.008396\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003812\n", "[01/17/2022-21:16:52] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Shuffle Tactic: 0\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,1,1) -> Float(768,1,768,768) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.00504\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004284\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004284\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,1,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.005032\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004232\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004232\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,1,1) -> Half(384,1:2,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008304\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.003996\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003996\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,1,1) -> Half(96,1:8,96,96) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.004916\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004156\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004156\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,768,768) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.005016\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004148\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004148\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,768,768) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.005024\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.00414\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.00414\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,768,768) -> Half(384,1:2,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008204\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004252\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004252\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Float(768,1,768,768) -> Half(96,1:8,96,96) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.0051\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004228\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004228\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(768,1,1,1) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.004908\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004052\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004052\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(768,1,1,1) -> Float(768,1,768,768) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.004972\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004216\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004216\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(768,1,1,1) -> Half(384,1:2,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008484\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.003992\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003992\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(768,1,1,1) -> Half(96,1:8,96,96) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.005232\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004064\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004064\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(384,1:2,1,1) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008472\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.003944\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003944\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(384,1:2,1,1) -> Float(768,1,768,768) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.00526\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004204\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004204\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(384,1:2,1,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008428\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.003976\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003976\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(384,1:2,1,1) -> Half(96,1:8,96,96) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.00476\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.00682\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 1002 Time: 0.00476\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(96,1:8,96,96) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008452\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.00392\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.00392\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(96,1:8,96,96) -> Float(768,1,768,768) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.004968\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.004212\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.004212\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(96,1:8,96,96) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008332\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.003612\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.003612\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning Reformat:Half(96,1:8,96,96) -> Half(384,1:2,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1002 Time: 0.008408\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.006744\n", "[01/17/2022-21:16:52] [V] [TRT] Fastest Tactic: 0 Time: 0.006744\n", "[01/17/2022-21:16:52] [V] [TRT] *************** Autotuning format combination: Float(768,1,1,1) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Gemm_1183 (FusedConvActConvolution)\n", "[01/17/2022-21:16:52] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudaDepthwiseConvolution)\n", "[01/17/2022-21:16:52] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:52] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudnnConvolution)\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 0 Time: 0.061184\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 1 Time: 0.056548\n", "[01/17/2022-21:16:52] [V] [TRT] Tactic: 2 Time: 0.126956\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 4 Time: 9.82836\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 5 Time: 1.06056\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 56 Time: 0.06464\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 57 Time: 0.059312\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 58 Time: 0.132724\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 60 Time: 9.84149\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 61 Time: 1.04484\n", "[01/17/2022-21:16:53] [V] [TRT] Fastest Tactic: 1 Time: 0.056548\n", "[01/17/2022-21:16:53] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CublasConvolution)\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 0 Time: 0.046\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 1 Time: 0.045096\n", "[01/17/2022-21:16:53] [V] [TRT] Fastest Tactic: 1 Time: 0.045096\n", "[01/17/2022-21:16:53] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_relu_interior_nn_v1 Tactic: 1754569683116234317\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 1754569683116234317 Time: 0.134744\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_relu_medium_nn_v1 Tactic: 1825138533642645384\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 1825138533642645384 Time: 0.135492\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x32_relu_interior_nn_v1 Tactic: 2733356012094739613\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 2733356012094739613 Time: 0.090328\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_relu_small_nn_v1 Tactic: 3915320020053085238\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 3915320020053085238 Time: 0.133856\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x64_relu_small_nn_v1 Tactic: 6808617066150061604\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 6808617066150061604 Time: 0.083104\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x64_relu_interior_nn_v1 Tactic: 9091006216302412844\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 9091006216302412844 Time: 0.080028\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x64_relu_medium_nn_v1 Tactic: -8060443123034038864\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -8060443123034038864 Time: 0.088948\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x32_relu_medium_nn_v1 Tactic: -4420849921117327522\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -4420849921117327522 Time: 0.079104\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x32_relu_small_nn_v1 Tactic: -3946921629105938337\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -3946921629105938337 Time: 0.092104\n", "[01/17/2022-21:16:53] [V] [TRT] Fastest Tactic: -4420849921117327522 Time: 0.079104\n", "[01/17/2022-21:16:53] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CublasConvolution Tactic: 1\n", "[01/17/2022-21:16:53] [V] [TRT] *************** Autotuning format combination: Float(768,1,768,768) -> Float(768,1,768,768) ***************\n", "[01/17/2022-21:16:53] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudnnConvolution)\n", "[01/17/2022-21:16:53] [V] [TRT] CudnnConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:53] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CublasConvolution)\n", "[01/17/2022-21:16:53] [V] [TRT] CublasConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:53] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_relu_exp_medium_nhwc_tn_v1 Tactic: 861694390046228376\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 861694390046228376 Time: 0.132908\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x64_sliced1x2_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: 5258189349241541167\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 5258189349241541167 Time: 0.073472\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_relu_exp_small_nhwc_tn_v1 Tactic: 5821621277990374316\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 5821621277990374316 Time: 0.132496\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x32_sliced1x4_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: 5863767799113001648\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 5863767799113001648 Time: 0.04358\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: -9147980667639709536\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -9147980667639709536 Time: 0.131948\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -8892196987859366827\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -8892196987859366827 Time: 0.1328\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x64_sliced1x2_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: -8850904373104590857\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -8850904373104590857 Time: 0.073692\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x32_sliced1x4_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -8010679767156598961\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -8010679767156598961 Time: 0.042488\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: -7751035352149795660\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -7751035352149795660 Time: 0.132664\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x128_relu_exp_interior_nhwc_tn_v1 Tactic: -5115676123557684531\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -5115676123557684531 Time: 0.13192\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x64_sliced1x2_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -493597327599791285\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -493597327599791285 Time: 0.070988\n", "[01/17/2022-21:16:53] [V] [TRT] Gemm_1183 Set Tactic Name: volta_scudnn_128x32_sliced1x4_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: -423878181466897819\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: -423878181466897819 Time: 0.04354\n", "[01/17/2022-21:16:53] [V] [TRT] Fastest Tactic: -8010679767156598961 Time: 0.042488\n", "[01/17/2022-21:16:53] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8010679767156598961\n", "[01/17/2022-21:16:53] [V] [TRT] *************** Autotuning format combination: Half(768,1,1,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:53] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudnnConvolution)\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 0 Time: 0.130736\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 1 Time: 0.09084\n", "[01/17/2022-21:16:53] [V] [TRT] Tactic: 2 Time: 0.12926\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 4 Time: 9.75246\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 5 Time: 0.960072\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 56 Time: 0.139268\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 58 Time: 0.13558\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 60 Time: 9.76153\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 61 Time: 1.00739\n", "[01/17/2022-21:16:54] [V] [TRT] Fastest Tactic: 1 Time: 0.09084\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CublasConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 0 Time: 0.015372\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 1 Time: 0.015104\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 2 Time: 0.015404\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 3 Time: 0.01492\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 4 Time: 0.024944\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 5 Time: 0.023556\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 6 Time: 0.015752\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 7 Time: 0.013556\n", "[01/17/2022-21:16:54] [V] [TRT] Fastest Tactic: 7 Time: 0.013556\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CublasConvolution Tactic: 7\n", "[01/17/2022-21:16:54] [V] [TRT] *************** Autotuning format combination: Half(384,1:2,1,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] *************** Autotuning format combination: Half(384,1:2,1,1) -> Half(384,1:2,1,1) ***************\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (FusedConvActConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudnnConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CudnnConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CublasConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CublasConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x64_relu_medium_nn_v1 Tactic: 1651411198763708804\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 1651411198763708804 Time: 0.049296\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x64_relu_medium_nn_v1 Tactic: 2418518597804310654\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 2418518597804310654 Time: 0.047872\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x64_relu_small_nn_v1 Tactic: 4318470497547290900\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 4318470497547290900 Time: 0.048036\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x32_relu_medium_nn_v1 Tactic: 4930470141256631146\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 4930470141256631146 Time: 0.052088\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x32_relu_medium_nn_v1 Tactic: 8292881859266835088\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 8292881859266835088 Time: 0.050588\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x64_relu_small_nn_v1 Tactic: 8401509141903434922\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 8401509141903434922 Time: 0.045032\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x128_relu_small_nn_v1 Tactic: -8654297089785671176\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -8654297089785671176 Time: 0.071008\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x32_relu_interior_nn_v1 Tactic: -7516584506774355935\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -7516584506774355935 Time: 0.046004\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x64_relu_interior_nn_v1 Tactic: -7140760933967189247\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -7140760933967189247 Time: 0.043408\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x32_relu_small_nn_v1 Tactic: -6004726995029373073\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -6004726995029373073 Time: 0.04836\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x128_relu_small_nn_v1 Tactic: -5719726816705110014\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -5719726816705110014 Time: 0.071096\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x32_relu_interior_nn_v1 Tactic: -4097850214384059472\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -4097850214384059472 Time: 0.044288\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x64_relu_interior_nn_v1 Tactic: -3717489476759089008\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -3717489476759089008 Time: 0.043948\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x128_relu_medium_nn_v1 Tactic: -3689982367035295496\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -3689982367035295496 Time: 0.070696\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x128_relu_interior_nn_v1 Tactic: -2640575123064142123\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -2640575123064142123 Time: 0.070556\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x128_relu_interior_nn_v1 Tactic: -2534402059426524406\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -2534402059426524406 Time: 0.070464\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: volta_fp16x2_hcudnn_fp16x2_128x32_relu_small_nn_v1 Tactic: -2027588946874785071\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -2027588946874785071 Time: 0.050204\n", "[01/17/2022-21:16:54] [V] [TRT] Gemm_1183 Set Tactic Name: turing_fp16x2_hcudnn_fp16x2_128x128_relu_medium_nn_v1 Tactic: -1968398013367819764\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: -1968398013367819764 Time: 0.071776\n", "[01/17/2022-21:16:54] [V] [TRT] Fastest Tactic: -7140760933967189247 Time: 0.043408\n", "[01/17/2022-21:16:54] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -7140760933967189247\n", "[01/17/2022-21:16:54] [V] [TRT] *************** Autotuning format combination: Half(96,1:8,96,96) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudnnConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CudnnConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CublasConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CublasConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] *************** Autotuning format combination: Half(96,1:8,96,96) -> Half(96,1:8,96,96) ***************\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudaDepthwiseConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:54] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CudnnConvolution)\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 0 Time: 0.13148\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 1 Time: 0.091344\n", "[01/17/2022-21:16:54] [V] [TRT] Tactic: 2 Time: 0.129476\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 4 Time: 9.74942\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 5 Time: 0.988076\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 56 Time: 0.142716\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 58 Time: 0.137084\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 60 Time: 9.77079\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 61 Time: 0.970292\n", "[01/17/2022-21:16:55] [V] [TRT] Fastest Tactic: 1 Time: 0.091344\n", "[01/17/2022-21:16:55] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CublasConvolution)\n", "[01/17/2022-21:16:55] [V] [TRT] CublasConvolution has no valid tactics for this config, skipping\n", "[01/17/2022-21:16:55] [V] [TRT] --------------- Timing Runner: Gemm_1183 (CaskConvolution)\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x32x32_stage1_warpsize4x1x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 83696452256923412\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 83696452256923412 Time: 0.015088\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x32x64_stage1_warpsize2x1x2_g1_tensor16x8x8_simple_t1r1s1 Tactic: 106549059816437840\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 106549059816437840 Time: 0.011156\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x64_sliced1x2_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: 1179757074518529353\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 1179757074518529353 Time: 0.025384\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x64_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: 2105695814191699972\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2105695814191699972 Time: 0.040064\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x64_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: 2148106709480872763\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2148106709480872763 Time: 0.026296\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x128_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: 2410442691266548717\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2410442691266548717 Time: 0.04114\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x256x64_stage1_warpsize1x4x2_g1_tensor16x8x8_t1r1s1 Tactic: 2511830168590723349\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2511830168590723349 Time: 0.025296\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x128x32_stage1_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 2634905271404611895\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2634905271404611895 Time: 0.018364\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x256x32_stage1_warpsize2x4x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: 2689212690707793357\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2689212690707793357 Time: 0.071948\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x32_stage1_warpsize2x2x1_g1_tensor8x8x4_t1r1s1 Tactic: 2798075085844016892\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 2798075085844016892 Time: 0.040068\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_128x128_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: 3041642431972138763\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 3041642431972138763 Time: 0.024572\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x32_stage1_warpsize2x2x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: 3091156937974993800\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 3091156937974993800 Time: 0.039924\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x256x64_stage1_warpsize1x4x2_g1_tensor16x8x8_simple_t1r1s1 Tactic: 3199589679702517123\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 3199589679702517123 Time: 0.025544\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_128x128_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: 3754069740140581927\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 3754069740140581927 Time: 0.041648\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: 3932578551652369355\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 3932578551652369355 Time: 0.014232\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x128x32_stage1_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: 4149021101886580762\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 4149021101886580762 Time: 0.01828\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x64_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: 4555462412611657028\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 4555462412611657028 Time: 0.026616\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x32x32_stage1_warpsize4x1x1_g1_tensor16x8x8_t1r1s1 Tactic: 4749226340913476230\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 4749226340913476230 Time: 0.014892\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x64x32_stage1_warpsize2x2x1_g1_tensor8x8x4_t1r1s1 Tactic: 5483093640784800285\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 5483093640784800285 Time: 0.022744\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_128x128_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: 5666160310350604399\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 5666160310350604399 Time: 0.041588\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x32x32_stage1_warpsize4x1x1_g1_tensor8x8x4_t1r1s1 Tactic: 5900614001783877430\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 5900614001783877430 Time: 0.02688\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x128_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: 5925270497649423688\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 5925270497649423688 Time: 0.071348\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x128x32_stage1_warpsize2x2x1_g1_tensor8x8x4_t1r1s1 Tactic: 5999406432703271895\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 5999406432703271895 Time: 0.023356\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x64_sliced1x2_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: 6680916730816870145\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 6680916730816870145 Time: 0.041104\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 7107292614492808590\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 7107292614492808590 Time: 0.011236\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x64_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: 7158029511300006471\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 7158029511300006471 Time: 0.039796\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x64_sliced1x2_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: 7859952145590271433\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 7859952145590271433 Time: 0.039924\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_128x128_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: 8283847742354150423\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 8283847742354150423 Time: 0.041424\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x128x32_stage1_warpsize4x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 8468288610222482742\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 8468288610222482742 Time: 0.040232\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x256x32_stage1_warpsize2x4x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 8620567263556985011\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 8620567263556985011 Time: 0.040912\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: 8642279798680442080\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 8642279798680442080 Time: 0.039692\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x32x32_stage1_warpsize4x1x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 8980274178270132023\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 8980274178270132023 Time: 0.019588\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x128x32_stage1_warpsize2x2x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: 9108067304506990859\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 9108067304506990859 Time: 0.0227\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x32x64_stage1_warpsize2x1x2_g1_tensor8x8x4_simple_t1r1s1 Tactic: -9104099172933216230\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -9104099172933216230 Time: 0.012056\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x64_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: -8992262742606384444\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8992262742606384444 Time: 0.028628\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x32x32_stage1_warpsize4x1x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: -8956720569082607796\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8956720569082607796 Time: 0.026512\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x64x32_stage1_warpsize4x1x1_g1_tensor8x8x4_t1r1s1 Tactic: -8952042869709043207\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8952042869709043207 Time: 0.039796\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x4_t1r1s1 Tactic: -8898856569474934280\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8898856569474934280 Time: 0.03972\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x256x64_stage1_warpsize1x4x2_g1_tensor8x8x4_t1r1s1 Tactic: -8774805574135441656\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8774805574135441656 Time: 0.038836\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x128x32_stage1_warpsize4x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -8749513212655756001\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8749513212655756001 Time: 0.04062\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x4_t1r1s1 Tactic: -8520017388966620486\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8520017388966620486 Time: 0.014252\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x256x32_stage1_warpsize2x4x1_g1_tensor8x8x4_t1r1s1 Tactic: -8487084252145372186\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8487084252145372186 Time: 0.070864\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x128_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: -8391760416076885205\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -8391760416076885205 Time: 0.06958\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x32x32_stage1_warpsize4x1x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: -7990268040387498660\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -7990268040387498660 Time: 0.018204\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x128x32_stage1_warpsize4x2x1_g1_tensor8x8x4_t1r1s1 Tactic: -7849113095413980300\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -7849113095413980300 Time: 0.070416\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x64x32_stage1_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -7533167286135592323\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -7533167286135592323 Time: 0.017352\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x64_sliced1x2_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: -6273232454637933930\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -6273232454637933930 Time: 0.025384\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x64_sliced1x2_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: -5818527483287834165\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5818527483287834165 Time: 0.02472\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x64x32_stage1_warpsize2x2x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: -5590418898350402100\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5590418898350402100 Time: 0.022412\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -5505475137955795830\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5505475137955795830 Time: 0.02578\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x128x32_stage1_warpsize4x2x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: -5389631537202601150\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5389631537202601150 Time: 0.070112\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x64x32_stage1_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -5332866838585594777\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5332866838585594777 Time: 0.016544\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x32_stage1_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -5121883532434354186\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5121883532434354186 Time: 0.025336\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x64x32_stage1_warpsize4x1x1_g1_tensor16x8x8_t1r1s1 Tactic: -5006039300385557796\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -5006039300385557796 Time: 0.025984\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x128_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: -4534876761957424274\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -4534876761957424274 Time: 0.069552\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -4352168563838861262\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -4352168563838861262 Time: 0.025656\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x256x32_stage1_warpsize2x4x1_g1_tensor16x8x8_t1r1s1 Tactic: -4109084522508697633\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -4109084522508697633 Time: 0.04056\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x64_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: -3237051169894153788\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -3237051169894153788 Time: 0.038716\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_128x128_ldg8_relu_exp_small_nhwc_tn_v1 Tactic: -3136088851200285532\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -3136088851200285532 Time: 0.024448\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x32x64_stage1_warpsize2x1x2_g1_tensor16x8x8_t1r1s1 Tactic: -2827934362840121038\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2827934362840121038 Time: 0.010592\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x32x32_stage1_warpsize4x1x1_g1_tensor8x8x4_t1r1s1 Tactic: -2676138141351394855\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2676138141351394855 Time: 0.018272\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize128x128x32_stage1_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -2601537631049973288\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2601537631049973288 Time: 0.025492\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x128_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: -2586046817576862066\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2586046817576862066 Time: 0.04092\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x32x32_stage1_warpsize4x1x1_g1_tensor16x8x8_t1r1s1 Tactic: -2569977342077121032\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2569977342077121032 Time: 0.020356\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: volta_h884cudnn_256x64_sliced1x2_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: -2422160065350346448\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2422160065350346448 Time: 0.040016\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x256x64_stage1_warpsize1x4x2_g1_tensor8x8x4_simple_t1r1s1 Tactic: -2125188058121029448\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2125188058121029448 Time: 0.038936\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x64x32_stage1_warpsize4x1x1_g1_tensor8x8x4_simple_t1r1s1 Tactic: -2123887091022542343\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -2123887091022542343 Time: 0.039684\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -1838109259315759592\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -1838109259315759592 Time: 0.0112\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_256x128_ldg8_relu_exp_interior_nhwc_tn_v1 Tactic: -1216445540764179377\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -1216445540764179377 Time: 0.040112\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm70_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x32x64_stage1_warpsize2x1x2_g1_tensor8x8x4_t1r1s1 Tactic: -539379305772590030\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -539379305772590030 Time: 0.012136\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize256x64x32_stage1_warpsize4x1x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -288413895057594820\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -288413895057594820 Time: 0.025232\n", "[01/17/2022-21:16:55] [V] [TRT] Gemm_1183 Set Tactic Name: turing_h1688cudnn_128x128_ldg8_relu_exp_medium_nhwc_tn_v1 Tactic: -229563042944049199\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: -229563042944049199 Time: 0.025264\n", "[01/17/2022-21:16:55] [V] [TRT] Fastest Tactic: -2827934362840121038 Time: 0.010592\n", "[01/17/2022-21:16:55] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -2827934362840121038\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Float(768,1,1,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Float(768,1,768,768) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Float(768,1,768,768) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Half(768,1,1,1) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Half(384,1:2,1,1) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Half(384,1:2,1,1) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Half(96,1:8,96,96) -> Float(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Half(96,1:8,96,96) -> Half(768,1,1,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning format combination: Float(768,1,1,1) -> Float(768,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] --------------- Timing Runner: (Unnamed Layer* 1983) [Shuffle] (Shuffle)\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 0 Time: 0.003844\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 1 Time: 0.007316\n", "[01/17/2022-21:16:55] [V] [TRT] Fastest Tactic: 0 Time: 0.003844\n", "[01/17/2022-21:16:55] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Shuffle Tactic: 0\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning format combination: Half(768,1,1,1) -> Half(768,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] --------------- Timing Runner: (Unnamed Layer* 1983) [Shuffle] (Shuffle)\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 0 Time: 0.003796\n", "[01/17/2022-21:16:55] [V] [TRT] Tactic: 1 Time: 0.008308\n", "[01/17/2022-21:16:55] [V] [TRT] Fastest Tactic: 0 Time: 0.003796\n", "[01/17/2022-21:16:55] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Shuffle Tactic: 0\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Float(768,1) -> Half(768,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning Reformat:Half(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] *************** Autotuning format combination: Float(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:16:55] [V] [TRT] --------------- Timing Runner: PWN(Tanh_1184) (PointWiseV2)\n", "[01/17/2022-21:16:56] [V] [TRT] Tactic: 0 Time: 0.00408\n", "[01/17/2022-21:16:56] [V] [TRT] Tactic: 1 Time: 0.004112\n", "[01/17/2022-21:16:56] [V] [TRT] Tactic: 2 Time: 0.004064\n", "[01/17/2022-21:16:56] [V] [TRT] Tactic: 3 Time: 0.004272\n", "[01/17/2022-21:16:57] [V] [TRT] Tactic: 4 Time: 0.004112\n", "[01/17/2022-21:16:57] [V] [TRT] Tactic: 5 Time: 0.006672\n", "[01/17/2022-21:16:57] [V] [TRT] Tactic: 6 Time: 0.009428\n", "[01/17/2022-21:16:57] [V] [TRT] Tactic: 7 Time: 0.008316\n", "[01/17/2022-21:16:57] [V] [TRT] Tactic: 8 Time: 0.007468\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 9 Time: 0.00718\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 28 Time: 0.006404\n", "[01/17/2022-21:16:58] [V] [TRT] Fastest Tactic: 2 Time: 0.004064\n", "[01/17/2022-21:16:58] [V] [TRT] --------------- Timing Runner: PWN(Tanh_1184) (PointWise)\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 128 Time: 0.00822\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 256 Time: 0.008284\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 512 Time: 0.008224\n", "[01/17/2022-21:16:58] [V] [TRT] Fastest Tactic: 128 Time: 0.00822\n", "[01/17/2022-21:16:58] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: PointWiseV2 Tactic: 2\n", "[01/17/2022-21:16:58] [V] [TRT] *************** Autotuning format combination: Half(768,1) -> Half(768,1) ***************\n", "[01/17/2022-21:16:58] [V] [TRT] --------------- Timing Runner: PWN(Tanh_1184) (PointWiseV2)\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 0 Time: 0.006528\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 1 Time: 0.007292\n", "[01/17/2022-21:16:58] [V] [TRT] Tactic: 2 Time: 0.006644\n", "[01/17/2022-21:16:59] [V] [TRT] Tactic: 3 Time: 0.008196\n", "[01/17/2022-21:16:59] [V] [TRT] Tactic: 4 Time: 0.007256\n", "[01/17/2022-21:16:59] [V] [TRT] Tactic: 5 Time: 0.006556\n", "[01/17/2022-21:16:59] [V] [TRT] Tactic: 6 Time: 0.009608\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 7 Time: 0.008232\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 8 Time: 0.00776\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 9 Time: 0.007208\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 28 Time: 0.006592\n", "[01/17/2022-21:17:00] [V] [TRT] Fastest Tactic: 0 Time: 0.006528\n", "[01/17/2022-21:17:00] [V] [TRT] --------------- Timing Runner: PWN(Tanh_1184) (PointWise)\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 128 Time: 0.012256\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 256 Time: 0.012472\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 512 Time: 0.013432\n", "[01/17/2022-21:17:00] [V] [TRT] Fastest Tactic: 128 Time: 0.012256\n", "[01/17/2022-21:17:00] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: PointWiseV2 Tactic: 0\n", "[01/17/2022-21:17:00] [V] [TRT] *************** Autotuning Reformat:Half(768,1) -> Float(768,1) ***************\n", "[01/17/2022-21:17:00] [V] [TRT] --------------- Timing Runner: Optimizer Reformat (Reformat)\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 1002 Time: 0.01026\n", "[01/17/2022-21:17:00] [V] [TRT] Tactic: 0 Time: 0.006648\n", "[01/17/2022-21:17:00] [V] [TRT] Fastest Tactic: 0 Time: 0.006648\n", "[01/17/2022-21:17:00] [V] [TRT] Adding reformat layer: Reformatted Output Tensor 0 to {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]} (output) from Half(98304,768,1) to Float(98304,768,1)\n", "[01/17/2022-21:17:00] [V] [TRT] Adding reformat layer: Reformatted Input Tensor 0 to Gemm_1183 ((Unnamed Layer* 1978) [Shuffle]_output) from Float(768,1,1,1) to Half(96,1:8,96,96)\n", "[01/17/2022-21:17:00] [V] [TRT] Adding reformat layer: Reformatted Input Tensor 0 to (Unnamed Layer* 1983) [Shuffle] ((Unnamed Layer* 1979) [Fully Connected]_output) from Half(96,1:8,96,96) to Float(768,1,1,1)\n", "[01/17/2022-21:17:00] [V] [TRT] For layer Gather_1182 a non-conforming implementation was chosen than was requested i.e. requested layer computation precision and output precision types were ignored because it resulted in faster network performance. Enable strict mode to try force choose a conforming implementation.\n", "[01/17/2022-21:17:00] [V] [TRT] For layer (Unnamed Layer* 1978) [Shuffle] a non-conforming implementation was chosen than was requested i.e. requested layer computation precision and output precision types were ignored because it resulted in faster network performance. Enable strict mode to try force choose a conforming implementation.\n", "[01/17/2022-21:17:00] [V] [TRT] For layer (Unnamed Layer* 1983) [Shuffle] a non-conforming implementation was chosen than was requested i.e. requested layer computation precision and output precision types were ignored because it resulted in faster network performance. Enable strict mode to try force choose a conforming implementation.\n", "[01/17/2022-21:17:00] [V] [TRT] For layer PWN(Tanh_1184) a non-conforming implementation was chosen than was requested i.e. requested layer computation precision and output precision types were ignored because it resulted in faster network performance. Enable strict mode to try force choose a conforming implementation.\n", "[01/17/2022-21:17:00] [V] [TRT] Formats and tactics selection completed in 442.646 seconds.\n", "[01/17/2022-21:17:00] [V] [TRT] After reformat layers: 22 layers\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 14680064000\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 25165824\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 25165824\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 25165824\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 196608\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 512\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 512\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 512\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 512\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 512\n", "[01/17/2022-21:17:00] [V] [TRT] Block size 512\n", "[01/17/2022-21:17:00] [V] [TRT] Total Activation Memory: 14755761152\n", "[01/17/2022-21:17:00] [I] [TRT] Detected 2 inputs and 2 output network tensors.\n", "[W] [TRT] Myelin graph with multiple dynamic values may have poor performance if they differ. Dynamic values are: \n", "[01/17/2022-21:17:02] [W] [TRT] (# 0 (SHAPE attn_mask))\n", "[01/17/2022-21:17:02] [W] [TRT] (# 0 (SHAPE token_ids))\n", "[01/17/2022-21:17:02] [01/17/2022-21:20:15] [V] [TRT] Gemm_1183 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_f16f16_f16f16_f16_nhwckrsc_nhwc_tilesize64x32x64_stage1_warpsize2x1x2_g1_tensor16x8x8_t1r1s1 Tactic: -2827934362840121038\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: [HostToDeviceCopy] HostPersistent: 24 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: embeddings.token_type_embeddings.weight HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: 208 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Slice_9 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Expand_21 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: embeddings.position_embeddings.weight HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: embeddings.position_ids HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Slice_36 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: embeddings.word_embeddings.weight HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Gather_37 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Gather_40 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Gather_38 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]} HostPersistent: 288 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Reformatting CopyNode for Output Tensor 0 to {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]} HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: 1631 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Gather_1182 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Reformatting CopyNode for Input Tensor 0 to Gemm_1183 HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Gemm_1183 HostPersistent: 3776 DevicePersistent: 1184256\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: Reformatting CopyNode for Input Tensor 0 to (Unnamed Layer* 1983) [Shuffle] HostPersistent: 0 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [V] [TRT] Layer: PWN(Tanh_1184) HostPersistent: 244 DevicePersistent: 0\n", "[01/17/2022-21:20:17] [I] [TRT] Total Host Persistent Memory: 4352\n", "[01/17/2022-21:20:17] [I] [TRT] Total Device Persistent Memory: 1184256\n", "[01/17/2022-21:20:17] [I] [TRT] Total Scratch Memory: 327287516\n", "[01/17/2022-21:20:17] [I] [TRT] [MemUsageStats] Peak memory usage of TRT CPU/GPU memory allocators: CPU 166 MiB, GPU 4 MiB\n", "[01/17/2022-21:20:17] [V] [TRT] Using cublasLt a tactic source\n", "[01/17/2022-21:20:17] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +8, now: CPU 2168, GPU 1322 (MiB)\n", "[01/17/2022-21:20:17] [V] [TRT] Using cuDNN as a tactic source\n", "[01/17/2022-21:20:17] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +8, now: CPU 2168, GPU 1330 (MiB)\n", "[01/17/2022-21:20:17] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 2168, GPU 1314 (MiB)\n", "[01/17/2022-21:20:17] [V] [TRT] Engine generation completed in 640.38 seconds.\n", "[01/17/2022-21:20:17] [V] [TRT] Deleting timing cache: 41 entries, 13 hits\n", "[01/17/2022-21:20:17] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 2167, GPU 1296 (MiB)\n", "[01/17/2022-21:20:17] [V] [TRT] Engine Layer Information:\n", "Layer(ShapeHostToDevice): [HostToDeviceCopy], Tactic: 0, -> 247[implicit padding 0][Int32()], attn_mask[implicit padding 1][Int32()], 315'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[shuffle input][implicit padding 2][Int32()], (Unnamed Layer* 184) [Shuffle]_output'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[implicit padding 3][Int32()], 321'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[implicit padding 4][Int32()], 429'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[shuffle input][implicit padding 5][Int32()]\n", "Layer(Constant): embeddings.token_type_embeddings.weight, Tactic: 0, -> (Unnamed Layer* 88) [Constant]_output[Half(2,768)]\n", "Layer(Constant): 208, Tactic: 0, -> (Unnamed Layer* 7) [Constant]_output[Int32(1,512)]\n", "Layer(Reformat): Slice_9, Tactic: 0, (Unnamed Layer* 7) [Constant]_output[Int32(1,128)] -> 215[Int32(1,128)]\n", "Layer(Slice): Expand_21, Tactic: 0, 215[Int32(1,128)] -> 227[Int32(-5,128)]\n", "Layer(Constant): embeddings.position_embeddings.weight, Tactic: 0, -> (Unnamed Layer* 91) [Constant]_output[Half(512,768)]\n", "Layer(Constant): embeddings.position_ids, Tactic: 0, -> (Unnamed Layer* 71) [Constant]_output[Int32(1,512)]\n", "Layer(Reformat): Slice_36, Tactic: 0, (Unnamed Layer* 71) [Constant]_output[Int32(1,128)] -> 246[Int32(1,128)]\n", "Layer(Constant): embeddings.word_embeddings.weight, Tactic: 0, -> (Unnamed Layer* 86) [Constant]_output[Half(105879,768)]\n", "Layer(Gather): Gather_37, Tactic: 7, (Unnamed Layer* 86) [Constant]_output[Half(105879,768)], token_ids[Int32(-5,128)] -> 247[Half(-5,128,768)]\n", "Layer(Gather): Gather_40, Tactic: 8, (Unnamed Layer* 91) [Constant]_output[Half(512,768)], 246[Int32(1,128)] -> 250[Half(1,128,768)]\n", "Layer(Gather): Gather_38, Tactic: 8, (Unnamed Layer* 88) [Constant]_output[Half(2,768)], 227[Int32(-5,128)] -> 248[Half(-5,128,768)]\n", "Layer(Myelin): {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}, Tactic: 0, 247[Half(-5,128,768)], 248[Half(-5,128,768)], 250[Half(1,128,768)], attn_mask[Int32(-7,128)], 247[implicit padding 0][Int32()], attn_mask[implicit padding 1][Int32()], 315'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[shuffle input][implicit padding 2][Int32()], (Unnamed Layer* 184) [Shuffle]_output'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[implicit padding 3][Int32()], 321'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[implicit padding 4][Int32()], 429'{ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[shuffle input][implicit padding 5][Int32()] -> Reformatted Output Tensor 0 to {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[Half(-9,128,768)]\n", "Layer(Reformat): Reformatting CopyNode for Output Tensor 0 to {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}, Tactic: 1002, Reformatted Output Tensor 0 to {ForeignNode[1643 + (Unnamed Layer* 136) [Shuffle]...Add_1180]}[Half(-9,128,768)] -> output[Float(-9,128,768)]\n", "Layer(Constant): 1631, Tactic: 0, -> (Unnamed Layer* 1971) [Constant]_output[Int32()]\n", "Layer(Gather): Gather_1182, Tactic: 9, output[Float(-9,128,768)], (Unnamed Layer* 1971) [Constant]_output[Int32()] -> 1632[Float(-9,768)]\n", "Layer(Reformat): Reformatting CopyNode for Input Tensor 0 to Gemm_1183, Tactic: 0, (Unnamed Layer* 1978) [Shuffle]_output[Float(-9,768,1,1)] -> Reformatted Input Tensor 0 to Gemm_1183[Half(-9,768,1,1)]\n", "Layer(CaskConvolution): Gemm_1183, Tactic: -2827934362840121038, Reformatted Input Tensor 0 to Gemm_1183[Half(-9,768,1,1)] -> (Unnamed Layer* 1979) [Fully Connected]_output[Half(-9,768,1,1)]\n", "Layer(Reformat): Reformatting CopyNode for Input Tensor 0 to (Unnamed Layer* 1983) [Shuffle], Tactic: 0, (Unnamed Layer* 1979) [Fully Connected]_output[Half(-9,768,1,1)] -> Reformatted Input Tensor 0 to (Unnamed Layer* 1983) [Shuffle][Float(-9,768,1,1)]\n", "Layer(PointWiseV2): PWN(Tanh_1184), Tactic: 2, 1633[Float(-9,768)] -> 1634[Float(-9,768)]\n", "[01/17/2022-21:20:17] [I] [TRT] [MemUsageSnapshot] Builder end: CPU 2166 MiB, GPU 1296 MiB\n", "[01/17/2022-21:20:19] [I] [TRT] Loaded engine size: 518 MB\n", "[01/17/2022-21:20:19] [I] [TRT] [MemUsageSnapshot] deserializeCudaEngine begin: CPU 2676 MiB, GPU 812 MiB\n", "[01/17/2022-21:20:21] [V] [TRT] Using cublasLt a tactic source\n", "[01/17/2022-21:20:21] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +10, now: CPU 2684, GPU 1304 (MiB)\n", "[01/17/2022-21:20:21] [V] [TRT] Using cuDNN as a tactic source\n", "[01/17/2022-21:20:21] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +8, now: CPU 2684, GPU 1312 (MiB)\n", "[01/17/2022-21:20:21] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 2684, GPU 1294 (MiB)\n", "[01/17/2022-21:20:21] [V] [TRT] Deserialization required 1601544 microseconds.\n", "[01/17/2022-21:20:21] [I] [TRT] [MemUsageSnapshot] deserializeCudaEngine end: CPU 2684 MiB, GPU 1294 MiB\n", "[01/17/2022-21:20:27] [I] Engine built in 654.341 sec.\n", "[01/17/2022-21:20:27] [I] [TRT] [MemUsageSnapshot] ExecutionContext creation begin: CPU 1522 MiB, GPU 1294 MiB\n", "[01/17/2022-21:20:27] [V] [TRT] Using cublasLt a tactic source\n", "[01/17/2022-21:20:27] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +10, now: CPU 1522, GPU 1304 (MiB)\n", "[01/17/2022-21:20:27] [V] [TRT] Using cuDNN as a tactic source\n", "[01/17/2022-21:20:27] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +8, now: CPU 1522, GPU 1312 (MiB)\n", "[01/17/2022-21:20:27] [V] [TRT] Total per-runner device memory is 1184256\n", "[01/17/2022-21:20:27] [V] [TRT] Total per-runner host memory is 4352\n", "[01/17/2022-21:20:27] [V] [TRT] Allocated activation device memory of size 402984960\n", "[01/17/2022-21:20:27] [V] [TRT] myelinAllocCb allocated GPU (data-constants) 12 bytes at 0x30bcfb900.\n", "[01/17/2022-21:20:29] [I] [TRT] [MemUsageSnapshot] ExecutionContext creation end: CPU 1576 MiB, GPU 1702 MiB\n", "[01/17/2022-21:20:29] [I] Created input binding for token_ids with dimensions 16x128\n", "[01/17/2022-21:20:29] [I] Created input binding for attn_mask with dimensions 16x128\n", "[01/17/2022-21:20:29] [I] Created output binding for output with dimensions 16x128x768\n", "[01/17/2022-21:20:29] [I] Created output binding for 1634 with dimensions 16x768\n", "[01/17/2022-21:20:29] [I] Starting inference\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 3342848 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 6685696 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 9439232 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 18878464 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 40910848 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x7f112c003500.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 81821696 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x7f112c004c80.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 163643392 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x7f112c006380.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 327286784 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x7f112c007b00.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated GPU 40902656 bytes at 0x320c00000.\n", "[01/17/2022-21:20:29] [V] [TRT] myelinAllocCb allocated CPU 5764 bytes at 0x7f112c009200.\n", "[01/17/2022-21:20:32] [I] Warmup completed 7 queries over 200 ms\n", "[01/17/2022-21:20:32] [I] Timing trace has 168 queries over 3.02514 s\n", "[01/17/2022-21:20:32] [I] \n", "[01/17/2022-21:20:32] [I] === Trace details ===\n", "[01/17/2022-21:20:32] [I] Trace averages of 10 runs:\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 19.237 ms - Host latency: 20.2651 ms (end to end 20.2765 ms, enqueue 19.2234 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.8785 ms - Host latency: 18.9116 ms (end to end 18.9227 ms, enqueue 17.8665 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.7844 ms - Host latency: 18.8122 ms (end to end 18.8239 ms, enqueue 17.7711 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.7652 ms - Host latency: 18.7956 ms (end to end 18.8044 ms, enqueue 17.7528 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.7342 ms - Host latency: 18.7657 ms (end to end 18.778 ms, enqueue 17.7245 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 18.1462 ms - Host latency: 19.1782 ms (end to end 19.1909 ms, enqueue 18.1349 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.8361 ms - Host latency: 18.8657 ms (end to end 18.878 ms, enqueue 17.8222 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.801 ms - Host latency: 18.8345 ms (end to end 18.8473 ms, enqueue 17.7865 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 18.0522 ms - Host latency: 19.0841 ms (end to end 19.095 ms, enqueue 18.0386 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.818 ms - Host latency: 18.8505 ms (end to end 18.8618 ms, enqueue 17.804 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.9759 ms - Host latency: 19.0073 ms (end to end 19.0205 ms, enqueue 17.9625 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 18.0702 ms - Host latency: 19.1039 ms (end to end 19.1167 ms, enqueue 18.0589 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.7052 ms - Host latency: 18.7375 ms (end to end 18.7495 ms, enqueue 17.6918 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.8359 ms - Host latency: 18.8684 ms (end to end 18.8801 ms, enqueue 17.8235 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.9514 ms - Host latency: 18.9839 ms (end to end 18.9964 ms, enqueue 17.9389 ms)\n", "[01/17/2022-21:20:32] [I] Average on 10 runs - GPU latency: 17.7322 ms - Host latency: 18.7659 ms (end to end 18.7769 ms, enqueue 17.717 ms)\n", "[01/17/2022-21:20:32] [I] \n", "[01/17/2022-21:20:32] [I] === Performance summary ===\n", "[01/17/2022-21:20:32] [I] Throughput: 55.5346 qps\n", "[01/17/2022-21:20:32] [I] Latency: min = 18.547 ms, max = 32.5411 ms, mean = 18.9891 ms, median = 18.874 ms, percentile(99%) = 19.7075 ms\n", "[01/17/2022-21:20:32] [I] End-to-End Host Latency: min = 18.5607 ms, max = 32.5616 ms, mean = 19.0009 ms, median = 18.8845 ms, percentile(99%) = 19.7186 ms\n", "[01/17/2022-21:20:32] [I] Enqueue Time: min = 17.5049 ms, max = 31.4764 ms, mean = 17.9446 ms, median = 17.8306 ms, percentile(99%) = 18.6627 ms\n", "[01/17/2022-21:20:32] [I] H2D Latency: min = 0.0102692 ms, max = 0.0169678 ms, mean = 0.0135384 ms, median = 0.0134277 ms, percentile(99%) = 0.0150757 ms\n", "[01/17/2022-21:20:32] [I] GPU Compute Time: min = 17.5187 ms, max = 31.5155 ms, mean = 17.9575 ms, median = 17.8443 ms, percentile(99%) = 18.6834 ms\n", "[01/17/2022-21:20:32] [I] D2H Latency: min = 0.97998 ms, max = 1.05371 ms, mean = 1.01799 ms, median = 1.01627 ms, percentile(99%) = 1.04321 ms\n", "[01/17/2022-21:20:32] [I] Total Host Walltime: 3.02514 s\n", "[01/17/2022-21:20:32] [I] Total GPU Compute Time: 3.01686 s\n", "[W] [01/17/2022-21:20:32] * Throughput may be bound by Enqueue Time rather than GPU Compute and the GPU may be under-utilized.\n", "[01/17/2022-21:20:32] [W] If not already in use, --useCudaGraph (utilize CUDA graphs where possible) may increase the throughput.\n", "[01/17/2022-21:20:32] [I] Explanations of the performance metrics are printed in the verbose logs.\n", "[01/17/2022-21:20:32] [V] \n", "[01/17/2022-21:20:32] [V] === Explanations of the performance metrics ===\n", "[01/17/2022-21:20:32] [V] Total Host Walltime: the host walltime from when the first query (after warmups) is enqueued to when the last query is completed.\n", "[01/17/2022-21:20:32] [V] GPU Compute Time: the GPU latency to execute the kernels for a query.\n", "[01/17/2022-21:20:32] [V] Total GPU Compute Time: the summation of the GPU Compute Time of all the queries. If this is significantly shorter than Total Host Walltime, the GPU may be under-utilized because of host-side overheads or data transfers.\n", "[01/17/2022-21:20:32] [V] Throughput: the observed throughput computed by dividing the number of queries by the Total Host Walltime. If this is significantly lower than the reciprocal of GPU Compute Time, the GPU may be under-utilized because of host-side overheads or data transfers.\n", "[01/17/2022-21:20:32] [V] Enqueue Time: the host latency to enqueue a query. If this is longer than GPU Compute Time, the GPU may be under-utilized.\n", "[01/17/2022-21:20:32] [V] H2D Latency: the latency for host-to-device data transfers for input tensors of a single query.\n", "[01/17/2022-21:20:32] [V] D2H Latency: the latency for device-to-host data transfers for output tensors of a single query.\n", "[01/17/2022-21:20:32] [V] Latency: the summation of H2D Latency, GPU Compute Time, and D2H Latency. This is the latency to infer a single query.\n", "[01/17/2022-21:20:32] [V] End-to-End Host Latency: the duration from when the H2D of a query is called to when the D2H of the same query is completed, which includes the latency to wait for the completion of the previous query. This is the latency of a query if multiple queries are enqueued consecutively.\n", "[01/17/2022-21:20:32] [I] \n", "&&&& PASSED TensorRT.trtexec [TensorRT v8001] # trtexec --onnx=model.onnx --saveEngine=model_bs16.plan --minShapes=token_ids:1x128,attn_mask:1x128 --optShapes=token_ids:16x128,attn_mask:16x128 --maxShapes=token_ids:128x128,attn_mask:128x128 --fp16 --verbose --workspace=14000\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing CPU at 0x7f112c003500.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing CPU at 0x7f112c004c80.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing CPU at 0x7f112c006380.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing CPU at 0x7f112c007b00.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x320c00000.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing CPU at 0x7f112c009200.\n", "[01/17/2022-21:20:32] [V] [TRT] myelinFreeCb freeing GPU at 0x30bcfb900.\n", "[01/17/2022-21:20:32] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 1522, GPU 1680 (MiB)\n" ] } ], "source": [ "!docker run --gpus=all --rm -it \\\n", " -v `pwd`/workspace:/workspace nvcr.io/nvidia/pytorch:21.08-py3 \\\n", " /bin/bash generate_models.sh" ] }, { "cell_type": "markdown", "id": "527e9eef", "metadata": {}, "source": [ "## PyTorch NLP-Bert\n", "\n", "For a simple use case we will take the pre-trained NLP Bert model from [Hugging Face](https://huggingface.co/transformers/model_doc/bert.html) and deploy it on SageMaker with Triton as the model server. The script for exporting this model can be found [here](./workspace/pt_exporter.py). This is run as part of the `generate_models.sh` script from the previous cell. After the model is serialized we package it into the format that Triton and SageMaker expect it to be. We used the pre-configured `config.pbtxt` file provided with this repo [here](./triton-serve-pt/bert/config.pbtxt) to specify model [configuration](https://github.com/triton-inference-server/server/blob/main/docs/model_configuration.md) which Triton uses to load the model. We tar the model directory and upload it to s3 to later create a [SageMaker Model](https://sagemaker.readthedocs.io/en/stable/api/inference/model.html).\n", "\n", "**Note**: SageMaker expects the model tarball file to have a top level directory with the same name as the model defined in the `config.pbtxt`.\n", "\n", "```\n", "bert\n", "├── 1\n", "│ └── model.pt\n", "└── config.pbtxt\n", "```" ] }, { "cell_type": "markdown", "id": "30061e77", "metadata": {}, "source": [ "### PyTorch: Packaging model files and uploading to s3" ] }, { "cell_type": "code", "execution_count": 74, "id": "a5caf869", "metadata": {}, "outputs": [], "source": [ "!mkdir -p triton-serve-pt/bert/1/\n", "!cp workspace/model.pt triton-serve-pt/bert/1/\n", "!tar -C triton-serve-pt/ -czf model.tar.gz bert\n", "model_uri = sagemaker_session.upload_data(path=\"model.tar.gz\", key_prefix=\"triton-serve-pt\")" ] }, { "cell_type": "markdown", "id": "0ff72cfe", "metadata": {}, "source": [ "### PyTorch: Create SageMaker Endpoint\n", "\n", "We start off by creating a [sagemaker model](https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_CreateModel.html) from the model files we uploaded to s3 in the previous step.\n", "\n", "In this step we also provide an additional Environment Variable i.e. `SAGEMAKER_TRITON_DEFAULT_MODEL_NAME` which specifies the name of the model to be loaded by Triton. **The value of this key should match the folder name in the model package uploaded to s3**. This variable is optional in case of a single model. In case of ensemble models, this key **has to be** specified for Triton to startup in SageMaker.\n", "\n", "Additionally, customers can set `SAGEMAKER_TRITON_BUFFER_MANAGER_THREAD_COUNT` and `SAGEMAKER_TRITON_THREAD_COUNT` for optimizing the thread counts.\n", "\n", "*Note*: The current release of Triton (21.08-py3) on SageMaker doesn't support running instances of different models on the same server, except in case of [ensembles](https://github.com/triton-inference-server/server/blob/main/docs/architecture.md#ensemble-models). Only multiple model instances of the same model are supported, which can be specified under the [instance-groups](https://github.com/triton-inference-server/server/blob/main/docs/model_configuration.md#instance-groups) section of the config.pbtxt file." ] }, { "cell_type": "code", "execution_count": 75, "id": "e2008819", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Model Arn: arn:aws:sagemaker:us-east-1:474422712127:model/triton-nlp-bert-pt-2022-01-17-21-22-32\n" ] } ], "source": [ "sm_model_name = \"triton-nlp-bert-pt-\" + time.strftime(\"%Y-%m-%d-%H-%M-%S\", time.gmtime())\n", "\n", "container = {\n", " \"Image\": triton_image_uri,\n", " \"ModelDataUrl\": model_uri,\n", " \"Environment\": {\"SAGEMAKER_TRITON_DEFAULT_MODEL_NAME\": \"bert\"},\n", "}\n", "\n", "create_model_response = sm.create_model(\n", " ModelName=sm_model_name, ExecutionRoleArn=role, PrimaryContainer=container\n", ")\n", "\n", "print(\"Model Arn: \" + create_model_response[\"ModelArn\"])" ] }, { "cell_type": "markdown", "id": "6fa3bc28", "metadata": {}, "source": [ "Using the model above, we create an [endpoint configuration](https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_CreateEndpointConfig.html) where we can specify the type and number of instances we want in the endpoint." ] }, { "cell_type": "code", "execution_count": 76, "id": "1c0f6ae1", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Endpoint Config Arn: arn:aws:sagemaker:us-east-1:474422712127:endpoint-config/triton-nlp-bert-pt-2022-01-17-21-22-36\n" ] } ], "source": [ "endpoint_config_name = \"triton-nlp-bert-pt-\" + time.strftime(\"%Y-%m-%d-%H-%M-%S\", time.gmtime())\n", "\n", "create_endpoint_config_response = sm.create_endpoint_config(\n", " EndpointConfigName=endpoint_config_name,\n", " ProductionVariants=[\n", " {\n", " \"InstanceType\": \"ml.g4dn.16xlarge\",\n", " \"InitialVariantWeight\": 1,\n", " \"InitialInstanceCount\": 1,\n", " \"ModelName\": sm_model_name,\n", " \"VariantName\": \"AllTraffic\",\n", " }\n", " ],\n", ")\n", "\n", "print(\"Endpoint Config Arn: \" + create_endpoint_config_response[\"EndpointConfigArn\"])" ] }, { "cell_type": "markdown", "id": "2ef7a7fc", "metadata": {}, "source": [ "Using the above endpoint configuration we create a new sagemaker endpoint and wait for the deployment to finish. The status will change to **InService** once the deployment is successful." ] }, { "cell_type": "code", "execution_count": 77, "id": "b72e317b", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Endpoint Arn: arn:aws:sagemaker:us-east-1:474422712127:endpoint/triton-nlp-bert-pt-multilingual2022-01-17-21-22-39\n" ] } ], "source": [ "endpoint_name = \"triton-nlp-bert-pt-multilingual\" + time.strftime(\"%Y-%m-%d-%H-%M-%S\", time.gmtime())\n", "\n", "create_endpoint_response = sm.create_endpoint(\n", " EndpointName=endpoint_name, EndpointConfigName=endpoint_config_name\n", ")\n", "\n", "print(\"Endpoint Arn: \" + create_endpoint_response[\"EndpointArn\"])" ] }, { "cell_type": "code", "execution_count": 78, "id": "3d459b27", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: InService\n", "Arn: arn:aws:sagemaker:us-east-1:474422712127:endpoint/triton-nlp-bert-pt-multilingual2022-01-17-21-22-39\n", "Status: InService\n" ] } ], "source": [ "resp = sm.describe_endpoint(EndpointName=endpoint_name)\n", "status = resp[\"EndpointStatus\"]\n", "print(\"Status: \" + status)\n", "\n", "while status == \"Creating\":\n", " time.sleep(60)\n", " resp = sm.describe_endpoint(EndpointName=endpoint_name)\n", " status = resp[\"EndpointStatus\"]\n", " print(\"Status: \" + status)\n", "\n", "print(\"Arn: \" + resp[\"EndpointArn\"])\n", "print(\"Status: \" + status)" ] }, { "cell_type": "markdown", "id": "ccebe031", "metadata": {}, "source": [ "### PyTorch: Run inference\n", "\n", "Once we have the endpoint running we can use a sample text to do an inference using json as the payload format. For inference request format, Triton uses the KFServing community standard [inference protocols](https://github.com/triton-inference-server/server/blob/main/docs/protocol/README.md)." ] }, { "cell_type": "code", "execution_count": 81, "id": "f9533527", "metadata": {}, "outputs": [], "source": [ "text_triton = \"Triton Inference Server provides a cloud and edge inferencing solution optimized for both CPUs and GPUs.\"\n", "input_ids, attention_mask = tokenize_text(text_triton)\n", "\n", "payload = {\n", " \"inputs\": [\n", " {\"name\": \"INPUT__0\", \"shape\": [1, 128], \"datatype\": \"INT32\", \"data\": input_ids},\n", " {\"name\": \"INPUT__1\", \"shape\": [1, 128], \"datatype\": \"INT32\", \"data\": attention_mask},\n", " ]\n", "}\n", "\n", "response = client.invoke_endpoint(\n", " EndpointName=endpoint_name, ContentType=\"application/octet-stream\", Body=json.dumps(payload)\n", ")\n", "\n", "#print(json.loads(response[\"Body\"].read().decode(\"utf8\")))" ] }, { "cell_type": "markdown", "id": "e18d857b", "metadata": {}, "source": [ "We can also use binary+json as the payload format to get better performance for the inference call. The specification of this format is provided [here](https://github.com/triton-inference-server/server/blob/main/docs/protocol/extension_binary_data.md).\n", "\n", "**Note:** With the `binary+json` format, we have to specify the length of the request metadata in the header to allow Triton to correctly parse the binary payload. This is done using a custom Content-Type header `application/vnd.sagemaker-triton.binary+json;json-header-size={}`.\n", "\n", "Please not, this is different from using `Inference-Header-Content-Length` header on a stand-alone Triton server since custom headers are not allowed in SageMaker." ] }, { "cell_type": "code", "execution_count": 82, "id": "638d2a4d", "metadata": {}, "outputs": [], "source": [ "text_sm = \"Amazon SageMaker helps data scientists and developers to prepare, build, train, and deploy high-quality machine learning (ML) models quickly by bringing together a broad set of capabilities purpose-built for ML.\"\n", "request_body, header_length = get_sample_tokenized_text_binary_pt(text_sm)\n" ] }, { "cell_type": "code", "execution_count": 84, "id": "34f2e79b", "metadata": {}, "outputs": [], "source": [ "response = client.invoke_endpoint(\n", " EndpointName=endpoint_name,\n", " ContentType=\"application/vnd.sagemaker-triton.binary+json;json-header-size={}\".format(\n", " header_length\n", " ),\n", " Body=request_body,\n", ")\n", "# Parse json header size length from the response\n", "header_length_prefix = \"application/vnd.sagemaker-triton.binary+json;json-header-size=\"\n", "header_length_str = response[\"ContentType\"][len(header_length_prefix) :]\n", "\n", "# Read response body\n", "result = httpclient.InferenceServerClient.parse_response_body(\n", " response[\"Body\"].read(), header_length=int(header_length_str)\n", ")\n", "output0_data = result.as_numpy(\"OUTPUT__0\")\n", "output1_data = result.as_numpy(\"1634__1\")\n", "#output0_data" ] }, { "cell_type": "code", "execution_count": 102, "id": "26f91d07", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "Errors - 0.0000 out of 100.0000 total runs | 0.0000% in 1.5954 seconds \n", "\n", "\n", "TPS: 62.6818\n", "Client end-to-end latency percentiles:\n", "Avg | P50 | P90 | P95 | P100\n", "14.5439 | 14.3293 | 14.7731 | 15.0954 | 27.8587 \n", "\n" ] } ], "source": [ "import numpy as np \n", "import datetime\n", "import math\n", "import time\n", "import random\n", "import json\n", "\n", "total_runs = 100\n", "client_times = []\n", "errors_list = []\n", "\n", "cw_start = datetime.datetime.utcnow()\n", "errors = 0\n", "\n", "endpoint_name = \"triton-nlp-bert-pt-multilingual2022-01-17-21-22-39\" #hard coded to display\n", "for _ in range(total_runs):\n", " client_start = time.time()\n", " response = client.invoke_endpoint(\n", " EndpointName=endpoint_name,\n", " ContentType=\"application/vnd.sagemaker-triton.binary+json;json-header-size={}\".format(\n", " header_length\n", " ),\n", " Body=request_body)\n", " client_end = time.time()\n", " client_times.append((client_end - client_start)*1000)\n", " skunk = response['Body'].read()\n", "\n", "cw_end = datetime.datetime.utcnow() \n", "\n", "cw_duration = cw_end - cw_start \n", "duration_in_s = cw_duration.total_seconds() \n", "\n", "tps = total_runs/duration_in_s\n", "\n", "print('\\nErrors - {:.4f} out of {:.4f} total runs | {:.4f}% in {:.4f} seconds \\n'.format(errors, total_runs, (errors/total_runs)*100, duration_in_s))\n", "errors = 0\n", "\n", "print('\\nTPS: {:.4f}'.format(tps))\n", " \n", "print('Client end-to-end latency percentiles:')\n", "client_avg = np.mean(client_times)\n", "client_p50 = np.percentile(client_times, 50)\n", "client_p90 = np.percentile(client_times, 90)\n", "client_p95 = np.percentile(client_times, 95)\n", "client_p100 = np.percentile(client_times, 100)\n", "print('Avg | P50 | P90 | P95 | P100')\n", "print('{:.4f} | {:.4f} | {:.4f} | {:.4f} | {:.4f} \\n'.format(client_avg, client_p50, client_p90, client_p95, client_p100))\n", "\n", "# Give 5 minute buffer to end\n", "cw_end += datetime.timedelta(minutes=5)" ] }, { "cell_type": "markdown", "id": "17a74307", "metadata": {}, "source": [ "### PyTorch: Terminate endpoint and clean up artifacts" ] }, { "cell_type": "code", "execution_count": 30, "id": "14705e8f", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'ResponseMetadata': {'RequestId': '08494bca-3a5f-4d0d-8b55-fc9490527570',\n", " 'HTTPStatusCode': 200,\n", " 'HTTPHeaders': {'x-amzn-requestid': '08494bca-3a5f-4d0d-8b55-fc9490527570',\n", " 'content-type': 'application/x-amz-json-1.1',\n", " 'content-length': '0',\n", " 'date': 'Mon, 17 Jan 2022 20:17:02 GMT'},\n", " 'RetryAttempts': 0}}" ] }, "execution_count": 30, "metadata": {}, "output_type": "execute_result" } ], "source": [ "sm.delete_model(ModelName=sm_model_name)\n", "sm.delete_endpoint_config(EndpointConfigName=endpoint_config_name)\n", "sm.delete_endpoint(EndpointName=endpoint_name)" ] }, { "cell_type": "markdown", "id": "3d1d0951", "metadata": {}, "source": [ "## TensorRT NLP-Bert\n", "\n", "Another way to improve performance is to convert the PyTorch NLP-Bert model to a TensorRT plan and use it natively to run inferences on Triton. By using the [onnx_exporter.py](./workspace/onnx_exporter.py) script and `trtexec` we create a TensorRT plan from the pre-trained PyTorch NLP-Bert model. This is already done as part of the `generate_models.sh` script that we ran earlier in this notebook. We'll package the model and the provided `config.pbtxt` according the Triton model specification and upload to s3 for creating a SageMaker model and endpoint." ] }, { "cell_type": "markdown", "id": "22a9d650", "metadata": {}, "source": [ "### TensorRT: Packaging model files and uploading to s3" ] }, { "cell_type": "code", "execution_count": 88, "id": "a45e5b0b", "metadata": {}, "outputs": [], "source": [ "!mkdir -p triton-serve-trt/bert/1/\n", "!cp workspace/model_bs16.plan triton-serve-trt/bert/1/model.plan\n", "!tar -C triton-serve-trt/ -czf model.tar.gz bert\n", "model_uri = sagemaker_session.upload_data(path=\"model.tar.gz\", key_prefix=\"triton-serve-trt\")" ] }, { "cell_type": "markdown", "id": "2c8d74c1", "metadata": {}, "source": [ "### TensorRT: Create SageMaker Endpoint" ] }, { "cell_type": "code", "execution_count": 89, "id": "242a2016", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Model Arn: arn:aws:sagemaker:us-east-1:474422712127:model/triton-nlp-bert-trt-2022-01-17-21-35-00\n" ] } ], "source": [ "sm_model_name = \"triton-nlp-bert-trt-\" + time.strftime(\"%Y-%m-%d-%H-%M-%S\", time.gmtime())\n", "\n", "container = {\n", " \"Image\": triton_image_uri,\n", " \"ModelDataUrl\": model_uri,\n", " \"Environment\": {\"SAGEMAKER_TRITON_DEFAULT_MODEL_NAME\": \"bert\"},\n", "}\n", "\n", "create_model_response = sm.create_model(\n", " ModelName=sm_model_name, ExecutionRoleArn=role, PrimaryContainer=container\n", ")\n", "\n", "print(\"Model Arn: \" + create_model_response[\"ModelArn\"])" ] }, { "cell_type": "code", "execution_count": 90, "id": "0496483d", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Endpoint Config Arn: arn:aws:sagemaker:us-east-1:474422712127:endpoint-config/triton-nlp-bert-trt-2022-01-17-21-35-03\n" ] } ], "source": [ "endpoint_config_name = \"triton-nlp-bert-trt-\" + time.strftime(\"%Y-%m-%d-%H-%M-%S\", time.gmtime())\n", "\n", "create_endpoint_config_response = sm.create_endpoint_config(\n", " EndpointConfigName=endpoint_config_name,\n", " ProductionVariants=[\n", " {\n", " \"InstanceType\": \"ml.g4dn.16xlarge\",\n", " \"InitialVariantWeight\": 1,\n", " \"InitialInstanceCount\": 1,\n", " \"ModelName\": sm_model_name,\n", " \"VariantName\": \"AllTraffic\",\n", " }\n", " ],\n", ")\n", "\n", "print(\"Endpoint Config Arn: \" + create_endpoint_config_response[\"EndpointConfigArn\"])" ] }, { "cell_type": "code", "execution_count": 92, "id": "c531ceda", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Endpoint Arn: arn:aws:sagemaker:us-east-1:474422712127:endpoint/triton-nlp-bert-trt-multingual2022-01-17-21-37-48\n" ] } ], "source": [ "endpoint_name = \"triton-nlp-bert-trt-multingual\" + time.strftime(\"%Y-%m-%d-%H-%M-%S\", time.gmtime())\n", "\n", "create_endpoint_response = sm.create_endpoint(\n", " EndpointName=endpoint_name, EndpointConfigName=endpoint_config_name\n", ")\n", "\n", "print(\"Endpoint Arn: \" + create_endpoint_response[\"EndpointArn\"])" ] }, { "cell_type": "code", "execution_count": 93, "id": "a65c80a3", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: Creating\n", "Status: InService\n", "Arn: arn:aws:sagemaker:us-east-1:474422712127:endpoint/triton-nlp-bert-trt-multingual2022-01-17-21-37-48\n", "Status: InService\n" ] } ], "source": [ "resp = sm.describe_endpoint(EndpointName=endpoint_name)\n", "status = resp[\"EndpointStatus\"]\n", "print(\"Status: \" + status)\n", "\n", "while status == \"Creating\":\n", " time.sleep(60)\n", " resp = sm.describe_endpoint(EndpointName=endpoint_name)\n", " status = resp[\"EndpointStatus\"]\n", " print(\"Status: \" + status)\n", "\n", "print(\"Arn: \" + resp[\"EndpointArn\"])\n", "print(\"Status: \" + status)" ] }, { "cell_type": "markdown", "id": "ae4fe0bd", "metadata": {}, "source": [ "### TensorRT: Run inference\n", "\n", "Once we have the endpoint running we can run the inference both using a json payload and binary+json payload as described in the standard PyTorch deployment section." ] }, { "cell_type": "code", "execution_count": 94, "id": "e5cfc757", "metadata": {}, "outputs": [], "source": [ "input_ids, attention_mask = tokenize_text(text_triton)\n", "\n", "payload = {\n", " \"inputs\": [\n", " {\"name\": \"token_ids\", \"shape\": [1, 128], \"datatype\": \"INT32\", \"data\": input_ids},\n", " {\"name\": \"attn_mask\", \"shape\": [1, 128], \"datatype\": \"INT32\", \"data\": attention_mask},\n", " ]\n", "}" ] }, { "cell_type": "code", "execution_count": 95, "id": "677f7839", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "CPU times: user 10.7 ms, sys: 0 ns, total: 10.7 ms\n", "Wall time: 133 ms\n" ] } ], "source": [ "%%time\n", "response = client.invoke_endpoint(\n", " EndpointName=endpoint_name, ContentType=\"application/octet-stream\", Body=json.dumps(payload)\n", ")" ] }, { "cell_type": "code", "execution_count": 97, "id": "5dca6fd0", "metadata": {}, "outputs": [], "source": [ "#print(json.loads(response[\"Body\"].read().decode(\"utf8\")))" ] }, { "cell_type": "code", "execution_count": 99, "id": "3c106716", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "Errors - 0.0000 out of 100.0000 total runs | 0.0000% in 3.8443 seconds \n", "\n", "\n", "TPS: 26.0124\n", "Client end-to-end latency percentiles:\n", "Avg | P50 | P90 | P95 | P100\n", "30.9116 | 30.1847 | 31.2072 | 32.4075 | 73.1871 \n", "\n" ] } ], "source": [ "import numpy as np \n", "import datetime\n", "import math\n", "import time\n", "import random\n", "import json\n", "\n", "runtime_sm_client = boto3.client(service_name='sagemaker-runtime')\n", "total_runs = 100\n", "content_type = \"application/octet-stream\" \n", "client_times = []\n", "errors_list = []\n", "\n", "cw_start = datetime.datetime.utcnow()\n", "errors = 0\n", "\n", "for _ in range(total_runs):\n", " client_start = time.time()\n", " response = runtime_sm_client.invoke_endpoint(\n", " EndpointName=endpoint_name,\n", " ContentType=content_type,\n", " Body=json.dumps(payload))\n", " client_end = time.time()\n", " client_times.append((client_end - client_start)*1000)\n", " skunk = response['Body'].read()\n", "\n", "cw_end = datetime.datetime.utcnow() \n", "\n", "cw_duration = cw_end - cw_start \n", "duration_in_s = cw_duration.total_seconds() \n", "\n", "tps = total_runs/duration_in_s\n", "\n", "print('\\nErrors - {:.4f} out of {:.4f} total runs | {:.4f}% in {:.4f} seconds \\n'.format(errors, total_runs, (errors/total_runs)*100, duration_in_s))\n", "errors = 0\n", "\n", "print('\\nTPS: {:.4f}'.format(tps))\n", " \n", "print('Client end-to-end latency percentiles:')\n", "client_avg = np.mean(client_times)\n", "client_p50 = np.percentile(client_times, 50)\n", "client_p90 = np.percentile(client_times, 90)\n", "client_p95 = np.percentile(client_times, 95)\n", "client_p100 = np.percentile(client_times, 100)\n", "print('Avg | P50 | P90 | P95 | P100')\n", "print('{:.4f} | {:.4f} | {:.4f} | {:.4f} | {:.4f} \\n'.format(client_avg, client_p50, client_p90, client_p95, client_p100))\n", "\n", "# Give 5 minute buffer to end\n", "cw_end += datetime.timedelta(minutes=5)" ] }, { "cell_type": "code", "execution_count": 48, "id": "4de1b0eb", "metadata": {}, "outputs": [], "source": [ "request_body, header_length = get_sample_tokenized_text_binary_trt(text_sm)" ] }, { "cell_type": "code", "execution_count": 56, "id": "ee531cfc", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "CPU times: user 10.9 ms, sys: 0 ns, total: 10.9 ms\n", "Wall time: 51 ms\n" ] } ], "source": [ "%%time\n", "response = client.invoke_endpoint(\n", " EndpointName=endpoint_name,\n", " ContentType=\"application/vnd.sagemaker-triton.binary+json;json-header-size={}\".format(\n", " header_length\n", " ),\n", " Body=request_body,\n", ")" ] }, { "cell_type": "code", "execution_count": null, "id": "03a42ebf", "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "id": "9f5f2412", "metadata": {}, "source": [ "### TensorRT: Terminate endpoint and clean up artifacts" ] }, { "cell_type": "code", "execution_count": null, "id": "c8d6020d", "metadata": {}, "outputs": [], "source": [ "sm.delete_endpoint(EndpointName=endpoint_name)\n", "sm.delete_endpoint_config(EndpointConfigName=endpoint_config_name)\n", "sm.delete_model(ModelName=sm_model_name)" ] } ], "metadata": { "kernelspec": { "display_name": "conda_python3", "language": "python", "name": "conda_python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.13" } }, "nbformat": 4, "nbformat_minor": 5 }