-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GroupNormalizationPlugin
not found
#4312
Comments
If you has build the nvinfer_plugin.so, you can add some print infos in your code like https://github.com/NVIDIA/TensorRT/blob/release/10.7/plugin/groupNormalizationPlugin/groupNormalizationPlugin.cpp#L350 to check your plugin version is loaded right. |
Hi @lix19937
I was able to bit-match the results with PyTorch and ONNX models as well. |
@hrishi121 You are welcome ! |
Description
I have a simple test model that has two conv layers and a group norm in pytorch, and I want to serialize it to TensorRT. I converted the pytorch model to ONNX opset20. The groupnorm layer is broken down into instancenorm+few other operations in the ONNX graph. I have been following the onnx_packnet example, where it suggests to replace the instancenorm nodes and replace it with "GroupNormalizationPlugin" node (as shown in the sample
post_processing.py
script)I followed the build instructions for "Native build on Jetson (aarch64)" and was able to build TensorRT OSS repo on Jetson AGX Orin.
However, when I try to parse the ONNX model using trtexec, I keep getting that "GroupNormalizationPlugin" not found
These are the commands that I am running to parse the ONNX model using trtexec:
where,
/home/nvidia/TensorRT/
path refers to TensorRT OSS path and the out directory contains the compiled .so files/home/nvidia/TensorRT-10.7.0.23/lib
is the path to the TensorRT-10.7 GA release downloaded from nvidia websiteEnvironment
TensorRT Version: 10.7.0.23
NVIDIA GPU: Jetson AGX Orin
NVIDIA Driver Version:
CUDA Version: 12.6
CUDNN Version:
Operating System: Jetson native build
Python Version (if applicable):
Tensorflow Version (if applicable):
PyTorch Version (if applicable): 2.5.1
Baremetal or Container (if so, version):
Relevant Files
Model link: replaced_plugin.onnx
Steps To Reproduce
Commands or scripts:
Have you tried the latest release?: Yes
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (
polygraphy run <model.onnx> --onnxrt
):The text was updated successfully, but these errors were encountered: