Your ONNX model has been generated with INT64 weights #173

Open
opened 2021-12-08 18:58:00 +01:00 by nonlin · 0 comments
nonlin commented 2021-12-08 18:58:00 +01:00 (Migrated from github.com)

[W:onnxruntime:Default, tensorrt_execution_provider.h:53 onnxruntime::TensorrtLogger::log] [2021-12-08 16:53:27 WARNING] D:\a_work\1\s\cmake\external\onnx-tensorrt\onnx2trt_utils.cpp:364: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.

Any way around this?

It still functions although with a slight delay at the start.

[W:onnxruntime:Default, tensorrt_execution_provider.h:53 onnxruntime::TensorrtLogger::log] [2021-12-08 16:53:27 WARNING] D:\a\_work\1\s\cmake\external\onnx-tensorrt\onnx2trt_utils.cpp:364: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. Any way around this? It still functions although with a slight delay at the start.
Sign in to join this conversation.