执行 报错 #215

Open
opened 2022-03-22 08:27:34 +01:00 by theoldsong · 2 comments
theoldsong commented 2022-03-22 08:27:34 +01:00 (Migrated from github.com)

ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)
执行报错,这是什么原因,win10系统
感谢大佬抽空解答下。

`ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)` 执行报错,这是什么原因,win10系统 感谢大佬抽空解答下。
ghost commented 2022-03-22 17:56:01 +01:00 (Migrated from github.com)
see https://github.com/neuralchen/SimSwap/issues/176#issuecomment-991705311
theoldsong commented 2022-03-23 01:25:01 +01:00 (Migrated from github.com)

see #176 (comment)

thank. I followed the operation and still reported an error.

> see [#176 (comment)](https://github.com/neuralchen/SimSwap/issues/176#issuecomment-991705311) thank. I followed the operation and still reported an error.
Sign in to join this conversation.