Problem ValueError with SimSwap #445

Closed
opened 2023-09-20 04:16:26 +02:00 by planettich · 4 comments
planettich commented 2023-09-20 04:16:26 +02:00 (Migrated from github.com)

It gives me this error at the end :

ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)

Can anyone help me? thank you very much anyone can answer me

It gives me this error at the end : ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...) Can anyone help me? thank you very much anyone can answer me
woctezuma commented 2023-09-20 23:05:17 +02:00 (Migrated from github.com)
See: - https://github.com/neuralchen/SimSwap/issues/316#issuecomment-1728448536
TransAmMan commented 2023-09-27 02:57:37 +02:00 (Migrated from github.com)

See:

Can someone help me implement this fix. I use a hosted GPU runtime. This fix assumes a local run time.

> See: > > * [ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. #316 (comment)](https://github.com/neuralchen/SimSwap/issues/316#issuecomment-1728448536) Can someone help me implement this fix. I use a hosted GPU runtime. This fix assumes a local run time.
neuralchen commented 2023-09-27 03:45:53 +02:00 (Migrated from github.com)

This bug is about the timm.
We have given new installation instructions, please use the following instructions to reinstall timm:
timm==0.5.4

This bug is about the timm. We have given new installation instructions, please use the following instructions to reinstall timm: timm==0.5.4
TransAmMan commented 2023-09-27 14:27:15 +02:00 (Migrated from github.com)

Thanks for your reply. I installed "timm==0.5.4" as you suggested. I tried in different places on the installation list:

!pip install timm==0.5.4

!pip install insightface==0.2.1 onnxruntime moviepy
!pip install timm==0.5.4
!pip install googledrivedownloader
!pip install imageio==2.4.1

!pip install timm==0.5.4

I used the released version of SimSwap:

https://colab.research.google.com/github/neuralchen/SimSwap/blob/main/SimSwap%20colab.ipynb#scrollTo=Y5K4au_UCkKn

and, I disconnected and deleted runtime, each time I tried. All attempts resulted in the same error:

ValueError Traceback (most recent call last)
in <cell line: 30>()
28 ## model.eval()
29
---> 30 app = Face_detect_crop(name='antelope', root='./insightface_func/models')
31 app.prepare(ctx_id= 0, det_thresh=0.6, det_size=(640,640),mode=mode)
32

5 frames
/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py in _create_inference_session(self, providers, provider_options, disabled_optimizers)
449 if not providers and len(available_providers) > 1:
450 self.disable_fallback()
--> 451 raise ValueError(
452 f"This ORT build has {available_providers} enabled. "
453 "Since ORT 1.9, you are required to explicitly set "

ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)

Thanks for your reply. I installed "timm==0.5.4" as you suggested. I tried in different places on the installation list: ## !pip install timm==0.5.4 !pip install insightface==0.2.1 onnxruntime moviepy !pip install timm==0.5.4 !pip install googledrivedownloader !pip install imageio==2.4.1 ## !pip install timm==0.5.4 I used the released version of SimSwap: https://colab.research.google.com/github/neuralchen/SimSwap/blob/main/SimSwap%20colab.ipynb#scrollTo=Y5K4au_UCkKn and, I disconnected and deleted runtime, each time I tried. All attempts resulted in the same error: ValueError Traceback (most recent call last) [<ipython-input-11-6073de39e5b0>](https://localhost:8080/#) in <cell line: 30>() 28 ## model.eval() 29 ---> 30 app = Face_detect_crop(name='antelope', root='./insightface_func/models') 31 app.prepare(ctx_id= 0, det_thresh=0.6, det_size=(640,640),mode=mode) 32 5 frames [/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py](https://localhost:8080/#) in _create_inference_session(self, providers, provider_options, disabled_optimizers) 449 if not providers and len(available_providers) > 1: 450 self.disable_fallback() --> 451 raise ValueError( 452 f"This ORT build has {available_providers} enabled. " 453 "Since ORT 1.9, you are required to explicitly set " ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
Sign in to join this conversation.