THRESHOLD ERROR #139

Closed
opened 2021-10-31 23:54:40 +01:00 by seonake · 5 comments
seonake commented 2021-10-31 23:54:40 +01:00 (Migrated from github.com)

Hi, i am geting this error.
Anyone knows how to fix it?
Many thanks in advance

Traceback (most recent call last):
File "test_video_swapspecific.py", line 50, in
img_a_align_crop, _ = app.get(img_a_whole,crop_size)
File "H:\faceswap\insightface_func\face_detect_crop_multi.py", line 55, in get
bboxes, kpss = self.det_model.detect(img,
TypeError: detect() got an unexpected keyword argument 'threshold'

Hi, i am geting this error. Anyone knows how to fix it? Many thanks in advance Traceback (most recent call last): File "test_video_swapspecific.py", line 50, in <module> img_a_align_crop, _ = app.get(img_a_whole,crop_size) File "H:\faceswap\insightface_func\face_detect_crop_multi.py", line 55, in get bboxes, kpss = self.det_model.detect(img, TypeError: detect() got an unexpected keyword argument 'threshold'
NNNNAI commented 2021-11-01 08:33:48 +01:00 (Migrated from github.com)

The version of your insightface is too high. Try to uninstall to insightface and use pip install insightface==0.2.1 to reinstall it ,

The version of your insightface is too high. Try to uninstall to insightface and use pip install insightface==0.2.1 to reinstall it ,
seonake commented 2021-11-01 11:43:46 +01:00 (Migrated from github.com)

OK. Many thanks

OK. Many thanks
NNNNAI commented 2021-11-01 13:00:28 +01:00 (Migrated from github.com)

Have a nice day.

Have a nice day.
woctezuma commented 2023-09-20 23:43:38 +02:00 (Migrated from github.com)

For info, there is an issue with version 0.2.1 now.

Traceback (most recent call last):
  File "/content/SimSwap/test_wholeimage_swapsingle.py", line 55, in <module>
    app = Face_detect_crop(name='antelope', root='./insightface_func/models')
  File "/content/SimSwap/insightface_func/face_detect_crop_single.py", line 40, in __init__
    model = model_zoo.get_model(onnx_file)
  File "/usr/local/lib/python3.10/dist-packages/insightface/model_zoo/model_zoo.py", line 56, in get_model
    model = router.get_model()
  File "/usr/local/lib/python3.10/dist-packages/insightface/model_zoo/model_zoo.py", line 23, in get_model
    session = onnxruntime.InferenceSession(self.onnx_file, None)
  File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 432, in __init__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 451, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
For info, there is an issue with version 0.2.1 now. ``` Traceback (most recent call last): File "/content/SimSwap/test_wholeimage_swapsingle.py", line 55, in <module> app = Face_detect_crop(name='antelope', root='./insightface_func/models') File "/content/SimSwap/insightface_func/face_detect_crop_single.py", line 40, in __init__ model = model_zoo.get_model(onnx_file) File "/usr/local/lib/python3.10/dist-packages/insightface/model_zoo/model_zoo.py", line 56, in get_model model = router.get_model() File "/usr/local/lib/python3.10/dist-packages/insightface/model_zoo/model_zoo.py", line 23, in get_model session = onnxruntime.InferenceSession(self.onnx_file, None) File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 432, in __init__ raise e File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 451, in _create_inference_session raise ValueError( ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...) ``` - https://github.com/neuralchen/SimSwap/issues/445
woctezuma commented 2023-09-21 11:36:22 +02:00 (Migrated from github.com)
Related: - #407 I can fix this with: - https://github.com/neuralchen/SimSwap/pull/447
Sign in to join this conversation.