You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using kokoro-onnx for TTS generation with CoreML execution provider on macOS with an M1 and it failed with the following error:
2025-01-06 20:04:29.684416 [W:onnxruntime:, helper.cc:88 IsInputSupported] CoreML does not support shapes with dimension values of 0. Input:/Slice_1_output_0, shape: {0}2025-01-06 20:04:29.684759 [W:onnxruntime:, helper.cc:88 IsInputSupported] CoreML does not support shapes with dimension values of 0. Input:/decoder/generator/m_source/l_sin_gen/Slice_output_0, shape: {0}2025-01-06 20:04:29.685270 [W:onnxruntime:, helper.cc:82 IsInputSupported] CoreML does not support input dim > 16384. Input:decoder.generator.stft.stft.window_sum, shape: {5000015}2025-01-06 20:04:29.686710 [W:onnxruntime:, coreml_execution_provider.cc:115 GetCapability] CoreMLExecutionProvider::GetCapability, number of partitions supported by CoreML: 123 number of nodes in the graph: 2361 number of nodes supported by CoreML: 949Traceback (most recent call last): File "/Volumes/Internal/audio/kokoro-onnx/examples/with_session.py", line 14, in <module> session = InferenceSession("kokoro-v0_19.onnx", providers=["CoreMLExecutionProvider"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Volumes/Internal/audio/kokoro-onnx/.venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 465, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/Volumes/Internal/audio/kokoro-onnx/.venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 537, in _create_inference_session sess.initialize_session(providers, provider_options, disabled_optimizers)onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : model_builder.cc:768 RegisterModelInputOutput Unable to get shape for output: /Squeeze_output_0
Describe the issue
I'm using kokoro-onnx for TTS generation with CoreML execution provider on macOS with an M1 and it failed with the following error:
Related:
To reproduce
Urgency
It's too slow on CPU
Platform
Mac
OS Version
14.5 (23F79)
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
onnxruntime v1.20.1
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CoreML
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: