-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Web] Onnxruntime web result compare to Python Onnx #22953
Comments
Example code for how you're running the model in python and ORT web would be helpful. |
|
https://github.com/ueihgnurt/test_onnx_web/blob/main/python_examples/python_example.ipynb is using opencv not onnxruntime to run the model afaics so there's no direct comparison to ORT web. I would suggest checking that the pre and post processing you're doing of the input/output is correct for the onnx model. Additionally, compare the raw float32 values produced for a given set of input values to remove any pre/post processing differences from the equation. |
So I tried using this code result is the same as opencv one. I didn't even process anything. |
I mean what I did was only divide the RGB to 255.0 then put it into the model. How could the different be this big between web version and python version. |
Describe the issue
Onnx Web Inference.
Python + Onnx Inference.
This is what happen if I remove the Conv2d layer from the model
As you can see the the input is right. so that the output is still right if there no conv2d layer.
The problem is when I put the conv2d layer in. All of them use float32
I wonder what causing this issue
To reproduce
The Model:
Urgency
No response
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.20.1
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)
The text was updated successfully, but these errors were encountered: