Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Web] Onnxruntime web result compare to Python Onnx #22953

Open
ueihgnurt opened this issue Nov 27, 2024 · 6 comments
Open

[Web] Onnxruntime web result compare to Python Onnx #22953

ueihgnurt opened this issue Nov 27, 2024 · 6 comments
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template

Comments

@ueihgnurt
Copy link

Describe the issue

Onnx Web Inference.
Image
Python + Onnx Inference.
Image
This is what happen if I remove the Conv2d layer from the model
Image

As you can see the the input is right. so that the output is still right if there no conv2d layer.
The problem is when I put the conv2d layer in. All of them use float32
I wonder what causing this issue

To reproduce

The Model:
Image

Urgency

No response

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.20.1

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

@ueihgnurt ueihgnurt added the platform:web issues related to ONNX Runtime web; typically submitted using template label Nov 27, 2024
@skottmckay
Copy link
Contributor

Example code for how you're running the model in python and ORT web would be helpful.

@ueihgnurt
Copy link
Author

Example code for how you're running the model in python and ORT web would be helpful.
The dummy code is here. [email protected]:ueihgnurt/test_onnx_web.git

@skottmckay
Copy link
Contributor

https://github.com/ueihgnurt/test_onnx_web/blob/main/python_examples/python_example.ipynb is using opencv not onnxruntime to run the model afaics so there's no direct comparison to ORT web. I would suggest checking that the pre and post processing you're doing of the input/output is correct for the onnx model.

Additionally, compare the raw float32 values produced for a given set of input values to remove any pre/post processing differences from the equation.

@ueihgnurt
Copy link
Author

https://github.com/ueihgnurt/test_onnx_web/blob/main/python_examples/python_example.ipynb is using opencv not onnxruntime to run the model afaics so there's no direct comparison to ORT web. I would suggest checking that the pre and post processing you're doing of the input/output is correct for the onnx model.

Additionally, compare the raw float32 values produced for a given set of input values to remove any pre/post processing differences from the equation.

So I tried using this code

Image

result is the same as opencv one. I didn't even process anything.
The error here only occur if I add conv2d into the model. you can test the model with this https://github.com/ueihgnurt/test_onnx_web/blob/main/vueexamples/public/onnxdinov2/test_empty_model.onnx
it contain no conv2d just return the origin image.
in this onnx model. both the web version and the python version return the same result.

@ueihgnurt
Copy link
Author

I mean what I did was only divide the RGB to 255.0 then put it into the model. How could the different be this big between web version and python version.

@ueihgnurt
Copy link
Author

Describe the issue

Onnx Web Inference. Image Python + Onnx Inference. Image This is what happen if I remove the Conv2d layer from the model Image

As you can see the the input is right. so that the output is still right if there no conv2d layer. The problem is when I put the conv2d layer in. All of them use float32 I wonder what causing this issue

To reproduce

The Model: Image

Urgency

No response

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.20.1

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

I don't think the result up here is related too much with the float32
since you can clearly see the conv2d literally grid origin image into 3x3 parts and positions is shuffled. it's clearly not a conv2d. or at least how I think it work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template
Projects
None yet
Development

No branches or pull requests

2 participants