-
Notifications
You must be signed in to change notification settings - Fork 279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
big difference in results using saved_model.tflite when compared to nsfwjs.com #109
Comments
while using tflite in android, input RGB is normalised to [0,1] from [0,255]. |
Tried predict.classify() JavaScript function of this project, with v1.1.0's saved_model.h5. It gave exactly same results as with my android code(using saved_model.tflite). Wondering if model used in nsfwjs.com is same as v1.1.0 of this repo or something else. @GantMan can you please clarify? |
The v1.1.0 is much worse than the nswfjs. No idea why. |
|
If anyone find's this thread in the future, I ended up using Bumble's Private Detector for my NSFW-FLASK service. Bumble's has higher precision and accuracy. Bumble's model is about 400mb compressed while this is 135Mb |
Wow, thanks! |
I have used saved_model.tflite from v1.1.0 and v1.2.0. image used is 224x224, same jpeg in both testing. here are results:
nsfwjs.com
Neutral - 67.35%
Porn - 31.40%
Drawing - 0.79%
Hentai - 0.26%
Sexy - 0.19%
saved_model.tflite (v1.1.0, on android with tensorflow lite)
porn: 0.8919
sexy: 0.0455
hentai: 0.0355
drawings: 0.0209
neutral: 0.0062
jpeg used is here(safe-for-work):
I am new to this, so correct me if am wrong, I have simply used this tflite without any changes. Am I using the correct version?
The text was updated successfully, but these errors were encountered: