You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After synchronizing the local 'compreface-postgres-db' database to the server's 'compreface-postgres-db' using third-party software, images and lists can be displayed correctly. However, when I perform RECOGNITION (facial recognition), faces can be correctly identified locally, but not on the online server.
After trying, using the command 'docker-compose restart compreface-api' solves the problem, and face recognition works correctly.
I understand that the API service runs PostgreSQL loaded into RAM. When faces are added locally through SDK or RESTful, does the code update the cache?
However, only synchronizing the database to the online server without updating the cache causes the corresponding faces to be recognized incorrectly. Is there a way to trigger cache updates when the database is updated, or to manually update the cache with a command?
Additionally,
If one face is 200kb and a subject has 6 images, how many subjects can be stored?
Does the API load all faces into RAM? Could there be a situation where RAM is insufficient, or is there a limit on subjects?
The text was updated successfully, but these errors were encountered:
After synchronizing the local 'compreface-postgres-db' database to the server's 'compreface-postgres-db' using third-party software, images and lists can be displayed correctly. However, when I perform RECOGNITION (facial recognition), faces can be correctly identified locally, but not on the online server.
After trying, using the command 'docker-compose restart compreface-api' solves the problem, and face recognition works correctly.
I understand that the API service runs PostgreSQL loaded into RAM. When faces are added locally through SDK or RESTful, does the code update the cache?
However, only synchronizing the database to the online server without updating the cache causes the corresponding faces to be recognized incorrectly. Is there a way to trigger cache updates when the database is updated, or to manually update the cache with a command?
Additionally,
If one face is 200kb and a subject has 6 images, how many subjects can be stored?
Does the API load all faces into RAM? Could there be a situation where RAM is insufficient, or is there a limit on subjects?
The text was updated successfully, but these errors were encountered: