You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am planning to use compreface to build a face recognition system, mainly for face recognition within videos. The total duration of the videos is estimated to be several thousand hours, with an expected number of recognized face images around several million. I have a few areas of confusion and would appreciate your assistance:
1、I noticed a mention of 200,000 data per application. Is the 200,000 referring to 200,000 subjects or 200,000 face images?
2、In the face detection application, are there any relevant configurations to set a minimum face image size requirement, such as 40*40 or higher? I only see configurations for maximum size (max_detect_size).
3、In other discussions, it is assumed that the face library is in the order of several million. Would a single server with 64GB of memory and a 4080 or 4090 GPU be sufficient? Are there corresponding test data, such as how much memory, GPU, etc., is needed for 1 million face images?
4、I noticed you have a version with a vector database. Is it possible to replace the existing PostgreSQL version with Redis to improve performance?
5、Is there an update or plan to update the face detection model, especially for Asian face recognition? I have tested some Asian face images, and there seem to be no issues with resolution or lighting. The recognition rate is very high in images containing up to 10 faces. However, beyond 20 faces, the accuracy noticeably decreases. In the photo below, only 18 faces were recognized out of 32,use SubCenter-ArcFace-r100-gpu。
6、For the configurations compreface_api_java_options=-Xmx8g and compreface_admin_java_options=-Xmx8g, if the memory is increased, for example, to 16GB or 24GB, would it support a greater number of faces?
These are the current areas of confusion for me, and I hope to receive assistance. Thank you.
The text was updated successfully, but these errors were encountered:
I am planning to use compreface to build a face recognition system, mainly for face recognition within videos. The total duration of the videos is estimated to be several thousand hours, with an expected number of recognized face images around several million. I have a few areas of confusion and would appreciate your assistance:
1、I noticed a mention of 200,000 data per application. Is the 200,000 referring to 200,000 subjects or 200,000 face images?
2、In the face detection application, are there any relevant configurations to set a minimum face image size requirement, such as 40*40 or higher? I only see configurations for maximum size (max_detect_size).
3、In other discussions, it is assumed that the face library is in the order of several million. Would a single server with 64GB of memory and a 4080 or 4090 GPU be sufficient? Are there corresponding test data, such as how much memory, GPU, etc., is needed for 1 million face images?
4、I noticed you have a version with a vector database. Is it possible to replace the existing PostgreSQL version with Redis to improve performance?
5、Is there an update or plan to update the face detection model, especially for Asian face recognition? I have tested some Asian face images, and there seem to be no issues with resolution or lighting. The recognition rate is very high in images containing up to 10 faces. However, beyond 20 faces, the accuracy noticeably decreases. In the photo below, only 18 faces were recognized out of 32,use SubCenter-ArcFace-r100-gpu。
6、For the configurations compreface_api_java_options=-Xmx8g and compreface_admin_java_options=-Xmx8g, if the memory is increased, for example, to 16GB or 24GB, would it support a greater number of faces?
These are the current areas of confusion for me, and I hope to receive assistance. Thank you.
The text was updated successfully, but these errors were encountered: