You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We should implement a limit to the maximum number of files per storage request. On biigle.de, we have a user who created several requests with multiple 10k or even more than 100k images. Maybe we could implement a 10k limit. If users want to upload more, they have to chunk the files into several storage requests.
One issue that too many files can cause is too long run times for the queue jobs. Also they can theoretically spam the service with millions of small files (as long as the total size is within their quota but that's easy).
The text was updated successfully, but these errors were encountered:
Wouldn't it make more sense to chunk the storage request into multiple queue jobs instead of limiting it on the user site? In the end both solutions would result in the same outcome (if the user submits multiple requests in your case), but having similar data as one storage request seems more manageable.
We should implement a limit to the maximum number of files per storage request. On biigle.de, we have a user who created several requests with multiple 10k or even more than 100k images. Maybe we could implement a 10k limit. If users want to upload more, they have to chunk the files into several storage requests.
One issue that too many files can cause is too long run times for the queue jobs. Also they can theoretically spam the service with millions of small files (as long as the total size is within their quota but that's easy).
The text was updated successfully, but these errors were encountered: