You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to run bsdiff4.file_diff on a large file (19GB) it sits for awhile, but crashes with:
File "/usr/lib/python3.11/site-packages/bsdiff4/format.py", line 85, in file_diff
dst = read_data(dst_path)
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/site-packages/bsdiff4/format.py", line 64, in read_data
data = fi.read()
^^^^^^^^^
OSError: [Errno 5] Input/output error
Seems to work fine on smaller files. Do I have to manually chunk large files myself?
The text was updated successfully, but these errors were encountered:
Actually turns out the crash was due to an unrelated filesystem corruption issue I was having. After fixing that I've found instead that bsdiff4 will just write memory until you run out and the system either hangs or kills the process so chunking the file or streaming it if possible seem to be required for files that exceed available memory.
Running version 1.2.4 installed via AUR package on Archlinux: https://aur.archlinux.org/packages/python-bsdiff4
When trying to run
bsdiff4.file_diff
on a large file (19GB) it sits for awhile, but crashes with:Seems to work fine on smaller files. Do I have to manually chunk large files myself?
The text was updated successfully, but these errors were encountered: