read gzipped metadata file by chunks could prevents python from MemoryError
I hit MemoryError when pulp generated checksum for metadata. I assume it could be solved if content of file is readed by chunks. You can find suggested change here:
Updated by bmbouter over 3 years ago
- Status changed from NEW to CLOSED - WONTFIX
Pulp 2 is approaching maintenance mode, and this Pulp 2 ticket is not being actively worked on. As such, it is being closed as WONTFIX. Pulp 2 is still accepting contributions though, so if you want to contribute a fix for this ticket, please reopen or comment on it. If you don't have permissions to reopen this ticket, or you want to discuss an issue, please reach out via the developer mailing list.