Issue #1155
closedread gzipped metadata file by chunks could prevents python from MemoryError
Description
I hit MemoryError when pulp generated checksum for metadata. I assume it could be solved if content of file is readed by chunks. You can find suggested change here:
https://github.com/release-engineering/pulp_rpm/commit/5d23b55d4327c1452be6d5845def084939bf5326
Updated by ipanova@redhat.com over 8 years ago
- Project changed from Pulp to RPM Support
Updated by mhrivnak over 8 years ago
- Triaged changed from No to Yes
- Tags Easy Fix added
This could be fixed on 2.6-dev in case we can get it into another 2.6 release.
Updated by amacdona@redhat.com over 8 years ago
There is a PR open for this: https://github.com/pulp/pulp_rpm/pull/707
Updated by ipanova@redhat.com about 6 years ago
- Status changed from ASSIGNED to NEW
Updated by bmbouter about 5 years ago
- Status changed from NEW to CLOSED - WONTFIX
Pulp 2 is approaching maintenance mode, and this Pulp 2 ticket is not being actively worked on. As such, it is being closed as WONTFIX. Pulp 2 is still accepting contributions though, so if you want to contribute a fix for this ticket, please reopen or comment on it. If you don't have permissions to reopen this ticket, or you want to discuss an issue, please reach out via the developer mailing list.