Project

Profile

Help

Issue #7480

closed

Pulp worker can consume high memory when publishing a repository with large metadata files

Added by hyu about 4 years ago. Updated about 4 years ago.

Status:
CLOSED - CURRENTRELEASE
Priority:
Normal
Assignee:
-
Sprint/Milestone:
-
Start date:
Due date:
Estimated time:
Severity:
2. Medium
Version:
Platform Release:
2.21.4
OS:
Triaged:
No
Groomed:
No
Sprint Candidate:
No
Tags:
Katello, Performance, Pulp 2
Sprint:
Quarter:

Description

Clone from bugzilla 1876782

Description of problem: When publishing a repository with large metadata file (such as the others.xml.gz file in rhel-7-server-rpms). The Pulp worker can consumes more than 3GB of RAM for a few minutes. After that, the memory is freed to normally usage which is ok.

When calculating the open-size of a metadata, Pulp opens the gzip file which loads the whole gzip file into the memory.

plugins/distributors/yum/metadata/repomd.py

    if file_path.endswith('.gz'):

        open_size_element = ElementTree.SubElement(data_element, 'open-size')

        open_checksum_attributes = {'type': self.checksum_type}
        open_checksum_element = ElementTree.SubElement(data_element, 'open-checksum',
                                                       open_checksum_attributes)

        try:
            file_handle = gzip.open(file_path, 'r')   <============= Here

        except:
            # cannot have an else clause to the try without an except clause
            raise

        else:
            try:
                content = file_handle.read()
                open_size_element.text = str(len(content))
                open_checksum_element.text = self.checksum_constructor(content).hexdigest()

            finally:
                file_handle.close()

This is not quite an issue if user is syncing only a few repos. In the case of Satellite, user may sync large repositories at the same time, such as the Optimized Capsule sync. If one Capsule has 8 workers and each worker consumes 4GB+ of memory then the Capsule will run out of memory.

Steps to Reproduce:

  1. Set Pulp to use only 1 worker so that we can monitor the progress easily.
  2. Force full publish a rhel-7-server-rpms repository.
  3. Use the following command to monitor the memory usage.

watch 'ps -aux | grep reserved_resource_worker-0'

  1. The high memory consumption happens when Pulp finalizing the others.xml.gz file. You can use the following command to monitor the pulp working directory.

cd /var/cache/pulp/reserved_resource_worker-0@// watch 'ls -alrth'

Added by hyu about 4 years ago

Revision 5b546ca0 | View on GitHub

Fix high memory issue when calculating metadata open checksum

closes #7480 https://pulp.plan.io/issues/7480

Actions #1

Updated by pulpbot about 4 years ago

  • Status changed from NEW to POST
Actions #3

Updated by ggainey about 4 years ago

katello would like to see a pulp2 build containing this fix towards the end of September

Actions #4

Updated by hyu about 4 years ago

  • Status changed from POST to MODIFIED
Actions #5

Updated by ggainey about 4 years ago

  • Platform Release set to 2.21.4

Added by hyu about 4 years ago

Revision 1069d42d | View on GitHub

Fix high memory issue when calculating metadata open checksum

closes #7480 https://pulp.plan.io/issues/7480

(cherry picked from commit 5b546ca0d8c972b210bc75d59a069d22034ee741)

Actions #6

Updated by ggainey about 4 years ago

  • Status changed from MODIFIED to CLOSED - CURRENTRELEASE

Also available in: Atom PDF