Project

Profile

Help

Issue #9370

After removing: distribution, publication, repository, remote - space is not reclaimed. GET /pulp/content continues.

Added by janr7 3 months ago. Updated 2 months ago.

Status:
CLOSED - NOTABUG
Priority:
Normal
Assignee:
-
Category:
-
Sprint/Milestone:
-
Start date:
Due date:
Estimated time:
Severity:
2. Medium
Version:
Master
Platform Release:
OS:
Triaged:
No
Groomed:
No
Sprint Candidate:
No
Tags:
Dev Environment
Sprint:
Quarter:

Description

Hello.

We are testing a fresh install of pulp3 in a docker container. Started with 3.14.2, having problems that were resolved in https://pulp.plan.io/issues/9196.

Current: "version": "3.15.2"

7 repositories were set up successfully using 145GB in filesystem space.

However plagued with: backoff:INFO: Backing off download_wrapper(...) for 0.3s (aiohttp.client_exceptions.ClientPayloadError: Response payload is not completed)

When refreshing - sync - of repositories. Down to : "download_concurrency: 1"

What we have seen in 3.14.5 that a repository not needed of updating the sync step was quick. With 3.15.2 the sync took long and extra filesystem space would be used and returned after the sync step of the static repository (that normally would not need updating).

This is where the decision was made to wipe the 7 repositories and start again. Using pulp-squeezer as before.

The problem: After removing all, distribution, publication, repository, remote , the space in media/artifact/ is not removed. The command 'pulp task' displays no active tasks.

  • How can we cleanup to reclaim the space.
  • Get rid of messages in the pulp log where pulp is trying to access removed base_path: in distribution creation.

There is no need to recover anything all removed.

Please see attached log.

Thank you.

Jan

  "versions": [
    {
      "component": "core",
      "version": "3.15.2"
    },
    {
      "component": "rpm",
      "version": "3.15.0"
    },
    {
      "component": "python",
      "version": "3.5.0"
    },
    {
      "component": "file",
      "version": "1.9.1"
    },
    {
      "component": "deb",
      "version": "2.15.0"
    },
    {
      "component": "container",
      "version": "2.8.0"
    },
    {
      "component": "certguard",
      "version": "1.5.0"
    },
    {
      "component": "ansible",
      "version": "0.10.0"
    }



pip freeze
appdirs==1.4.3
Babel==2.8.0
certifi==2021.5.30
chardet==4.0.0
click==8.0.1
idna==2.10
importlib-metadata==4.6.1
iotop==0.6
isc==2.0
Jinja2==2.10.1
MarkupSafe==1.0
packaging==20.3
ply==3.10
pulp-cli==0.10.1
pyparsing==2.4.7
pytz==2020.5
PyYAML==5.4.1
requests==2.25.1
six==1.14.0
toml==0.10.2
typing-extensions==3.10.0.0
urllib3==1.26.6
zipp==3.5.0

docker_pulp_log.txt (22.1 KB) docker_pulp_log.txt base_path referenced janr7, 09/10/2021 09:59 AM

History

#1 Updated by gerrod 3 months ago

@janr7, Have you tried calling the orphan cleanup endpoint after deleting all remotes, repositories, publications and distributions? Since content in Pulp is immutable, the only way for it to be remove is for the content to be an orphan (not part of any repository version). Here is the documentation for the endpoint: https://docs.pulpproject.org/pulpcore/restapi.html#operation/orphans_cleanup_cleanup.

#2 Updated by janr7 3 months ago

Thank you so much.

#3 Updated by janr7 3 months ago

Hello Team

Please assist. I am getting URI does not exist. For status it works.

http http://ltpulp01.group.net:8080/pulp/api/v3/status/

HTTP/1.1 200 OK
Access-Control-Expose-Headers: Correlation-ID
Allow: GET, HEAD, OPTIONS
Connection: keep-alive
Content-Length: 1050
Content-Type: application/json
Correlation-ID: baa812af2be2408a9b867cdebcf078cd
Date: Thu, 16 Sep 2021 11:57:22 GMT
Referrer-Policy: same-origin
Server: nginx/1.14.1
Vary: Accept
X-Content-Type-Options: nosniff
X-Frame-Options: DENY

{
    "database_connection": {
        "connected": true
    },
    "online_content_apps": [
        {
            "last_heartbeat": "2021-09-16T11:57:18.971446Z",
            "name": "15401@pulp"
        },

http POST http://ltpulp01.group.net:8080/pulp/api/v3/orphans/cleanup/ < orphans_cleanup.json

cat orphans_cleanup.json

{
  "content_hrefs": [
    null
  ],
  "orphan_protection_time": 0
}

http --auth admin:pwd POST http://ltpulp01.group.net:8080/pulp/api/v3/orphans/cleanup/ < orphans_cleanup.json

HTTP/1.1 400 Bad Request
Access-Control-Expose-Headers: Correlation-ID
Allow: POST, OPTIONS
Connection: keep-alive
Content-Length: 41
Content-Type: application/json
Correlation-ID: 050f15c5a67841bc92d87348c6a076a6
Date: Thu, 16 Sep 2021 11:57:54 GMT
Referrer-Policy: same-origin
Server: nginx/1.14.1
Vary: Accept, Cookie
X-Content-Type-Options: nosniff
X-Frame-Options: DENY

{
    "content_hrefs": [
        "URI not valid: None"
    ]
}

docker log

127.0.0.1 - - [16/Sep/2021:11:57:21 +0000] "GET /pulp/api/v3/status/ HTTP/1.0" 200 1050 "-" "HTTPie/1.0.3" pulp [050f15c5a67841bc92d87348c6a076a6]: django.request:WARNING: Bad Request: /pulp/api/v3/orphans/cleanup/ 127.0.0.1 - admin [16/Sep/2021:11:57:54 +0000] "POST /pulp/api/v3/orphans/cleanup/ HTTP/1.0" 400 41 "-" "HTTPie/1.0.3"

Thank you so much.

#4 Updated by janr7 3 months ago

Hello

I have called the cleanup with no payload and this worked 100%

Thanks so much.

All disk space regained.

http --auth admin:pwd POST http://ltpulp01.group.net:8080/pulp/api/v3/orphans/cleanup/
HTTP/1.1 202 Accepted
Access-Control-Expose-Headers: Correlation-ID
Allow: POST, OPTIONS
Connection: keep-alive
Content-Length: 67
Content-Type: application/json
Correlation-ID: ff4643d7d2e342fa974565629a8333a6
Date: Thu, 16 Sep 2021 12:16:47 GMT
Referrer-Policy: same-origin
Server: nginx/1.14.1
Vary: Accept, Cookie
X-Content-Type-Options: nosniff
X-Frame-Options: DENY

{
    "task": "/pulp/api/v3/tasks/27ee5dab-6948-4aef-b447-2c5e654b8d8e/"
}

docker log: 127.0.0.1 - admin [16/Sep/2021:12:16:47 +0000] "POST /pulp/api/v3/orphans/cleanup/ HTTP/1.0" 202 67 "-" "HTTPie/1.0.3" pulp [ff4643d7d2e342fa974565629a8333a6]: pulpcore.tasking.pulpcore_worker:INFO: Starting task 27ee5dab-6948-4aef-b447-2c5e654b8d8e

pulp [ff4643d7d2e342fa974565629a8333a6]: pulpcore.tasking.pulpcore_worker:INFO: Task completed 27ee5dab-6948-4aef-b447-2c5e654b8d8e

#5 Updated by janr7 3 months ago

Hello

Please help by giving an properly formatted payload json for the content_hrefs:

to get rid of messages: 127.0.0.1 [16/Sep/2021:14:20:57 +0000] "GET /pulp/content/SLE/SLES12/SP5/product//repodata/repomd.xml.asc HTTP/1.0" 404

Appreciated. Thank you.

#6 Updated by janr7 3 months ago

Hi

To rule out confusion from the previous question. I am asking an example of a working payload json please. Not what mine should be.

Regards.

#7 Updated by ipanova@redhat.com 3 months ago

  • Category deleted (Operator - Moved to Github Issues)

#8 Updated by ipanova@redhat.com 3 months ago

janr7 wrote:

Hello

Please help by giving an properly formatted payload json for the content_hrefs:

to get rid of messages: 127.0.0.1 [16/Sep/2021:14:20:57 +0000] "GET /pulp/content/SLE/SLES12/SP5/product//repodata/repomd.xml.asc HTTP/1.0" 404

Appreciated. Thank you.

Hi, the 404 that you see in the logs is not related to the content_hrefs json payload. I am suggesting you to open an issue in the rpm tracker https://pulp.plan.io/projects/pulp_rpm/issues/new

In any case, for the a correct oprhan cleanup call, in case you'd want to specify a set of content to be removed here is an example:

$ http POST :24817/pulp/api/v3/orphans/cleanup/ content_hrefs:=[\"/pulp/api/v3/content/file/files/6f64347a-4edd-49b3-8c66-1bf82b59d682/\"]
HTTP/1.1 202 Accepted
Access-Control-Expose-Headers: Correlation-ID
Allow: POST, OPTIONS
Connection: close
Content-Length: 67
Content-Type: application/json
Correlation-ID: b12bb411e19049e6ab0181f50b8edbf8
Date: Tue, 21 Sep 2021 15:00:40 GMT
Referrer-Policy: same-origin
Server: gunicorn
Vary: Accept, Cookie
X-Content-Type-Options: nosniff
X-Frame-Options: DENY

{
    "task": "/pulp/api/v3/tasks/651429ef-00fe-4462-9664-984a099ec698/"
}

#9 Updated by ipanova@redhat.com 3 months ago

  • Status changed from NEW to CLOSED - NOTABUG

#10 Updated by janr7 2 months ago

Thank you so much. A great help!

Please register to edit this issue

Also available in: Atom PDF