Actions
Backport #9334
closedBackport 2.8: race condition syncing repositories with the same tags results in bad data within the database
Start date:
Due date:
% Done:
100%
Estimated time:
Triaged:
Yes
Sprint Candidate:
No
Tags:
Katello
Sprint:
Quarter:
Description
On a 4-core vm with 12 GB of ram, i synced 2 repos at the same time:
http://perf54.perf.lab.eng.bos.redhat.com:5000
test_repo11 & test_repo12
(apologies for these being internal only). After syncing, we can use the following query to identify tags shared by this repository:
select container_tag.name, container_tag.content_ptr_id from core_repositorycontent inner join container_tag on container_tag.content_ptr_id = core_repositorycontent.content_id inner join core_repositorycontent as core_repositorycontent2 on core_repositorycontent.content_id = core_repositorycontent2.content_id and core_repositorycontent.repository_id != core_repositorycontent2.repository_id;
In my case i had 106 of them, here's one:
ver154 | 5ba8ddfc-fde7-4485-8880-fae8f3901d8e
Lets examine the that tag:
pulpcore=# select * from container_Tag where content_ptr_id = '5ba8ddfc-fde7-4485-8880-fae8f3901d8e';
content_ptr_id | name | tagged_manifest_id
--------------------------------------+--------+--------------------------------------
5ba8ddfc-fde7-4485-8880-fae8f3901d8e | ver154 | 8b3c4916-0012-443b-b2a7-3421e65a51d7
Lets see which repos it is in:
pulpcore=# select repository_id, content_id from core_repositorycontent where content_id = '5ba8ddfc-fde7-4485-8880-fae8f3901d8e' order by repositorY_id;
repository_id | content_id
--------------------------------------+--------------------------------------
16cc1c09-4d64-4b43-a138-250074f56ef7 | 5ba8ddfc-fde7-4485-8880-fae8f3901d8e
60aa08e6-5997-4103-91c3-1b0730d16024 | 5ba8ddfc-fde7-4485-8880-fae8f3901d8e
(2 rows)
Its in 2 repos! This is unexpected, is its manifest in both?
pulpcore=# select repository_id, content_id from core_repositorycontent where content_id = '8b3c4916-0012-443b-b2a7-3421e65a51d7' order by repositorY_id;
repository_id | content_id
--------------------------------------+--------------------------------------
60aa08e6-5997-4103-91c3-1b0730d16024 | 8b3c4916-0012-443b-b2a7-3421e65a51d7
(1 row)
No, its manifest is only in one. This isn't right.
If i sync these sequentially, the problem doesn't seem to occur
Related issues
Updated by mdellweg over 3 years ago
- Copied from Issue #9292: race condition syncing repositories with the same tags results in bad data within the database added
Updated by pulpbot over 3 years ago
- Status changed from NEW to POST
Added by mdellweg over 3 years ago
Added by mdellweg over 3 years ago
Updated by mdellweg over 3 years ago
- Status changed from POST to MODIFIED
- % Done changed from 0 to 100
Applied in changeset 69b410055adaf274f213fd1d3a1063b38945fbd9.
Updated by pulpbot over 3 years ago
- Status changed from MODIFIED to CLOSED - CURRENTRELEASE
Actions
Refactor sync pipeline with content resolution
This will no longer create hollow tags that will be updated later, but wait for their creation until the Manifest is in the database.
backports #9292
fixes #9334
(cherry picked from commit 7285b7ab28c652b508b03f453ffcb7c80d0923f1)