Issue #5573
closedPublish won't create multiple checkecksummed copies of primary.xml, fileliststs.xml etc even when in fast-forward mode
Description
I had noticed that, even though I was publishing a repository after copying a unit into it (i.e. no removal), I was still not getting multiple copies of primary.xml.gz.
Background: repositories with frequent publishes may cause yum clients to issue 404 errors if the copy of repomd.xml points to metadata files that disappeared. So it's desirable for the metadata files to stick around for a while.
tthereshc found these tickets somewhat related:
https://pulp.plan.io/issues/1684
https://pulp.plan.io/issues/2788
https://pulp.plan.io/issues/3551
https://pulp.plan.io/issues/3816
However, none of them was addressing the behavior I was seeing - that the FileContext was picking the latest file (of which there was only one), moving it, uncompressing it for fast-forward operations, and then removing it. So, in my observations, we were starting with a list of one and ending with a list of one (usually different) file.
I tracked the code in pulp core, and here is the change I am proposing, that seems to keep the file around:
diff --git a/server/pulp/plugins/util/metadata_writer.py b/server/pulp/plugins/util/metadata_writer.py
index 5c459fad5..225528473 100644
--- a/server/pulp/plugins/util/metadata_writer.py
+++ b/server/pulp/plugins/util/metadata_writer.py
@@ -341,7 +341,7 @@ class FastForwardXmlFileContext(XmlFileContext):
# move the file so that we can still process it if the name is the same
if self.existing_file:
new_file_name = 'original.%s' % self.existing_file
- shutil.move(os.path.join(working_dir, self.existing_file),
+ shutil.copy(os.path.join(working_dir, self.existing_file),
os.path.join(working_dir, new_file_name))
self.existing_file = new_file_name
Related issues
Preserve older copies of metadata files (primary.xml, etc)
closes #5573