[epic] As a user, I can export the content of a RepositoryVersion from one Pulp3 system and import on an air gapped Pulp3 system
The Pulp3-to-Pulp3 sync mechanisms are typically one Pulp syncing from another Pulp over the network. When the two Pulp3 systems are air gapped this sync mechanism won't work.
Plugins could make individual import/export functions, but can pulpcore do it once for all Content types?
Use Case: Full Export+Import¶
As a Pulp3 user I can export all of the content from one RepositoryVersion to a disk. Then walk that disk into the air gapped environment and mount it. Add the RepositoryVersion to an air gapped Pulp as part of a specified, existing Repository.
Use Case: Incremental Export+Import¶
As a Pulp3 user who has exported a RepositoryVersion earlier, I can specify that RepositoryVersion and it will export the repository containing all content differences between that RepositoryVersion and the latest version of that Repository.
Artifact and ContentArtifact objects¶
The Content units created need to also create corresponding Artifact and ContentArtifact objects. Artifacts already existing on the import system should not be disturbed but used.
Foreign Keys to/from Content Model¶
Content units sometimes have other non-Content models that are related to them, e.g. in pulp_rpm the UpdateRecord has related UpdateCollection and UpdateCollectionPackage records. These may or may not need to be duplicated and likely need plugin writers to be involved.
Make an export API that takes a 'repository_version' and it will generate a task that will create the export archive for the user. All Content types in the repository version, their Artifacts, and ContentArtifact objects are exported with django-import-export. This will make the file available for downloading via the pulp-content app somehow as a single archive.
Additionally the Export takes a 'prev_export_repository_version' option which will be used as the diff for all content between 'prev_export_repository_version' and 'repository_version'.
Using the upload API the tarball can be uploaded in parallel efficiently. An import API that takes a 'repository' and it will generate a task that will import the archive for the user.
To use django-import-export, plugin writers would provide a customized resource for their Content model and this would be put into a resources.py and imported by the Django plugin loader causing it to import for each plugin.
If the plugin writer doesn't provide one and the generic one will work for them, they do nothing, and pulpcore will create a generic one at Runtime
pulpcore would also provide Resources for Artifact and ContentArtifact which the import task would also handle.
How will users at import time specify if they want dry-run or not?