Pulp: Issueshttps://pulp.plan.io/https://pulp.plan.io/favicon.ico2021-09-08T14:45:07ZPulp
Planio Pulp - Refactor #9354 (CLOSED - CURRENTRELEASE): Disable DJANGO_ALLOW_ASYNC_UNSAFE for content apphttps://pulp.plan.io/issues/93542021-09-08T14:45:07Zdalleydalley@redhat.com
<p>Plugins will need to adapt their code to be async - and possibly APIs will need to be changed to facilitate this.</p> Pulp - Story #8446 (CLOSED - CURRENTRELEASE): As an installer user, I can install object storage...https://pulp.plan.io/issues/84462021-03-24T17:15:32Zmdepaulo@redhat.com
<p>pulpcore's setup.py now contains the following:
extras_require={
"s3": ["django-storages[boto3]"],
"azure": ["django-storages[azure]"],
"prometheus": ["django-prometheus"],
"test": test_requirements,
},</p>
<p>So one can install for example
<code>pip install pulpcore[s3]</code></p>
<p>the installer should take advantage of this interface, perhaps with a variable like:
<code>pulp_install_object_storage</code> (with "s3" or "azure" as possible variables).
or:
<code>pulp_install_extras</code>: (Which would accept a list of strings).</p>
<p>Installer should also be updated, including an example list of PULP_SETTINGS variables for using it like the <a href="https://docs.pulpproject.org/pulpcore/installation/storage.html" class="external">pulpcore docs have for manual installs.</a></p> Pulp - Story #7712 (CLOSED - DUPLICATE): As a developer, pulp_installer will provide me with pos...https://pulp.plan.io/issues/77122020-10-14T19:55:40Zdalleydalley@redhat.com
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulpcore/1938":<a href="https://github.com/pulp/pulpcore/issues/1938" class="external">https://github.com/pulp/pulpcore/issues/1938</a></p>
<hr>
<p>Something like pg_stat_monitor or pg_stat_statements that keeps track of which queries are called most frequently, and much time is spent executing them would be extremely useful. It's actually quite difficult to do from the Django side.</p>
<p><a href="https://www.postgresql.org/docs/current/pgstatstatements.html" class="external">https://www.postgresql.org/docs/current/pgstatstatements.html</a>
<a href="https://github.com/percona/pg_stat_monitor" class="external">https://github.com/percona/pg_stat_monitor</a></p> Pulp - Task #7476 (CLOSED - DUPLICATE): [Docs] Improve plugin API reference section of the guidehttps://pulp.plan.io/issues/74762020-09-09T02:51:15Zdalleydalley@redhat.com
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulpcore/1933":<a href="https://github.com/pulp/pulpcore/issues/1933" class="external">https://github.com/pulp/pulpcore/issues/1933</a></p>
<hr>
<p><a href="https://docs.pulpproject.org/pulpcore/plugins/index.html#plugin-writer-s-guide" class="external">https://docs.pulpproject.org/pulpcore/plugins/index.html#plugin-writer-s-guide</a></p>
<p>Feedback from Gerrod:</p>
<blockquote>
<p>Knowing how Pulp works in order to make changes is definitely the most challenging aspect of contributing. I think the writer’s guide does a good job describing all the different aspects of Pulp and plugins. The plugin api reference section is a bit barren though.</p>
</blockquote>
<p>I think what he's referring to is that several sections like the models and viewsets have almost no description, but then other sections such as downloaders have a lot of detail.</p> Pulp - Task #7474 (CLOSED - DUPLICATE): [Docs] Improve developer environment guidehttps://pulp.plan.io/issues/74742020-09-09T02:43:55Zdalleydalley@redhat.com
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulpcore/1932":<a href="https://github.com/pulp/pulpcore/issues/1932" class="external">https://github.com/pulp/pulpcore/issues/1932</a></p>
<hr>
<p><a href="https://docs.pulpproject.org/pulpcore/en/master/nightly/contributing/dev-setup.html" class="external">https://docs.pulpproject.org/pulpcore/en/master/nightly/contributing/dev-setup.html</a></p>
<p>Feedback from Gerrod:</p>
<blockquote>
<p>I think setting up the pulplift environment is really easy but maybe there should be more info on interacting with elements in the environment like how it's useful to add port tunnels for 24817, 24816 on the vagrant ssh or more info about useful commands inside the environment like 'phelp, pbindings, etc..'</p>
</blockquote> Docker Support - Story #6933 (CLOSED - WONTFIX): As a user I can opt out of mirorring source con...https://pulp.plan.io/issues/69332020-06-09T10:39:34Zipanova@redhat.comipanova@redhat.com
<p>Description of problem:
Once source container are enabled, they will be included as regular types of the images in the repositories.
Source container images are quite big, they cannot be pulled( podman docker pull will fail) and run.
Only with skopeo copy command those containers can be installed.</p>
<p>Pulp Registry won't have an issue to mirror down such content, however the sync time might be much longer due to the fact that source container images are quite large. This might decrease user experience.</p>
<p>Solution: add an option to opt out of mirroring source container images.</p>
<p>Every source container will be tagged with the following convention:</p>
<p>Source container images are named based on the binary containers they represent. For example, for a particular standard RHEL UBI 8 container registry.access.redhat.com/ubi8:8.1-397 append -source to get the source container image (registry.access.redhat.com/ubi8:8.1-397-source).</p>
<p>We already have a whitelist option, adjust the code so it can handle regular expressions and a negation of the search pattern can be used.</p> File Support - Story #5351 (CLOSED - DUPLICATE): As a user I can do a chuncked one shot uploadhttps://pulp.plan.io/issues/53512019-08-26T16:05:13Zipanova@redhat.comipanova@redhat.com
<p>Add the ability to provide a file, where behind the scenes it will be divided and uploaded in chucks, content unit created and added to the repo.</p> Pulp - Story #5280 (CLOSED - WONTFIX): As a user, I can rsync deduplicated content to the remote...https://pulp.plan.io/issues/52802019-08-14T12:21:16Zipanova@redhat.comipanova@redhat.com
<p>If some content is shared between the repos then it will be rsynced as many times as it present in the repos.<br>
We need to find a way how to de-duplicate content when rsyncing to the remote server.</p> Container Support - Story #4947 (CLOSED - CURRENTRELEASE): As a user I can add tags to a reposit...https://pulp.plan.io/issues/49472019-06-10T17:09:50Zamacdona@redhat.comaustin@redhat.com
<p>This user story is doable by the initial design (<a href="https://pulp.plan.io/issues/3405" class="external">https://pulp.plan.io/issues/3405</a>), but it currently requires users to make a number of preliminary calls to pulp to gather information that could be replaced with more arguments to a single call. My guess is that adding units by tag name and source repo will be a common use case.</p>
<p>This story is to accept repository version hrefs and tag names. If they are provided, they should be used to generate a list of tag content units to add to a repository, and this list should be passed to the rest of the work implemented in 3405.</p>
<p><span>Endpoint</span></p>
<p>POST v3/docker/tags/copy repo_source ='' repo_dest='' name=X will recursively copy everything under tag X<br>
POST v3/docker/tags/copy repo_source ='' repo_dest='' will recursively copy all the tags</p> Container Support - Task #4935 (CLOSED - WONTFIX): As a user, I can validate blobs and manifests...https://pulp.plan.io/issues/49352019-06-07T17:14:29Zipanova@redhat.comipanova@redhat.com
<p>Test that if validate option is set to true, size and checksum of the incoming content is incrementally calculated and verified against repo metadata.</p> RPM Support - Story #4812 (CLOSED - CURRENTRELEASE): As a user, I can publish a Yum repository t...https://pulp.plan.io/issues/48122019-05-10T15:07:36Zdalleydalley@redhat.com
<p>(Clone of Pulp 2 issue <a href="https://pulp.plan.io/issues/3055" class="external">https://pulp.plan.io/issues/3055</a>)</p>
<p>To allow a Yum repository to be used with Yum clients that have repo_gpgcheck=1 configured in /etc/yum.conf:</p>
<ol>
<li>Create a new GPG signing key that can be used by Pulp worker processes without a password. (Documentation provides example procedures.)</li>
<li>Append the public key associated with the new GPG signing key to the gpgkey file specified in the distributor config for the Yum repository in Pulp.</li>
<li>Set gpg_sign_metadata to True in the distributor config for the Yum repository in Pulp.</li>
</ol>
<p>See also <a href="https://access.redhat.com/solutions/2850911" class="external">https://access.redhat.com/solutions/2850911</a></p>
<p>More detailed description from Neal Gompa (Conan_Kudo, Fedora contributor):</p>
<p>Signed repositories (for RPM repos) are when the `repomd.xml` file (the index file referencing all other parts of the RPM metadata) is signed using <em>a</em> GPG key (but does not necessarily have to be the same key as the packages, though usually is) in the form of a detached signature (`repomd.xml.asc`) that is placed next to the `repomd.xml` file. Package managers like DNF, Zypper, and YUM can use this when `repo_gpgcheck=1` is set in the .repo file to validate the XML before reading it. SUSE systems <em>require</em> this by default and will not normally fetch repos that are not signed. If the GPG key for the repository metadata differs from the packages' GPG key, its public key must <em>also</em> be present in the `gpgkey=` list in the .repo file.</p> Container Support - Story #4717 (CLOSED - CURRENTRELEASE): As a user, I can sync from registries...https://pulp.plan.io/issues/47172019-04-18T12:26:48Zipanova@redhat.comipanova@redhat.com
<p>Parse 401 WWW-Authenticate response header to find out if the scheme is basic or token.</p>
<p>This pulp2 ticket might contain more info <a href="https://pulp.plan.io/issues/2956" class="external">https://pulp.plan.io/issues/2956</a></p> Pulp - Task #2946 (CLOSED - WONTFIX): As a plugin writer, I know how to publish docs to RTDhttps://pulp.plan.io/issues/29462017-07-31T17:40:02Zttereshcttereshc@redhat.com
<p>Here are some points to take into account:</p>
<ul>
<li>we want plugin docs to be built independently from the core ones</li>
</ul>
<ul>
<li>not to be blocked from building docs due to error in plugin docs (this is especially relevant to the community plugins where we potentially can do nothing to fix the docs)</li>
<li>plugin docs version can't be defined by core, plugins depend more on a plugin API version than on the core one.</li>
</ul>
<ul>
<li>unified search (across core and all the plugins) would be nice, if possible</li>
<li>but it's likely that every plugin will have its own site for docs</li>
</ul>
<p>Here is also a relevant IRC discussion:</p>
<pre><code>[17:17:21] <bmbouter> we need to have each docs be it's own JJB job, not one epic docs builder
[17:17:52] <bmbouter> also we can't have our docs blocked from building due to an error in plugin docs, so we need some kind of two-stage build or something
[17:18:10] <bizhang> bmbouter, I think the reason it's one epic builder is so search works across all docs
[17:19:14] <bmbouter> originally it was because we had 6 RTD sites to maintain
[17:19:42] <bmbouter> unified search is also nice
[17:20:41] <asmacdo> unifiedsearch++
[17:20:50] <bmbouter> I'm conflicted on it because hosting lots of docs is nice but also hard
[17:21:32] <asmacdo> We could revisit the idea of a docs repository
[17:21:56] <asmacdo> it would split the PRs, but that doesn't seem like that big of a deal to me
[17:22:19] <bmbouter> s/ideas/problems/
[17:25:29] <bmbouter> asmacdo: the issue with one repo is that it doesn't allow for docs hosting as versioned by plugins
[17:26:07] <asmacdo> That's a deal breaker for me...
[17:26:08] <bmbouter> which when you think about it, the way we would host "plugin docs" won't work even because it assumes the plugin's docs are appropriate as versioned by core
[17:26:27] <bmbouter> yeah it keeps bringing me back to, plugin writers should just use RTD
[17:27:14] <asmacdo> perhaps we could host them, but as separate projects
[17:28:02] <bmbouter> yes perhaps
[17:28:05] <asmacdo> python.pulpproject.org
[17:28:26] <ttereshc> if we have some unified automatic way to add plugins to our docs, we can try to build docs and include them only if there were no errors during build process
[17:28:31] <bmbouter> we could also just point a DNS name like that at RTD
[17:28:53] <bmbouter> ttereshc: yes but we still have "the versions problem"
[17:29:29] <bmbouter> as in the python plugin is at 1.8.3 (as an example), yet it's being browsed at http://docs.pulpproject.org/en/2.13/
[17:29:49] <ttereshc> :(
[17:31:01] <asmacdo> such an ugly problem
[17:32:28] <asmacdo> I think the uglier solution is that our core docs should be very very light
[17:34:11] <asmacdo> And each plugin can link to core docs for concepts, installation, etc but each plugin will need fairly comprehensive docs
[17:35:03] <bmbouter> I think having each plugin have its own site for docs is inevitable
[17:35:38] <bmbouter> and the uniformity of the pulp2 plugin docs made sense since they were uniformly produced, but with pulp3 that won't be the case, each plugin will likely release separately
[17:35:52] <bmbouter> thanks to the plugin api
</code></pre> Pulp - Story #2843 (CLOSED - CURRENTRELEASE): As an authenticated user, I can create an Artifac...https://pulp.plan.io/issues/28432017-06-26T17:22:30Zdkliban@redhat.com
<p>For an API user to create an Artifact, Pulp 3 needs to have the following:</p>
<p>- updated Artifact[0] model without a 'content' foreign key nor the 'relative_path' field. A uniqueness constraint should be added on the sha256 field.<br>
- viewset that can handle CRD operations for Artifacts. It most likely needs to use the FileUploadParser[1] and a custom Django Upload Handler[2].<br>
- serializer for the viewset which will return all serialized fields of the Artifact model. The file field should be it's full path on disk.<br>
- API endpoint at /api/v3/content/artifacts<br>
- POST request to the /api/v3/content/artifacts/ endpoint creates an Artifact. The body of the request contains multipart form data with the following:</p>
<blockquote>
<p>file - The file being uploaded<br>
size - The size of the file in bytes.<br>
md5 - The MD5 checksum of the file.<br>
sha1 - The SHA-1 checksum of the file.<br>
sha224 - The SHA-224 checksum of the file.<br>
sha256 - The SHA-256 checksum of the file.<br>
sha384 - The SHA-384 checksum of the file.<br>
sha512 - The SHA-512 checksum of the file.</p>
</blockquote>
<p>Before the model is saved, a SHA-256 checksum (digest) of the uploaded file is generated.<br>
If a sha256 was specified in POST parameters, the generated hash is validated against the value specified as the POST parameter. If the values don't match a validation exception is raised.<br>
If an Artifact with the same sha256 checksum already exists, a 400 response is returned to the user.<br>
When the model is saved, the file is written to MEDIA_ROOT/content/units/digest[0:2]/digest[2:]<br>
After a successful save, a serialized version of the Artifact is returned.</p>
<p>- GET request to the /api/v3/content/artifacts/<artifact id> returns serialized Artifact</p>
<p>- DELETE request to the /api/v3/content/artifacts/<artifact id>/ raises an exception if the Artifact is still associated with any ContentUnit. The exception should provide a list of content units still associated with the Artifact. If an exception is not raised, Artifact is removed from the database and the file is removed from disk.</p>
<p>[0] <a href="https://github.com/pulp/pulp/blob/3.0-dev/platform/pulpcore/app/models/content.py#L72" class="external">https://github.com/pulp/pulp/blob/3.0-dev/platform/pulpcore/app/models/content.py#L72</a><br>
[1] <a href="http://www.django-rest-framework.org/api-guide/parsers/#fileuploadparser" class="external">http://www.django-rest-framework.org/api-guide/parsers/#fileuploadparser</a><br>
[2] <a href="https://docs.djangoproject.com/en/1.11/topics/http/file-uploads/#upload-handlers" class="external">https://docs.djangoproject.com/en/1.11/topics/http/file-uploads/#upload-handlers</a></p> Pulp - Story #1677 (CLOSED - WONTFIX): 'pulp-admin repo history sync' needs more filters to be us...https://pulp.plan.io/issues/16772016-02-15T17:36:05Zkfiresmithkfiresmith@gmail.com
<p>Right now I can iterate over a full list of repos to get a snapshot of the state of repo syncs across my pulp server by running this:</p>
<ol>
<li>for i in $(pulp-admin repo list | grep 'Id:' | awk '{print $2}'); do pulp-admin repo history sync --limit=1 --repo-id=$i; done</li>
</ol>
<p>Unfortunately that will return text for all repos regardless of if they have ever been synced (some repos originate from within pulp thus do not sync...), and cannot be set to not return anything if the last sync was successful.</p>
<p>I would very much like to be able to use pulp-admin to return only a list of repos that had errors syncing, so that I can limit the noise in my logging emails.</p>
<p>A setting that would allow me to filter based on the "Result:" field would be swell and could work like this:</p>
<p>pulp-admin repo history sync --limit=1 --repo-id=EPEL7 --result={error,failed,success}</p>