Pulp: Issueshttps://pulp.plan.io/https://pulp.plan.io/favicon.ico2021-12-08T16:59:02ZPulp
Planio RPM Support - Issue #9619 (CLOSED - DUPLICATE): OpenAPI schema for ModulemdDefault is incorrecthttps://pulp.plan.io/issues/96192021-12-08T16:59:02Zdkliban@redhat.com
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulp_rpm/2307":<a href="https://github.com/pulp/pulp_rpm/issues/2307" class="external">https://github.com/pulp/pulp_rpm/issues/2307</a></p>
<hr>
<p>The open api schema for ModulemdDefaults is incorrect. When trying to generate a client with openapi-generator-cli 5.3.0, the following exception is emitted:</p>
<pre><code>Exception in thread "main" java.lang.RuntimeException: Could not generate api file for 'ContentModulemdDefaults'
</code></pre> RPM Support - Issue #9337 (CLOSED - DUPLICATE): Dependency solving does not pull all stream depen...https://pulp.plan.io/issues/93372021-09-02T15:00:08Zdalleydalley@redhat.com
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulp_rpm/2299":<a href="https://github.com/pulp/pulp_rpm/issues/2299" class="external">https://github.com/pulp/pulp_rpm/issues/2299</a></p> RPM Support - Issue #9336 (CLOSED - DUPLICATE): Assertion failure when performing depsolving-enab...https://pulp.plan.io/issues/93362021-09-02T14:57:42Zdalleydalley@redhat.comRPM Support - Issue #9331 (CLOSED - DUPLICATE): Dependency solver takes an extremely long time to...https://pulp.plan.io/issues/93312021-09-02T03:01:50Zdalleydalley@redhat.com
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulp_rpm/2298":<a href="https://github.com/pulp/pulp_rpm/issues/2298" class="external">https://github.com/pulp/pulp_rpm/issues/2298</a></p> RPM Support - Issue #9219 (CLOSED - DUPLICATE): Mirrored .treeinfo metadata needs to be rewritten...https://pulp.plan.io/issues/92192021-08-09T16:25:57Zdalleydalley@redhat.com
<p>When .treeinfo contains relative paths to a location outside of the repository, as is the case with CentOS 8, Pulp cannot serve those sub-repos precisely as they are. So it syncs all of them and publishes all of them into one repository with subdirectories for the sub-repos, and writes the locations of these sub-repos into the .treeinfo metadata.</p>
<p>In the mirrored metadata case, the .treeinfo file will be pointing to the wrong locations, so we need to rewrite the .treeinfo file just like we do during a standard publish.</p>
<p>As .treeinfo isn't checksummed or signed we aren't prevented from doing this.</p> RPM Support - Issue #9133 (CLOSED - NOTABUG): RPM repository sync errorhttps://pulp.plan.io/issues/91332021-07-23T07:56:53Zgvde
<p>I have been collecting all information on this problem here: <a href="https://community.theforeman.org/t/centos-8-4-baseos-sync-error" class="external">https://community.theforeman.org/t/centos-8-4-baseos-sync-error</a></p>
<p>In short: occasionally my sync of the CentOS 8 BaseOS repository from centos.org ends with errors:</p>
<pre><code>Error message: the server returns an error
HTTP status code: 400
Response headers: {"date"=>"Thu, 22 Jul 2021 20:36:46 GMT", "server"=>"gunicorn", "content-type"=>"application/json", "vary"=>"Accept,Cookie", "allow"=>"GET, POST, HEAD, OPTIONS", "x-frame-options"=>"SAMEORIGIN", "content-length"=>"67", "correlation-id"=>"ded10b33-c063-4471-9182-3b62facbd36b", "access-control-expose-headers"=>"Correlation-ID", "via"=>"1.1 foreman.example.com", "connection"=>"close"}
Response body: {"repository_version":["Invalid hyperlink - Incorrect URL match."]}Error message: the server returns an error
HTTP status code: 400
Response headers: {"date"=>"Thu, 22 Jul 2021 20:36:47 GMT", "server"=>"gunicorn", "content-type"=>"application/json", "vary"=>"Accept,Cookie", "allow"=>"GET, POST, HEAD, OPTIONS", "x-frame-options"=>"SAMEORIGIN", "content-length"=>"112", "correlation-id"=>"ded10b33-c063-4471-9182-3b62facbd36b", "access-control-expose-headers"=>"Correlation-ID", "via"=>"1.1 foreman.example.com", "connection"=>"close"}
Response body: ["URI /pulp/api/v3/publications/rpm/rpm/1ad1ad0e-c9c1-42ef-94f6-0b188c29f72d/ not found for repositoryversion."]
</code></pre>
<p>I can access the mentioned publication URI via API:</p>
<pre><code># curl -s --cert /etc/pki/katello/certs/pulp-client.crt --key /etc/pki/katello/private/pulp-client.key 'https://foreman.dkrz.de/pulp/api/v3/publications/rpm/rpm/1ad1ad0e-c9c1-42ef-94f6-0b188c29f72d/' | python -m json.tool
{
"gpgcheck": 0,
"metadata_checksum_type": "unknown",
"package_checksum_type": "unknown",
"pulp_created": "2021-07-22T20:36:04.962028Z",
"pulp_href": "/pulp/api/v3/publications/rpm/rpm/1ad1ad0e-c9c1-42ef-94f6-0b188c29f72d/",
"repo_gpgcheck": 1,
"repository": "/pulp/api/v3/repositories/rpm/rpm/3f9dc526-a51c-4a25-9547-95f82bedb3ee/",
"repository_version": "/pulp/api/v3/repositories/rpm/rpm/3f9dc526-a51c-4a25-9547-95f82bedb3ee/versions/4/",
"sqlite_metadata": true
}
</code></pre>
<p>However, the repository for that is not found:</p>
<pre><code># curl -s --cert /etc/pki/katello/certs/pulp-client.crt --key /etc/pki/katello/private/pulp-client.key 'https://foreman.dkrz.de/pulp/api/v3/repositories/rpm/rpm/3f9dc526-a51c-4a25-9547-95f82bedb3ee/' | python -m json.tool
{
"detail": "Not found."
}
</code></pre>
<p>Checking the pulpcore database I can find both:</p>
<pre><code>pulpcore=# select * from rpm_rpmpublication where publication_ptr_id = '1ad1ad0e-c9c1-42ef-94f6-0b188c29f72d';
publication_ptr_id | metadata_checksum_type | package_checksum_type | gpgcheck | repo_gpgcheck | sqlite_metadata
--------------------------------------+------------------------+-----------------------+----------+---------------+-----------------
1ad1ad0e-c9c1-42ef-94f6-0b188c29f72d | unknown | unknown | 0 | 1 | t
(1 row)
pulpcore=# select * from rpm_rpmrepository where repository_ptr_id = '3f9dc526-a51c-4a25-9547-95f82bedb3ee';
repository_ptr_id | sub_repo | metadata_signing_service_id | last_sync_remote_id | last_sync_repo_version | last_sync_revision
_number | original_checksum_types
| retain_package_versions | last_sync_repomd_checksum | autopublish | gpgcheck
| metadata_checksum_type | package_checksum_type | repo_gpgcheck | sqlite_metadata
--------------------------------------+----------+-----------------------------+--------------------------------------+------------------------+-------------------
--------+----------------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------------------------+-------------------------+------------------------------------------------------------------+-------------+----------
+------------------------+-----------------------+---------------+-----------------
3f9dc526-a51c-4a25-9547-95f82bedb3ee | t | | 69abed5c-b1e1-452a-b58b-f0cc23548924 | 4 | 8.4.2105
| {"group": "sha256", "other": "sha256", "modules": "sha256", "primary": "sha256", "group_xz": "sha256", "other_db": "sha256", "filelists": "sha256", "prim
ary_db": "sha256", "filelists_db": "sha256"} | 0 | c21c5d2410544fccf2dcc78ce0f472fd8ed9d8c3fc0de1a8f8061a72629a5c7e | f | 0
| sha256 | sha256 | 0 | f
(1 row)
</code></pre> RPM Support - Story #9131 (CLOSED - DUPLICATE): As an administrator, I'd like RPM repository sync...https://pulp.plan.io/issues/91312021-07-23T06:13:18Zwibbit
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulp_rpm/2286":<a href="https://github.com/pulp/pulp_rpm/issues/2286" class="external">https://github.com/pulp/pulp_rpm/issues/2286</a></p>
<hr>
<p>Currently, though Pulp3 supports mirror lists, it does not currently support re-trying against a different host in the mirror list in the event of a package sync failure.</p>
<p>While attempting to use mirror lists while syncing fedora34 updates, after ~15 attempts I was not able to get to version 1 of the repository, as each time it would try and it would get a new mirror, and there would be a failed package of some kind.</p>
<p>Anecdotally I see this a lot when running a dnf update/upgrade where packages will fail and DNF will happily go off and try a new mirror, without this logic, for larger repositories that may have a lot of change, I'm unsure of the value of supporting mirror lists.</p> RPM Support - Issue #8985 (CLOSED - DUPLICATE): deadlock detected during pulp3 to pulp3 synchttps://pulp.plan.io/issues/89852021-06-29T17:50:51Zttereshcttereshc@redhat.com
<p>It happens during Pulp 3 to Pulp 3 sync.
Not clear how reproducible it is. Check the related BZ for repo list</p>
<p>Variations seen</p>
<pre><code>deadlock detected
DETAIL: Process 21456 waits for ShareLock on transaction 25847; blocked by process 21471.
Process 21471 waits for ShareLock on transaction 25727; blocked by process 21456.
HINT: See server log for query details.
CONTEXT: while inserting index tuple (0,2) in relation "rpm_repometadatafile_data_type_checksum_relat_c9d7364a_uniq"
</code></pre>
<pre><code>deadlock detected
DETAIL: Process 35582 waits for ShareLock on transaction 218181; blocked by process 35563.
Process 35563 waits for ShareLock on transaction 218140; blocked by process 35582.
HINT: See server log for query details.
CONTEXT: while inserting index tuple (32,1) in relation "rpm_package_pkgId_key"
</code></pre>
<p>Full traceback for one</p>
<pre><code>pulpcore-worker-2[5987]: pulp [75194b26-9465-42fe-97fc-23b4d0d33c7b]: rq.worker:ERROR: Traceback (most recent call last):
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
pulpcore-worker-2[5987]: return self.cursor.execute(sql, params)
pulpcore-worker-2[5987]: psycopg2.errors.DeadlockDetected: deadlock detected
pulpcore-worker-2[5987]: DETAIL: Process 8294 waits for ShareLock on transaction 4698; blocked by process 8300.
pulpcore-worker-2[5987]: Process 8300 waits for ShareLock on transaction 4629; blocked by process 8294.
pulpcore-worker-2[5987]: HINT: See server log for query details.
pulpcore-worker-2[5987]: CONTEXT: while inserting index tuple (0,2) in relation "rpm_repometadatafile_data_type_checksum_relat_c9d7364a_uniq"
pulpcore-worker-2[5987]: The above exception was the direct cause of the following exception:
pulpcore-worker-2[5987]: Traceback (most recent call last):
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/rq/worker.py", line 975, in perform_job
pulpcore-worker-2[5987]: rv = job.perform()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/rq/job.py", line 696, in perform
pulpcore-worker-2[5987]: self._result = self._execute()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/rq/job.py", line 719, in _execute
pulpcore-worker-2[5987]: return self.func(*self.args, **self.kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulp_rpm/app/tasks/synchronizing.py", line 269, in synchronize
pulpcore-worker-2[5987]: dv.create()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/declarative_version.py", line 147, in create
pulpcore-worker-2[5987]: loop.run_until_complete(pipeline)
pulpcore-worker-2[5987]: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
pulpcore-worker-2[5987]: return future.result()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
pulpcore-worker-2[5987]: await asyncio.gather(*futures)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
pulpcore-worker-2[5987]: await self.run()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/content_stages.py", line 95, in run
pulpcore-worker-2[5987]: d_content.content.save()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/app/models/base.py", line 149, in save
pulpcore-worker-2[5987]: return super().save(*args, **kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django_lifecycle/mixins.py", line 134, in save
pulpcore-worker-2[5987]: save(*args, **kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 744, in save
pulpcore-worker-2[5987]: force_update=force_update, update_fields=update_fields)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 782, in save_base
pulpcore-worker-2[5987]: force_update, using, update_fields,
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 873, in _save_table
pulpcore-worker-2[5987]: result = self._do_insert(cls._base_manager, using, fields, update_pk, raw)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 911, in _do_insert
pulpcore-worker-2[5987]: using=using, raw=raw)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/manager.py", line 82, in manager_method
pulpcore-worker-2[5987]: return getattr(self.get_queryset(), name)(*args, **kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 1186, in _insert
pulpcore-worker-2[5987]: return query.get_compiler(using=using).execute_sql(return_id)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/sql/compiler.py", line 1377, in execute_sql
pulpcore-worker-2[5987]: cursor.execute(sql, params)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 67, in execute
pulpcore-worker-2[5987]: return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
pulpcore-worker-2[5987]: return executor(sql, params, many, context)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
pulpcore-worker-2[5987]: return self.cursor.execute(sql, params)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/utils.py", line 89, in __exit__
pulpcore-worker-2[5987]: raise dj_exc_value.with_traceback(traceback) from exc_value
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
pulpcore-worker-2[5987]: return self.cursor.execute(sql, params)
pulpcore-worker-2[5987]: django.db.utils.OperationalError: deadlock detected
pulpcore-worker-2[5987]: DETAIL: Process 8294 waits for ShareLock on transaction 4698; blocked by process 8300.
pulpcore-worker-2[5987]: Process 8300 waits for ShareLock on transaction 4629; blocked by process 8294.
pulpcore-worker-2[5987]: HINT: See server log for query details.
pulpcore-worker-2[5987]: CONTEXT: while inserting index tuple (0,2) in relation "rpm_repometadatafile_data_type_checksum_relat_c9d7364a_uniq"
pulpcore-worker-2[5987]: Traceback (most recent call last):
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
pulpcore-worker-2[5987]: return self.cursor.execute(sql, params)
pulpcore-worker-2[5987]: psycopg2.errors.DeadlockDetected: deadlock detected
pulpcore-worker-2[5987]: DETAIL: Process 8294 waits for ShareLock on transaction 4698; blocked by process 8300.
pulpcore-worker-2[5987]: Process 8300 waits for ShareLock on transaction 4629; blocked by process 8294.
pulpcore-worker-2[5987]: HINT: See server log for query details.
pulpcore-worker-2[5987]: CONTEXT: while inserting index tuple (0,2) in relation "rpm_repometadatafile_data_type_checksum_relat_c9d7364a_uniq"
pulpcore-worker-2[5987]: The above exception was the direct cause of the following exception:
pulpcore-worker-2[5987]: Traceback (most recent call last):
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/rq/worker.py", line 975, in perform_job
pulpcore-worker-2[5987]: rv = job.perform()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/rq/job.py", line 696, in perform
pulpcore-worker-2[5987]: self._result = self._execute()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/rq/job.py", line 719, in _execute
pulpcore-worker-2[5987]: return self.func(*self.args, **self.kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulp_rpm/app/tasks/synchronizing.py", line 269, in synchronize
pulpcore-worker-2[5987]: dv.create()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/declarative_version.py", line 147, in create
pulpcore-worker-2[5987]: loop.run_until_complete(pipeline)
pulpcore-worker-2[5987]: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
pulpcore-worker-2[5987]: return future.result()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
pulpcore-worker-2[5987]: await asyncio.gather(*futures)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
pulpcore-worker-2[5987]: await self.run()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/content_stages.py", line 95, in run
pulpcore-worker-2[5987]: d_content.content.save()
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/pulpcore/app/models/base.py", line 149, in save
pulpcore-worker-2[5987]: return super().save(*args, **kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django_lifecycle/mixins.py", line 134, in save
pulpcore-worker-2[5987]: save(*args, **kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 744, in save
pulpcore-worker-2[5987]: force_update=force_update, update_fields=update_fields)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 782, in save_base
pulpcore-worker-2[5987]: force_update, using, update_fields,
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 873, in _save_table
pulpcore-worker-2[5987]: result = self._do_insert(cls._base_manager, using, fields, update_pk, raw)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/base.py", line 911, in _do_insert
pulpcore-worker-2[5987]: using=using, raw=raw)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/manager.py", line 82, in manager_method
pulpcore-worker-2[5987]: return getattr(self.get_queryset(), name)(*args, **kwargs)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 1186, in _insert
pulpcore-worker-2[5987]: return query.get_compiler(using=using).execute_sql(return_id)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/models/sql/compiler.py", line 1377, in execute_sql
pulpcore-worker-2[5987]: cursor.execute(sql, params)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 67, in execute
pulpcore-worker-2[5987]: return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
pulpcore-worker-2[5987]: return executor(sql, params, many, context)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
pulpcore-worker-2[5987]: return self.cursor.execute(sql, params)
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/utils.py", line 89, in __exit__
pulpcore-worker-2[5987]: raise dj_exc_value.with_traceback(traceback) from exc_value
pulpcore-worker-2[5987]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
pulpcore-worker-2[5987]: return self.cursor.execute(sql, params)
pulpcore-worker-2[5987]: django.db.utils.OperationalError: deadlock detected
pulpcore-worker-2[5987]: DETAIL: Process 8294 waits for ShareLock on transaction 4698; blocked by process 8300.
pulpcore-worker-2[5987]: Process 8300 waits for ShareLock on transaction 4629; blocked by process 8294.
pulpcore-worker-2[5987]: HINT: See server log for query details.
pulpcore-worker-2[5987]: CONTEXT: while inserting index tuple (0,2) in relation "rpm_repometadatafile_data_type_checksum_relat_c9d7364a_uniq"
</code></pre> RPM Support - Issue #8967 (CLOSED - DUPLICATE): "duplicate key value violates unique constraint" ...https://pulp.plan.io/issues/89672021-06-24T13:23:18Zwilful
<p><strong>Ticket moved to GitHub</strong>: "pulp/pulp_rpm/2278":<a href="https://github.com/pulp/pulp_rpm/issues/2278" class="external">https://github.com/pulp/pulp_rpm/issues/2278</a></p>
<hr>
<p>The original issue is difficult to reproduce any longer, but there are similar issues which can be. see <a href="https://pulp.plan.io/issues/8967#note-16" class="external">https://pulp.plan.io/issues/8967#note-16</a></p>
<p>========================</p>
<p>Hi for all!</p>
<p>Me need added for pulp server two repositories:</p>
<p><a href="http://downloads.linux.hpe.com/SDR/repo/spp/redhat/7/x86_64/current/" class="external">http://downloads.linux.hpe.com/SDR/repo/spp/redhat/7/x86_64/current/</a></p>
<p><a href="http://downloads.linux.hpe.com/SDR/repo/mcp/CentOS/7/x86_64/current/" class="external">http://downloads.linux.hpe.com/SDR/repo/mcp/CentOS/7/x86_64/current/</a></p>
<p>But i can't do it, becouse:</p>
<pre><code class="text syntaxhl" data-language="text"> "description": "duplicate key value violates unique constraint \"rpm_package_pkgId_key\"\nDETAIL: Key (\"pkgId\")=(ebf96fb31b880280a25d07c596bde204df50d140) already exists.\
n"
</code></pre>
<p>How can I find out in which repository this package is?</p> RPM Support - Issue #8700 (CLOSED - CURRENTRELEASE): CentOS 8 stream repositories fails to synchr...https://pulp.plan.io/issues/87002021-05-05T12:30:27Zadam.tkac@gooddata.com
<p>Hello pulp upstream,</p>
<p>we are regularly mirroring various CentOS stream repos, but recently "sync" tasks started to fail with following traceback:</p>
<pre><code class="text syntaxhl" data-language="text">May 5 12:20:22 pulp32 rq[2083851]: Traceback (most recent call last):
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/rq/worker.py", line 1008, in perform_job
May 5 12:20:22 pulp32 rq[2083851]: rv = job.perform()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/rq/job.py", line 706, in perform
May 5 12:20:22 pulp32 rq[2083851]: self._result = self._execute()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/rq/job.py", line 729, in _execute
May 5 12:20:22 pulp32 rq[2083851]: result = self.func(*self.args, **self.kwargs)
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulp_rpm/app/tasks/synchronizing.py", line 255, in synchronize
May 5 12:20:22 pulp32 rq[2083851]: dv.create()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulpcore/plugin/stages/declarative_version.py", line 149, in create
May 5 12:20:22 pulp32 rq[2083851]: loop.run_until_complete(pipeline)
May 5 12:20:22 pulp32 rq[2083851]: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
May 5 12:20:22 pulp32 rq[2083851]: return future.result()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
May 5 12:20:22 pulp32 rq[2083851]: await asyncio.gather(*futures)
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
May 5 12:20:22 pulp32 rq[2083851]: await self.run()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulp_rpm/app/tasks/synchronizing.py", line 439, in run
May 5 12:20:22 pulp32 rq[2083851]: await self.parse_modules_metadata()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulp_rpm/app/tasks/synchronizing.py", line 495, in parse_modules_metadata
May 5 12:20:22 pulp32 rq[2083851]: modules_metadata_parser.parse()
May 5 12:20:22 pulp32 rq[2083851]: File "/opt/gdc/pulp3/lib64/python3.6/site-packages/pulp_rpm/app/tasks/synchronizing.py", line 747, in parse
May 5 12:20:22 pulp32 rq[2083851]: content = moduleyaml.read()
May 5 12:20:22 pulp32 rq[2083851]: File "/usr/lib64/python3.6/codecs.py", line 321, in decode
May 5 12:20:22 pulp32 rq[2083851]: (result, consumed) = self._buffer_decode(data, self.errors, final)
May 5 12:20:22 pulp32 rq[2083851]: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xfd in position 0: invalid start byte
</code></pre>
<p>We are using following versions:</p>
<pre><code class="text syntaxhl" data-language="text">[root@pulp32:~] pulp status
{
"versions": [
{
"component": "core",
"version": "3.12.1"
},
{
"component": "rpm",
"version": "3.10.0"
},
{
"component": "file",
"version": "1.7.0"
}
</code></pre>
<p>Remote configuration:</p>
<pre><code class="text syntaxhl" data-language="text">[root@pulp32:~] pulp rpm remote show --name c8s-baseos-remote
{
"pulp_href": "/pulp/api/v3/remotes/rpm/rpm/dc58e56e-4a00-47f1-af0d-a9b6786af51c/",
"pulp_created": "2021-05-05T02:44:02.345456Z",
"name": "c8s-baseos-remote",
"url": "http://mirror.centos.org/centos/8-stream/BaseOS/x86_64/os/",
"ca_cert": null,
"client_cert": null,
"tls_validation": true,
"proxy_url": null,
"pulp_labels": {},
"pulp_last_updated": "2021-05-05T02:44:02.345474Z",
"download_concurrency": 10,
"policy": "immediate",
"total_timeout": null,
"connect_timeout": null,
"sock_connect_timeout": null,
"sock_read_timeout": null,
"headers": null,
"rate_limit": null,
"sles_auth_token": null
}
</code></pre>
<p>How to reproduce:</p>
<pre><code class="text syntaxhl" data-language="text">[root@pulp32:~] pulp rpm repository sync --name c8s-baseos --remote c8s-baseos-remote
Started background task /pulp/api/v3/tasks/e50f608f-9bf7-48a9-ad6a-03efb34014d6/
.Error: Task /pulp/api/v3/tasks/e50f608f-9bf7-48a9-ad6a-03efb34014d6/ failed: ''utf-8' codec can't decode byte 0xfd in position 0: invalid start byte'
</code></pre>
<p>Please let me know if you need any more details. Thank you in advance!</p> RPM Support - Issue #7046 (CLOSED - CURRENTRELEASE): Adding a distribution tree to another repo d...https://pulp.plan.io/issues/70462020-06-24T20:54:17Zdaviddavis
<p>Steps to recreate:</p>
<ol>
<li>Create a repo A and sync down a distribution tree such as centos 8</li>
<li>Create a new repo B</li>
<li>Add the distribution tree to your new repo B (<code>http :/pulp/api/v3/repositories/rpm/rpm/.../modify/ add_content_units:="..."</code>)</li>
</ol>
<p>Result:</p>
<p>The new repo B now has a distribution tree whose BaseOS variant points to the original repo A that synced down the tree. Moreover, repo A shows it has a number of packages/etc while B only has one content unit.</p>
<p>I think this could lead to strange situations where for example repo A can be deleted which would mess up the kickstart.</p> RPM Support - Issue #6440 (CLOSED - CURRENTRELEASE): Modular advisory is published without module...https://pulp.plan.io/issues/64402020-04-03T09:45:45Zttereshcttereshc@redhat.com
<p>Module information is present in the database but not published</p>
<p>It's missed here <a href="https://github.com/pulp/pulp_rpm/blob/09187398a3636257fe0c4dcb7f0aa01889c2c284/pulp_rpm/app/models/advisory.py#L278-L293" class="external">https://github.com/pulp/pulp_rpm/blob/09187398a3636257fe0c4dcb7f0aa01889c2c284/pulp_rpm/app/models/advisory.py#L278-L293</a></p> RPM Support - Issue #5509 (CLOSED - CURRENTRELEASE): Can't sync with 3.0.0b5 release of pulp_rpm https://pulp.plan.io/issues/55092019-09-27T12:57:16Zdaviddavis
<p>Error:</p>
<pre><code> "description": "ProgressBar() got an unexpected keyword argument 'code'",
"traceback": " File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/rq/worker.py\", line 822, in perform_job\n rv = job.perform()\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/rq/job.py\", line 605, in perform\n self._result = self._execute()\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/rq/job.py\", line 611, in _execute\n return self.func(*self.args, **self.kwargs)\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/pulp_rpm/app/tasks/synchronizing.py\", line 105, in synchronize\n dv.create()\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/pulpcore/plugin/stages/declarative_version.py\", line 169, in create\n loop.run_until_complete(pipeline)\n File \"/usr/lib64/python3.7/asyncio/base_events.py\", line 579, in run_until_complete\n return future.result()\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/pulpcore/plugin/stages/api.py\", line 209, in create_pipeline\n await asyncio.gather(*futures)\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/pulpcore/plugin/stages/api.py\", line 43, in __call__\n await self.run()\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/pulp_rpm/app/tasks/synchronizing.py\", line 264, in run\n packages_pb = ProgressBar(message='Parsed Packages', code='parsing.packages')\n File \"/usr/local/lib/pulp/lib64/python3.7/site-packages/django/db/models/base.py\", line 501, in __init__\n raise TypeError(\"%s() got an unexpected keyword argument '%s'\" % (cls.__name__, kwarg))\n"
</code></pre>
<p>We had the same issue in pulp_ansible and had to fix it with:</p>
<p><a href="https://github.com/pulp/pulp_ansible/commit/00bec5339eaa091d63ae7be76464bc68ed040327" class="external">https://github.com/pulp/pulp_ansible/commit/00bec5339eaa091d63ae7be76464bc68ed040327</a></p>
<p>The other option would be to release a new version of core.</p> RPM Support - Test #5320 (CLOSED - WONTFIX): Module Streams not copying correctly with recursive ...https://pulp.plan.io/issues/53202019-08-21T12:47:54Zbherring
<ol>
<li>
<p>Create and sync the following yum repo (Source) -> <a href="https://partha.fedorapeople.org/test-repos/pteradactyl/" class="external">https://partha.fedorapeople.org/test-repos/pteradactyl/</a></p>
</li>
<li>
<p>Create another repo Dest which will serve as the destination repo</p>
</li>
<li>
<p>Go to mongo and pick up a uuid for the pteradactly:2 module stream. This stream will be copied from source to dest .</p>
</li>
<li>
<p>run the following command</p>
<pre><code>https://<fqdn>/pulp/api/v2/repositories/Dest/actions/associate/: {"source_repo_id":"Source","criteria":{"type_ids":["modulemd"],"filters":{"association":{"unit_id":{"$in":[<$MODULE UUID>]}}}},"override_config":{"recursive":true}}: {"content_type"=>"application/json", "accept"=>"application/json"}
</code></pre>
</li>
<li>
<p>pulp-admin rpm repo list. Check for the number of module mds copied over by the above call.</p>
</li>
<li>
<p>notice that with recursive set to true all the pteradactyl module streams gets copied over, instead of just pteradactly:2 and packages belonging to that</p>
</li>
<li>
<p>Behavior is similar for recursive conservative</p>
</li>
</ol> RPM Support - Test #4730 (CLOSED - WONTFIX): incremental publish of yum_repo_metadata_repo fails ...https://pulp.plan.io/issues/47302019-04-23T19:45:12Zbherring
<p>Steps to reproduce:<br>
1. use Pulp 2.18<br>
2. sync repo with yum_repo_metadata_file (e.g any RHEL repo contains productid which is of the type we need)<br>
3. publish it (in the publish directory there should be a symlink to /var/lib/pulp/published/../../../productid)<br>
4. upgrade to the 2-master (commit c86c4339b9c1b4f158af1e961e8d68492dd2a760)<br>
5. upload any rpm to the repo (to make publish incremental)<br>
6. publish the repo, see the error below:</p>
<pre><code>Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) Traceback (most recent call last):
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/usr/lib/python2.7/site-packages/celery/app/trace.py", line 367, in trace_task
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) R = retval = fun(*args, **kwargs)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/server/async/tasks.py", line 529, in __call__
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) return super(Task, self).__call__(*args, **kwargs)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/server/async/tasks.py", line 107, in __call__
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) return super(PulpTask, self).__call__(*args, **kwargs)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/usr/lib/python2.7/site-packages/celery/app/trace.py", line 622, in __protected_call__
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) return self.run(*args, **kwargs)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/server/controllers/repository.py", line 1110, in publish
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) result = check_publish(repo_obj, dist_id, dist_inst, transfer_repo, conduit, call_config)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/server/controllers/repository.py", line 1207, in check_publish
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) result = _do_publish(repo_obj, dist_id, dist_inst, transfer_repo, conduit, call_config)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/server/controllers/repository.py", line 1259, in _do_publish
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) publish_report = publish_repo(transfer_repo, conduit, call_config)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/server/async/tasks.py", line 737, in wrap_f
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) return f(*args, **kwargs)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp_rpm/plugins/pulp_rpm/plugins/distributors/yum/distributor.py", line 174, in publish_repo
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) return self._publisher.process_lifecycle()
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/plugins/util/publish_step.py", line 572, in process_lifecycle
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) super(PluginStep, self).process_lifecycle()
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/plugins/util/publish_step.py", line 163, in process_lifecycle
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) step.process()
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/plugins/util/publish_step.py", line 239, in process
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) self._process_block(item=item)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp/server/pulp/plugins/util/publish_step.py", line 301, in _process_block
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) self.process_main(item=item)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/home/vagrant/devel/pulp_rpm/plugins/pulp_rpm/plugins/distributors/yum/publish.py", line 527, in process_main
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) shutil.copy2(unit._storage_path, file_path)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/usr/lib64/python2.7/shutil.py", line 144, in copy2
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) copyfile(src, dst)
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) File "/usr/lib64/python2.7/shutil.py", line 83, in copyfile
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) raise Error("`%s` and `%s` are the same file" % (src, dst))
Apr 10 14:45:22 pulp2.dev pulp[4045]: celery.app.trace:ERROR: [7669f0fa] (4045-84096) Error: `/var/lib/pulp/content/units/yum_repo_metadata_file/46/f013ec598b38b306dfd761b41a3ebf1c496f09f440679a1d7b2d4188145fda/ba86625b825e4bea5f6ab2b3e83c2cb076087507815be7e35da6d8bf697829dd-productid.gz` and `/var/cache/pulp/reserved_resource_worker-0@pulp2.dev/7669f0fa-1fc0-49a9-b834-dda548f0da0f/repodata/ba86625b825e4bea5f6ab2b3e83c2cb076087507815be7e35da6d8bf697829dd-productid.gz` are the same file
</code></pre>