Pulp: Issueshttps://pulp.plan.io/https://pulp.plan.io/favicon.ico2021-06-21T22:37:08ZPulp
Planio RPM Support - Issue #8944 (CLOSED - CURRENTRELEASE): Cannot sync RHEL6 repos from cdn.redhat.comhttps://pulp.plan.io/issues/89442021-06-21T22:37:08Zsskracic@redhat.comsskracic@redhat.com
<ul>
<li>a containerized pulp based on pulp-ci-centos image with:</li>
</ul>
<p>pulpcore-3.13.0
pulp-rpm-3.13.0</p>
<p>I am not sure whether it's a Pulp bug or a CDN data error, but RHEL6 i386 and x86_64
repo syncing repeatedly fails with this error. The repositories are synced with mirror=True
and the distribution is set to point to repository (so that the latest repo is always distributed).</p>
<pre><code>Repo: Red Hat Enterprise Linux 6 Server from RHUI (RPMs) (6Server-i386)
Start Time: 2021-06-21 22:24:04
Finish Time: 2021-06-21 22:28:18
Elapsed Time: 0:04:13.581718
Result: Error
Exception: Package id from primary metadata (42003b3621ebf25f9b5fe25b0d2c4f2024958a9d), does not match package id from filelists, other metadata (dcacfd1c930cba9ee112c1055e7a1e94bedd7f3a)
Traceback: File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 1013, in perform_job
rv job.perform()
File "/usr/local/lib/python3.6/site-packages/rq/job.py", line 709, in perform
self._result = self._execute()
File "/usr/local/lib/python3.6/site-packages/rq/job.py", line 732, in _execute
result = self.func(*self.args, **self.kwargs)
File "/src/pulp-rpm/pulp_rpm/app/tasks/synchronizing.py", line 395, in synchronize
version = dv.create()
File "/src/pulpcore/pulpcore/plugin/stages/declarative_version.py", line 149, in create
loop.run_until_complete(pipeline)
File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
return future.result()
File "/src/pulpcore/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
await asyncio.gather(*futures)
File "/src/pulpcore/pulpcore/plugin/stages/api.py", line 43, in __call__
await self.run()
File "/src/pulp-rpm/pulp_rpm/app/tasks/synchronizing.py", line 674, in run
await self.parse_repository_metadata(repomd, repomd_files, file_extension)
File "/src/pulp-rpm/pulp_rpm/app/tasks/synchronizing.py", line 719, in parse_repository_metadata
file_extension=file_extension,
File "/src/pulp-rpm/pulp_rpm/app/tasks/synchronizing.py", line 987, in parse_packages
).format(pkgid, pkgid_extra)
</code></pre> Debian Support - Issue #8307 (CLOSED - CURRENTRELEASE): Signing Service is broken by pulpcore changehttps://pulp.plan.io/issues/83072021-02-25T12:30:31Zquba42
<p>I need to implement usage of the new Signing Service fields from pulpcore and fix the tests.</p> Pulp - Issue #7867 (CLOSED - WONTFIX): Issue #7796 : Pulp Version 2.20 - Mongod and pulp_celerybe...https://pulp.plan.io/issues/78672020-11-19T11:53:20Ztosif.meman@kindredgroup.com
<p>Hello Team,</p>
<p>We have pulp version 2.20 configured in our environment and we have faced challenges as mongodb randomly stopped working and due to that pulp_celerybeat not able to connect to the database and apparently the result is pulp_celerybeat also failing.</p>
<p>mongodb version - mongodb-org-server-3.2.22-1.el7.x86_64</p>
<p>The logs did not helped me either as logs not showing sufficient info to troubleshoot.</p>
<p>localhost: pulp: kombu.async.hub:ERROR: (23150-55872) File "/usr/lib/python2.7/site-packages/pulp/server/async/app.py", line 119, in _record_heartbeat pulp: kombu.async.hub:ERROR: (23141-69152) File "/usr/lib/python2.7/site-packages/pulp/server/async/worker_watcher.py", line 39, in handle_worker_heartbeat pulp: kombu.async.hub:ERROR: (23240-19648) File "/usr/lib/python2.7/site-packages/pulp/server/async/worker_watcher.py", line 39, in handle_worker_heartbeat</p>
<p>MongoDB logs:</p>
<p>2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] db version v3.2.22
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] git version: 105acca0d443f9a47c1a5bd608fd7133840a58dd
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] OpenSSL version: OpenSSL 1.0.1e-fips 11 Feb 2013
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] allocator: tcmalloc
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] modules: none
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] build environment:
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] distmod: rhel70
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] distarch: x86_64
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] target_arch: x86_64
2020-11-04T12:19:29.375+0100 I CONTROL [initandlisten] options: { config: "/etc/mongod.conf", net: { bindIp: "127.0.0.1" }, processManagement: { fork: true, pidFilePath: "/var/run/mongodb/mongod.pid" }, security: { authorization: "disabled" }, storage: { dbPath: "/var/lib/mongodb", journal: { enabled: true } }, systemLog: { destination: "file", logAppend: true, path: "/var/log/mongodb/mongodb.log" } }
2020-11-04T12:19:29.404+0100 I - [initandlisten] Detected data files in /var/lib/mongodb created by the 'wiredTiger' storage engine, so setting the active storage engine to 'wiredTiger'.
2020-11-04T12:19:29.404+0100 W - [initandlisten] Detected unclean shutdown - /var/lib/mongodb/mongod.lock is not empty.
2020-11-04T12:19:29.404+0100 W STORAGE [initandlisten] Recovering data from the last clean checkpoint.
2020-11-04T12:19:29.404+0100 I STORAGE [initandlisten] wiredtiger_open config: create,cache_size=18G,session_max=20000,eviction=(threads_min=4,threads_max=4),config_base=false,statistics=(fast),log=(enabled=true,archive=true,path=journal,compressor=snappy),file_manager=(close_idle_time=100000),checkpoint=(wait=60,log_size=2GB),statistics_log=(wait=0),verbose=(recovery_progress),</p>
<p>2020-11-04T12:19:30.164+0100 I CONTROL [initandlisten]
2020-11-04T12:19:30.164+0100 I CONTROL [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/enabled is 'always'.
2020-11-04T12:19:30.164+0100 I CONTROL [initandlisten] ** We suggest setting it to 'never'
2020-11-04T12:19:30.164+0100 I CONTROL [initandlisten]
2020-11-04T12:19:30.164+0100 I CONTROL [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/defrag is 'always'.
2020-11-04T12:19:30.164+0100 I CONTROL [initandlisten] ** We suggest setting it to 'never'
2020-11-04T12:19:30.165+0100 I CONTROL [initandlisten]
2020-11-04T12:19:30.165+0100 I CONTROL [initandlisten] ** WARNING: soft rlimits too low. rlimits set to 4096 processes, 64000 files. Number of processes should be at least 32000 : 0.5 times number of files.</p> Pulp - Issue #7754 (CLOSED - CURRENTRELEASE): PostgreSQL extension uuid-ossp is missing in pulp/p...https://pulp.plan.io/issues/77542020-10-27T14:22:55Zosapryki
<p>Automation Hub requires PostgreSQL extension "uuid-ossp" to perform a data migration <a href="https://github.com/ansible/galaxy_ng/pull/528" class="external">https://github.com/ansible/galaxy_ng/pull/528</a>, however Travis CI fails with the following error:</p>
<pre><code>django.db.utils.OperationalError: could not open extension control file "/usr/share/pgsql/extension/uuid-ossp.control": No such file or directory
</code></pre>
<p>This error occurs because <code>pulp/pulp-ci</code> image has a PostgreSQL installed without compiled standard extensions (specifically "uuid-ossp").</p> Pulp - Issue #7663 (CLOSED - CURRENTRELEASE): SELinux policies not being applied when using RPM b...https://pulp.plan.io/issues/76632020-10-07T11:16:21Zspredzy
<p>As a user I am deploying pulpcore using the pulp_installer (3.7.1) and the upstream packages located at <a href="https://yum.theforeman.org/pulpcore/3.7/el7/x86_64/" class="external">https://yum.theforeman.org/pulpcore/3.7/el7/x86_64/</a></p>
<p>After installation the services are started as <code>unconfined_service_t</code> rather than <code>pulpcore_t</code> (as per the pulpcore-selinux definition <a href="https://github.com/pulp/pulpcore-selinux/blob/master/pulpcore.te#L8" class="external">https://github.com/pulp/pulpcore-selinux/blob/master/pulpcore.te#L8</a>)</p>
<pre><code>[root@localhost vagrant]# rpm -qa | grep pulpcore-selinux
pulpcore-selinux-1.1.1-1.el7.x86_64
[root@localhost vagrant]# semodule -l | grep pulp
pulpcore 1.1.1
pulpcore_port 1.1.1
pulpcore_rhsmcertd 1.1.1
[root@localhost vagrant]# ps -Z fauxwww | grep pulpcore-api
unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 root 3533 0.0 0.1 12532 984 pts/1 S+ 11:07 0:00 \_ grep --color=auto pulpcore-api
system_u:system_r:unconfined_service_t:s0 pulp 2986 0.0 1.0 122984 5220 ? Ss 09:29 0:02 /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind unix:/var/run/pulpcore-api/pulpcore-api.sock --workers 4 --access-logfile -
system_u:system_r:unconfined_service_t:s0 pulp 2989 0.0 6.9 276252 34836 ? S 09:29 0:01 \_ /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind unix:/var/run/pulpcore-api/pulpcore-api.sock --workers 4 --access-logfile -
system_u:system_r:unconfined_service_t:s0 pulp 2991 0.0 6.9 276252 34500 ? S 09:29 0:01 \_ /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind unix:/var/run/pulpcore-api/pulpcore-api.sock --workers 4 --access-logfile -
system_u:system_r:unconfined_service_t:s0 pulp 2992 0.0 6.5 276252 32848 ? S 09:29 0:01 \_ /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind unix:/var/run/pulpcore-api/pulpcore-api.sock --workers 4 --access-logfile -
system_u:system_r:unconfined_service_t:s0 pulp 2995 0.0 6.1 276344 30620 ? S 09:29 0:01 \_ /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind unix:/var/run/pulpcore-api/pulpcore-api.sock --workers 4 --access-logfile -
[root@localhost vagrant]#
</code></pre>
<p>Expected behavior is to have everything properly labeled</p> Debian Support - Issue #7190 (CLOSED - CURRENTRELEASE): Pulp 3 - pulp-deb : APT client installing...https://pulp.plan.io/issues/71902020-07-22T14:11:24Zswisscom
<p>Dear support,</p>
<p>We have the same issue than reported in <a href="https://pulp.plan.io/issues/6982" class="external">https://pulp.plan.io/issues/6982</a> but for field Multi-Arch (was fixed for field Installed-Size in <a href="https://github.com/pulp/pulp_deb/pull/184" class="external">https://github.com/pulp/pulp_deb/pull/184</a>)</p>
<p>Would it be possible to fix missing Multi-Arch field ?
And in addition if possible not put SHA1 field in Packages file when SHA1 value is empty...</p>
<p>Pulp components version :</p>
<pre><code class="text syntaxhl" data-language="text">(pulp) -bash-4.2$ pip list | grep pulp
pulp-deb 2.6.0b1.dev0 /var/lib/pulp/pulp_deb
pulp-rpm 3.4.2
pulpcore 3.5.0
</code></pre>
<p>Thanks in advance</p> Debian Support - Issue #6982 (CLOSED - CURRENTRELEASE): Pulp 3 - pulp-deb : APT client installing...https://pulp.plan.io/issues/69822020-06-16T12:46:52Zswisscom
<p>Dear support team,
When setting up APT client to connect to Pulp to get it's content, I discovered that it installs the same patches again and again. Like if I run twice "apt-get upgrade -y", it will install 339 packages the two times.</p>
<p>The issue is due to the fact the "Installed-Size" field is missing in the Packages file produced by Pulp. "Installed-Size" is one of the mandatory fields, that is used by APT client to check if the patch is already installed or not.</p>
<p>Mpore information here : <a href="https://github.com/Debian/apt/issues/23" class="external">https://github.com/Debian/apt/issues/23</a>
The following fields must be the same in the Packages file and /var/lib/dpkg status: "Installed-Size",
"Depends", "Pre-Depends", "Conflicts", "Breaks", "Replaces". All spaces are ignored, all values are transformed to lowercase, and <= and >= are normalized into < and >.</p>
<p>Would it be possible to fix this issue ?</p>
<p>Thanks a lot</p> Debian Support - Issue #6876 (CLOSED - CURRENTRELEASE): Pulp 3 - pulp-deb : Wrong paths in Releas...https://pulp.plan.io/issues/68762020-06-02T10:23:11Zswisscom
<p>Dear support team,</p>
<p>I discovered a major issue in the Release file produced by pulp-deb publications.
The paths are wrong, because it must be relative from the place where the Release file is. With wrong path in Release file, the repo cannot be used (apt client do not see any package comin</p>
<p>Example of what I see in Release file produced by Pulp :</p>
<pre><code class="python syntaxhl" data-language="python"><span class="n">MD5sum</span><span class="p">:</span>
<span class="n">aa84d85627f4b4d1e1d8072b238150ca</span> <span class="mi">39099687</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">s390x</span><span class="o">/</span><span class="n">Packages</span>
<span class="mi">422</span><span class="n">f6f847b971d36d26d724d04e8a0f2</span> <span class="mi">11994723</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">s390x</span><span class="o">/</span><span class="n">Packages</span><span class="p">.</span><span class="n">gz</span>
<span class="mf">4e01</span><span class="n">fb37905070bf47a186b9321aa507</span> <span class="mi">39689978</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">ppc64el</span><span class="o">/</span><span class="n">Packages</span>
<span class="mi">15681</span><span class="n">a4fb0e26463cd7eca6e4c8c4082</span> <span class="mi">12128141</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">ppc64el</span><span class="o">/</span><span class="n">Packages</span><span class="p">.</span><span class="n">gz</span>
<span class="mi">57439</span><span class="n">b8db44e619293e6df3bc0666de0</span> <span class="mi">39371394</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">mipsel</span><span class="o">/</span><span class="n">Packages</span>
</code></pre>
<p>What it must be :</p>
<pre><code class="python syntaxhl" data-language="python"><span class="n">MD5sum</span><span class="p">:</span>
<span class="n">aa84d85627f4b4d1e1d8072b238150ca</span> <span class="mi">39099687</span> <span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">s390x</span><span class="o">/</span><span class="n">Packages</span>
<span class="mi">422</span><span class="n">f6f847b971d36d26d724d04e8a0f2</span> <span class="mi">11994723</span> <span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">s390x</span><span class="o">/</span><span class="n">Packages</span><span class="p">.</span><span class="n">gz</span>
<span class="mf">4e01</span><span class="n">fb37905070bf47a186b9321aa507</span> <span class="mi">39689978</span> <span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">ppc64el</span><span class="o">/</span><span class="n">Packages</span>
<span class="mi">15681</span><span class="n">a4fb0e26463cd7eca6e4c8c4082</span> <span class="mi">12128141</span> <span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">ppc64el</span><span class="o">/</span><span class="n">Packages</span><span class="p">.</span><span class="n">gz</span>
<span class="mi">57439</span><span class="n">b8db44e619293e6df3bc0666de0</span> <span class="mi">39371394</span> <span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">mipsel</span><span class="o">/</span><span class="n">Packages</span>
</code></pre>
<p>Example of a working Release file : <a href="http://ftp.debian.org/debian/dists/buster/Release" class="external">http://ftp.debian.org/debian/dists/buster/Release</a></p>
<p>FYI : I already applied this fix : <a href="https://github.com/pulp/pulp_deb/pull/173" class="external">https://github.com/pulp/pulp_deb/pull/173</a></p>
<p>Thanks for your help</p> Debian Support - Issue #6873 (CLOSED - DUPLICATE): Pulp 3 - pulp-deb : APT client notb working du...https://pulp.plan.io/issues/68732020-06-01T19:09:23Zswisscom
<p>Dear support team,</p>
<p>The paths in Release files created by Pulp (pulp-deb plugin) wre wrong. It contain full path (dists/)/...
instead of "...." only.</p>
<p>Example from <a href="http://ftp.debian.org/debian/dists/buster/Release" class="external">http://ftp.debian.org/debian/dists/buster/Release</a> :</p>
<pre><code class="python syntaxhl" data-language="python"><span class="n">MD5Sum</span><span class="p">:</span>
<span class="mi">11</span><span class="n">bc5601662d8b6f5b24a92d28150fee</span> <span class="mi">1363066</span> <span class="n">contrib</span><span class="o">/</span><span class="n">Contents</span><span class="o">-</span><span class="n">amd64</span>
<span class="mi">3059916</span><span class="n">a1fef8d912df26e4e537c87f0</span> <span class="mi">103346</span> <span class="n">contrib</span><span class="o">/</span><span class="n">Contents</span><span class="o">-</span><span class="n">amd64</span><span class="p">.</span><span class="n">gz</span>
<span class="n">ccfc6bac526797636a618f18cdce393c</span> <span class="mi">1081641</span> <span class="n">contrib</span><span class="o">/</span><span class="n">Contents</span><span class="o">-</span><span class="n">arm64</span>
<span class="n">e4e018e33daf3f4f24732834af309669</span> <span class="mi">84796</span> <span class="n">contrib</span><span class="o">/</span><span class="n">Contents</span><span class="o">-</span><span class="n">arm64</span><span class="p">.</span><span class="n">gz</span>
</code></pre>
<p>Example from repo coming from Pulp :</p>
<pre><code class="python syntaxhl" data-language="python"><span class="n">Codename</span><span class="p">:</span> <span class="n">buster</span>
<span class="n">Date</span><span class="p">:</span> <span class="n">Sat</span><span class="p">,</span> <span class="mi">09</span> <span class="n">May</span> <span class="mi">2020</span> <span class="mi">09</span><span class="p">:</span><span class="mi">51</span><span class="p">:</span><span class="mi">02</span> <span class="n">UTC</span>
<span class="n">Architectures</span><span class="p">:</span> <span class="n">s390x</span> <span class="n">ppc64el</span> <span class="n">mipsel</span> <span class="n">mips64el</span> <span class="n">mips</span> <span class="n">i386</span> <span class="n">armhf</span> <span class="n">armel</span> <span class="n">arm64</span> <span class="n">amd64</span>
<span class="n">MD5sum</span><span class="p">:</span>
<span class="n">aa84d85627f4b4d1e1d8072b238150ca</span> <span class="mi">39099687</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">s390x</span><span class="o">/</span><span class="n">Packages</span>
<span class="mi">422</span><span class="n">f6f847b971d36d26d724d04e8a0f2</span> <span class="mi">11994723</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">s390x</span><span class="o">/</span><span class="n">Packages</span><span class="p">.</span><span class="n">gz</span>
<span class="mf">4e01</span><span class="n">fb37905070bf47a186b9321aa507</span> <span class="mi">39689978</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">ppc64el</span><span class="o">/</span><span class="n">Packages</span>
<span class="mi">15681</span><span class="n">a4fb0e26463cd7eca6e4c8c4082</span> <span class="mi">12128141</span> <span class="n">dists</span><span class="o">/</span><span class="n">buster</span><span class="o">/</span><span class="n">main</span><span class="o">/</span><span class="n">binary</span><span class="o">-</span><span class="n">ppc64el</span><span class="o">/</span><span class="n">Packages</span><span class="p">.</span><span class="n">gz</span>
</code></pre>
<p>"dist/buster/" should not appear here as the path here must be relative from the place where the Release file is, to work prpoerly with the apt client.</p> Debian Support - Issue #6593 (CLOSED - NOTABUG): No plugin found: deb_distributorhttps://pulp.plan.io/issues/65932020-04-28T20:12:14Zymadav
<p>I have pulp 2.18 running on RHEL 7.7 and have all pulp debain packages and pulgin packages installed,I was able to create the repo but when i try to run the sync command it says like no plugin found or missing resources, during the delete of the repo also it is the same situation.Below are the my pulp packages.Can some one help me here.</p>
<p>python-pulp-rpm-common-2.18.1-1.el7.noarch
pulp-docker-admin-extensions-3.2.2-1.el7.noarch
pulp-deb-admin-extensions-1.8.0-1.el7.noarch
python-pulp-client-lib-2.18.1-2.el7.noarch
pulp-puppet-plugins-2.18.1-1.el7.noarch
python-pulp-docker-common-3.2.2-1.el7.noarch
pulp-server-2.18.1-2.el7.noarch
python-pulp-repoauth-2.18.1-2.el7.noarch
pulp-rpm-admin-extensions-2.18.1-1.el7.noarch
python-pulp-bindings-2.18.1-2.el7.noarch
pulp-rpm-plugins-2.18.1-1.el7.noarch
python-pulp-deb-common-1.8.0-1.el7.noarch
python-pulp-puppet-common-2.18.1-1.el7.noarch
pulp-selinux-2.18.1-2.el7.noarch
pulp-puppet-admin-extensions-2.18.1-1.el7.noarch
python-pulp-common-2.18.1-2.el7.noarch
pulp-admin-client-2.18.1-2.el7.noarch
pulp-docker-plugins-3.2.2-1.el7.noarch
python-pulp-oid_validation-2.18.1-2.el7.noarch
pulp-deb-plugins-1.8.0-1.el7.noarch</p>
<p>Erros i am getting are as like below.</p>
<p>pulp-admin deb repo publish run --repo-id xenial-backports-amd64
+----------------------------------------------------------------------+
Publishing Repository [xenial-backports-amd64]
+----------------------------------------------------------------------+</p>
<p>This command may be exited via ctrl+c without affecting the request.</p>
<p>Task Failed</p>
<p>No plugin found: deb_distributor</p>
<p>pulp-admin deb repo delete --repo-id xenial-newrelic-test
This command may be exited via ctrl+c without affecting the request.
[]
Running...
Task Failed
Pulp exception occurred: PulpExecutionException
No plugin found: deb_importer
No plugin found: deb_distributor</p> Pulp - Issue #6154 (CLOSED - CURRENTRELEASE): Redirect to S3 in pulp-content-app is broken in 3.1.0https://pulp.plan.io/issues/61542020-02-13T14:29:13Zosapryki
<p>pulp-content-app returns redirect to invalid S3 URL:</p>
<pre><code>https://bucket.s3.amazonaws.com/artifact/xx/xxx?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=********&X-Amz-Date=20200213T141229Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=********?response-content-disposition=attachment;+filename=pulp-artifact-filename
</code></pre>
<p><code>response-content-disposition</code> is attached to the URL that already has query string which gives as a result invalid URL.</p>
<p>Please refer to for the correct example of setting Response-Content-Disposition parameter with django-storages:</p>
<p><a href="https://github.com/ansible/galaxy/blob/devel/galaxy/api/download/views.py#L64" class="external">https://github.com/ansible/galaxy/blob/devel/galaxy/api/download/views.py#L64</a></p> Pulp - Issue #5673 (CLOSED - CURRENTRELEASE): Resource reservations are not cleaned up if worker ...https://pulp.plan.io/issues/56732019-11-06T17:29:15Zosapryki
<p>It pulp worker is killed while executing task that has reserved resources, resources are not cleaned up.<br>
All subsequent task that use any of the reserved resources are assigned to the same worker (which is dead).</p>
<p>Steps to reproduce:</p>
<p>1. Spawn import_collection task from pulp_ansible (T1).<br>
2. While task is running kill worker (W1).<br>
3. Start another worker (W2)<br>
4. Spawn import_collection task from pulp_ansible (T2)</p>
<p>Expected behavior:</p>
<p>Task T2 is assigned on worker W2 or Cancelled if assigned on W1 before cleanup is performed.</p>
<p>Actual behavior:</p>
<p>Task T2 is assigned on worker W1 and remains in waiting state forever.<br>
Resources are not cleaned up.</p>
<p>Environment:</p>
<p>pulpcore + pulp_ansible.<br>
1 worker</p> Pulp - Issue #4929 (CLOSED - WORKSFORME): Pulp 2.18 having issues to start workers celery and com...https://pulp.plan.io/issues/49292019-06-07T04:36:54Zymadav
<p>Team,</p>
<p>We upgraded pulp version from 2.12 to 2.18 and all the services are up and running but,worker services in worker server are not able to start and it is failing to communicate to rabbitmq server.we have below package versions installed in worker for celery and pulp,also posting the pulp errors.Please help us to fix this as this is production we are running out of the time.uploaded the message file for reference,please let me know if anything else is required.</p>
<p>We have earlier python2-celery-4.2.1-3.el7.noarch version installed,even with that we were unable to start the services,so using python2-celery-4.0.2-7.el7.noarch.Server is running with RHEL 7.6</p>
<p>Mostly i see these errors</p>
<p>Jun 07 04:17:21 ip-10-12-111-237. celery[319]: from billiard.compat import get_fdmax, close_open_fds<br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: ImportError: cannot import name get_fdmax</p>
<p>Installed packaged<br>
rpm <del>qa | grep -i celery<br>
python2-celery-4.0.2-7.el7.noarch<br>
[ymadav@ip-10-12-111-237 ~]$ rpm -qa | egrep -i "amqp|celery|billiard|kombu"<br>
python2-celery-4.0.2-7.el7.noarch<br>
python2-kombu-4.0.2-11.el7.noarch<br>
python-gofer-amqp-2.12.5-1.el7.noarch<br>
python2-amqp-2.2.2-5.el7.noarch<br>
python2-billiard-3.5.0.3-4.el7.x86_64<br>
[ymadav@ip-10-12-111-237 ~]$ rpm -qa | grep -i pulp</del>*<br>
python-pulp-python-common-2.0.3-1.el7.noarch<br>
python-pulp-rpm-common-2.18.1-1.el7.noarch<br>
pulp-rpm-plugins-2.18.1-1.el7.noarch<br>
python-pulp-puppet-common-2.18.1-1.el7.noarch<br>
pulp-selinux-2.18.1-2.el7.noarch<br>
pulp-puppet-tools-2.18.1-1.el7.noarch<br>
python-pulp-manifest-2.16.4-1.el7.noarch<br>
python-pulp-docker-common-3.2.2-1.el7.noarch<br>
pulp-docker-plugins-3.2.2-1.el7.noarch<br>
pulp-python-plugins-2.0.3-1.el7.noarch<br>
python-pulp-deb-common-1.8.0-1.el7.noarch<br>
python-pulp-repoauth-2.18.1-2.el7.noarch<br>
python2-solv-0.6.34-2.pulp.el7.x86_64<br>
pulp-deb-plugins-1.8.0-1.el7.noarch<br>
python-pulp-ostree-common-1.4.0-1.el7.noarch<br>
pulp-puppet-plugins-2.18.1-1.el7.noarch<br>
python-pulp-oid_validation-2.18.1-2.el7.noarch<br>
pulp-server-2.18.1-2.el7.noarch<br>
python-pulp-common-2.18.1-2.el7.noarch<br>
libsolv-0.6.34-2.pulp.el7.x86_64<br>
pulp-ostree-plugins-1.4.0-1.el7.noarch</p>
<pre><code>sudo systemctl list-units | grep -i pulp
● pulp_worker-0.service loaded failed failed Pulp Worker #0
● pulp_worker-1.service loaded failed failed Pulp Worker #1
● pulp_worker-2.service loaded failed failed Pulp Worker #2
● pulp_worker-3.service loaded failed failed Pulp Worker #3
pulp_workers.service loaded active exited Pulp Celery Workers
</code></pre>
<p>Jun 07 04:17:21 ip-10-12-111-237. celery[319]: File "/usr/lib/python2.7/site-packages/celery/utils/log.py", line 19, in <module><br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: from .term import colored<br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: File "/usr/lib/python2.7/site-packages/celery/utils/term.py", line 11, in <module><br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: from celery.platforms import isatty<br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: File "/usr/lib/python2.7/site-packages/celery/platforms.py", line 21, in <module><br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: from billiard.compat import get_fdmax, close_open_fds<br>
Jun 07 04:17:21 ip-10-12-111-237. celery[319]: ImportError: cannot import name get_fdmax<br>
Jun 07 04:17:21 ip-10-12-111-237. systemd[1]: pulp_worker-0.service: main process exited, code=exited, status=1/FAILURE<br>
Jun 07 04:17:21 ip-10-12-111-237. systemd[1]: Unit pulp_worker-0.service entered failed state.<br>
Jun 07 04:17:21 ip-10-12-111-237. systemd[1]: pulp_worker-0.service failed.<br>
<img src="https://pulp.plan.io/attachments/download/518583/clipboard-201906070629-xt6og.png" alt=""></p>
<p>Thanks,<br>
-Yash</p> Pulp - Issue #4226 (CLOSED - DUPLICATE): Upgrade from 2.17 to 2.18 failedhttps://pulp.plan.io/issues/42262018-12-05T07:19:37ZPoil
<p>Hi,</p>
<p>Nothing in the release note about how to upgrade, so I've first only do a "yum update"<br>
When starting the service I have this error</p>
<pre><code>Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) The database has not been migrated to the current version. Run pulp-manage-db and restart the application.
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) Traceback (most recent call last):
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) File "/usr/lib/python2.7/site-packages/pulp/server/webservices/application.py", line 111, in wsgi_application
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) _initialize_web_services()
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) File "/usr/lib/python2.7/site-packages/pulp/server/webservices/application.py", line 74, in _initialize_web_services
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) migration_models.check_package_versions()
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) File "/usr/lib/python2.7/site-packages/pulp/server/db/migrate/models.py", line 314, in check_package_versions
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) raise Exception(error_message)
Dec 5 07:14:53 repos pulp: pulp.server.webservices.application:ERROR: (5296-72992) InitializationException: The database has not been migrated to the current version. Run pulp-manage-db and restart the application
</code></pre>
<p>So I've run "sudo -u apache pulp-manage-db"<br>
And I've this error</p>
<pre><code>Applying pulp.server.db.migrations version 29
*******************************************************************************
Applying migration pulp.server.db.migrations.0029_applicability_schema_change failed.
Halting migrations due to a migration failure.
command SON([('dropIndexes', u'repo_profile_applicability'), ('index', 'profile_hash_-1_repo_id_-1')]) on namespace pulp_database.$cmd failed: index not found with name [profile_hash_-1_repo_id_-1]
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/pulp/server/db/manage.py", line 239, in main
return _auto_manage_db(options)
File "/usr/lib/python2.7/site-packages/pulp/server/db/manage.py", line 306, in _auto_manage_db
migrate_database(options)
File "/usr/lib/python2.7/site-packages/pulp/server/db/manage.py", line 135, in migrate_database
update_current_version=not options.test)
File "/usr/lib/python2.7/site-packages/pulp/server/db/migrate/models.py", line 189, in apply_migration
migration.migrate()
File "/usr/lib/python2.7/site-packages/pulp/server/db/migrations/0029_applicability_schema_change.py", line 52, in migrate
rpa_collection.drop_index("profile_hash_-1_repo_id_-1")
File "/usr/lib64/python2.7/site-packages/pymongo/collection.py", line 1456, in drop_index
allowable_errors=["ns not found"])
File "/usr/lib64/python2.7/site-packages/pymongo/collection.py", line 205, in _command
read_concern=read_concern)
File "/usr/lib64/python2.7/site-packages/pymongo/pool.py", line 211, in command
read_concern)
File "/usr/lib64/python2.7/site-packages/pymongo/network.py", line 100, in command
helpers._check_command_response(response_doc, msg, allowable_errors)
File "/usr/lib64/python2.7/site-packages/pymongo/helpers.py", line 196, in _check_command_response
raise OperationFailure(msg % errmsg, code, response)
OperationFailure: command SON([('dropIndexes', u'repo_profile_applicability'), ('index', 'profile_hash_-1_repo_id_-1')]) on namespace pulp_database.$cmd failed: index not found with name [profile_hash_-1_repo_id_-1]
</code></pre> Pulp - Issue #3611 (CLOSED - NOTABUG): Pulp Data Migrated but failed to access through URLhttps://pulp.plan.io/issues/36112018-04-27T04:32:27Zrams
<p>Hi,</p>
<p>We migrated pulp-server from NTT env to AWS env.<br>
We bcaked up data in following way<br>
- /var/lib/pulp: tar this data and transferred<br>
-/var/lib/mongodb - tar this data and transferred</p>
<p>on AWS instance we installed pulp-server (2.6.2), mongo db (version 2.4.14) and pyhton (2.6.6)<br>
untar the content into same file system as in previous environment. Started all services.<br>
Ran sudo -u apache pulp-manage-db command on empty db<br>
then restored data</p>
<p>but not able to access repositories via web browser</p>
<p>httpd, qpidd, pulp_workers, pulp_celerybeat, pulp_resource_manager serivces are up and running</p>