Project

Profile

Help

Issue #8377

closed

Content Migration to Pulp 3 with Katello fails with message "No declared artifact"

Added by fbachmann over 3 years ago. Updated over 3 years ago.

Status:
CLOSED - CURRENTRELEASE
Priority:
Normal
Assignee:
-
Category:
-
Sprint/Milestone:
Start date:
Due date:
Estimated time:
Severity:
2. Medium
Version:
Platform Release:
OS:
CentOS 7
Triaged:
Yes
Groomed:
No
Sprint Candidate:
No
Tags:
Katello
Sprint:
Sprint 93
Quarter:

Description

Greetings.

Members of the Foreman/Katello development team asked me to open an issue here.

Content migration to Pulp 3 using foreman-maintain content prepare fails with the following error message:

ForemanTasks::TaskError: Task 5dfd06d7-4d68-4977-8b2f-63445abc8e25: Katello::Errors::Pulp3Error: No declared artifact with relative path "images/boot.iso" for content "<DistributionTree: pk=6995fa03-af4e-4f86-b324-a9888309a5e7>"

The corresponding entry in Pulp 3’s PostgreSQL database (table rpm_distributiontree) looks like this:

pulpcore=# select * from rpm_distributiontree where content_ptr_id='6995fa03-af4e-4f86-b324-a9888309a5e7';
-[ RECORD 1 ]--------+-------------------------------------
content_ptr_id       | 6995fa03-af4e-4f86-b324-a9888309a5e7
header_version       | 1.2
release_name         | CentOS Linux
release_short        | CentOS
release_version      | 8
release_is_layered   | f
base_product_name    |
base_product_short   |
base_product_version |
arch                 | x86_64
build_timestamp      | 0
instimage            |
mainimage            | images/install.img
discnum              |
totaldiscs           |

I am not sure what to make of this. My synchronized CentOS 8 repository os seems to include an images/boot.iso file.

I have synchronized the CentOS 8 repository from http://mirror.centos.org/centos/7/BaseOS/x86_64/os/.

The following Pulp modules are installed on my system:

pulp-2to3-migration (0.8.0)
pulp-certguard (1.0.3)
pulp-container (2.1.0)
pulp-file (1.3.0)
pulp-rpm (3.7.0)
pulpcore (3.7.3)

With kind regards Florian


Related issues

Related to Pulp - Issue #8514: Foreman Katello: Unable to migrate pulp2 to pulp3CLOSED - DUPLICATE04/07/2021Actions
Actions #1

Updated by iballou over 3 years ago

Hi Florian, did you mean "http://mirror.centos.org/centos/8/BaseOS/x86_64/os/" for the repo URL?

Also here's the link to the Foreman community forum thread that preceded this bug: https://community.theforeman.org/t/katello-3-18-1-content-migration-to-pulp-3-fails-with-no-declared-artifact-message/22645/2

Actions #2

Updated by fbachmann over 3 years ago

Oops, I indeed meant to write "http://mirror.centos.org/centos/8/BaseOS/x86_64/os/" for the repo URL. Sorry...

Actions #3

Updated by fao89 over 3 years ago

  • Project changed from Pulp to Migration Plugin
  • Triaged changed from No to Yes
  • Sprint set to Sprint 92
Actions #4

Updated by ggainey over 3 years ago

@fbachmann - what does the following show on the pulp3 side of the world?

select pulp_id, path, platforms from rpm_image where distribution_tree_id = '6995fa03-af4e-4f86-b324-a9888309a5e7';

On my pulp3 instance after sync'ing CentOS8-Base, I see the following:

pulp=> select pulp_id, path, platforms from rpm_image where distribution_tree_id = '6995fa03-af4e-4f86-b324-a9888309a5e7';
               pulp_id                |           path            |  platforms  
--------------------------------------+---------------------------+-------------
 1360f187-5e59-4337-8368-2859d3fc83d7 | images/pxeboot/vmlinuz    | x86_64, xen
 99b1e311-4fa3-4661-a7ce-6586366d8b16 | images/pxeboot/initrd.img | x86_64, xen
 5e29d099-ff40-4203-acde-b76442af5dea | images/efiboot.img        | x86_64
 83f2e0f1-6f05-4618-be98-232078df5180 | images/boot.iso           | x86_64
(4 rows)

pulp=> 

I'm going to add "sync to pulp2, and migrate" into the experiment next.

Also - on the Pulp2 side of things - is your CentOS8-Base sync'd immediate or on_demand?

Actions #5

Updated by ggainey over 3 years ago

ggainey wrote:

@fbachmann - what does the following show on the pulp3 side of the world?

select pulp_id, path, platforms from rpm_image where distribution_tree_id = '0de942aa-2116-454c-9235-bbdcad17b488';

On my pulp3 instance after sync'ing CentOS8-Base, I see the following:

pulp=> select pulp_id, path, platforms from rpm_image where distribution_tree_id = '0de942aa-2116-454c-9235-bbdcad17b488';
               pulp_id                |           path            |  platforms  
--------------------------------------+---------------------------+-------------
 1360f187-5e59-4337-8368-2859d3fc83d7 | images/pxeboot/vmlinuz    | x86_64, xen
 99b1e311-4fa3-4661-a7ce-6586366d8b16 | images/pxeboot/initrd.img | x86_64, xen
 5e29d099-ff40-4203-acde-b76442af5dea | images/efiboot.img        | x86_64
 83f2e0f1-6f05-4618-be98-232078df5180 | images/boot.iso           | x86_64
(4 rows)

pulp=> 

I'm going to add "sync to pulp2, and migrate" into the experiment next.

Also - on the Pulp2 side of things - is your CentOS8-B

One last thing - what do you get when you issue

select * from core_contentartifact where content_id = ''0de942aa-2116-454c-9235-bbdcad17b488';

If you've sync'd CentOS8 'on_demand' in Pulp2, the only artifact guaranteed to be there is .treeinfo. On my system, post-migration, as an example:

pulp=> select * from core_contentartifact where content_id = 'fe6d3d49-559a-461a-8348-1fb7ea1e5cc1';
               pulp_id                |         pulp_created          |       pulp_last_updated       |       relative_path       |             artifact_id           
   |              content_id              
--------------------------------------+-------------------------------+-------------------------------+---------------------------+-----------------------------------
---+--------------------------------------
 45710880-a9ff-4ca3-b5c4-674b9a7e3146 | 2021-03-15 18:22:22.314304+00 | 2021-03-15 18:22:22.314315+00 | images/boot.iso           |                                   
   | fe6d3d49-559a-461a-8348-1fb7ea1e5cc1
 97249018-ec6c-4f27-abf6-f9b3b08d59df | 2021-03-15 18:22:22.314337+00 | 2021-03-15 18:22:22.314345+00 | images/efiboot.img        |                                   
   | fe6d3d49-559a-461a-8348-1fb7ea1e5cc1
 ca09fcf1-6574-4869-85c7-1769994b2d16 | 2021-03-15 18:22:22.314362+00 | 2021-03-15 18:22:22.314368+00 | images/pxeboot/initrd.img |                                   
   | fe6d3d49-559a-461a-8348-1fb7ea1e5cc1
 f896ed00-8b07-4c5b-b7d8-754aa7f9c292 | 2021-03-15 18:22:22.314384+00 | 2021-03-15 18:22:22.314391+00 | images/pxeboot/vmlinuz    |                                   
   | fe6d3d49-559a-461a-8348-1fb7ea1e5cc1
 3ec58c03-3c03-45eb-a119-a8c872bc89f2 | 2021-03-15 18:22:22.314407+00 | 2021-03-15 18:22:22.314413+00 | images/install.img        |                                   
   | fe6d3d49-559a-461a-8348-1fb7ea1e5cc1
 0189d273-0a39-4a2d-baba-f8c39894b5f5 | 2021-03-15 18:22:22.314428+00 | 2021-03-15 18:22:22.314434+00 | .treeinfo                 | 90e18a4a-58f9-4ab7-acb0-5664f6f3ff
82 | fe6d3d49-559a-461a-8348-1fb7ea1e5cc1
(6 rows)

pulp=> 

DO you still have, or can you reproduce, the logs from around when this failure happens? Is it pulp complaining, or katello?

Actions #6

Updated by fbachmann over 3 years ago

Hi, thanks for looking into this problem.

ggainey wrote:

ggainey wrote:

@fbachmann - what does the following show on the pulp3 side of the world?

select pulp_id, path, platforms from rpm_image where distribution_tree_id = '0de942aa-2116-454c-9235-bbdcad17b488';

Looks like what you posted above:

# select pulp_id, path, platforms from rpm_image where distribution_tree_id = '6995fa03-af4e-4f86-b324-a9888309a5e7';
               pulp_id                |           path            |  platforms
--------------------------------------+---------------------------+-------------
 134ebee3-5637-4827-9617-e598796d96b3 | images/boot.iso           | x86_64
 e52d6b37-dcec-4532-83ee-2df4b775bdd0 | images/efiboot.img        | x86_64
 74bb8798-1c6e-4976-8d0e-0293fdbd43d8 | images/pxeboot/initrd.img | x86_64, xen
 a5a23885-e830-4433-9ad1-fa3e92c491f6 | images/pxeboot/vmlinuz    | x86_64, xen
(4 rows)

Also - on the Pulp2 side of things - is your CentOS8-Base sync'd immediate or on_demand?

Immediate

One last thing - what do you get when you issue

select * from core_contentartifact where content_id = ''0de942aa-2116-454c-9235-bbdcad17b488';
# select * from core_contentartifact where content_id = '6995fa03-af4e-4f86-b324-a9888309a5e7';
               pulp_id                |         pulp_created          |       pulp_last_updated       |       relative_path       |             artifact_id              |              content_id
--------------------------------------+-------------------------------+-------------------------------+---------------------------+--------------------------------------+--------------------------------------
 323d4eb1-3871-4f96-a36a-d11c99499aa3 | 2021-03-04 11:25:54.692017+01 | 2021-03-04 11:25:54.692029+01 | images/boot.iso           | a28bb69d-d01b-41d8-8837-ed59d9e79f60 | 6995fa03-af4e-4f86-b324-a9888309a5e7
 3bde7ca5-91ed-44d0-a77d-f9b0908f940e | 2021-03-04 11:25:54.692061+01 | 2021-03-04 11:25:54.692073+01 | images/efiboot.img        | 4768891c-2c7e-479a-af26-b339556ce05c | 6995fa03-af4e-4f86-b324-a9888309a5e7
 621c1310-df64-40d3-8427-73c59a574686 | 2021-03-04 11:25:54.692193+01 | 2021-03-04 11:25:54.692206+01 | images/install.img        | 9928d035-9614-4940-a9bd-b47215f56c61 | 6995fa03-af4e-4f86-b324-a9888309a5e7
 7dfee54c-6412-42f8-a722-5a132671b36c | 2021-03-04 11:25:54.692105+01 | 2021-03-04 11:25:54.692118+01 | images/pxeboot/initrd.img | c4088184-50a9-480b-b5ec-24c949ddd825 | 6995fa03-af4e-4f86-b324-a9888309a5e7
 f493333d-79fa-49ef-8056-c365ff0b1804 | 2021-03-04 11:25:54.692149+01 | 2021-03-04 11:25:54.692162+01 | images/pxeboot/vmlinuz    | 7526306a-1454-4f11-a9bc-42e17dd6797f | 6995fa03-af4e-4f86-b324-a9888309a5e7
 978d8ce0-7a98-4d76-93ba-3a8cbca96c7b | 2021-03-04 11:25:54.692238+01 | 2021-03-04 11:25:54.69225+01  | .treeinfo                 | 11693a6b-56c6-4371-81e9-97ed753027c8 | 6995fa03-af4e-4f86-b324-a9888309a5e7
(6 rows)

DO you still have, or can you reproduce, the logs from around when this failure happens? Is it pulp complaining, or katello?

I am afraid I do not know enough about the inner workings of either Katello or Pulp to answer this question with confidence.

This is from pulp.log:

Mar  4 11:26:00 foreman pulpcore-api: - - [04/Mar/2021:10:26:00 +0000] "GET /pulp/api/v3/task-groups/5c7decea-f2be-4d4c-9b78-b1d19fc9f875/ HTTP/1.1" 200 440 "-" "OpenAPI-Generator/3.7.1/ruby"
Mar  4 11:26:05 foreman pulpcore-worker-4: pulp: rq.worker:ERROR: Traceback (most recent call last):
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/rq/worker.py", line 936, in perform_job
Mar  4 11:26:05 foreman pulpcore-worker-4: rv = job.perform()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/rq/job.py", line 684, in perform
Mar  4 11:26:05 foreman pulpcore-worker-4: self._result = self._execute()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/rq/job.py", line 690, in _execute
Mar  4 11:26:05 foreman pulpcore-worker-4: return self.func(*self.args, **self.kwargs)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/tasks/migrate.py", line 81, in migrate_from_pulp2
Mar  4 11:26:05 foreman pulpcore-worker-4: migrate_content(plan, skip_corrupted=skip_corrupted)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/migration.py", line 47, in migrate_content
Mar  4 11:26:05 foreman pulpcore-worker-4: plugin.migrator.migrate_content_to_pulp3(skip_corrupted=skip_corrupted)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/plugin/rpm/migrator.py", line 145, in migrate_content_to_pulp3
Mar  4 11:26:05 foreman pulpcore-worker-4: loop.run_until_complete(dm.create())
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
Mar  4 11:26:05 foreman pulpcore-worker-4: return future.result()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/plugin/content.py", line 90, in create
Mar  4 11:26:05 foreman pulpcore-worker-4: await pipeline
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
Mar  4 11:26:05 foreman pulpcore-worker-4: await asyncio.gather(*futures)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
Mar  4 11:26:05 foreman pulpcore-worker-4: await self.run()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/artifact_stages.py", line 244, in run
Mar  4 11:26:05 foreman pulpcore-worker-4: RemoteArtifact.objects.bulk_get_or_create(self._needed_remote_artifacts(batch))
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/artifact_stages.py", line 293, in _needed_remote_artifacts
Mar  4 11:26:05 foreman pulpcore-worker-4: msg.format(rp=content_artifact.relative_path, c=d_content.content)
Mar  4 11:26:05 foreman pulpcore-worker-4: ValueError: No declared artifact with relative path "images/boot.iso" for content "<DistributionTree: pk=6995fa03-af4e-4f86-b324-a9888309a5e7>"
Mar  4 11:26:05 foreman pulpcore-worker-4: Traceback (most recent call last):
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/rq/worker.py", line 936, in perform_job
Mar  4 11:26:05 foreman pulpcore-worker-4: rv = job.perform()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/rq/job.py", line 684, in perform
Mar  4 11:26:05 foreman pulpcore-worker-4: self._result = self._execute()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/rq/job.py", line 690, in _execute
Mar  4 11:26:05 foreman pulpcore-worker-4: return self.func(*self.args, **self.kwargs)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/tasks/migrate.py", line 81, in migrate_from_pulp2
Mar  4 11:26:05 foreman pulpcore-worker-4: migrate_content(plan, skip_corrupted=skip_corrupted)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/migration.py", line 47, in migrate_content
Mar  4 11:26:05 foreman pulpcore-worker-4: plugin.migrator.migrate_content_to_pulp3(skip_corrupted=skip_corrupted)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/plugin/rpm/migrator.py", line 145, in migrate_content_to_pulp3
Mar  4 11:26:05 foreman pulpcore-worker-4: loop.run_until_complete(dm.create())
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
Mar  4 11:26:05 foreman pulpcore-worker-4: return future.result()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulp_2to3_migration/app/plugin/content.py", line 90, in create
Mar  4 11:26:05 foreman pulpcore-worker-4: await pipeline
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
Mar  4 11:26:05 foreman pulpcore-worker-4: await asyncio.gather(*futures)
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
Mar  4 11:26:05 foreman pulpcore-worker-4: await self.run()
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/artifact_stages.py", line 244, in run
Mar  4 11:26:05 foreman pulpcore-worker-4: RemoteArtifact.objects.bulk_get_or_create(self._needed_remote_artifacts(batch))
Mar  4 11:26:05 foreman pulpcore-worker-4: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/artifact_stages.py", line 293, in _needed_remote_artifacts
Mar  4 11:26:05 foreman pulpcore-worker-4: msg.format(rp=content_artifact.relative_path, c=d_content.content)
Mar  4 11:26:05 foreman pulpcore-worker-4: ValueError: No declared artifact with relative path "images/boot.iso" for content "<DistributionTree: pk=6995fa03-af4e-4f86-b324-a9888309a5e7>"
Mar  4 11:26:05 foreman pulpcore-worker-4: pulp: rq.worker:INFO: Cleaning registries for queue: 17850@foreman.ubl-is.de
Mar  4 11:26:05 foreman pulpcore-worker-4: pulp: rq.worker:INFO: 17850@foreman.ubl-is.de: a567276e-d8d8-4f68-994c-a2b05b12e052
Mar  4 11:26:05 foreman pulpcore-worker-4: pulp: rq.worker:INFO: 17850@foreman.ubl-is.de: Job OK (a567276e-d8d8-4f68-994c-a2b05b12e052)
Mar  4 11:26:16 foreman pulpcore-api: - - [04/Mar/2021:10:26:16 +0000] "GET /pulp/api/v3/tasks/4ad09597-f027-404e-b2e2-4072b41cc2b1/ HTTP/1.1" 200 7314 "-" "OpenAPI-Generator/3.7.1/ruby"

This is from Foreman's production.log:

2021-03-04T11:26:13 [I|app|5d4d3b08] Completed 200 OK in 50ms (Views: 35.0ms | ActiveRecord: 9.0ms | Allocations: 33231)
2021-03-04T11:26:15 [I|app|217664cb] Started GET "/notification_recipients" for 127.0.0.1 at 2021-03-04 11:26:15 +0100
2021-03-04T11:26:15 [I|app|217664cb] Processing by NotificationRecipientsController#index as JSON
2021-03-04T11:26:15 [I|app|217664cb] Completed 200 OK in 8ms (Views: 0.1ms | ActiveRecord: 2.3ms | Allocations: 2425)
2021-03-04T11:26:16 [E|bac|] No declared artifact with relative path "images/boot.iso" for content "<DistributionTree: pk=6995fa03-af4e-4f86-b324-a9888309a5e7>" (Katello::Errors::Pulp3Error)
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/pulp3/abstract_async_task.rb:102:in `block in check_for_errors'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/pulp3/abstract_async_task.rb:100:in `each'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/pulp3/abstract_async_task.rb:100:in `check_for_errors'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/pulp3/abstract_async_task.rb:133:in `poll_external_task'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action/polling.rb:100:in `poll_external_task_with_rescue'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action/polling.rb:22:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action/cancellable.rb:14:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/pulp3/abstract_async_task.rb:10:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:571:in `block (3 levels) in execute_run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:32:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/middleware/remote_action.rb:16:in `block in run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/middleware/remote_action.rb:40:in `block in as_remote_user'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/models/katello/concerns/user_extensions.rb:21:in `cp_config'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/middleware/remote_action.rb:27:in `as_cp_user'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/middleware/remote_action.rb:39:in `as_remote_user'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1/app/lib/actions/middleware/remote_action.rb:16:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/rails_executor_wrap.rb:14:in `block in run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/execution_wrapper.rb:88:in `wrap'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/rails_executor_wrap.rb:13:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action/progress.rb:31:in `with_progress_calculation'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action/progress.rb:17:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_request_id.rb:15:in `block in run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_request_id.rb:49:in `restore_current_request_id'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_request_id.rb:15:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_timezone.rb:15:in `block in run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_timezone.rb:44:in `restore_curent_timezone'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_timezone.rb:15:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_user.rb:15:in `block in run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_user.rb:44:in `restore_curent_user'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_user.rb:15:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_taxonomies.rb:15:in `block in run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_taxonomies.rb:45:in `restore_current_taxonomies'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-3.0.3/app/lib/actions/middleware/keep_current_taxonomies.rb:15:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:32:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:570:in `block (2 levels) in execute_run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:569:in `catch'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:569:in `block in execute_run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `block in with_error_handling'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `catch'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `with_error_handling'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:564:in `execute_run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:285:in `execute'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:18:in `block (2 levels) in execute'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract.rb:167:in `with_meta_calculation'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:17:in `block in execute'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:32:in `open_action'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:16:in `execute'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/director.rb:93:in `execute'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/executors/sidekiq/worker_jobs.rb:11:in `block (2 levels) in perform'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/executors.rb:18:in `run_user_code'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/executors/sidekiq/worker_jobs.rb:9:in `block in perform'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/executors/sidekiq/worker_jobs.rb:25:in `with_telemetry'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/executors/sidekiq/worker_jobs.rb:8:in `perform'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/executors/sidekiq/serialization.rb:27:in `perform'
 | [ sidekiq ]
 | [ concurrent-ruby ]
2021-03-04T11:26:16 [I|bac|] Task {label: Actions::Pulp3::ContentMigration, id: 5dfd06d7-4d68-4977-8b2f-63445abc8e25, execution_plan_id: 2a385a9b-4109-4297-8aa9-13db7f126edb} state changed: stopped  result: warning
2021-03-04T11:26:16 [I|bac|] Task {label: Actions::Pulp3::ContentMigration, id: 5dfd06d7-4d68-4977-8b2f-63445abc8e25, execution_plan_id: 2a385a9b-4109-4297-8aa9-13db7f126edb} state changed: stopped  result: warning
2021-03-04T11:26:18 [I|app|bb246384] Started GET "/foreman_tasks/api/tasks/5dfd06d7-4d68-4977-8b2f-63445abc8e25/details?include_permissions" for 127.0.0.1 at 2021-03-04 11:26:18 +0100


Please let me know if you need anything else.

Kind regards Florian

Actions #7

Updated by ggainey over 3 years ago

Hm. OK, well this is a lot of data for me to dig into, thanks much! I'll see if I can figure out what's going on here - stay tuned!

Actions #8

Updated by rchan over 3 years ago

  • Sprint changed from Sprint 92 to Sprint 93
Actions #9

Updated by ggainey over 3 years ago

@fbachmann I have tried a number of different scenarios to reproduce this, and have had zero luck to date. I'm running on Centos7, pulp2.21.5, pulpcore 3.11dev, pulp_rpm 3.10dev, migration 0.10.0dev. While we've done some work in the area that generates the exception since 3.7, none of it looks like it would address this particular failure.

Is there any chance you could upgrade to pulpcore3.9 or 3.11 and retry? RPMs are available from https://yum.theforeman.org/pulpcore/ . I'm kind of out of ideas at this point :(

Actions #10

Updated by ggainey over 3 years ago

Just to get various experiments down in one place - I've tried various combinations of immediate and on_demand on the Pulp2 side, including starting with one, migrating, switching to the other and re-syncing, and then re-migrating. We have

Starting with clean DBs on both sides, here's a set of commands to do the basic sync-immediate-and-migrate:

pulp-admin login -u admin -p admin
pulp-admin rpm repo create --repo-id c8 --feed http://mirror.centos.org/centos/8/BaseOS/x86_64/os/ --download-policy immediate
pulp-admin rpm repo sync run --repo-id c8

pulp migration plan create --plan '{"plugins": [{"type": "rpm"}]}'
pulp migration plan run --href HREF-FROM-ABOVE

Between experiments, clean out both sides:

pulp-admin rpm repo delete --repo-id c8
pulp-admin orphan remove --all
pclean  # dev-env alias that drops/recreates Pulp3 db

Replace the "...immediate", above, and use this line to test on_demand:

pulp-admin rpm repo create --repo-id c8 --feed http://mirror.centos.org/centos/8/BaseOS/x86_64/os/ --download-policy on_demand

Do the first experiment, then switch Pulp2 from immediate to on_demand, resync and redo the migration:

pulp-admin login -u admin -p admin
pulp-admin rpm repo create --repo-id c8 --feed http://mirror.centos.org/centos/8/BaseOS/x86_64/os/ --download-policy immediate
pulp-admin rpm repo sync run --repo-id c8

pulp migration plan create --plan '{"plugins": [{"type": "rpm"}]}'
pulp migration plan run --href HREF-FROM-ABOVE

pulp-admin rpm repo update --repo-id c8 --download-policy on_demand
pulp-admin rpm repo sync run  --repo-id c8 
pulp migration plan run --href HREF-FROM-ABOVE

Repeat the last test, starting with on_demand and switching to immediate.

Digging into the CentOS remote-repo, everything looks fine at http://mirror.centos.org/centos/8/BaseOS/x86_64/os/. It was last updated on 16-MAR, and OP didn't hit the problem until 22-MAR, so it's un/less-likely that there was some oddness in the remote causing us trouble.

OP db queries on the Pulp3 side, show the data I would expect for a distribution-tree.

Hopefully we can get some more eyes on the above, that can spot a hole I have missed.

Actions #11

Updated by ttereshc over 3 years ago

ggainey, I have one unconfirmed theory, a bit painful one, so bear with me.

What if... it's the AppStream kickstart repo interfering with the BaseOS one?
If both are present in pulp2 and one of them is migrated first...
They both would be treated as the same distribution tree I think.
I still can't fully explain how to get to what OP sees but my first instinct is to test those two.

See the CentOS8 treeinfos
http://mirror.centos.org/centos/8/BaseOS/x86_64/os/.treeinfo
http://mirror.centos.org/centos/8/AppStream/x86_64/kickstart/.treeinfo
And uniqueness constraint for a distribution tree https://github.com/pulp/pulp_rpm/blob/master/pulp_rpm/app/models/distribution.py#L133-L140
And now I'll tell you that at migration time build_timestamp is set to 0 to let Pulp 3 bring in addons on the upcoming sync.
(Maybe it's better to do build_timestamp minus 1, instead of setting it to zero?)

Not sure if it's the reason for this specific ticket but maybe still worth testing.

Actions #12

Updated by ggainey over 3 years ago

ttereshc wrote:

ggainey, I have one unconfirmed theory, a bit painful one, so bear with me.

That's a great idea!

I have access to an empty 3.7 system. While I'm getting it set up, and can run this test in my dev-env, maybe it'll uncover something.

More eyes is always better!

Actions #13

Updated by ggainey over 3 years ago

Aaaand ttereshc wins the prize!

If you have CentOS8-Base and CentOs8-AppStream-Kickstart sync'd, and you migrate them, you trigger this failure. I've recreated this in our dev-env with current-master, which makes debugging MUCH MUCH easier.

@fbachmann - there is Hope! Also, I suspect that removing the 8-AppStream-Kickstart tree from your Pulp2 would be a workaround, if you can't wait.

Actions #14

Updated by ttereshc over 3 years ago

ggainey, great news. Please consider testing RHEL8 repos as well, since the treeinfo files might slightly differ. It would be good to understand if it's CentOS only, or RHEL as well. Thanks for working on it!

Actions #15

Updated by ggainey over 3 years ago

commit 072677 in https://github.com/pulp/pulpcore-plugin introduced an optimization that may be faster, but is (in this specific case) wrong. AppStream has a .treeinfo, but no artifacts. The prefetch logic added in 072677 results in code that attempts to find the remote, for artifacts associated with the BaseOS tree, which AppStream doesn't know about, and we throw the error.

See #4404 for details on the (substantial) performance impact. Investigation into the right way to address this, without losing the performance-gain, continues.

Actions #16

Updated by ggainey over 3 years ago

  • Project changed from Migration Plugin to Pulp
Actions #17

Updated by pulpbot over 3 years ago

  • Status changed from NEW to POST

Added by ggainey over 3 years ago

Revision 48b365d0 | View on GitHub

Fixed artifact_stages edge case for multi-artifact/multi-remote batches.

Only encountered when 2to3-migrating specific content in specific orders.

fixes #8377 [nocoverage]

Actions #18

Updated by ggainey over 3 years ago

  • Status changed from POST to MODIFIED
Actions #20

Updated by mdellweg over 3 years ago

  • Sprint/Milestone set to 3.12.0
Actions #21

Updated by pulpbot over 3 years ago

  • Status changed from MODIFIED to CLOSED - CURRENTRELEASE
Actions #22

Updated by ggainey over 3 years ago

  • Related to Issue #8514: Foreman Katello: Unable to migrate pulp2 to pulp3 added

Also available in: Atom PDF