Project

Profile

Help

Story #5522

closed

Story #5517: [EPIC] Automation Hub Release Blockers

As a user, I can filter ImportCollection reposnse messages by date to receive limit the set of messages returned

Added by bmbouter over 4 years ago. Updated over 3 years ago.

Status:
CLOSED - CURRENTRELEASE
Priority:
Normal
Assignee:
Sprint/Milestone:
-
Start date:
Due date:
% Done:

100%

Estimated time:
Platform Release:
Groomed:
Yes
Sprint Candidate:
No
Tags:
Sprint:
Sprint 60
Quarter:

Description

Problem

Each time the user fetches logging output from GALAXY_API_ROOT/v3/imports/collections/<uuid:pk>/ it comes from this viewset

The issue is that the user wants to receive new log messages, but they get all logs every time.

Solution

Rename the CollectionImport Model by:

1. removing the "messages" attribute
2. creating three new attributes

  • message <--- long string, required.
  • level. <--- this should be a choice field w/ the standard Python logging levels enumerated
  • time. This one can be configured to auto-record at save using the django ORM syntax for that.

3. The "add_log_record" method should be removed
4. Port the logutils.CollectionImportHandler to create a CollectionImport for each call to emit()
5. Rename CollectionImport to CollectionImportMessage, and update migrations accordingly.

Actions #1

Updated by bmbouter over 4 years ago

  • Parent issue set to #5517
Actions #2

Updated by bmbouter over 4 years ago

  • Subject changed from As a user, I can provide a 'since' attribute to filter response data for ImportCollection to As a user, I can filter ImportCollection reposnse messages by date to receive limit the set of messages returned
  • Description updated (diff)
Actions #3

Updated by bmbouter over 4 years ago

  • Description updated (diff)
Actions #4

Updated by daviddavis over 4 years ago

  • Groomed changed from No to Yes
  • Sprint set to Sprint 60
Actions #5

Updated by daviddavis over 4 years ago

  • Status changed from NEW to ASSIGNED
  • Assignee set to daviddavis
Actions #6

Updated by daviddavis over 4 years ago

For the time field, it looks like we currently use the log record's created field[0]. Will using the db record creation timestamp cause different values/ordering?

[0] https://github.com/pulp/pulp_ansible/blob/055305d7b928f9cef84bacb6b086ea9a41b23a89/pulp_ansible/app/models.py#L70

Actions #7

Updated by bmbouter over 4 years ago

daviddavis wrote:

For the time field, it looks like we currently use the log record's created field[0]. Will using the db record creation timestamp cause different values/ordering?

[0] https://github.com/pulp/pulp_ansible/blob/055305d7b928f9cef84bacb6b086ea9a41b23a89/pulp_ansible/app/models.py#L70

It likely will change the values but not the ordering (I think). Messages created one after the other would still be time-sorted correctly if they are sorted on that field by default.

The values changing I think is ok because two representations of the same moment in time could be translated. I'm making this up, but say the logger uses epoch, but the DB uses a strfmt() based timestamp, I think the user would submit the type required from the API and the internals would use it's datetime equivalent as it queries the db, so it should "just work".

Maybe I'm missing something though, not working on the code directly.

Actions #8

Updated by daviddavis over 4 years ago

Follow up question: do we need a time field since it duplicates the pulp_created field?

Actions #9

Updated by bmbouter over 4 years ago

daviddavis wrote:

Follow up question: do we need a time field since it duplicates the pulp_created field?

I don't think we need it, it would be redundant.

Actions #10

Updated by daviddavis over 4 years ago

This is a bit more complex than I thought. Currently, there is a collection imports endpoint[0]. I thought I could "fake" a CollectionImport in the API by using the collection import Task instead. However, the design is complex and not very straightforward. Also, it would mean that if galaxy ever wanted to store collection import data, they would have no place to store it. I think it might be better to keep the CollectionImport model instead and create a new CollectionImportMessage model.

[0] https://github.com/pulp/pulp_ansible/blob/078942a1e695135af09fcd0f4759824d8a3df498/pulp_ansible/app/urls.py#L67-L71

Actions #11

Updated by daviddavis over 4 years ago

I added a since filter to just filter the messages as they are now. I think this is probably the easiest/simplest solution:

https://github.com/pulp/pulp_ansible/pull/229

Added by daviddavis over 4 years ago

Revision 395ba61e | View on GitHub

Add since filter for collection import messages

fixes #5522 https://pulp.plan.io/issues/5522

Actions #12

Updated by daviddavis over 4 years ago

  • Status changed from ASSIGNED to MODIFIED
  • % Done changed from 0 to 100
Actions #13

Updated by bmbouter over 3 years ago

  • Status changed from MODIFIED to CLOSED - CURRENTRELEASE

Also available in: Atom PDF