Story #5522
closedStory #5517: [EPIC] Automation Hub Release Blockers
As a user, I can filter ImportCollection reposnse messages by date to receive limit the set of messages returned
100%
Description
Problem¶
Each time the user fetches logging output from GALAXY_API_ROOT/v3/imports/collections/<uuid:pk>/
it comes from this viewset
The issue is that the user wants to receive new log messages, but they get all logs every time.
Solution¶
Rename the CollectionImport Model by:
1. removing the "messages" attribute
2. creating three new attributes
- message <--- long string, required.
- level. <--- this should be a choice field w/ the standard Python logging levels enumerated
- time. This one can be configured to auto-record at save using the django ORM syntax for that.
3. The "add_log_record" method should be removed
4. Port the logutils.CollectionImportHandler to create a CollectionImport for each call to emit()
5. Rename CollectionImport to CollectionImportMessage, and update migrations accordingly.
Updated by bmbouter about 5 years ago
- Subject changed from As a user, I can provide a 'since' attribute to filter response data for ImportCollection to As a user, I can filter ImportCollection reposnse messages by date to receive limit the set of messages returned
- Description updated (diff)
Updated by daviddavis about 5 years ago
- Groomed changed from No to Yes
- Sprint set to Sprint 60
Updated by daviddavis about 5 years ago
- Status changed from NEW to ASSIGNED
- Assignee set to daviddavis
Updated by daviddavis about 5 years ago
For the time field, it looks like we currently use the log record's created field[0]. Will using the db record creation timestamp cause different values/ordering?
Updated by bmbouter about 5 years ago
daviddavis wrote:
For the time field, it looks like we currently use the log record's created field[0]. Will using the db record creation timestamp cause different values/ordering?
It likely will change the values but not the ordering (I think). Messages created one after the other would still be time-sorted correctly if they are sorted on that field by default.
The values changing I think is ok because two representations of the same moment in time could be translated. I'm making this up, but say the logger uses epoch, but the DB uses a strfmt() based timestamp, I think the user would submit the type required from the API and the internals would use it's datetime equivalent as it queries the db, so it should "just work".
Maybe I'm missing something though, not working on the code directly.
Updated by daviddavis about 5 years ago
Follow up question: do we need a time field since it duplicates the pulp_created field?
Updated by bmbouter about 5 years ago
daviddavis wrote:
Follow up question: do we need a time field since it duplicates the pulp_created field?
I don't think we need it, it would be redundant.
Updated by daviddavis about 5 years ago
This is a bit more complex than I thought. Currently, there is a collection imports endpoint[0]. I thought I could "fake" a CollectionImport in the API by using the collection import Task instead. However, the design is complex and not very straightforward. Also, it would mean that if galaxy ever wanted to store collection import data, they would have no place to store it. I think it might be better to keep the CollectionImport model instead and create a new CollectionImportMessage model.
Updated by daviddavis about 5 years ago
I added a since filter to just filter the messages as they are now. I think this is probably the easiest/simplest solution:
Added by daviddavis about 5 years ago
Updated by daviddavis about 5 years ago
- Status changed from ASSIGNED to MODIFIED
- % Done changed from 0 to 100
Applied in changeset pulp_ansible|395ba61ed0d499285e2fe204d51307ca7c9f9251.
Updated by bmbouter about 4 years ago
- Status changed from MODIFIED to CLOSED - CURRENTRELEASE
Add since filter for collection import messages
fixes #5522 https://pulp.plan.io/issues/5522