Project

Profile

Help

Issue #4989

closed

Api bindings for python use unnecessary verbose action names

Added by mdellweg almost 5 years ago. Updated over 4 years ago.

Status:
CLOSED - CURRENTRELEASE
Priority:
Normal
Category:
-
Sprint/Milestone:
Start date:
Due date:
Estimated time:
Severity:
2. Medium
Version:
Platform Release:
OS:
Triaged:
No
Groomed:
No
Sprint Candidate:
No
Tags:
Sprint:
Quarter:

Description

This leads to code like:

repository = repositories.repositories_create(repository_data)
sync_response = fileremotes.remotes_file_file_sync(file_remote.href, repository_sync_data)

It is not only unnecessary to namespace the actions by the used endpoint identifier, but it also prevents to write a generic function that decides whether to call update or create on an endpoint, that is given as a parameter.

I expect the above code to look like:

repository = repositories.create(repository_data)
sync_response = fileremotes.sync(file_remote.href, repository_sync_data)
Actions #1

Updated by dkliban@redhat.com almost 5 years ago

  • Status changed from NEW to ASSIGNED
  • Assignee set to dkliban@redhat.com

It's possible to generate bindings like what is described in this issue. Here is what the diff looks like for a script that was updated from the current bindings to these potential new bindings.

diff --git a/test_bindings.py b/test_bindings.py
index 4f0685e..748ac6a 100755
--- a/test_bindings.py
+++ b/test_bindings.py
@@ -26,10 +26,10 @@ def monitor_task(task_href):

     """
     completed = ['completed', 'failed', 'canceled']
-    task = tasks.tasks_read(task_href)
+    task = tasks.read(task_href)
     while task.state not in completed:
         sleep(2)
-        task = tasks.tasks_read(task_href)
+        task = tasks.read(task_href)
     pprint(task)
     if task.state == 'completed':
         print("The task was successfful.")
@@ -63,12 +63,12 @@ def upload_file_in_chunks(file_path):
             with NamedTemporaryFile() as file_chunk:
                 file_chunk.write(chunk)
                 if not offset:
-                    upload = uploads.uploads_create(file=file_chunk.name, content_range=content_range)
+                    upload = uploads.start_upload(file=file_chunk.name, content_range=content_range)
                 else:
-                    upload = uploads.uploads_update(upload_href=upload.href, file=file_chunk.name, content_range=content_range)
+                    upload = uploads.continue_upload(upload_href=upload.href, file=file_chunk.name, content_range=content_range)
             offset += chunk_size
             md5hasher.update(chunk)
-        uploads.uploads_finish(upload_href=upload.href, md5=md5hasher.hexdigest())
+        uploads.finish_upload(upload_href=upload.href, md5=md5hasher.hexdigest())
     return upload

 # Configure HTTP basic authorization: basic
@@ -99,59 +99,59 @@ def upload_file_in_chunks(file_path):
     downloaded_file.write(response.content)
     upload = upload_file_in_chunks(downloaded_file.name)
     pprint(upload)
-    artifact = artifacts.artifacts_create(upload=upload.href)
+    artifact = artifacts.create(upload=upload.href)
     pprint(artifact)

 # Create a File Remote
 remote_url = 'https://repos.fedorapeople.org/pulp/pulp/demo_repos/test_file_repo/PULP_MANIFEST'
 remote_data = FileRemote(name='bar15', url=remote_url)
-file_remote = fileremotes.remotes_file_file_create(remote_data)
+file_remote = fileremotes.create(remote_data)
 pprint(file_remote)

 # Create a Repository
 repository_data = Repository(name='foo15')
-repository = repositories.repositories_create(repository_data)
+repository = repositories.create(repository_data)
 pprint(repository)

 # Sync a Repository
 repository_sync_data = RepositorySyncURL(repository=repository.href)
-sync_response = fileremotes.remotes_file_file_sync(file_remote.href, repository_sync_data)
+sync_response = fileremotes.sync(file_remote.href, repository_sync_data)

 pprint(sync_response)

 # Monitor the sync task
 created_resources = monitor_task(sync_response.task)

-repository_version_1 = repositories.repositories_versions_read(created_resources[0])
+repository_version_1 = repositories.read_repository_version(created_resources[0])
 pprint(repository_version_1)

 # Create an artifact from a local file
-artifact = artifacts.artifacts_create(file='test_bindings.py')
+artifact = artifacts.create(file='test_bindings.py')
 pprint(artifact)

 # Create a FileContent from the artifact
 file_data = FileContent(relative_path='foo.tar.gz', artifact=artifact.href)
-filecontent = filecontent.content_file_files_create(file_data)
+filecontent = filecontent.create(file_data)
 pprint(filecontent)

 # Add the new FileContent to a repository version
 repo_version_data = {'add_content_units': [filecontent.href]}
-repo_version_response = repositories.repositories_versions_create(repository.href, repo_version_data)
+repo_version_response = repositories.create_repository_version(repository.href, repo_version_data)

 # Monitor the repo version creation task
 created_resources = monitor_task(repo_version_response.task)

-repository_version_2 = repositories.repositories_versions_read(created_resources[0])
+repository_version_2 = repositories.read_repository_version(created_resources[0])
 pprint(repository_version_2)

 # Create a publication from the latest version of the repository
 publish_data = FilePublication(repository=repository.href)
-publish_response = filepublications.publications_file_file_create(publish_data)
+publish_response = filepublications.create(publish_data)

 # Monitor the publish task
 created_resources = monitor_task(publish_response.task)
 publication_href = created_resources[0]

 distribution_data = FileDistribution(name='baz15', base_path='foo15', publication=publication_href)
-distribution = filedistributions.distributions_file_file_create(distribution_data)
+distribution = filedistributions.create(distribution_data)
 pprint(distribution)

Added by dkliban@redhat.com almost 5 years ago

Revision 72edf72e | View on GitHub

Problem: OpenAPI schema contains long Operation Ids

Solution: request OpenAPI schema with 'bindings' query parameter

This patch takes advantage of a new feature in pulpcore that allows generating OpenAPI schema with short Operation Ids.

Required PR: https://github.com/pulp/pulpcore/pull/178

re: #4989 https://pulp.plan.io/issues/4989

Actions #2

Updated by dkliban@redhat.com almost 5 years ago

After making a few more changes the diff for the bindings use looks like this:

diff --git a/test_bindings.py b/test_bindings.py
index 4f0685e..483a54f 100755
--- a/test_bindings.py
+++ b/test_bindings.py
@@ -1,10 +1,10 @@
-from pulpcore.client.pulpcore import (ApiClient as CoreApiClient, Artifact, ArtifactsApi,
-                                      Configuration, Repository,
-                                      RepositoriesApi, TasksApi, UploadsApi)
-from pulpcore.client.pulp_file import (ApiClient as FileApiClient, ContentApi as FileContentApi, FileContent,
-                                       DistributionsApi as FileDistributionsApi, FileDistribution,
-                                       PublicationsApi as FilePublicationsApi,
-                                       RemotesApi as FileRemotesApi, FileRemote, RepositorySyncURL,
+from pulpcore.client.pulpcore import (ApiClient as CoreApiClient, ArtifactsApi, Configuration,
+                                      Repository, RepositoriesApi, RepositoriesVersionsApi,
+                                      TasksApi, UploadsApi)
+from pulpcore.client.pulp_file import (ApiClient as FileApiClient, ContentFilesApi,
+                                       FileContent, DistributionsFileApi,
+                                       FileDistribution, PublicationsFileApi,
+                                       RemotesFileApi, FileRemote, RepositorySyncURL,
                                        FilePublication)
 from pprint import pprint
 from time import sleep
@@ -13,6 +13,7 @@ import os
 import requests
 from tempfile import NamedTemporaryFile

+
 def monitor_task(task_href):
     """Polls the Task API until the task is in a completed state.

@@ -26,10 +27,10 @@ def monitor_task(task_href):

     """
     completed = ['completed', 'failed', 'canceled']
-    task = tasks.tasks_read(task_href)
+    task = tasks.read(task_href)
     while task.state not in completed:
         sleep(2)
-        task = tasks.tasks_read(task_href)
+        task = tasks.read(task_href)
     pprint(task)
     if task.state == 'completed':
         print("The task was successfful.")
@@ -38,6 +39,7 @@ def monitor_task(task_href):
         print("The task did not finish successfully.")
         exit()

+
 def upload_file_in_chunks(file_path):
     """Uploads a file using the Uploads API

@@ -59,18 +61,23 @@ def upload_file_in_chunks(file_path):
             if not chunk:
                 break
             actual_chunk_size = len(chunk)
-            content_range = 'bytes {start}-{end}/{size}'.format(start=offset, end=offset+actual_chunk_size-1, size=size)
+            content_range = 'bytes {start}-{end}/{size}'.format(start=offset,
+                                                                end=offset+actual_chunk_size-1,
+                                                                size=size)
             with NamedTemporaryFile() as file_chunk:
                 file_chunk.write(chunk)
                 if not offset:
-                    upload = uploads.uploads_create(file=file_chunk.name, content_range=content_range)
+                    upload = uploads.start_upload(file=file_chunk.name, content_range=content_range)
                 else:
-                    upload = uploads.uploads_update(upload_href=upload.href, file=file_chunk.name, content_range=content_range)
+                    upload = uploads.continue_upload(upload_href=upload.href,
+                                                     file=file_chunk.name,
+                                                     content_range=content_range)
             offset += chunk_size
             md5hasher.update(chunk)
-        uploads.uploads_finish(upload_href=upload.href, md5=md5hasher.hexdigest())
+        uploads.finish_upload(upload_href=upload.href, md5=md5hasher.hexdigest())
     return upload

+
 # Configure HTTP basic authorization: basic
 configuration = Configuration()
 configuration.username = 'admin'
@@ -83,10 +90,11 @@ file_client = FileApiClient(configuration)
 # Create api clients for all resource types
 artifacts = ArtifactsApi(core_client)
 repositories = RepositoriesApi(core_client)
-filecontent = FileContentApi(file_client)
-filedistributions = FileDistributionsApi(core_client)
-filepublications = FilePublicationsApi(file_client)
-fileremotes = FileRemotesApi(file_client)
+repo_versions = RepositoriesVersionsApi(core_client)
+filecontent = ContentFilesApi(file_client)
+filedistributions = DistributionsFileApi(core_client)
+filepublications = PublicationsFileApi(file_client)
+fileremotes = RemotesFileApi(file_client)
 tasks = TasksApi(core_client)
 uploads = UploadsApi(core_client)

@@ -94,64 +102,65 @@ uploads = UploadsApi(core_client)
 # Test creating an Artifact from a 1mb file uploaded in 200kb chunks
 with NamedTemporaryFile() as downloaded_file:
     response = requests.get(
-        'https://repos.fedorapeople.org/repos/pulp/pulp/demo_repos/test_bandwidth_repo/pulp-large_1mb_test-packageA-0.1.1-1.fc14.noarch.rpm')
+        'https://repos.fedorapeople.org/repos/pulp/pulp/demo_repos/test_bandwidth_repo/'
+        'pulp-large_1mb_test-packageA-0.1.1-1.fc14.noarch.rpm')
     response.raise_for_status()
     downloaded_file.write(response.content)
     upload = upload_file_in_chunks(downloaded_file.name)
     pprint(upload)
-    artifact = artifacts.artifacts_create(upload=upload.href)
+    artifact = artifacts.create(upload=upload.href)
     pprint(artifact)

 # Create a File Remote
 remote_url = 'https://repos.fedorapeople.org/pulp/pulp/demo_repos/test_file_repo/PULP_MANIFEST'
 remote_data = FileRemote(name='bar15', url=remote_url)
-file_remote = fileremotes.remotes_file_file_create(remote_data)
+file_remote = fileremotes.create(remote_data)
 pprint(file_remote)

 # Create a Repository
 repository_data = Repository(name='foo15')
-repository = repositories.repositories_create(repository_data)
+repository = repositories.create(repository_data)
 pprint(repository)

 # Sync a Repository
 repository_sync_data = RepositorySyncURL(repository=repository.href)
-sync_response = fileremotes.remotes_file_file_sync(file_remote.href, repository_sync_data)
+sync_response = fileremotes.sync(file_remote.href, repository_sync_data)

 pprint(sync_response)

 # Monitor the sync task
 created_resources = monitor_task(sync_response.task)

-repository_version_1 = repositories.repositories_versions_read(created_resources[0])
+repository_version_1 = repo_versions.read(created_resources[0])
 pprint(repository_version_1)

 # Create an artifact from a local file
-artifact = artifacts.artifacts_create(file='test_bindings.py')
+artifact = artifacts.create(file='test_bindings.py')
 pprint(artifact)

 # Create a FileContent from the artifact
 file_data = FileContent(relative_path='foo.tar.gz', artifact=artifact.href)
-filecontent = filecontent.content_file_files_create(file_data)
+filecontent = filecontent.create(file_data)
 pprint(filecontent)

 # Add the new FileContent to a repository version
 repo_version_data = {'add_content_units': [filecontent.href]}
-repo_version_response = repositories.repositories_versions_create(repository.href, repo_version_data)
+repo_version_response = repo_versions.create(repository.href, repo_version_data)

 # Monitor the repo version creation task
 created_resources = monitor_task(repo_version_response.task)

-repository_version_2 = repositories.repositories_versions_read(created_resources[0])
+repository_version_2 = repo_versions.read(created_resources[0])
 pprint(repository_version_2)

 # Create a publication from the latest version of the repository
 publish_data = FilePublication(repository=repository.href)
-publish_response = filepublications.publications_file_file_create(publish_data)
+publish_response = filepublications.create(publish_data)

 # Monitor the publish task
 created_resources = monitor_task(publish_response.task)
 publication_href = created_resources[0]

 distribution_data = FileDistribution(name='baz15', base_path='foo15', publication=publication_href)
-distribution = filedistributions.distributions_file_file_create(distribution_data)
+distribution = filedistributions.create(distribution_data)
 pprint(distribution)

Added by dkliban@redhat.com almost 5 years ago

Revision 7748ffde | View on GitHub

Problem: Pulp's OpenAPI schema has verbose Operation Ids

Solution: add ability to request shorter operation ids in the schema

This patch adds ability to specify 'bindings' query parameter when requesting the OpenAPI schema. When specified, a short OperationId is used for each operation.

This patch also changes how the OpenAPI schema operations are grouped together. Each resource now has its own tag associated with it. This results in better REST API docs and better experience for users of bindings.

The tag for a resource is determined from it's URL. The plugin name is omitted from the tag. For example:

URL: /pulp/api/v3/repositories/ Tag: Repositories

URL: /pulp/api/v3/repositories//versions/ Tag: Repositories: Versions

URL: /pulp/api/v3/content/file/files/ Tag: Content: Files

URL: /pulp/api/v3/remotes/file/file/ Tag: Remotes: File

Required PR: https://github.com/pulp/pulp-openapi-generator/pull/18

re: #4989 https://pulp.plan.io/issues/4989

Actions #3

Updated by dkliban@redhat.com almost 5 years ago

  • Status changed from ASSIGNED to MODIFIED
Actions #4

Updated by bmbouter over 4 years ago

  • Sprint/Milestone set to 3.0.0
Actions #5

Updated by bmbouter over 4 years ago

  • Status changed from MODIFIED to CLOSED - CURRENTRELEASE

Also available in: Atom PDF