Project

Profile

Help

Story #3421

closed

As a plugin writer I have HTTPDownloaders which provide exponential backoff for HTTP 429 errors

Added by daviddavis about 6 years ago. Updated over 4 years ago.

Status:
CLOSED - CURRENTRELEASE
Priority:
Normal
Assignee:
Category:
-
Sprint/Milestone:
Start date:
Due date:
% Done:

100%

Estimated time:
Platform Release:
Groomed:
Yes
Sprint Candidate:
No
Tags:
Sprint:
Sprint 35
Quarter:

Description

When HTTP downloads fail with 429 responses (Too Many Requests), the HttpDownloader could wait-and-retry later with possible success. This would allow for downloads to complete over time instead of the first few completing and then all subsequent requests being rejected. For example this is how github rate limits you if you try to download too many files from them at once.

Use Cases

  • the downloaders will exponential backoff and retry when a HTTP 429 response is encountered
  • As a plugin writer, I can disable the exponential backoff and retry feature

Implementation Options

Having a coordinated rate limiting implementation is straightforward, but then it needs to be configured probably site-by-site by either users or plugin writers. This is a pain and prone to many problems. Instead a dumb exponential backoff behavior can cause rejected requests to retry automatically when the server is overloaded up to 10 times.

We could do something like use backoff-async and passthrough those options to the plugin writer, but I'm hoping for something really simple hence the hardcoded 10 times.

When to finally fail?

After 10 times. If it backs off exponentially it would back of for roughly 17 minutes or so.

Also available in: Atom PDF