Skip to content

feat: add backoff_factor and max_backoff for customizable retry backoff#3045

Open
crawfordxx wants to merge 1 commit intoopenai:mainfrom
crawfordxx:feat-customizable-backoff
Open

feat: add backoff_factor and max_backoff for customizable retry backoff#3045
crawfordxx wants to merge 1 commit intoopenai:mainfrom
crawfordxx:feat-customizable-backoff

Conversation

@crawfordxx
Copy link
Copy Markdown

Summary

Adds two new optional parameters to the OpenAI and AsyncOpenAI client constructors:

  • backoff_factor (float | None) — multiplier for the exponential backoff delay between retries (default: 0.5 seconds, matching current behavior)
  • max_backoff (float | None) — maximum wait time between retries (default: 8.0 seconds, matching current behavior)

These parameters are threaded through BaseClient, SyncAPIClient, AsyncAPIClient, and the copy()/with_options() methods so they work consistently everywhere.

Usage

# Default behavior (unchanged)
client = OpenAI()

# Longer backoff for rate-limited workloads
client = OpenAI(
    max_retries=5,
    backoff_factor=2.0,   # start at 2s instead of 0.5s
    max_backoff=60.0,     # allow up to 60s between retries
)

# Also works with with_options/copy
slow_client = client.with_options(backoff_factor=5.0, max_backoff=120.0)

Backward compatibility

When backoff_factor and max_backoff are not provided (the default), the existing constants INITIAL_RETRY_DELAY (0.5) and MAX_RETRY_DELAY (8.0) are used, so there is zero behavior change for existing users.

Closes #1102

…y backoff

Allow users to customize the exponential backoff behavior when using
max_retries. The new backoff_factor parameter controls the initial delay
multiplier (default 0.5s) and max_backoff controls the maximum wait time
between retries (default 8s). Existing behavior is fully preserved when
parameters are not provided.

Closes openai#1102
@crawfordxx crawfordxx requested a review from a team as a code owner April 1, 2026 15:10
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d5d2ca507a

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +423 to +424
backoff_factor=backoff_factor if backoff_factor is not None else self._backoff_factor,
max_backoff=max_backoff if max_backoff is not None else self._max_backoff,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Avoid forwarding backoff kwargs into incompatible subclasses

OpenAI.copy() now always forwards backoff_factor/max_backoff into self.__class__(...), which breaks subclasses whose __init__ signatures do not accept those keywords. In this repo, AzureOpenAI.copy() and AsyncAzureOpenAI.copy() delegate to super().copy(), so calling copy()/with_options() on either Azure client now raises TypeError for unexpected keyword arguments, regressing an existing workflow for Azure users.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Functionality to adjust exponential backoff associated with max_retries option

1 participant