Skip to content

fix: correct prompt_cache_retention literal type from "in-memory" to "in_memory"#3003

Open
alvinttang wants to merge 1 commit intoopenai:mainfrom
alvinttang:fix/prompt-cache-retention-type
Open

fix: correct prompt_cache_retention literal type from "in-memory" to "in_memory"#3003
alvinttang wants to merge 1 commit intoopenai:mainfrom
alvinttang:fix/prompt-cache-retention-type

Conversation

@alvinttang
Copy link

Summary

Fixes #2883.

The prompt_cache_retention parameter type declaration used "in-memory" (hyphen) as a valid Literal value, but the OpenAI API expects "in_memory" (underscore). Passing "in-memory" results in a 400 error from the API.

Changes

  • src/openai/types/chat/completion_create_params.pyLiteral["in-memory", "24h"]Literal["in_memory", "24h"]
  • src/openai/types/responses/response_create_params.py — same fix
  • src/openai/types/responses/response.py — same fix
  • src/openai/types/responses/responses_client_event_param.py — same fix
  • src/openai/types/responses/responses_client_event.py — same fix
  • src/openai/resources/responses/responses.py — all overloads updated
  • src/openai/resources/chat/completions/completions.py — all overloads updated
  • tests/api_resources/test_responses.py — test values updated to match
  • tests/api_resources/chat/test_completions.py — test values updated to match

Reproduction

from openai import OpenAI
client = OpenAI()
# Before fix: raises 400 because "in-memory" is rejected by the API
response = client.responses.create(
    model="gpt-4o",
    input="hello",
    prompt_cache_key="my-key",
    prompt_cache_retention="in-memory",  # wrong — API returns 400
)
# After fix: use "in_memory" (underscore)
response = client.responses.create(
    model="gpt-4o",
    input="hello",
    prompt_cache_key="my-key",
    prompt_cache_retention="in_memory",  # correct
)

🤖 Generated with Claude Code

The Literal type for `prompt_cache_retention` used "in-memory" (hyphen)
but the OpenAI API expects "in_memory" (underscore), causing 400 errors.

Fixes openai#2883

Co-Authored-By: Claude Sonnet 4.6 <[email protected]>
@alvinttang alvinttang requested a review from a team as a code owner March 24, 2026 05:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

prompt_cache_retention type declares "in-memory" but API expects "in_memory"

1 participant