The current openapi schema describes the two valid enum variants for prompt_cache_retention as in-memory and 24h:
prompt_cache_retention:
anyOf:
- type: string
enum:
- in-memory
- 24h
description: >
The retention policy for the prompt cache. Set to `24h` to enable extended prompt caching,
which keeps cached prefixes active for longer, up to a maximum of 24 hours. [Learn
more](https://platform.openai.com/docs/guides/prompt-caching#prompt-cache-retention).
- type: 'null'
While the docs specify the valid options as 24h and in_memory (note the underscore instead of the dash) at https://platform.openai.com/docs/guides/prompt-caching#configure-per-request:
If you don’t specify a retention policy, the default is in_memory. Allowed values are in_memory and 24h.
The current openapi schema describes the two valid enum variants for
prompt_cache_retentionasin-memoryand24h:While the docs specify the valid options as
24handin_memory(note the underscore instead of the dash) at https://platform.openai.com/docs/guides/prompt-caching#configure-per-request: