Skip to content

Conversation

@mshr-h
Copy link
Contributor

@mshr-h mshr-h commented Dec 14, 2025

Passing invalid memory_max_entries to Cache can crash cachetool with unclear errors. This PR validates it upfront and fails with clearer messages.

Repros:

from dataclasses import dataclass
from dspy.clients.cache import Cache

@dataclass
class DummyResponse:
    message: str
    usage: dict

if __name__ == "__main__":
    memory_cache = Cache(
        enable_disk_cache=False,
        enable_memory_cache=True,
        disk_cache_dir="",
        disk_size_limit_bytes=0,
        memory_max_entries=1, # negative entry size
    )

    request = {"prompt": "Hello", "model": "openai/gpt-4o-mini", "temperature": 0.7}
    value = DummyResponse(message="This is a test response", usage={"prompt_tokens": 10, "completion_tokens": 20})
    memory_cache.put(request, value)  # -> ValueError: value too large
from dataclasses import dataclass
from dspy.clients.cache import Cache

@dataclass
class DummyResponse:
    message: str
    usage: dict

if __name__ == "__main__":
    memory_cache = Cache(
        enable_disk_cache=False,
        enable_memory_cache=True,
        disk_cache_dir="",
        disk_size_limit_bytes=0,
        memory_max_entries=None, # None
    )

    request = {"prompt": "Hello", "model": "openai/gpt-4o-mini", "temperature": 0.7}
    value = DummyResponse(message="This is a test response", usage={"prompt_tokens": 10, "completion_tokens": 20})
    memory_cache.put(request, value)  # -> TypeError: '>' not supported between instances of 'int' and 'NoneType'

@mshr-h mshr-h changed the title Fixmemory_max_entries enforce positive integer constraint Fix memory_max_entries enforce positive integer constraint Dec 14, 2025
@mshr-h mshr-h changed the title Fix memory_max_entries enforce positive integer constraint Fix: enforce positivememory_max_entries for in-memory cache Dec 14, 2025
@mshr-h mshr-h changed the title Fix: enforce positivememory_max_entries for in-memory cache Fix: enforce positive memory_max_entries for in-memory cache Dec 14, 2025
Copy link
Collaborator

@chenmoneygithub chenmoneygithub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! Two minor comments, then we should be good to go!

mshr-h and others added 2 commits December 22, 2025 11:28
Co-authored-by: Chen Qian <qianchen94era@gmail.com>
@mshr-h mshr-h force-pushed the fix-memory_max_entries branch from 0f7a312 to c0a85cc Compare December 22, 2025 03:08
Copy link
Collaborator

@chenmoneygithub chenmoneygithub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for addressing the comments! LGTM with 2 minor comments.

disk_cache_dir: str | None = DISK_CACHE_DIR,
disk_size_limit_bytes: int | None = DISK_CACHE_LIMIT,
memory_max_entries: int | None = 1000000,
memory_max_entries: int | float = 1000000,
Copy link
Contributor Author

@mshr-h mshr-h Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fyi: math.inf is float type so it should also accept it.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes this definitely reads weird in type hint, but that's sticking with cachetools' best practice, so acceptable.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. reverted. @chenmoneygithub

Copy link
Contributor Author

@mshr-h mshr-h Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Maybe just float type is better than int | float or int?
https://github.com/python/typeshed/blob/main/stubs/cachetools/cachetools/__init__.pyi#L20

    def __init__(self, maxsize: float, getsizeof: None = None) -> None: ...

@mshr-h
Copy link
Contributor Author

mshr-h commented Dec 22, 2025

@chenmoneygithub
Thank you for your review! I updated the code accordingly.

@chenmoneygithub chenmoneygithub merged commit 252878d into stanfordnlp:main Dec 23, 2025
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants