Skip to content

Conversation

@wschurman
Copy link
Member

@wschurman wschurman commented Nov 26, 2025

Why

We've fallen too far behind on lru-cache. We previously decided to lock it @ 6.x in #170 because our use case couldn't pre-allocate all the data structures ahead of time efficiently.

But while reading the thread for that, I noticed the author has a ttl-cache: isaacs/node-lru-cache#208 (comment). This is much closer to our needs.

We essentially simulate a LRU cache by updating the TTL on each get: https://github.com/isaacs/ttlcache?tab=readme-ov-file#cachegetkey-updateageonget-checkageonget-ttl--

Because our TTL is short yet we require storing N caches this use case is more optimal.

Ref: ENG-18193

How

Switch to ttl-cache but only as a dev dependency, make exported API an interface so that consumers can choose between old lru-cache and ttl-cache.

Test Plan

Run all tests.

Copy link
Member Author

wschurman commented Nov 26, 2025

This stack of pull requests is managed by Graphite. Learn more about stacking.

@linear
Copy link

linear bot commented Nov 26, 2025

@codecov
Copy link

codecov bot commented Nov 26, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 100.00%. Comparing base (9140938) to head (3d3c5a4).
⚠️ Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##              main      #322   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           87        87           
  Lines        12174     12187   +13     
  Branches       619       637   +18     
=========================================
+ Hits         12174     12187   +13     
Flag Coverage Δ
integration 7.52% <0.00%> (-0.01%) ⬇️
unittest 96.96% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@wschurman wschurman force-pushed the wschurman/11-26-chore_upgrade_lru-cache branch 2 times, most recently from 5e5347a to 106aab4 Compare November 26, 2025 23:31
@wschurman wschurman changed the title chore: upgrade lru-cache chore: switch to ttl-cache Nov 26, 2025
@wschurman wschurman force-pushed the wschurman/11-26-chore_upgrade_lru-cache branch from 106aab4 to 5777c85 Compare December 1, 2025 19:21
@wschurman wschurman changed the title chore: switch to ttl-cache feat: switch local cache to an interface Dec 1, 2025
@wschurman wschurman force-pushed the wschurman/11-26-chore_upgrade_lru-cache branch from 5777c85 to 3d3c5a4 Compare December 1, 2025 19:46
@wschurman wschurman marked this pull request as ready for review December 1, 2025 19:47
@wschurman wschurman requested a review from quinlanj December 1, 2025 19:47
@wschurman wschurman merged commit c453726 into main Dec 9, 2025
4 checks passed
@wschurman wschurman deleted the wschurman/11-26-chore_upgrade_lru-cache branch December 9, 2025 01:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants